o Ps3 multipart copies at this time. .. _ref_s3transfer_usage: Usage ===== The simplest way to use this module is: .. code-block:: python client = boto3.client('s3', 'us-west-2') transfer = S3Transfer(client) # Upload /tmp/myfile to s3://bucket/key transfer.upload_file('/tmp/myfile', 'bucket', 'key') # Download s3://bucket/key to /tmp/myfile transfer.download_file('bucket', 'key', '/tmp/myfile') The ``upload_file`` and ``download_file`` methods also accept ``**kwargs``, which will be forwarded through to the corresponding client operation. Here are a few examples using ``upload_file``:: # Making the object public transfer.upload_file('/tmp/myfile', 'bucket', 'key', extra_args={'ACL': 'public-read'}) # Setting metadata transfer.upload_file('/tmp/myfile', 'bucket', 'key', extra_args={'Metadata': {'a': 'b', 'c': 'd'}}) # Setting content type transfer.upload_file('/tmp/myfile.json', 'bucket', 'key', extra_args={'ContentType': "application/json"}) The ``S3Transfer`` clas also supports progress callbacks so you can provide transfer progress to users. Both the ``upload_file`` and ``download_file`` methods take an optional ``callback`` parameter. Here's an example of how to print a simple progress percentage to the user: .. code-block:: python class ProgressPercentage(object): def __init__(self, filename): self._filename = filename self._size = float(os.path.getsize(filename)) self._seen_so_far = 0 self._lock = threading.Lock() def __call__(self, bytes_amount): # To simplify we'll assume this is hooked up # to a single filename. with self._lock: self._seen_so_far += bytes_amount percentage = (self._seen_so_far / self._size) * 100 sys.stdout.write( " %s %s / %s (%.2f%%)" % (self._filename, self._seen_so_far, self._size, percentage)) sys.stdout.flush() transfer = S3Transfer(boto3.client('s3', 'us-west-2')) # Upload /tmp/myfile to s3://bucket/key and print upload progress. transfer.upload_file('/tmp/myfile', 'bucket', 'key', callback=ProgressPercentage('/tmp/myfile')) You can also provide a TransferConfig object to the S3Transfer object that gives you more fine grained control over the transfer. For example: .. code-block:: python client = boto3.client('s3', 'us-west-2') config = TransferConfig( multipart_threshold=8 * 1024 * 1024, max_concurrency=10, num_download_attempts=10, ) transfer = S3Transfer(client, config) transfer.upload_file('/tmp/foo', 'bucket', 'key') N)six)ReadTimeoutError)IncompleteReadError)RetriesExceededErrorS3UploadFailedErrorzAmazon Web Servicesz0.5.0c@seZdZddZdS) NullHandlercCsdSN)selfrecordr r 5/usr/lib/python3/dist-packages/s3transfer/__init__.pyemitzNullHandler.emitN)__name__ __module__ __qualname__r r r r r rs ricCsdddt|DS)Ncss|] }ttjVqdSr)randomchoicestring hexdigits).0_r r r sz(random_file_extension..)joinrange) num_digitsr r r random_file_extensionsrcK*|dvrt|jdr|jdSdSdS)N PutObject UploadPartdisable_callback)hasattrbodyr#requestoperation_namekwargsr r r disable_upload_callbacks  r*cKr)Nr enable_callback)r$r%r,r&r r r enable_upload_callbacksr+r-c@s eZdZdS)QueueShutdownErrorN)rrrr r r r r.sr.c@seZdZ dddZe  dddZddZdd d Zd d Zd dZ ddZ ddZ ddZ ddZ ddZddZddZdS) ReadFileChunkNTcCsF||_||_|j|j|||d|_|j|jd|_||_||_dS)a Given a file object shown below: |___________________________________________________| 0 | | full_file_size |----chunk_size---| start_byte :type fileobj: file :param fileobj: File like object :type start_byte: int :param start_byte: The first byte from which to start reading. :type chunk_size: int :param chunk_size: The max chunk size to read. Trying to read pass the end of the chunk size will behave like you've reached the end of the file. :type full_file_size: int :param full_file_size: The entire content length associated with ``fileobj``. :type callback: function(amount_read) :param callback: Called whenever data is read from this object. )requested_size start_byteactual_file_sizerN)_fileobj _start_byte_calculate_file_size_sizeseek _amount_read _callback_callback_enabled)r fileobjr1 chunk_sizefull_file_sizecallbackr,r r r __init__s zReadFileChunk.__init__cCs,t|d}t|j}|||||||S)aWConvenience factory function to create from a filename. :type start_byte: int :param start_byte: The first byte from which to start reading. :type chunk_size: int :param chunk_size: The max chunk size to read. Trying to read pass the end of the chunk size will behave like you've reached the end of the file. :type full_file_size: int :param full_file_size: The entire content length associated with ``fileobj``. :type callback: function(amount_read) :param callback: Called whenever data is read from this object. :type enable_callback: bool :param enable_callback: Indicate whether to invoke callback during read() calls. :rtype: ``ReadFileChunk`` :return: A new instance of ``ReadFileChunk`` rb)openosfstatfilenost_size)clsfilenamer1r<r>r,f file_sizer r r from_filenames  zReadFileChunk.from_filenamecCs||}t||Sr)min)r r;r0r1r2max_chunk_sizer r r r5s z"ReadFileChunk._calculate_file_sizecCsh|dur |j|j}n t|j|j|}|j|}|jt|7_|jdur2|jr2|t||Sr)r6r8rKr3readlenr9r:)r amountamount_to_readdatar r r rMs zReadFileChunk.readcC d|_dSNTr:r r r r r, zReadFileChunk.enable_callbackcCrRNFrTrUr r r r#rVzReadFileChunk.disable_callbackcCs<|j|j||jdur|jr|||j||_dSr)r3r7r4r9r:r8)r wherer r r r7s zReadFileChunk.seekcCs|jdSr)r3closerUr r r rYszReadFileChunk.closecC|jSr)r8rUr r r tell!szReadFileChunk.tellcCrZr)r6rUr r r __len__$szReadFileChunk.__len__cCs|Srr rUr r r __enter__,rzReadFileChunk.__enter__cOs |dSr)rY)r argsr)r r r __exit__/ zReadFileChunk.__exit__cCstgSr)iterrUr r r __iter__2szReadFileChunk.__iter__rSr)rrrr? classmethodrJr5rMr,r#r7rYr[r\r]r_rbr r r r r/s$ (    r/c@s"eZdZdZdddZddZdS)StreamReaderProgresszr r r r?=s zStreamReaderProgress.__init__cOs.|jj|i|}|jdur|t||Sr)rerMr9rN)r r^r)valuer r r rMAs zStreamReaderProgress.readr)rrr__doc__r?rMr r r r rd;s  rdc@s4eZdZddZddZddZddZd d Zd S) OSUtilscCs tj|Sr)rBpathgetsizer rGr r r get_file_sizeIr`zOSUtils.get_file_sizecCstj||||ddS)NF)r,)r/rJ)r rGr1sizer>r r r open_file_chunk_readerLszOSUtils.open_file_chunk_readercCs t||Sr)rA)r rGmoder r r rAQrVz OSUtils.opencCs&zt|WdStyYdSw)z+Remove a file, noop if file does not exist.N)rBremoveOSErrorrlr r r remove_fileTs  zOSUtils.remove_filecCstj||dSr) s3transfercompat rename_file)r current_filename new_filenamer r r rv]szOSUtils.rename_fileN)rrrrmrorArsrvr r r r riHs  ric@sDeZdZgdZejjfddZddZddZ dd Z d d Z d S) MultipartUploader)SSECustomerKeySSECustomerAlgorithmSSECustomerKeyMD5 RequestPayercCs||_||_||_||_dSr)_client_config_os _executor_clsr clientconfigosutil executor_clsr r r r?ks zMultipartUploader.__init__cCs,i}|D] \}}||jvr|||<q|Sr)itemsUPLOAD_PART_ARGS)r extra_argsupload_parts_argskeyrgr r r _extra_upload_part_argsrs  z)MultipartUploader._extra_upload_part_argsc Cs|jjd ||d|}|d}z |||||||}Wn*tyF} ztjddd|jj|||dtd|d||g| fd} ~ ww|jj |||d |id dS) NBucketKeyUploadIdzBException raised while uploading parts, aborting multipart upload.Texc_info)rrrzFailed to upload %s to %s: %s/Parts)rrrMultipartUploadr ) r~create_multipart_upload _upload_parts Exceptionloggerdebugabort_multipart_uploadrrcomplete_multipart_upload) r rGbucketrr>rresponse upload_idpartser r r upload_file{s8   zMultipartUploader.upload_filec Cs||}g}|jj} tt|j|t| } |jj } |j | d)} t |j ||||| ||} | | td| dD]}||q=Wd|S1sPwY|S)N max_workers)rrmultipart_chunksizeintmathceilrrmfloatmax_concurrencyr functoolspartial_upload_one_partmaprappend)r rrGrrr>rupload_parts_extra_argsr part_size num_partsrexecutorupload_partialpartr r r rs&    zMultipartUploader._upload_partsc Csr|jj} | |||d|| } |jjd||||| d|} | d} | |dWdS1s2wYdS)Nr)rrr PartNumberBodyETag)rrr )rror~ upload_part) r rGrrrrrr> part_numberopen_chunk_readerr%retagr r r rs$z"MultipartUploader._upload_one_partN) rrrr concurrentfuturesThreadPoolExecutorr?rrrrr r r r ryas   ryc@s(eZdZdZddZddZddZdS) ShutdownQueueaYA queue implementation that can be shutdown. Shutting down a queue means that this class adds a trigger_shutdown method that will trigger all subsequent calls to put() to fail with a ``QueueShutdownError``. It purposefully deviates from queue.Queue, and is *not* meant to be a drop in replacement for ``queue.Queue``. cCsd|_t|_tj||SrW) _shutdown threadingLock_shutdown_lockqueueQueue_init)r maxsizer r r rs zShutdownQueue._initcCs<|jd|_tdWddS1swYdS)NTzThe IO queue is now shutdown.)rrrrrUr r r trigger_shutdowns "zShutdownQueue.trigger_shutdowncCsB|j|jr tdWdn1swYtj||S)Nz6Cannot put item to queue when queue has been shutdown.)rrr.rrput)r itemr r r rs zShutdownQueue.putN)rrrrhrrrr r r r rs   rc@sPeZdZejjfddZ dddZddZdd Z d d Z d d Z ddZ dS)MultipartDownloadercCs*||_||_||_||_t|jj|_dSr)r~rrrr max_io_queue_ioqueuerr r r r?s zMultipartDownloader.__init__Nc Cs|jdd6}t|j|||||}||} t|j|} || } tjj| | gtjj d} | | WddS1s?wYdS)Nr) return_when) rrr_download_file_as_futuresubmit_perform_io_writesrrwaitFIRST_EXCEPTION_process_future_results) r rrrG object_sizerr> controllerdownload_parts_handler parts_futureio_writes_handler io_futureresultsr r r download_files     "z!MultipartDownloader.download_filecCs|\}}|D]}|qdSr)result)r rfinished unfinishedfuturer r r rs z+MultipartDownloader._process_future_resultsc Cs|jj}tt|t|}|jj}t|j ||||||} z2|j |d} t | | t |Wdn1s;wYW|jtdSW|jtdS|jtw)Nr)rrrrrrrrr_download_rangerlistrrrrSHUTDOWN_SENTINEL) r rrrGrr>rrrdownload_partialrr r r rs z,MultipartDownloader._download_file_as_futurecCs6||}||dkr d}n||d}d||f}|S)Nrrz bytes=%s-%sr )r r part_indexr start_range end_range range_paramr r r _calculate_range_params    z*MultipartDownloader._calculate_range_paramc s z}||||}|jj} d} t| D]g} zAtd|jj|||d} t| d|d||} t fdddD]}|j | |f| t |7} q:WWtd|dSt jt jttfyy}ztjd || | d d |} WYd}~qd}~wwt| td|w) NzMaking get_object call.)rrRangeri@cs SrrMr  buffer_sizestreaming_bodyr r  z5MultipartDownloader._download_range..z$EXITING _download_range for part: %sCRetrying exception caught (%s), retrying request, (attempt %s / %s)Tr)rrnum_download_attemptsrrrr~ get_objectrdrarrrNsockettimeouterrorrrr)r rrrGrrr>rr max_attemptslast_exceptionir current_indexchunkrr rr rsJ     z#MultipartDownloader._download_rangec Cs|j|dE} |j}|tur td WddSz|\}}||||Wnt yJ}ztjd|dd|j d}~wwq 1sOwYdS)NwbTzCShutdown sentinel received in IO handler, shutting down IO handler.z!Caught exception in IO thread: %sr) rrArgetrrrr7writerr)r rGrHtaskoffsetrQrr r r r$s*    z&MultipartDownloader._perform_io_writesr) rrrrrrr?rrrrrrr r r r rs   !rc@s(eZdZdeddeddfddZdS)TransferConfigr dcCs"||_||_||_||_||_dSr)multipart_thresholdrrrr)r r rrrrr r r r?9s  zTransferConfig.__init__N)rrrMBr?r r r r r8src@seZdZgdZgdZdddZ dddZdd Z  dd d Zd d Z ddZ ddZ ddZ ddZ ddZddZdS) S3Transfer) VersionIdr{rzr|r})ACL CacheControlContentDispositionContentEncodingContentLanguage ContentTypeExpiresGrantFullControl GrantRead GrantReadACP GrantWriteACLMetadatar}ServerSideEncryption StorageClassr{rzr| SSEKMSKeyIdSSEKMSEncryptionContextTaggingNcCs2||_|dur t}||_|durt}||_dSr)r~rrri_osutil)r rrrr r r r?hs zS3Transfer.__init__cCs|duri}|||j|jjj}|jdtdd|jdtdd|j ||j j kr7| |||||dS||||||dS)zUpload a file to an S3 object. Variants have also been injected into S3 client, Bucket and Object. You don't have to use S3Transfer.upload_file() directly. Nzrequest-created.s3zs3upload-callback-disable) unique_idzs3upload-callback-enable)_validate_all_known_argsALLOWED_UPLOAD_ARGSr~metaeventsregister_firstr* register_lastr-rrmrr _multipart_upload _put_object)r rGrrr>rr#r r r rqs"  zS3Transfer.upload_filecCs`|jj}||d|j||d}|jjd|||d|WddS1s)wYdS)Nr)r>)rrrr )rrormr~ put_object)r rGrrr>rrr%r r r r's "zS3Transfer._put_objectcCs|duri}|||j||||}|tjt}z |||||||Wnty>tj d|dd|j |w|j ||dS)zDownload an S3 object to a file. Variants have also been injected into S3 client, Bucket and Object. You don't have to use S3Transfer.download_file() directly. Nzr temp_filenamer r r rs"    zS3Transfer.download_filecCs:||jjkr|||||||dS||||||dSr)rr _ranged_download _get_object)r rrrGrrr>r r r r,s  zS3Transfer._download_filecCs,|D]}||vrtd|d|fqdS)Nz/Invalid extra_args key '%s', must be one of: %sz, ) ValueErrorr)r actualallowedkwargr r r r s z#S3Transfer._validate_all_known_argscCs*t|j|j|j}|||||||dSr)rr~rrr)r rrrGrrr> downloaderr r r r.s  zS3Transfer._ranged_downloadc Cs|jj}d}t|D]2}z ||||||WStjtjttfy<} zt j d| ||dd| }WYd} ~ q d} ~ wwt |)NrTr) rrr_do_get_objectrrrrrrrr) r rrrGrr>rrrrr r r r/s&      zS3Transfer._get_objectc s||jjd||d|}t|d||j|d}tfdddD]}||q$WddS1s7wYdS)Nrrrcs dS)Ni rr rr r rrz+S3Transfer._do_get_object..rr )r~rrdrrArar) r rrrGrr>rrHrr r6r r5s "zS3Transfer._do_get_objectcCs|jjd||d|dS)Nr ContentLengthr )r~ head_object)r rrrr r r r*szS3Transfer._object_sizecCs(t|j|j|j}||||||dSr)ryr~rrr)r rGrrr>ruploaderr r r r&szS3Transfer._multipart_upload)NN)rrrr)r!r?rr'rr,r r.r/r5r*r&r r r r r Fs"     r )r)1rhrBrrloggingrrrrconcurrent.futuresrbotocore.compatrurllib3.exceptionsrbotocore.exceptionsrs3transfer.compatrts3transfer.exceptionsrr __author__ __version__Handlerr getLoggerrr addHandlermovesrr objectrrr*r-rr.r/rdriryrrrrr r r r r sH q       K l