2015-06-19 129 views
2

我成功通过AWS进行身份验证,并使用Bucket对象上的'put_object'方法上传文件。现在我想使用multipart API来完成这个大文件。我发现在这个问题上接受的答案: How to save S3 object to a file using boto3Python Boto3 AWS分段上传语法

但是当试图实现我得到“未知的方法”的错误。我究竟做错了什么?我的代码如下。谢谢!

## Get an AWS Session 
self.awsSession = Session(aws_access_key_id=accessKey, 
aws_secret_access_key=secretKey, 
aws_session_token=session_token, 
region_name=region_type) 

...   

# Upload the file to S3 
s3 = self.awsSession.resource('s3') 
s3.Bucket('prodbucket').put_object(Key=fileToUpload, Body=data) # WORKS 
#s3.Bucket('prodbucket').upload_file(dataFileName, 'prodbucket', fileToUpload) # DOESNT WORK 
#s3.upload_file(dataFileName, 'prodbucket', fileToUpload) # DOESNT WORK 
+0

您是否看到过boto3中新增的文件上传高级界面?有关详细信息,请参阅https://boto3.readthedocs.org/en/latest/reference/customizations/s3.html#module-boto3.s3.transfer,但它使分段上传更容易。 – garnaat

回答

2

upload_file方法尚未被移植到存储桶资源。现在你需要使用客户端对象直接做到这一点:

client = self.awsSession.client('s3') 
client.upload_file(...) 
+0

感谢这工作! – PhilBot

0

Libcloud S3 wrapper透明地处理你的所有部件的分裂和上传。

使用upload_object_via_stream方法,这样做:

from libcloud.storage.types import Provider 
from libcloud.storage.providers import get_driver 

# Path to a very large file you want to upload 
FILE_PATH = '/home/user/myfile.tar.gz' 

cls = get_driver(Provider.S3) 
driver = cls('api key', 'api secret key') 

container = driver.get_container(container_name='my-backups-12345') 

# This method blocks until all the parts have been uploaded. 
extra = {'content_type': 'application/octet-stream'} 

with open(FILE_PATH, 'rb') as iterator: 
    obj = driver.upload_object_via_stream(iterator=iterator, 
              container=container, 
              object_name='backup.tar.gz', 
              extra=extra) 

有关S3多成分的官方文档,请AWS Official Blog