2016-09-24 33 views

回答

3

运行以下在另外的空单元格的代码:

%%storage read --object <path-to-gcs-bucket>/my_pickle_file.pkl --variable test_pickle_var 

然后运行以下代码:

from io import BytesIO  
pickle.load(BytesIO(test_pickle_var)) 

我用下面的代码上传将大熊猫DataFrame复制到Google Cloud Storage作为腌制文件并将其读回:

from datalab.context import Context 
import datalab.storage as storage 
import pandas as pd 
from io import BytesIO 
import pickle 

df = pd.DataFrame(data=[{1,2,3},{4,5,6}],columns=['a','b','c']) 

# Create a local pickle file 
df.to_pickle('my_pickle_file.pkl') 

# Create a bucket in GCS 
sample_bucket_name = Context.default().project_id + '-datalab-example' 
sample_bucket_path = 'gs://' + sample_bucket_name 
sample_bucket = storage.Bucket(sample_bucket_name) 
if not sample_bucket.exists(): 
    sample_bucket.create() 

# Write pickle to GCS 
sample_item = sample_bucket.item('my_pickle_file.pkl') 
with open('my_pickle_file.pkl', 'rb') as f: 
    sample_item.write_to(bytearray(f.read()), 'application/octet-stream') 

# Read Method 1 - Read pickle from GCS using %storage read (note single % for line magic) 
path_to_pickle_in_gcs = sample_bucket_path + '/my_pickle_file.pkl' 
%storage read --object $path_to_pickle_in_gcs --variable remote_pickle_1 
df_method1 = pickle.load(BytesIO(remote_pickle_1)) 
print(df_method1) 

# Read Alternate Method 2 - Read pickle from GCS using storage.Bucket.item().read_from() 
remote_pickle_2 = sample_bucket.item('my_pickle_file.pkl').read_from() 
df_method2 = pickle.load(BytesIO(remote_pickle_2)) 
print(df_method2) 

注意:有一个known issue%storage命令不起作用,如果它是单元格中的第一行。在第一行放置注释或python代码。

+1

谢谢。我尝试使用%%存储与酸菜加载。不知何故,它不适合我。它对你有用吗?替代方案也很好 - 一个有效的解决方法。 –

+0

我不确定问题在于腌菜本身。当我试图通过python手段读取数据时 - 意味着一切都正常。虽然我使用BytesIO。然而,当我尝试使用存储子句时 - 什么都不会发生 –

+1

您可以尝试提供的示例代码(StringIO)来确认它在您的端点上有效吗?请分享一个代码片段,它不会按预期的那样执行,以帮助进行故障排除。 –