2017-06-27 64 views
3

我有一个看似简单的场景,我正在使用python数据流来查询使用big-query的数据。通过python中的bigquery阅读器读取行时AssertError谷歌云数据流

当bq查询返回零行时,我遇到一个AssertionError,脚本&声明错误如下所示。我想知道如果这是一个错误,或者可能有一个推荐的方式来处理py数据流中的bq阅读器零行?

数据流脚本:

from apache_beam.io import WriteToText 
from apache_beam.typehints import Any, Dict 

pipeline_options = PipelineOptions(pipeline_args) 
pipeline_options.view_as(SetupOptions).save_main_session = True 
p = beam.Pipeline(options=pipeline_options) 
BIGQUERY_ROW_TYPE = Dict[str, Any] 

# construct a bigquery SQL 
query_sql = Query().build_sql() 
lines = p \ 
     | 'read from bigquery' >> beam.io.Read(beam.io.BigQuerySource(query=query_sql, validate=True)).with_output_types(BIGQUERY_ROW_TYPE) \ 
     | 'write to test' >> WriteToText(known_args.output) 

result = p.run() 

,我看到当查询返回零行错误:

(98b5a6e4c0cd002e): Traceback (most recent call last): 
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 581, in do_work 
work_executor.execute() 
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 166, in execute 
op.start() 
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/native_operations.py", line 48, in start 
for value in reader: 
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/nativefileio.py", line 186, in __iter__ 
for eof, record, delta_offset in self.read_records(): 
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/nativeavroio.py", line 102, in read_records 
assert block.num_records() > 0 
AssertionError 

`2017-06-27 (13:55:58) Workflow failed. Causes: (7390b72dc5ceedb6): S04:read from bigquery+write to test/Write/WriteImpl/Wr... 
(bb74ab934e658b06): Workflow failed. Causes: (7390b72dc5ceedb6): S04:read 
    from bigquery+write to test/Write/WriteImpl/WriteBundles/Do+write to 
    test/Write/WriteImpl/Pair+write to 
    test/Write/WriteImpl/WindowInto(WindowIntoFn)+write to 
    test/Write/WriteImpl/GroupByKey/Reify+write to 
    test/Write/WriteImpl/GroupByKey/Write failed.` 
+0

你有一个实际的查询吗? – Pablo

+0

我相信这是Python Dataflow worker中的一个bug。 Avro规范不要求块具有非零元素计数,但工作人员不正确地强制执行此要求。我们正在修复。 – jkff

+0

@jkff关于这个的任何消息? – Goranek

回答

1

这是一个错误,并已修复(按@jkff)。该修补程序将在下一个Dataflow版本中提供 - 应该在3-5周左右。

相关问题