2016-11-25 77 views
0

我是Apache Spark Streaming的新成员。尝试构建Spark以读取Kinesis Stream中的值。这是我的Python脚本Spark Streaming - 从Kinesis读取时出错

import settings 
from pyspark import SparkContext 
from pyspark.streaming import StreamingContext 
from pyspark.streaming.kinesis import KinesisUtils, InitialPositionInStream 
spark_context = SparkContext(master="local[2]", appName=settings.KINESIS_APP_NAME) 

streaming_context = StreamingContext(sparkContext=spark_context, batchDuration=settings.BATCH_DURATION) 

kinesis_good_stream = KinesisUtils.createStream(
ssc=streaming_context, kinesisAppName=settings.KINESIS_APP_NAME, 
streamName=settings.KINESIS_GOOD_STREAM, endpointUrl=settings.KINESIS_ENDPOINT, 
awsAccessKeyId=settings.AWS_ACCESS_KEY, awsSecretKey=settings.AWS_SECRET_KEY, 
checkpointInterval=settings.KINESIS_CHECKPOINT_INTERVAL, regionName=settings.KINESIS_REGION, 
initialPositionInStream=InitialPositionInStream.LATEST) 

counts = kinesis_good_stream.flatMap(lambda line: line.split(" ")) \ 
    .map(lambda word: (word, 1)) \ 
    .reduceByKey(lambda a, b: a+b) 
counts.pprint() 

streaming_context.start() 
streaming_context.awaitTermination() 

设置文件

# Kinesis Configuration 
KINESIS_REGION = 'ap-southeast-1' 
KINESIS_ENDPOINT = 'kinesis.ap-southeast-1.amazonaws.com' 
KINESIS_GOOD_STREAM = 'GoodStream' 
KINESIS_BAD_STREAM = 'BadStream' 
KINESIS_CHECKPOINT_INTERVAL = 2000 
KINESIS_APP_NAME = 'test-spark' 

# Spark context 
BATCH_DURATION = 2 

# AWS Credential 
AWS_ACCESS_KEY = '' 
AWS_SECRET_KEY = '' 

我运行该脚本,使用此命令

spark-submit --jars spark-streaming-kinesis-asl-assembly.jar kinesis.py 

从我的Django项目

INFO:snowplow_tracker.emitters:GET request finished with status code: 200 
INFO:snowplow_tracker.emitters:POST request finished with status code: 200 

从我收集,注意到给K写信inesis成功

08:00:19.720 [pool-1-thread-9] INFO c.s.s.c.s.sinks.KinesisSink - Successfully wrote 2 out of 2 records 

对于我的星火流

------------------------------------------- 
Time: 2016-11-25 07:59:25 
------------------------------------------- 

16/11/25 07:59:30 ERROR Executor: Exception in task 0.0 in stage 345.0 (TID 173) 
java.lang.NoSuchMethodError: org.apache.spark.storage.BlockManager.get(Lorg/apache/spark/storage/BlockId;)Lscala/Option; 
at org.apache.spark.streaming.kinesis.KinesisBackedBlockRDD.getBlockFromBlockManager$1(KinesisBackedBlockRDD.scala:104) 
at org.apache.spark.streaming.kinesis.KinesisBackedBlockRDD.compute(KinesisBackedBlockRDD.scala:117) 
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319) 
at org.apache.spark.rdd.RDD.iterator(RDD.scala:283) 
at org.apache.spark.api.python.PythonRDD.compute(PythonRDD.scala:63) 
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319) 
at org.apache.spark.rdd.RDD.iterator(RDD.scala:283) 
at org.apache.spark.api.python.PairwiseRDD.compute(PythonRDD.scala:390) 
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319) 
at org.apache.spark.rdd.RDD.iterator(RDD.scala:283) 
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:79) 
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:47) 
at org.apache.spark.scheduler.Task.run(Task.scala:86) 
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:274) 
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) 
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) 
at java.lang.Thread.run(Thread.java:745) 

对于我的Kinesis流,我使用1碎片,并设置星火语境与2芯

+0

你可以发布你的sbt/maven构建文件,以便我们可以看到你在哪些版本上?特别是aws库和火花版本 – ImDarrenG

+0

对不起,刚刚注意到你正在使用pyspark,我的不好 – ImDarrenG

+0

我使用的是来自Spark 2.0.2的pyspark –

回答

0

设法解决的错误。我正在运行Spark-2.0.2,但是我正在使用导致java.lang.NoSuchMethodError的streaming-kinesis-asl-assembly.2.10-2.0.0.jar。

相关问题