2017-09-26 158 views
0

我在Scala 2.11.8的CDH 5.10集群上使用Spark 2.2。一切工作正常,但我突然开始得到这个驱动程序代码:Spark Streaming中的java.lang.LinkageError

 Exception in thread "main" java.lang.LinkageError: loader constraint violation: when resolving method 
    "org.apache.spark.streaming.StreamingContext$.getOrCreate(Ljava/lang/String;Lscala/Function0;Lorg/apache/hadoop/conf/Configuration;Z)Lorg/apache/spark/streaming/StreamingContext;" 
    the class loader (instance of org/apache/spark/util/ChildFirstURLClassLoader) of the current class, com/hp/hawkeye/driver/StreamingDriver$, 
    and the class loader (instance of sun/misc/Launcher$AppClassLoader) 
for the method's defining class, org/apache/spark/streaming/StreamingContext$, 
have different Class objects for the type scala/Function0 used in the signature 

任何想法我可以解决这个问题?

回答

0

找出解决方案 - 有一个类加载器冲突,这是因为在集群上手动放置依赖关系jar。这些帮助:

RM -rf〜/名.bst RM -rf〜/ .ivy2 /缓存

然后重新IDEA。星团上的Spark提交很好。但是在lib中放置一个额外的依赖jar(spark-avro-assembly-4.0.0-snapshot)带来了这个问题。不知怎的,那个用spark-avro 3.2修复Spark 2.2的jar会产生这个问题。