2017-03-07 82 views
6

我正在运行spark版本2.1.0,并且出现以下异常。我得到的结果,但它抛出异常Spark异常java.lang.ClassNotFoundException:de.unkrig.jdisasm.Disassembler

java.lang.ClassNotFoundException: de.unkrig.jdisasm.Disassembler 
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381) 
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424) 
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331) 
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357) 
    at java.lang.Class.forName0(Native Method) 
    at java.lang.Class.forName(Class.java:264) 
    at org.codehaus.janino.SimpleCompiler.disassembleToStdout(SimpleCompiler.java:430) 
    at org.codehaus.janino.SimpleCompiler.compileToClassLoader(SimpleCompiler.java:404) 
    at org.codehaus.janino.ClassBodyEvaluator.compileToClass(ClassBodyEvaluator.java:311) 
    at org.codehaus.janino.ClassBodyEvaluator.cook(ClassBodyEvaluator.java:229) 
    at org.codehaus.janino.SimpleCompiler.cook(SimpleCompiler.java:196) 
    at org.codehaus.commons.compiler.Cookable.cook(Cookable.java:91) 
    at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$.org$apache$spark$sql$catalyst$expressions$codegen$CodeGenerator$$doCompile(CodeGenerator.scala:935) 
    at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$$anon$1.load(CodeGenerator.scala:998) 
    at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$$anon$1.load(CodeGenerator.scala:995) 
    at org.spark_project.guava.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3599) 
    at org.spark_project.guava.cache.LocalCache$Segment.loadSync(LocalCache.java:2379) 
    at org.spark_project.guava.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2342) 
    at org.spark_project.guava.cache.LocalCache$Segment.get(LocalCache.java:2257) 
    at org.spark_project.guava.cache.LocalCache.get(LocalCache.java:4000) 
    at org.spark_project.guava.cache.LocalCache.getOrLoad(LocalCache.java:4004) 
    at org.spark_project.guava.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4874) 
    at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$.compile(CodeGenerator.scala:890) 
    at org.apache.spark.sql.execution.WholeStageCodegenExec.doExecute(WholeStageCodegenExec.scala:357) 
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:114) 
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:114) 
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:135) 
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) 
    at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:132) 
    at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:113) 
    at org.apache.spark.sql.execution.SparkPlan.getByteArrayRdd(SparkPlan.scala:225) 
    at org.apache.spark.sql.execution.SparkPlan.executeCollect(SparkPlan.scala:272) 
    at org.apache.spark.sql.Dataset$$anonfun$org$apache$spark$sql$Dataset$$execute$1$1.apply(Dataset.scala:2371) 
    at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:57) 
    at org.apache.spark.sql.Dataset.withNewExecutionId(Dataset.scala:2765) 
    at org.apache.spark.sql.Dataset.org$apache$spark$sql$Dataset$$execute$1(Dataset.scala:2370) 
    at org.apache.spark.sql.Dataset.org$apache$spark$sql$Dataset$$collect(Dataset.scala:2377) 
    at org.apache.spark.sql.Dataset$$anonfun$count$1.apply(Dataset.scala:2405) 
    at org.apache.spark.sql.Dataset$$anonfun$count$1.apply(Dataset.scala:2404) 
    at org.apache.spark.sql.Dataset.withCallback(Dataset.scala:2778) 
    at org.apache.spark.sql.Dataset.count(Dataset.scala:2404) 

我只是阅读的文本文件和打印计数

val rdd = sparkLocal.read.text("/data/logs/file.log") 
println(rdd.count) 

但我得到的结果

这是我build.sbt

libraryDependencies ++= { 
    val akkaVersion = "2.4.10" 
    val sparkVersion = "2.1.0" 

    Seq(
    "com.typesafe.akka" %% "akka-actor"       % akkaVersion, 
    "org.apache.spark" %% "spark-core"       % sparkVersion, 
    "org.apache.spark" %% "spark-sql"       % sparkVersion, 
    "org.apache.spark" %% "spark-hive"       % sparkVersion, 
    "com.typesafe.akka" %% "akka-slf4j"       % akkaVersion, 
    "org.apache.spark" %% "spark-streaming"      % sparkVersion 
) 
} 

请人帮我

+0

你错过了什么地方。如果你所做的只是那么那么你就不会得到这个错误。你是从火星壳跑出来的吗?你建立和火花提交?什么是你的build.sbt看起来像...这里有什么缺失 –

+0

我修改了问题 – Muhunthan

回答

5

有同样的问题,这似乎是由this issue of the Janino compiler造成的。

该问题已经结束,根据change log该修正应包含在2017年3月22日发布的版本3.0.7中,该版本已经是on Maven

我加入了一个明确的

libraryDependencies += "org.codehaus.janino" % "janino" % "3.0.7" 

build.sbt,至今它似乎解决了问题。

相关问题