2015-07-13 32 views
-2

当我创建使用Scala的一个星火背景下,此跟踪显示:与阿卡演员的问题,以初始化火花背景

[sparkDriver-akka.actor.default-dispatcher-3] ERROR akka.actor.ActorSystemImpl - Uncaught fatal error from thread [sparkDriver-akka.remote.default-remote-dispatcher-5] shutting down ActorSystem [sparkDriver] 

    java.lang.NoSuchMethodError: org.jboss.netty.channel.socket.nio.NioWorkerPool.<init>(Ljava/util/concurrent/Executor;I)V 
    at akka.remote.transport.netty.NettyTransport.<init>(NettyTransport.scala:283) 
    at akka.remote.transport.netty.NettyTransport.<init>(NettyTransport.scala:240) 
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57) 
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) 
    at java.lang.reflect.Constructor.newInstance(Constructor.java:526) 
    at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:78) 
    at scala.util.Try$.apply(Try.scala:161) 
    at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:73) 
    at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84) 
    at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84) 
    at scala.util.Success.flatMap(Try.scala:200) 
    at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:84) 
    at akka.remote.EndpointManager$$anonfun$9.apply(Remoting.scala:692) 
    at akka.remote.EndpointManager$$anonfun$9.apply(Remoting.scala:684) 
    at scala.collection.TraversableLike$WithFilter$$anonfun$map$2.apply(TraversableLike.scala:722) 
    at scala.collection.Iterator$class.foreach(Iterator.scala:727) 
    at scala.collection.AbstractIterator.foreach(Iterator.scala:1157) 
    at scala.collection.IterableLike$class.foreach(IterableLike.scala:72) 
    at scala.collection.AbstractIterable.foreach(Iterable.scala:54) 
    at scala.collection.TraversableLike$WithFilter.map(TraversableLike.scala:721) 
    at akka.remote.EndpointManager.akka$remote$EndpointManager$$listens(Remoting.scala:684) 
    at akka.remote.EndpointManager$$anonfun$receive$2.applyOrElse(Remoting.scala:492) 
    at akka.actor.Actor$class.aroundReceive(Actor.scala:465) 
    at akka.remote.EndpointManager.aroundReceive(Remoting.scala:395) 
    at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516) 
    at akka.actor.ActorCell.invoke(ActorCell.scala:487) 
    at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238) 
    at akka.dispatch.Mailbox.run(Mailbox.scala:220) 
    at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393) 
    at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) 
    at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339) 
    at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) 
    at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) 

我读到网状版本冲突什么,但我解决不了这个问题。

这是我的依赖集:

libraryDependencies ++= Seq(
    "org.apache.spark" %% "spark-core" % "1.2.2" 
    exclude ("com.esotericsoftware.minlog", "minlog") 
    exclude("org.eclipse.jetty.orbit", "javax.transaction") 
    exclude("org.eclipse.jetty.orbit", "javax.mail.glassfish") 
    exclude ("commons-beanutils", "commons-beanutils-core") 
    exclude ("commons-digester", "commons-digester") 
    exclude ("org.slf4j", "jcl-over-slf4j"), 
    "org.apache.spark" %% "spark-streaming" % "1.2.2", 
    "org.apache.spark" %% "spark-streaming-flume" % "1.2.2" exclude ("org.mortbay.jetty", "servlet-api"), 
    "org.apache.spark" %% "spark-mllib" % "1.2.2", 
    "com.datastax.spark" %% "spark-cassandra-connector" % "1.2.0" withSources() withJavadoc(), 
    "org.scalatest" % "scalatest_2.10" % "2.2.1" % "test", 
    "org.cassandraunit" % "cassandra-unit" % "2.1.3.1" % "test", 
    "org.apache.cassandra" % "cassandra-all" % "2.1.3", 
    "com.bitmonlab.nrich" % "spark-jobserver-api" % "0.5.0" 
) 

对不起,但我不能更详细的,因为我完全同意这个话题失踪。

如果有任何知道这个发生了什么...

修订

我只是初始化与卡桑德拉支持的火花背景:

val sparkConf = new SparkConf().setAppName("QueryExample").setMaster("local[*]").set("spark.cassandr‌​a.connection.host", seeds).set("spark.cassandra.connection.rpc.port", "9171").set("spark.cassandra.connection.native.port","9142") sc = new SparkContext(sparkConf) 
+1

你至少可以告诉我们你在做什么!分享可能导致错误的代码样本。 – eliasah

+0

我只是用cassandra支持初始化一个火花上下文: val sparkConf = new SparkConf()。setAppName(“QueryExample”).setMaster(“local [*]”).set(“spark.cassandra.connection.host “,seed).set(”spark.cassandra.connection.rpc.port“,”9171“)。set(”spark.cassandra.connection.native.port“,”9142“) sc = new SparkContext(sparkConf) –

+0

你需要用你的评论更新你的问题! – eliasah

回答

0

有一个依赖性问题。正在将Avro-Tools jar文件导入到项目并导致错误。感谢大家。

+0

你是如何解决这个问题的......不确定你的意思是通过avro-tools jar被导入到项目中...... thx – hba

0

有一个类似的问题,但使用maven而不是sbt。由于我包括avro-ipc作为我的依赖项之一,所以我需要排除netty,因此,它看起来像这样。

<dependency> 
    <groupId>org.apache.avro</groupId> 
    <artifactId>avro-ipc</artifactId> 
    <version>${avro.version}</version> 
    <exclusions> 
     <exclusion> 
      <groupId>io.netty</groupId> 
      <artifactId>netty</artifactId> 
     </exclusion> 
    </exclusions> 
</dependency>