2015-08-27 125 views
4

喜时出错局部类不兼容我在http://spark.apache.org/docs/1.2.0/quick-start.html#self-contained-applications阿帕奇星火:启动SparkContext类

火花版本上星火网站runnign例如:火花1.4.0

SBT版本:0.13.8

然后我运行命令“sbt run”并得到错误“java.io.InvalidClassException:org.apache.spark.deploy.ApplicationDescription; local class incompatible”。

当我尝试启动SparkContext类时,此应用程序在“val sc = new SparkContext(conf)”时失败。我搜索了一下,看到this post,但我没有使用hadoop-client。

你可以看看吗?我的猜测是build.sbt中的版本问题。非常感谢你。

更新:我已经尝试提交python应用程序并正常工作,这意味着Spark集群是可以的。

Scala代码低于:

/* SimpleApp.scala */ 
import org.apache.spark.SparkContext 
import org.apache.spark.SparkContext._ 
import org.apache.spark.SparkConf 

object SimpleApp { 
    def main(args: Array[String]) { 
    val logFile = "YOUR_SPARK_HOME/README.md" // Should be some file on your system 
    val conf = new SparkConf().setAppName("Simple Application") 
    val sc = new SparkContext(conf) 
    val logData = sc.textFile(logFile, 2).cache() 
    val numAs = logData.filter(line => line.contains("a")).count() 
    val numBs = logData.filter(line => line.contains("b")).count() 
    println("Lines with a: %s, Lines with b: %s".format(numAs, numBs)) 
    } 
} 

built.sbt低于:

name := "Simple Project" 

version := "1.0" 

scalaVersion := "2.10.4" 

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.2.0" 

错误信息如下:

15/08/27 05:23:38 ERROR Remoting: org.apache.spark.deploy.ApplicationDescription; local class incompatible: stream classdesc serialVersionUID = 7674242335164700840, local class serialVersionUID = -7685200927816255400 
java.io.InvalidClassException: org.apache.spark.deploy.ApplicationDescription; local class incompatible: stream classdesc serialVersionUID = 7674242335164700840, local class serialVersionUID = -7685200927816255400 
    at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:617) 
    at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1622) 
    at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517) 
    at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771) 
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350) 
    at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990) 
    at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915) 
    at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798) 
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350) 
    at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370) 
    at akka.serialization.JavaSerializer$$anonfun$1.apply(Serializer.scala:136) 
    at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57) 
    at akka.serialization.JavaSerializer.fromBinary(Serializer.scala:136) 
    at akka.serialization.Serialization$$anonfun$deserialize$1.apply(Serialization.scala:104) 
    at scala.util.Try$.apply(Try.scala:161) 
    at akka.serialization.Serialization.deserialize(Serialization.scala:98) 
    at akka.remote.serialization.MessageContainerSerializer.fromBinary(MessageContainerSerializer.scala:63) 
    at akka.serialization.Serialization$$anonfun$deserialize$1.apply(Serialization.scala:104) 
    at scala.util.Try$.apply(Try.scala:161) 
    at akka.serialization.Serialization.deserialize(Serialization.scala:98) 
    at akka.remote.MessageSerializer$.deserialize(MessageSerializer.scala:23) 
    at akka.remote.DefaultMessageDispatcher.payload$lzycompute$1(Endpoint.scala:58) 
    at akka.remote.DefaultMessageDispatcher.payload$1(Endpoint.scala:58) 
    at akka.remote.DefaultMessageDispatcher.dispatch(Endpoint.scala:76) 
    at akka.remote.EndpointReader$$anonfun$receive$2.applyOrElse(Endpoint.scala:937) 
    at akka.actor.Actor$class.aroundReceive(Actor.scala:465) 
    at akka.remote.EndpointActor.aroundReceive(Endpoint.scala:415) 
    at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516) 
    at akka.actor.ActorCell.invoke(ActorCell.scala:487) 
    at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238) 
    at akka.dispatch.Mailbox.run(Mailbox.scala:220) 
    at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393) 
    at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) 
    at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339) 
    at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) 
    at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) 

回答

5

你说你是一个运行Spark 1.4.0群集,但您的build.sbt正在搭建1.2.0。请更改这个在您的build.sbt:

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.4.0" 
+0

哦,你是正确的,非常感谢......可耻的我...哈哈 – keypoint

+0

没有概率,没点敲打你的头agianst墙上;-) –