2016-07-25 36 views
0
  1. 我在Linux机器上安装了Spark。版本是spark-1.6.2-bin-hadoop2.6.tgz。
  2. 然后我用./sbin/start-all.sh启动Spark
  3. 我在Eclipse中运行示例JavaWordCount.java。 但始终失败。有人可以帮助我吗?无法连接到spark主机:InvalidClassException:org.apache.spark.rpc.RpcEndpointRef;本地类不兼容

  4. 星火大师的版本是:欢迎到1.6.2版本,使用Scala的版本2.10.5(Java的热点(TM)服务器虚拟机,Java的1.8.0_101) 星火在Eclipse上侧的版本是: enter image description here

例外如下:

16/07/25 12:01:20 INFO StandaloneAppClient$ClientEndpoint: Connecting to master spark:// hostname:7077... 
16/07/25 12:01:20 WARN StandaloneAppClient$ClientEndpoint: Failed to connect to master hostname:7077 
org.apache.spark.SparkException: Exception thrown in awaitResult 
    at org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:77) 
    at org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:75) 
    at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:33) 
    at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59) 
    at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59) 
    at scala.PartialFunction$OrElse.apply(PartialFunction.scala:162) 
    at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:83) 
    at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:88) 
    at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:96) 
    at org.apache.spark.deploy.client.StandaloneAppClient$ClientEndpoint$$anonfun$tryRegisterAllMasters$1$$anon$1.run(StandaloneAppClient.scala:109) 
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) 
    at java.util.concurrent.FutureTask.run(FutureTask.java:266) 
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) 
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) 
    at java.lang.Thread.run(Thread.java:745) 
Caused by: java.lang.RuntimeException: java.io.InvalidClassException: org.apache.spark.rpc.RpcEndpointRef; local class incompatible: stream classdesc serialVersionUID = -1223633663228316618, local class serialVersionUID = 18257903091306170 
    at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:616) 
    at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1630) 
    at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1521) 
    at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1630) 
    at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1521) 
    at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1781) 
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353) 
    at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2018) 
    at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1942) 
    at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1808) 
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353) 
    at java.io.ObjectInputStream.readObject(ObjectInputStream.java:373) 
    at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:76) 
    at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:109) 
    at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1$$anonfun$apply$1.apply(NettyRpcEnv.scala:258) 
    at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57) 
    at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:310) 
    at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1.apply(NettyRpcEnv.scala:257) 
    at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57) 
    at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:256) 
    at org.apache.spark.rpc.netty.NettyRpcHandler.internalReceive(NettyRpcEnv.scala:588) 
    at org.apache.spark.rpc.netty.NettyRpcHandler.receive(NettyRpcEnv.scala:570) 
    at org.apache.spark.network.server.TransportRequestHandler.processRpcRequest(TransportRequestHandler.java:149) 
    at org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:102) 
    at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:104) 
    at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:51) 
    at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105) 
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308) 
    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294) 
    at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:266) 
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308) 
    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294) 
    at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103) 
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308) 
+0

'本地类不兼容:':你在两端使用两个不同版本的Spark。使用相同的版本。这实际上是一个Spark错误,但使用一致的版本将会修复它。 – EJP

+0

@EJP你能给我更多的细节信息吗?我应该使用哪个版本?在Spark机器上:./spark-shell,我可以看到以下信息欢迎使用1.6.2版本,使用Scala版本2.10.5(Java HotSpot(TM)Server VM,Java 1.8.0_101) – michelle

+0

您应该使用最新的版本,你可以得到你的手。 – EJP

回答

2

此问题是由版本造成的不匹配。 我试着安装hadoop并使用spark-assembly-1.6.2-hadoop2.6.0.jar,现在工作正常。

相关问题