2017-08-04 73 views
-2

我只是使用火花蟒蛇约2个月,现在我想设置一些脚本火花流,所以我需要纱线和mesos作为我的火花应用程序。然而,由于我使用这个命令 “火花提交--master纱file.py” 它总是显示此问题:火花提交 - 主管纱线不能工作,并显示错误


17/08/04 10:36:24 ERROR client.TransportClient: Failed to send RPC 8506915915091728278 to /192.168.11.164:55857: java.nio.channels.ClosedChannelException 
java.nio.channels.ClosedChannelException 
17/08/04 10:36:24 WARN netty.NettyRpcEndpointRef: Error sending message [message = RequestExecutors(0,0,Map())] in 1 attempts 
org.apache.spark.SparkException: Exception thrown in awaitResult 
    at org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:77) 
    at org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:75) 
    at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:36) 
    at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59) 
    at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59) 
    at scala.PartialFunction$OrElse.apply(PartialFunction.scala:167) 
    at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:83) 
    at org.apache.spark.rpc.RpcEndpointRef.askWithRetry(RpcEndpointRef.scala:102) 
    at org.apache.spark.rpc.RpcEndpointRef.askWithRetry(RpcEndpointRef.scala:78) 
    at org.apache.spark.scheduler.cluster.YarnSchedulerBackend$YarnSchedulerEndpoint$$anonfun$receiveAndReply$1$$anonfun$applyOrElse$1.apply$mcV$sp(YarnSchedulerBackend.scala:271) 
    at org.apache.spark.scheduler.cluster.YarnSchedulerBackend$YarnSchedulerEndpoint$$anonfun$receiveAndReply$1$$anonfun$applyOrElse$1.apply(YarnSchedulerBackend.scala:271) 
    at org.apache.spark.scheduler.cluster.YarnSchedulerBackend$YarnSchedulerEndpoint$$anonfun$receiveAndReply$1$$anonfun$applyOrElse$1.apply(YarnSchedulerBackend.scala:271) 
    at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24) 
    at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24) 
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) 
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) 
    at java.lang.Thread.run(Thread.java:745) 
Caused by: java.io.IOException: Failed to send RPC 8506915915091728278 to /192.168.11.164:55857: java.nio.channels.ClosedChannelException 
    at org.apache.spark.network.client.TransportClient$3.operationComplete(TransportClient.java:239) 
    at org.apache.spark.network.client.TransportClient$3.operationComplete(TransportClient.java:226) 
    at io.netty.util.concurrent.DefaultPromise.notifyListener0(DefaultPromise.java:680) 
    at io.netty.util.concurrent.DefaultPromise$LateListeners.run(DefaultPromise.java:845) 
    at io.netty.util.concurrent.DefaultPromise$LateListenerNotifier.run(DefaultPromise.java:873) 
    at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357) 
    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357) 
    at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111) 
    ... 1 more 
Caused by: java.nio.channels.ClosedChannelException 
+0

火花版本是2.0和python版本是3.5.3 –

+0

你可以请张贴你的spark-submit命令吗? –

+0

[root @ master addnewpaper]#spark-submit --master yarn demo1.py –

回答

0

无法发送RPC 8506915915091728278到/192.168.11.164:55857: java.nio.channels.ClosedChannelException

您在提交作业的机器和YARN之间是否存在连接?看起来像是存在连接问题。

+0

我的操作工程师说在一个节点上有一些问题 –

+0

我已经要求他们要求节点,他们告诉我,如果我使用了这么多的内存,它会使节点死亡 –