2016-11-30 110 views
3

我正尝试使用Java程序从本地系统连接到Spark主节点(远程群集节点)。我正在使用以下API连接:远程连接到Spark群集

SparkConf conf = new SparkConf().setAppName("WorkCountApp").setMaster("spark://masterIP:7077"); 
JavaSparkContext sc = new JavaSparkContext(conf); 

我的程序尝试连接到主设备,但一段时间后失败。下面是堆栈跟踪:

16/11/30 17:40:26 INFO AppClient$ClientActor: Connecting to master akka.tcp://[email protected]:7077/user/Master... 
    16/11/30 17:40:46 ERROR SparkDeploySchedulerBackend: Application has been killed. Reason: All masters are unresponsive! Giving up. 
    16/11/30 17:40:46 WARN SparkDeploySchedulerBackend: Application ID is not initialized yet. 
    16/11/30 17:40:46 INFO SparkUI: Stopped Spark web UI at http://172.31.11.1:4040 
    16/11/30 17:40:46 INFO DAGScheduler: Stopping DAGScheduler 
    16/11/30 17:40:46 INFO SparkDeploySchedulerBackend: Shutting down all executors 
    16/11/30 17:40:46 INFO SparkDeploySchedulerBackend: Asking each executor to shut down 
    16/11/30 17:40:46 ERROR OneForOneStrategy: 
    java.lang.NullPointerException 
java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext 
    at org.apache.spark.SparkContext.org$apache$spark$SparkContext$$assertNotStopped(SparkContext.scala:103) 

请帮我用同样

回答

0

有很多原因的连接失败。然而,对于这个,看起来没有工作线程已经为这个Spark主实例化。

在远程机器上,您需要启动火花主机以及火花工(从机)

相关问题