我写了一个部署在unix机器上的小类,我似乎无法弄清楚为什么会出现这个错误。我检查了我的SPARK_HOME
并添加了如下课程中所示的所有必需选项。我试图用这种方式来监视最终运行的火花线程。 Spark-submit完美地工作,所以我知道环境的设置不是问题。SparkLauncher未启动应用程序
package com.james.SparkLauncher2;
import java.io.IOException;
import java.util.HashMap;
import java.util.Map;
import org.apache.log4j.Logger;
import org.apache.spark.launcher.SparkAppHandle;
import org.apache.spark.launcher.SparkLauncher;
public class SparkLauncher2
{
static final Logger LOGGER = Logger.getLogger(SparkLauncher2.class);
public static void main(String[] args) {
try {
LOGGER.info("In main of SparkLauncher2");
Map <String, String> env= new HashMap<>();
env.put("SPARK_HOME", "/opt/cloudera/parcels/CDH-5.8.0-1.cdh5.8.0.p0.42/lib/spark");
env.put(" SPARK_LIBRARY_PATH", "/opt/cloudera/parcels/CDH-5.8.0-1.cdh5.8.0.p0.42/lib/spark/lib");
System.out.println("Environments setup correctly");
//pass in enviroment variables
SparkAppHandle sparkLauncher= new SparkLauncher(env)
.setAppResource("/home/james/fe.jar")
//This conf file works well with the spark submit so it shouldn't be source of the issue
.setPropertiesFile("/etc/spark/conf/spark-defaults.conf")
.setMainClass("com.james.SparkLauncher2.SparkLauncher2")
.setConf(SparkLauncher.DRIVER_MEMORY, "2g")
.setDeployMode("client")
.setVerbose(true)
.setConf("spark.yarn.keytab ","/home/james/my.keytab")
.setConf("spark.yarn.principal","somestring")
.setConf("spark.app.name ","SparkLauncher2") //add class name for example HbaseTest
.setConf("spark.jars","/home/james/flume-test.jar,/opt/cloudera/parcels/CDH-5.8.0-1.cdh5.8.0.p0.42/bin/test")
//call listener class to see if there is any state change
.startApplication(new MyListener());
sparkLauncher.stop();
//handle.stop();
} catch (IOException e) {
e.printStackTrace();
}
//this exception is what gets thrown
catch(Exception e){
LOGGER.info("General exception");
e.printStackTrace();
}
}
}
我inteneded这个类主要是状态的变化检查,但没有记录状态转换
class MyListener implements SparkAppHandle.Listener {
@Override
public void stateChanged(SparkAppHandle handle) {
System.out.println("state changed " + handle.getState());
}
@Override
public void infoChanged(SparkAppHandle handle) {
System.out.println("info changed " + handle.getState());
}
}
这里是我检查的目录和所有的人似乎纠正异常。我甚至写了一个替代版本,其中一切都被硬编码到setConf方法中。显然没有火花工作开始。我在UI上也看不到任何工作。 CommandBuilder类文档不清楚如何抛出此异常。出于上下文的目的,这是Java 7和火花1.6
java.lang.IllegalStateException: Application is still not connected.
at org.apache.spark.launcher.CommandBuilderUtils.checkState(CommandBuilderUtils.java:249)
at org.apache.spark.launcher.ChildProcAppHandle.stop(ChildProcAppHandle.java:74)
at com.james.SparkLauncher2.SparkLauncher2.main(SparkLauncher2.java:43)