我一直在尝试使用spark-shell的spark。我所有的数据都是在sql中。spark jobserver错误classnotfoundexception
I used to include external jars using the --jars flag like /bin/spark-shell --jars /path/to/mysql-connector-java-5.1.23-bin.jar --master spark://sparkmaster.com:7077
I have included it in class path by changing the bin/compute-classpath.sh file
I was running succesfully with this config.
现在,当我通过jobserver运行一个独立的工作。我收到以下错误消息
result: {
"message" : "com.mysql.jdbc.Driver"
"errorClass" : "java.lang.classNotFoundException"
"stack" :[.......]
}
我已经将jar文件包含在我的local.conf文件中,如下所示。 上下文设置{ ..... 从属JAR-URI的= [ “文件:///绝对/路径/到/与/ jar文件”] ...... }