2015-10-08 132 views
6

我正在使用Spark 1.4.1。 我可以使用spark-submit没有问题。 但是,当我跑~/spark/bin/spark-shell无法启动火花外壳

我得到了下面 错误我已经配置SPARK_HOMEJAVA_HOME。 然而,正是有了1.2

15/10/08 02:40:30 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 

Failed to initialize compiler: object scala.runtime in compiler mirror not found. 
** Note that as of 2.8 scala does not assume use of the java classpath. 
** For the old behavior pass -usejavacp to scala, or if using a Settings 
** object programatically, settings.usejavacp.value = true. 

Failed to initialize compiler: object scala.runtime in compiler mirror not found. 
** Note that as of 2.8 scala does not assume use of the java classpath. 
** For the old behavior pass -usejavacp to scala, or if using a Settings 
** object programatically, settings.usejavacp.value = true. 
Exception in thread "main" java.lang.AssertionError: assertion failed: null 
     at scala.Predef$.assert(Predef.scala:179) 
     at org.apache.spark.repl.SparkIMain.initializeSynchronous(SparkIMain.scala:247) 
     at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:990) 
     at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945) 
     at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945) 
     at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135) 
     at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945) 
     at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059) 
     at org.apache.spark.repl.Main$.main(Main.scala:31) 
     at org.apache.spark.repl.Main.main(Main.scala) 
     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 
     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
     at java.lang.reflect.Method.invoke(Method.java:606) 
     at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:665) 
     at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:170) 
     at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:193) 
     at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:112) 
     at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 
+0

你在哪里设置了'SPARK_HOME'?在你的.bashrc?导致你得到的错误是由于SPARK_HOME未设置为'spark-shell'试图从'dirname'中找到它。 – user1314742

+0

我应该如何设置我的SPARK_HOME?是否应该设置为导出SPARK_HOME =/usr/local/Cellar/apache-spark/2.2.0/bin? – lordlabakdas

+0

我不认为这个问题是SPARK_HOME。不正确的SPARK_HOME将导致spark-shell脚本无法找到spark-submit。但是,当我确保SPARK_HOME和我直接调用“spark-submit -class org.apache.spark.repl.Main”时,我在机器上看到了同样的错误。 –

回答

0

你有没有安装Scala和SBT星火OK?
日志表示它没有找到主类。

+0

你认为这是由sbt和scala引起的,不是放在PATH中吗? – worldterminator

1

我在运行spark时遇到了同样的问题,但是我发现这是我没有正确配置scala的原因。 请确保您有Java中,Scala和SBT安装和火花建:

编辑您的.bashrc文件 VIM的.bashrc

设置你的ENV变量:

export JAVA_HOME=/usr/lib/jvm/java-7-oracle 
export PATH=$JAVA_HOME:$PATH 

export SCALA_HOME=/usr/local/src/scala/scala-2.11.5 
export PATH=$SCALA_HOME/bin:$PATH 

export SPARK_HOME=/usr/local/src/apache/spark.2.0.0/spark 
export PATH=$SPARK_HOME/bin:$PATH 

源设置 。 .bashrc中

检查斯卡拉 阶-version

确保REPL开始 斯卡拉

如果你击退开始尝试并重新启动您的火花外壳。 ./path/to/spark/bin/spark-shell

你应该得到的火花REPL

1

你可以尝试运行

spark-shell -usejavacp 

它没有为我工作,但它确实在Spark Issue 18778的描述中为某人工作。