2017-08-01 35 views
0

在我安装了anaconda软件包后,我无法在Windows 7下启动Spark Shell。每次输入spark-shell时,控制台将以The system cannot find the path specified.回答Spark Shell无法启动当然。spark-shell:系统找不到指定的路径

我有以下echo %PATH%

C:\ Program Files文件\微软MPI \ BIN \; C:\ Program Files文件(x86)的\ Common Files文件\英特尔\共享文件\ CPP \ BIN \ Intel64位; C:\ Program Files(x86)\ Intel \ iCLS Client \; C:\ ProgramFiles \ Intel \ iCLS Client \; C:\ windows \ system32; C:\ windows; C:\ windows \ System32 \ Wbem; C: \ Windows \ System32 \ WindowsPowerShell \ v1.0 \; C:\ Program Files \ Intel \ Intel(R)Management Engine Components \ DAL; C:\ Program Files(x86)\ Intel \ Intel(R)Management Engine Components \ DAL ; C:\ Program Files \ Intel \ Intel(R)Management Engine Components \ IPT; C:\ Program Files(x86)\ Intel \ Intel(R)Management Engine Components \ IPT; C:\ Program Files \ Lenovo \ Fingerprint Manager Pro \; C:\ Program Files(x86)\ WinSCP \; C:\ Program F iles(x86)\ Lenovo \ Access Connections \; C:\ Program Files \ MiKTeX 2.9 \ miktex \ bin \ x64 \; C:\ Program Files \ PuTTY \; C:\ Program Files(x86)\ Intel \ UCRT \ C:\ Program Files \ Intel \ UCRT \; C:\ Program Files \ Intel \ WiFi \ bin \; C:\ Program Files \ Common Files \ Intel \ WirelessCommon \; C:\ Program Files \ Microsoft SQL Server \ 130 \ Tools \ Binn \; C:\ Program Files \ dotnet \; C:\ Program Files \ Anaconda3; C:\ Program Files \ Anaconda3 \ Scripts; C:\ Program Files \ Anaconda3 \ Library \ bin; C:\ Program Files x)\ GtkSharp \ 2.12 \ bin; C:\ Program Files \ Git \ cmd; C:\ Program Files \ TortoiseGit \ bin; C:\ Program Files \ TortoiseSVN \ bin; C:\ Program Files(x86)\ sbt \ bin; C:\ Program Files(x86)\ scala \ bin; C:\ Program Files(x86)\ Java \ jre1.8.0_144 \ bin; C:\ Program Files \ Intel \ WiFi \ bin \; C:\ Program Files \ Common Files \ Intel \ WirelessCommon \; C:\ Program Files(x86)\ Graphviz2.38 \ bin \; C:\ Program Files (86)\ SBT \ BIN; C:\ Program Files文件(x86)的\斯卡拉\ BIN; d:\星火\ BIN; d:\ Hadoop的\ BIN

而下面echo %SPARK_HOME%

d:\星火

而下面echo %JAVA_HOME%

C:\ Program Files文件(86)\的Java \ jre1.8.0_144

这是我java -version

Java版本 “1.8.0_144”

的Java(TM)SE运行时环境(建1.8.0_144 -b01)

爪哇的HotSpot(TM)客户机VM(建立25.144-B01,混合模式,共享)

我已经尝试重新安装Java,但没有任何成功。有一个类似的问题here,但我没有看到我的设置中有任何错误的环境变量。所以我真的不知道如何解决这个问题......任何想法?

经过一番测试,我发现当我cd$SPARK_HOME$\bin我实际上可以执行spark-shell。它退出时显示一条错误消息:

\ Java \ jre1.8.0_144 \ bin \ java此时意外。

执行Spark\bin\spark-submit2.cmd的最后一行"%~dp0spark-class2.cmd" %CLASS% %*时出现此错误。

更新1:

改变从%JAVA_HOME% “C:\ Program Files文件...” 到 “C:\ PROGRA〜1 ......” 的确解决了这个问题,在一些地方:spark-shell现在看来开始。然而,也有很多Access denied错误:

java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveSessionStateBuilder': 
at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$insta 
ntiateSessionState(SparkSession.scala:1053) 
at org.apache.spark.sql.SparkSession$$anonfun$sessionState$2.apply(SparkSessio 
n.scala:130) 
at org.apache.spark.sql.SparkSession$$anonfun$sessionState$2.apply(SparkSessio 
n.scala:130) 
at scala.Option.getOrElse(Option.scala:121) 
at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scal 
a:129) 
at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:126) 
at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(Spar 
kSession.scala:938) 
at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(Spar 
kSession.scala:938) 
at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99) 
at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99) 
at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230) 
at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40) 
at scala.collection.mutable.HashMap.foreach(HashMap.scala:99) 
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:93 
8) 
at org.apache.spark.repl.Main$.createSparkSession(Main.scala:97) 
... 47 elided 
Caused by: org.apache.spark.sql.AnalysisException: java.lang.RuntimeException: j 
ava.lang.RuntimeException: java.io.IOException: Access is denied; 
at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalo 
g.scala:106) 
at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCa 
talog.scala:193) 
at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(Shared 
State.scala:105) 
at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala 
:93) 
at org.apache.spark.sql.hive.HiveSessionStateBuilder.externalCatalog(HiveSessi 
onStateBuilder.scala:39) 
at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog$lzycompute(HiveSe 
ssionStateBuilder.scala:54) 
at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog(HiveSessionStateB 
uilder.scala:52) 
at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog(HiveSessionStateB 
uilder.scala:35) 
at org.apache.spark.sql.internal.BaseSessionStateBuilder.build(BaseSessionStat 
eBuilder.scala:289) 
at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$insta 
ntiateSessionState(SparkSession.scala:1050) 
... 61 more 
Caused by: java.lang.RuntimeException: java.lang.RuntimeException: java.io.IOExc 
eption: Access is denied 
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522) 

at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala 
:191) 
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 
at sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source) 
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source) 
at java.lang.reflect.Constructor.newInstance(Unknown Source) 
at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(Isolated 
ClientLoader.scala:264) 
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:3 
62) 
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:2 
66) 
at org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExterna 
lCatalog.scala:66) 
at org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.sc 
ala:65) 
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.app 
ly$mcZ$sp(HiveExternalCatalog.scala:194) 
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.app 
ly(HiveExternalCatalog.scala:194) 
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.app 
ly(HiveExternalCatalog.scala:194) 
at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalo 
g.scala:97) 
... 70 more 
Caused by: java.lang.RuntimeException: java.io.IOException: Access is denied 
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:515) 

... 84 more 
Caused by: java.io.IOException: Access is denied 
at java.io.WinNTFileSystem.createFileExclusively(Native Method) 
at java.io.File.createTempFile(Unknown Source) 
at org.apache.hadoop.hive.ql.session.SessionState.createTempFile(SessionState. 
java:818) 
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:513) 

... 84 more 
<console>:14: error: not found: value spark 
import spark.implicits._ 
    ^

<console>:14: error: not found: value spark 
import spark.sql 
    ^

更新2:

运行spark-shell作为管理员的工作!但是,这可能是非常不安全的,我不认为这是一个真正的解决方案。

+0

计划之间'空间'JAVA_HOME'中的''和'文件'可能是这里的罪魁祸首。 – philantrovert

+0

最好的解决办法是什么?重新安装Java?顺便说一下,我的java目录中有一段空间,当它在一段时间之前工作时... – thestackexchangeguy

+1

我不确定这是否会导致错误,但您可以尝试将java重新安装到没有空格的路径。就像'C:\ Java \' – philantrovert

回答

0

请确保您已正确设置您的JAVA_HOME和SBT_HOME,我也将它们添加到Path变量中以保证安全。为了做到这一点,我可以推荐“快速环境编辑器”,这是编辑系统变量的简单而漂亮的工具。这种方法使它适用于我,因为我遇到了同样的问题。一个例子是:

JAVA_HOME设置为C:\ Program Files文件\的Java \ jdk1.8.0_151

SBT_HOME设置为C:\ Program Files文件(x86)的\ SBT \

相关问题