2015-10-20 121 views
3

我只是试图发动对我的本地Windows 8的火花外壳和这里的错误信息,我得到:星火1.5.1火花壳抛出的RuntimeException

java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: 
rw-rw-rw- 
    at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522) 
    at org.apache.spark.sql.hive.client.ClientWrapper.<init>(ClientWrapper.scala:171) 
    at org.apache.spark.sql.hive.HiveContext.executionHive$lzycompute(HiveContext.scala:162) 
    at org.apache.spark.sql.hive.HiveContext.executionHive(HiveContext.scala:160) 
    at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:167) 
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source) 
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source) 
    at java.lang.reflect.Constructor.newInstance(Unknown Source) 
    at org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:1028) 
    at $iwC$$iwC.<init>(<console>:9) 
    at $iwC.<init>(<console>:18) 
    at <init>(<console>:20) 
    at .<init>(<console>:24) 
    at .<clinit>(<console>) 
    at .<init>(<console>:7) 
    at .<clinit>(<console>) 
    at $print(<console>) 

Caused by: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rw-rw-rw- 
    at org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:612) 
    at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:554) 
    at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:508) 
    ... 56 more 

不知何故,REPL是在这里,但我不能使用sqlContext ..

以前有没有人遇到过这个问题?任何答案都会有帮助,谢谢。

+0

它说:'HDFS上的'/ tmp/hive应该是可写的'。你甚至有这样一个目录吗?这看起来像是一种UNIX风格的路径。 – tuxdna

+0

确实我不知道这个目录,这可能是由hadoop在启动spark-shell时创建的,我猜想 – Will

+0

我不认为spark-shell会调用任何hadoop相关的函数,除非您要求Spark这样做。您可能想要描述您采取的步骤,并提及您使用的火花版本。 – tuxdna

回答