2016-02-15 40 views
1

我在单个节点上安装了apache-spark。当我运行spark-shell时,出现下面的异常。尽管有例外,我仍然可以创建RDD并运行scala代码片段。执行spark-shell时出现Apache Spark异常

这是例外:

16/02/15 14:21:29 WARN ObjectStore: Failed to get database default, returning NoSuchObjectException 
16/02/15 14:21:31 WARN : Your hostname, Rahul-PC resolves to a loopback/non-reachable address: fe80:0:0:0:c0c1:cd2e:990d:17ac%e 
java.lang.RuntimeException: java.lang.NullPointerException 
     at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522) 
     at org.apache.spark.sql.hive.client.ClientWrapper.<init>(ClientWrapper.scala:171) 
     at org.apache.spark.sql.hive.HiveContext.executionHive$lzycompute(HiveContext.scala:162) 
     at org.apache.spark.sql.hive.HiveContext.executionHive(HiveContext.scala:160) 
     at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:167) 
     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 
     at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) 
     at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) 
     at java.lang.reflect.Constructor.newInstance(Constructor.java:408) 
     at org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:1028) 
     at $iwC$$iwC.<init>(<console>:9) 
     at $iwC.<init>(<console>:18) 
     at <init>(<console>:20) 
     at .<init>(<console>:24) 
     at .<clinit>(<console>) 
     at .<init>(<console>:7) 
     at .<clinit>(<console>) 
     at $print(<console>) 
     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 

JAVA_HOME设置为指向正确的JDK的安装文件夹中。

JAVA_HOME = C:\Program Files\Java\jdk1.8.0 

还有什么我需要做的。请指教。

+0

似乎类似于这样的问题:http://stackoverflow.com/q/32721647/1395437 –

+0

当你得到这个?火花外壳的启动还是执行一些命令? –

回答

相关问题