2015-05-20 22 views
4

我正在运行Hadoop 2.7.0,hive 1.1.0和spark 1.3.1。我在mysql数据库中有我的metastore db。我可以创建和查看蜂巢shell中的数据。SPARK:无法实例化org.apache.hadoop.hive.metastore.HiveMetaStoreClient

hive (dwhdb)> select * from dwhdb.test_sample; 
 
OK 
 
test_sample.emp_id \t test_sample.emp_name \t test_sample.emp_dept \t test_sample.emp_sal 
 
Eid1 \t EName1 \t EDept1 \t 100.0 
 
Eid2 \t EName2 \t EDept1 \t 102.0 
 
Eid3 \t EName3 \t EDept1 \t 101.0 
 
Eid4 \t EName4 \t EDept2 \t 110.0 
 
Eid5 \t EName5 \t EDept2 \t 121.0 
 
Eid6 \t EName6 \t EDept3 \t 99.0 
 
Time taken: 0.135 seconds, Fetched: 6 row(s)

但是,当我试图选择火花,我得到错误相同的数据

[email protected]:~$ spark-shell 
 
Welcome to 
 
     ____    __ 
 
    /__/__ ___ _____/ /__ 
 
    _\ \/ _ \/ _ `/ __/ '_/ 
 
    /___/ .__/\_,_/_/ /_/\_\ version 1.3.1 
 
     /_/ 
 

 
Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_21) 
 
Type in expressions to have them evaluated. 
 
Type :help for more information. 
 
Spark context available as sc. 
 
SQL context available as sqlContext. 
 

 
scala> val sqlHContext = new org.apache.spark.sql.hive.HiveContext(sc) 
 
sqlHContext: org.apache.spark.sql.hive.HiveContext = [email protected] 
 

 
scala> sqlHContext.sql("SELECT emp_id, emp_name from dwhdb.test_sample").collect().foreach(println) 
 
java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient 
 
\t at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:346) 
 
\t at org.apache.spark.sql.hive.HiveContext.sessionState$lzycompute(HiveContext.scala:239) 
 
\t at org.apache.spark.sql.hive.HiveContext.sessionState(HiveContext.scala:235) 
 
\t at org.apache.spark.sql.hive.HiveContext.hiveconf$lzycompute(HiveContext.scala:251) 
 
\t at org.apache.spark.sql.hive.HiveContext.hiveconf(HiveContext.scala:250) 
 
\t at org.apache.spark.sql.hive.HiveContext.sql(HiveContext.scala:95) 
 
\t at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:24) 
 
\t at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:29) 
 
\t at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:31) 
 
\t at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:33) 
 
\t at $iwC$$iwC$$iwC$$iwC.<init>(<console>:35) 
 
\t at $iwC$$iwC$$iwC.<init>(<console>:37) 
 
\t at $iwC$$iwC.<init>(<console>:39) 
 
\t at $iwC.<init>(<console>:41) 
 
\t at <init>(<console>:43) 
 
\t at .<init>(<console>:47) 
 
\t at .<clinit>(<console>) 
 
\t at .<init>(<console>:7) 
 
\t at .<clinit>(<console>) 
 
\t at $print(<console>) 
 
\t at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
 
\t at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 
 
\t at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
 
\t at java.lang.reflect.Method.invoke(Method.java:601) 
 
\t at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065) 
 
\t at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338) 
 
\t at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840) 
 
\t at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871) 
 
\t at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819) 
 
\t at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:856) 
 
\t at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:901) 
 
\t at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:813) 
 
\t at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:656) 
 
\t at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:664) 
 
\t at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:669) 
 
\t at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:996) 
 
\t at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944) 
 
\t at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944) 
 
\t at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135) 
 
\t at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:944) 
 
\t at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1058) 
 
\t at org.apache.spark.repl.Main$.main(Main.scala:31) 
 
\t at org.apache.spark.repl.Main.main(Main.scala) 
 
\t at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
 
\t at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 
 
\t at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
 
\t at java.lang.reflect.Method.invoke(Method.java:601) 
 
\t at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569) 
 
\t at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166) 
 
\t at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189) 
 
\t at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110) 
 
\t at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 
 
Caused by: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient 
 
\t at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1412) 
 
\t at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:62) 
 
\t at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:72) 
 
\t at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2453) 
 
\t at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2465) 
 
\t at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:340) 
 
\t ... 51 more 
 
Caused by: java.lang.reflect.InvocationTargetException 
 
\t at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 
 
\t at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57) 
 
\t at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) 
 
\t at java.lang.reflect.Constructor.newInstance(Constructor.java:525) 
 
\t at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1410) 
 
\t ... 56 more 
 
Caused by: java.lang.NumberFormatException: For input string: "600s" 
 
\t at java.lang.NumberFormatException.forInputString(NumberFormatException.java:65) 
 
\t at java.lang.Integer.parseInt(Integer.java:492) 
 
\t at java.lang.Integer.parseInt(Integer.java:527) 
 
\t at org.apache.hadoop.conf.Configuration.getInt(Configuration.java:1134) 
 
\t at org.apache.hadoop.hive.conf.HiveConf.getIntVar(HiveConf.java:1211) 
 
\t at org.apache.hadoop.hive.conf.HiveConf.getIntVar(HiveConf.java:1220) 
 
\t at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:293) 
 
\t at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:214) 
 
\t ... 61 more

可否请你让我知道什么都可以可能的原因是这样的。

回答

0

日志说你正试图将字符串'600s'转换为整数。你能检查你提供这个的位置吗?看起来像你在你的hive-site.xml中有这个。 请将其更改为'600'。

相关问题