0
虽然我可以成功利用火花外壳连接到HBase的它引发错误Spark1.4.0无法连接到HBase的1.1.0.1
Caused by: java.lang.NoSuchMethodError: org.apache.hadoop.hbase.util.Addressing.getIpAddress()Ljava/net/InetAddress;
。谁能知道问题在哪里?
详细错误
15/07/01 18:57:57 ERROR yarn.ApplicationMaster: User class threw exception: java.io.IOException: java.lang.reflect.InvocationTargetException
java.io.IOException: java.lang.reflect.InvocationTargetException
at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:240)
at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:218)
at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:119)
at com.koudai.resys.tmp.HbaseLearning$.main(HbaseLearning.scala:22)
at com.koudai.resys.tmp.HbaseLearning.main(HbaseLearning.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:483)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:525)
at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:238)
... 9 more
Caused by: java.lang.NoSuchMethodError: org.apache.hadoop.hbase.util.Addressing.getIpAddress()Ljava/net/InetAddress;
at org.apache.hadoop.hbase.client.ClientIdGenerator.getIpAddressBytes(ClientIdGenerator.java:83)
at org.apache.hadoop.hbase.client.ClientIdGenerator.generateClientId(ClientIdGenerator.java:43)
at org.apache.hadoop.hbase.client.PerClientRandomNonceGenerator.<init>(PerClientRandomNonceGenerator.java:37)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.<init>(ConnectionManager.java:682)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.<init>(ConnectionManager.java:630)
... 14 more
的SBT配置:
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.4.0" % "provided"
libraryDependencies += "org.apache.spark" %% "spark-mllib" % "1.4.0" % "provided"
libraryDependencies += "org.apache.hbase" % "hbase" % "1.1.0.1"
libraryDependencies += "org.apache.hbase" % "hbase-client" % "1.1.0.1"
libraryDependencies += "org.apache.hbase" % "hbase-common" % "1.1.0.1"
libraryDependencies += "org.apache.hbase" % "hbase-server" % "1.1.0.1"
libraryDependencies += "org.apache.hbase" % "hbase-hadoop2-compat" % "1.1.0.1"
的运行的代码:
val sc = new SparkConf().setAppName("[email protected]") val conf = HBaseConfiguration.create() conf.set("hbase.zookeeper.property.clientPort", "2181") conf.set("hbase.zookeeper.quorum", "idc02-rs-sfa-10") // the error raised from here val conn = ConnectionFactory.createConnection(conf)
使用反射来列出org.apache.hadoop.hbase方法.util.Addressing发现它是hbase 0.94版本,可能来自哪里?
parsePort
createHostAndPortStr
createInetSocketAddressFromHostAndPortStr
getIpAddress
getIp4Address
getIp6Address
parseHostname
isLocalAddress
wait
wait
wait
equals
toString
hashCode
getClass
notify
notifyAll
您只需要在您的应用程序中使用hbase-client。你为什么在你的应用中包含hbase-server,hbase-common?你使用的是cdh还是hdp?哪个版本的发行版?另外,运行时应用程序的类路径是什么?在那里引用的任何陈旧的安装hbase的库? –
谢谢,1。 hbase-common也需要通过测试2.使用Apache原始版本2.5.2,3. classpath已经包含hadoop lib 4.你是对的,通过挖掘classpath,发现有人意外地把旧版本的hbase lib(0.94) hadoop库,它污染了classpath – Vanjor