2
我有Hadoop/hive的工作。我已经安装了hadoop和hive,它在命令提示符下运行良好。我还创建了一个hive的MySQL元存储。我已经在hive-site.xml文件中定义了HIVE-DB数据库名称。同名数据库可用于MySQL> HIVE-DB。但是,在命令提示符下创建的表在mysql命令提示符中不可用。 而当我想创建再蜂巢JDBC连接得到以下error..First这是我的计划,以建立一个JDBC连接
package aa;
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.ResultSet;
import java.sql.Statement;
public class Main
{
private static String driverName = "org.apache.hadoop.hive.jdbc.HiveDriver";
public static void main(String args[])
{
try {
Class.forName(driverName);
}
catch (ClassNotFoundException e)
{
// TODO Auto-generated catch block
e.printStackTrace();
System.exit(1);
}
try
{
Connection con = DriverManager.getConnection("jdbc:hive://localhost:10001/default", "", "");
Statement stmt = con.createStatement();
String tableName = "recordssss";
stmt.executeQuery("create table"+tableName+"(id int,name string)");
}
catch(Exception e)
{
e.printStackTrace();
}
}
}
and then following error is display... because i have start hive as a hive server i.e
**$HIVE_HOME/bin/hive --service hiveserver -p 10001**
xception in thread "main" java.lang.NoSuchMethodError: org.apache.hadoop.hive.service.ThriftHive$Client.sendBase(Ljava/lang/String;Lorg/apache/thrift/TBase;)V
at org.apache.hadoop.hive.service.ThriftHive$Client.send_execute(ThriftHive.java:110)
at org.apache.hadoop.hive.service.ThriftHive$Client.execute(ThriftHive.java:102)
at org.apache.hadoop.hive.jdbc.HiveStatement.executeQuery(HiveStatement.java:187)
at org.apache.hadoop.hive.jdbc.HiveStatement.execute(HiveStatement.java:127)
at org.apache.hadoop.hive.jdbc.HiveConnection.configureConnection(HiveConnection.java:126)
at org.apache.hadoop.hive.jdbc.HiveConnection.<init>(HiveConnection.java:121)
at org.apache.hadoop.hive.jdbc.HiveDriver.connect(HiveDriver.java:104)
at java.sql.DriverManager.getConnection(DriverManager.java:620)
at java.sql.DriverManager.getConnection(DriverManager.java:200)
at aa.Main.main(Main.java:25)
enter code here
so pls help me i have describe the problem to you so pls dear anyone help me
我已经使用hive 0.10版本和hadoop 1.1.2版本eclipse中的所有jar文件都有0.10 –
commons-logging-1.1.3.jar,hadoop-core-1.1.2.jar,hive-exec-0.10.0的.jar,蜂房JDBC-0.10.0.jar,蜂房metastore-0.10.0.jar,蜂房服务0.10.0.jar,libfb303.jar,SLF4J-API-1.6.1.jar,SLF4J-log4j12 -1.6.1.jar,log4j-1.2.16.jar这些是我插入到项目库中的所有jar ... –
@Charnjeet我认为hive-0.10.0发行版包含libfb303-0.9.0.jar。请使用libfb303-0.9.0.jar而不是libfb303.jar,并且还包含libthrift-0.9.0.jar以及HIVE_HOME/lib。我有一种感觉,问题在于这两个瓶子(libthrift-0.9.0.jar和libfb303.jar)。玩这两个罐子,希望它会做到这一点。祝你好运 !!! – nJn