2016-10-31 88 views
1
  1. 我已经ambari HDP 2.5.0sqoop将数据导入配置单元抛出错误org.apache.sqoop.hive.HiveConfig?

  2. 安装HUE 3.10 Config中的hue.ini完全

我的问题是无功sqoop同步数据从MySQL蜂巢,它抛出一个异常:

[main] ERROR org.apache.sqoop.hive.HiveConfig – Could not load org.apache.hadoop.hive.conf.HiveConf. Make sure HIVE_CONF_DIR is set correctly. 

[main] ERROR org.apache.sqoop.hive.HiveConfig – Could not load org.apache.hadoop.hive.conf.HiveConf. Make sure HIVE_CONF_DIR is set correctly. 

[main] ERROR org.apache.sqoop.tool.ImportTool – Encountered IOException running import job: java.io.IOException: java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveConf 

at org.apache.sqoop.hive.HiveConfig.getHiveConf(HiveConfig.java:50) 
    at org.apache.sqoop.hive.HiveImport.getHiveArgs(HiveImport.java:397) 
    at org.apache.sqoop.hive.HiveImport.executeExternalHiveScript(HiveImport.java:384) 
    at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:342) 
    at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:246) 
    at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:524) 
    at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:615) 
    at org.apache.sqoop.Sqoop.run(Sqoop.java:147) 
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76) 
    at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183) 
    at org.apache.sqoop.Sqoop.runTool(Sqoop.java:225) 
    at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234) 
    at org.apache.sqoop.Sqoop.main(Sqoop.java:243) 
    at org.apache.oozie.action.hadoop.SqoopMain.runSqoopJob(SqoopMain.java:202) 
    at org.apache.oozie.action.hadoop.SqoopMain.run(SqoopMain.java:182) 
    at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:51) 
    at org.apache.oozie.action.hadoop.SqoopMain.main(SqoopMain.java:48) 
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
    at java.lang.reflect.Method.invoke(Method.java:606) 
    at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:242) 
    at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54) 
    at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453) 
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343) 
    at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168) 
    at java.security.AccessController.doPrivileged(Native Method) 
    at javax.security.auth.Subject.doAs(Subject.java:415) 
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724) 
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162) 

但是,如果在命令行中执行相同的sqoop脚本,它的工作原理!

增加的环境变量HADOOP_CLASSPATH=$HADOOP_CLASSPATH:/usr/hdp/current/hive-client/lib to /etc/profile。它仍然不起作用。我尝试了一些时间来解决这个问题,但我自己却表现出色。

脚本是/usr/hdp/2.5.0.0-1245/hive/bin/hive。它好像是${HADOOP_CLASSPATH} point to /usr/hdp/2.5.0.0-1245/atlas/hook/hive/* ?

#!/bin/bash 


    if [ -d "/usr/hdp/2.5.0.0-1245/atlas/hook/hive" ]; then 
     if [ -z "${HADOOP_CLASSPATH}" ]; then 
     export HADOOP_CLASSPATH=/usr/hdp/2.5.0.0-1245/atlas/hook/hive/* 
     else 
     export HADOOP_CLASSPATH=${HADOOP_CLASSPATH}:/usr/hdp/2.5.0.0-1245/atlas/hook/hive/* 
     fi 
    fi 


    BIGTOP_DEFAULTS_DIR=${BIGTOP_DEFAULTS_DIR-/etc/default} 
    [ -n "${BIGTOP_DEFAULTS_DIR}" -a -r ${BIGTOP_DEFAULTS_DIR}/hbase ] && . ${BIGTOP_DEFAULTS_DIR}/hbase 




    export HIVE_HOME=${HIVE_HOME:-/usr/hdp/2.5.0.0-1245/hive} 
    export HADOOP_HOME=${HADOOP_HOME:-/usr/hdp/2.5.0.0-1245/hadoop} 
    export ATLAS_HOME=${ATLAS_HOME:-/usr/hdp/2.5.0.0-1245/atlas} 


    HCATALOG_JAR_PATH=/usr/hdp/2.5.0.0-1245/hive-hcatalog/share/hcatalog/hive-hcatalog-core-1.2.1000.2.5.0.0-1245.jar:/usr/hdp/2.5.0.0-1245/hive-hcatalog/share/hcatalog/hive-hcatalog-server-extensions-1.2.1000.2.5.0.0-1245.jar:/usr/hdp/2.5.0.0-1245/hive-hcatalog/share/webhcat/java-client/hive-webhcat-java-client-1.2.1000.2.5.0.0-1245.jar 


    if [ -z "${HADOOP_CLASSPATH}" ]; then 
     export HADOOP_CLASSPATH=${HCATALOG_JAR_PATH} 
    else 
     export HADOOP_CLASSPATH=${HADOOP_CLASSPATH}:${HCATALOG_JAR_PATH} 
    fi 


    exec "${HIVE_HOME}/bin/hive.distro" "[email protected]" 

如何解决这个问题?

+0

同样的问题,你有没有找到解决方案? – Petro

回答

0

对我来说,这个问题在Ambari Workflow Editor中显示。为了解决这个问题,在每个sqoop客户端节点中创建一个符号链接,以将hive-exec.jar所在的lib放在hive中。 接下来,将hive-exec.jar放入HDFS oozie共享lib文件夹中。

su root 

cd /usr/hdp/current/sqoop-client/ 

ln -s /usr/hdp/current/hive-client/lib/hive-exec.jar hive-exec.jar 

cp hive-exec.jar lib/ 

su -l hdfs 

hdfs dfs -put hive-exec.jar /user/oozie/share/lib/sqoop 

hdfs dfs -put hive-exec.jar /user/oozie/share/lib/lib_20161117191926/sqoop