2016-12-27 39 views
0

运行:[cloudera @ quickstart〜] $ sqoop export --connect“jdbc:mysql://quickstart.cloudera:3306/retail_db”--username retail_dba --password cloudera - -table department_export --export-DIR /家庭/ Cloudera公司/ sqoop_import /部门-m 12在运行sqoop导出时需要hdfs中的附加块

错误:

16/12/24 22:29:48 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032 16/12/24 22:29:49 INFO mapreduce.JobSubmitter: Cleaning up the staging area /tmp/hadoop-yarn/staging/cloudera/.staging/job_1482646432089_0001 16/12/24 22:29:49 WARN security.UserGroupInformation: PriviledgedActionException as:cloudera (auth:SIMPLE) cause:org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.namenode.SafeModeException): Cannot delete /tmp/hadoop-yarn/staging/cloudera/.staging/job_1482646432089_0001. Name node is in safe mode. The reported blocks 1268 needs additional 39 blocks to reach the threshold 0.9990 of total blocks 1308. The number of live datanodes 1 has reached the minimum number 0. Safe mode will be turned off automatically once the thresholds have been reached. at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkNameNodeSafeMode(FSNamesystem.java:1446) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.deleteInternal(FSNamesystem.java:4072) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.deleteInt(FSNamesystem.java:4030)

使用 “HDFS dfsadmin -safemode离开” 试过了,再次得到错误,

16/12/24 10:37:59 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032 16/12/24 10:38:00 INFO mapreduce.JobSubmitter: Cleaning up the staging area /tmp/hadoop-yarn/staging/cloudera/.staging/job_1482602419946_0007 16/12/24 10:38:00 WARN security.UserGroupInformation: PriviledgedActionException as:cloudera (auth:SIMPLE) cause:org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.namenode.SafeModeException): Cannot delete /tmp/hadoop-yarn/staging/cloudera/.staging/job_1482602419946_0007. Name node is in safe mode. It was turned on manually. Use "hdfs dfsadmin -safemode leave" to turn safe mode off. at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkNameNodeSafeMode(FSNamesystem.

回答

0

制作确保您已为Sqoop运行时正确设置了HCAT_HOME环境变量。你得到的错误是因为sqoop无法找到所需的依赖项“org.apache.hive.hcatalog *”,它在hcat的hcatalog中可用。