2016-08-24 71 views
0

1)我已经创建,我们是从两个不同的蜂房表收集的数据,并插入到单个配置单元表一个SQL文件,HiveException:无法创建火花客户

2)我们使用调用该SQL文件外壳脚本

3)样品火花环境:

SET hive.execution.engine=spark; 
SET spark.master=yarn-cluster; 
SET spark.app.name="ABC_${hiveconf:PRC_DT}_${hiveconf:JOB_ID}"; 
--SET spark.driver.memory=8g; 
--SET spark.executor.memory=8g; 
SET hive.exec.dynamic.partition.mode = nonstrict; 
SET hive.stats.fetch.column.stats=true; 
SET hive.optimize.index.filter=true; 
Set hive.map.aggr=true; 
Set hive.exec.parallel=true;SET spark.executor.cores=5; 
SET hive.prewarm.enabled=true; 
SET hive.spark.client.future.timeout=900; 
SET hive.spark.client.server.connect.timeout=100000; 

4)样品蜂巢查询:

insert OVERWRITE table ABC (a,b,c) select * from XYZ 
from ${hiveconf:SCHEMA_NAME}.${hiveconf:TABLE_NAME} 
where JOB_ID = '${hiveconf:JOB_ID}' 

5)示例脚本:

hive -f $PARENTDIR/sql/test.sql --hiveconf SCHEMA_NAME=ABC --hiveconf TABLE_NAME=AB1 --hiveconf PRC_DT=${PRC_DT} --hiveconf JOB_ID=${JOB_ID} 
hive -f $PARENTDIR/sql/test.sql --hiveconf SCHEMA_NAME=ABC --hiveconf TABLE_NAME=AB2 --hiveconf PRC_DT=${PRC_DT} --hiveconf JOB_ID=${JOB_ID} 

Error: 
2016-08-24 17:30:05,651 WARN [main] mapreduce.TableMapReduceUtil: The hbase-prefix-tree module jar containing PrefixTreeCodec is not present. Continuing without it. 

Logging initialized using configuration in jar:file:/opt/cloudera/parcels/CDH-5.7.2-1.cdh5.7.2.p0.18/jars/hive-common-1.1.0-cdh5.7.2.jar!/hive-log4j.properties 
FAILED: SemanticException Failed to get a spark session: org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create spark client. 
+0

把命令和'''代码block'''犯错消息,它会帮助你让你的问题更容易理解:) –

+0

@ sel-fish新到这个学习的窍门 –

回答

0

它的示数,因为你没有得到之前超时分配一个ApplicationMaster。增加如下参数(默认为90000ms,你必须将其设置为100000ms以上):

set hive.spark.client.server.connect.timeout=300000ms;