我在连接期间遇到问题。如何连接spark与hbase
scala> val results = sql("SELECT * FROM tablename");
results: org.apache.spark.sql.DataFrame = [hbid: string, matrix_col: string, matrix_value_col: double, country_col: string]
scala> results.show();
org.apache.hadoop.hbase.client.RetriesExhaustedException:失败之后尝试 = 36,不同的是: 星期二年07月25 10点03分27秒SGT 2017,NULL,java.net.SocketTimeoutException: callTimeout = 60000, callDuration = 68549:行'nrd_app_spt:capacity_new ,, 00000000000000' 表'hbase:meta'at region = hbase:meta ,, 1.1588230740, hostname = x01shdpeapp3a.sgp.dbs.com,60020,1500273862255 ,seqNum = 0 引起:java.net.SocketTimeoutException:callTimeout = 60000,callDuration = 68549:row'nrd_app_spt:capacity_new ,, 00000000000000'on table'hbase:meta'at region = hbase:meta ,, 1.1588230740, hostname = x01shdpeapp3a.sgp.dbs.com,60020,1500273862255,seqNum = 0 引起:org.apache.hadoop.hbase.exceptions.ConnectionClosingException :致电 x01shdpeapp3a.sgp.dbs.com/10.92.139.145:60020本地失败 异常: org.apache.hadoop.hbase.exceptions.ConnectionClosingException: 与x01shdpeapp3a.sgp.dbs.com/10.92.139.145的连接:60020是 收盘。 Call id = 9,waitTime = 3
我想读在HBase的表,并做一些内置功能:问题在HBase的-site.xml中更改以下属性
变化解决。 scala是否足以写入查询? –