2015-06-02 106 views
2

我有地图减少作业的HBase批量加载。作业正在将数据转换成Hfiles并加载到hbase中,但在某个地图%job失败后。下面是我得到的例外。Hbase批量加载 - 地图减少作业失败

Error: java.io.FileNotFoundException: /var/mapr/local/tm4/mapred/nodeManager/spill/job_1433110149357_0005/attempt_1433110149357_0005_m_000000_0/spill83.out.index 
    at org.apache.hadoop.fs.RawLocalFileSystem.open(RawLocalFileSystem.java:198) 
    at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:800) 
    at org.apache.hadoop.io.SecureIOUtils.openFSDataInputStream(SecureIOUtils.java:156) 
    at org.apache.hadoop.mapred.SpillRecord.<init>(SpillRecord.java:74) 
    at org.apache.hadoop.mapred.MapRFsOutputBuffer.mergeParts(MapRFsOutputBuffer.java:1382) 
    at org.apache.hadoop.mapred.MapRFsOutputBuffer.flush(MapRFsOutputBuffer.java:1627) 
    at org.apache.hadoop.mapred.MapTask$NewOutputCollector.close(MapTask.java:709) 
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:779) 
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:345) 
    at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168) 
    at java.security.AccessController.doPrivileged(Native Method) 
    at javax.security.auth.Subject.doAs(Subject.java:415) 
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1566) 
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163) 

,我注意到,在工作,对于小数据集是工作正常,但随着数据的增长作业唯一开始出现问题。

让我知道是否有人遇到过这个问题。

谢谢

回答