2017-07-16 44 views
0

我正在写MRjob,并且想要在基于密钥的分区上对reducer输出进行分区。 我正在使用这些选项并获得以下错误。如何使用keyfieldbasedpartitioner?我需要为此下载一些东西吗? MRJOB是用python编写的。-partitioner:class not found:org.apache.Hadoop.mapred.lib.KeyFieldBasedPartitioner

Step 1 of 1 failed: Command '['hadoop', 'jar', 
    '/usr/lib/hadoop-mapreduce/hadoop-streaming.jar', '-files', 
    'hdfs://hdpb-dfs/tmp/20170716.162009.525122/files/abc.py#parsec_status_error_fedactivity.py,hdfs://hdpb-dfs/tmp/20170716.162009.525122/files/setup-wrapper.sh#setup-wrapper.sh', 
    '-archives', 
    'hdfs://hdpb-dfs/tmp/20170716.162009.525122/files/mrjob.tar.gz#mrjob.tar.gz', 
    '-D', 'mapreduce.job.name=abc', '-D', 'mapreduce.job.reduces=2', 
    '-D', 'mapreduce.job.split.metainfo.maxsize=-1', '-D', 
    'mapreduce.map.failures.maxpercent=1', '-D', 
    'mapreduce.map.java.opts=-Xmx1g', '-D', 
    'mapreduce.map.memory.mb=2048', '-D', 
    'mapreduce.output.fileoutputformat.compress=true', '-D', 
    'mapreduce.output.fileoutputformat.compress.codec=org.apache.hadoop.io.compress.GzipCodec', 
    '-D', 'mapreduce.partition.keypartitioner.options=-k1', '-D', 
    'mapreduce.reduce.java.opts=-Xmx2g', '-D', 
    'mapreduce.reduce.memory.mb=3072', '-D', 
    'mapreduce.reduce.shuffle.input.buffer.percent=0.4', '-D', 
    'mapreduce.reduce.shuffle.merge.percent=0.4', '-D', 
    'stream.map.input.ignoreKey=true', '-D', 
    'stream.num.map.output.key.fields=5', '-libjars', 
    '/opt/parsec/lib/correctionlayer2.jar', '-partitioner', 
    'org.apache.Hadoop.mapred.lib.KeyFieldBasedPartitioner', '-input', 
    'hdfs:////10.134.71.100.1500076800077.gz', '-output', 
    'hdfs:///20170715', '-mapper', 'sh -ex setup-wrapper.sh python abc.py 
    --step-num=0 --mapper', '-reducer', 'sh -ex setup-wrapper.sh python abc.py --step-num=0 --reducer']' returned non-zero exit status 256de 
    here 

运行的1步1 ...

-partitioner:未找到类:org.apache.Hadoop.mapred.lib.KeyFieldBasedPartitioner

尝试-help以获取更多信息

流式命令失败!

回答

0

我不得不在“--partitioner”,“org.apache.hadoop.mapred.lib.KeyFieldBasedPartitioner”, 中使用较小的情况“h”,但它没有做到这一点。