2015-06-06 29 views
0

尝试运行一个Hadoop映射减少代码但出现错误。不知道为什么......在Hadoop练习期间遇到错误

Hadoop的罐子BWC11.jar WordCountDriver “/家/培训/ training_material /数据/莎士比亚/喜剧片” “/家/培训/ training_material /数据/莎士比亚/ AWL” 警告:$ HADOOP_HOME已弃用。

Exception in thread "main" java.lang.NoClassDefFoundError: WordCountDriver (wrong name: 

COM /菲利克斯/ Hadoop的/培训/ WordCountDriver) 在java.lang.ClassLoader.defineClass1(本机方法) 在需要java.lang.ClassLoader.defineClass(ClassLoader.java:791) 是java。 security.SecureClassLoader.defineClass(SecureClassLoader.java:142) at java.net.URLClassLoader.defineClass(URLClassLoader.java:449) at java.net.URLClassLoader.access $ 100(URLClassLoader.java:71) at java.net .URLClassLoader $ 1.run(URLClassLoader.java:361) at java.net.URLClassLoader $ 1.run(URLClassLoader.java:355) at java.security.AccessController.doPrivile ged(本地方法) at java.net.URLClassLoader.findClass(URLClassLoader.java:354) at java.lang.ClassLoader.loadClass(ClassLoader.java:423) at sun.misc.Launcher $ AppClassLoader.loadClass(Launcher .java:308) at java.lang.ClassLoader.loadClass(ClassLoader.java:410) at java.lang.ClassLoader.loadClass(ClassLoader.java:356) at java.lang.Class.forName0(Native Method) 在java.lang.Class.forName(Class.java:264) 在org.apache.hadoop.util.RunJar.main(RunJar.java:149) [培训@本地BasicWordCount] $

有人可以帮忙吗?我出了这个?

驱动程序代码:

package com.felix.hadoop.training; 

import org.apache.hadoop.conf.Configured; 
import org.apache.hadoop.fs.Path; 
import org.apache.hadoop.io.IntWritable; 
import org.apache.hadoop.io.Text; 
import org.apache.hadoop.mapreduce.Job; 
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat; 
import org.apache.hadoop.mapreduce.lib.input.TextInputFormat; 
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat; 
import org.apache.hadoop.util.Tool; 
import org.apache.hadoop.util.ToolRunner; 


public class WordCountDriver extends Configured implements Tool{ 

    public static void main(String[] args) throws Exception 
    { 
     ToolRunner.run(new WordCountDriver(),args); 
    } 

    @Override 
    public int run(String[] args) throws Exception { 

     Job job = new Job(getConf(),"Basic Word Count Job"); 
     job.setJarByClass(WordCountDriver.class); 

     job.setMapperClass(WordCountMapper.class); 
     job.setReducerClass(WordCountReducer.class); 

     job.setInputFormatClass(TextInputFormat.class); 

     job.setMapOutputKeyClass(Text.class); 
     job.setMapOutputValueClass(IntWritable.class); 

     job.setOutputKeyClass(Text.class); 
     job.setOutputValueClass(IntWritable.class); 

     job.setNumReduceTasks(1); 

     FileInputFormat.addInputPath(job, new Path(args[0])); 
     FileOutputFormat.setOutputPath(job, new Path(args[1])); 

     job.waitForCompletion(true); 



     return 0; 
    } 


} 

映射代码:

package com.felix.hadoop.training; 

import java.io.IOException; 

import org.apache.hadoop.io.IntWritable; 
import org.apache.hadoop.io.LongWritable; 
import org.apache.hadoop.io.Text; 
import org.apache.hadoop.mapreduce.Mapper; 
/** 
* 
* @author training 
* Class : WordCountMapper 
* 
*/ 

public class WordCountMapper extends Mapper<LongWritable, Text, Text, IntWritable>{ 
    /** 
    * Optimization: Instead of creating the variables in the 
    */ 

    @Override 
    public void map(LongWritable inputKey,Text inputVal,Context context) throws IOException,InterruptedException 
    { 
     String line = inputVal.toString(); 
     String[] splits = line.trim().split("\\W+"); 
     for(String outputKey:splits) 
     { 
      context.write(new Text(outputKey), new IntWritable(1)); 

     } 


    } 

} 

减速代码:

package com.felix.hadoop.training; 
import java.io.IOException; 

import org.apache.hadoop.io.IntWritable; 
import org.apache.hadoop.io.Text; 
import org.apache.hadoop.mapreduce.Reducer; 


public class WordCountReducer extends Reducer<Text,IntWritable,Text, IntWritable>{ 

    @Override 
    public void reduce(Text key,Iterable<IntWritable> listOfValues,Context context) throws IOException,InterruptedException 
    { 
     int sum=0; 
     for(IntWritable val:listOfValues) 
     { 
      sum = sum + val.get(); 
     } 
     context.write(key,new IntWritable(sum)); 


    } 

} 

不知道为什么我收到这个错误.. 我已尝试添加class path,将类文件复制到.jar文件所在的位置等......但无济于事。

+0

可能重复的[如何解决java.lang.NoClassDefFoundError?](http://stackoverflow.com/questions/17973970/how-to-solve-java-lang-noclassdeffounderror) – GabrielOshiro

回答

0

"WordCountDriver"之前添加包名“com.felix.hadoop.training”。