2017-06-01 102 views
0

我试图使用下面的代码将收集从mongodb复制到hadoop使用HadoopMongodb连接器 包hdfs;将数据从mongoDB复制到hdfs时出现hadoop jar错误

import java.io.*; 
import org.apache.commons.logging.*; 
import org.apache.hadoop.conf.*; 
import org.apache.hadoop.fs.Path; 
import org.apache.hadoop.io.*; 
import org.apache.hadoop.mapreduce.lib.output.*; 
import org.apache.hadoop.mapreduce.*; 
import org.bson.*; 
import com.mongodb.hadoop.*; 
import com.mongodb.hadoop.util.*; 

public class ImportWeblogsFromMongo { 
    private static final Log log = LogFactory.getLog(ImportWeblogsFromMongo.class); 

    public static class ReadWeblogsFromMongo extends Mapper<Object, BSONObject, Text, Text> { 
     public void map(Object key, BSONObject value, Context context) throws IOException, InterruptedException { 
      System.out.println("Key: " + key); 
      System.out.println("Value: " + value); 
      String md5 = value.get("md5").toString(); 
      String url = value.get("url").toString(); 
      String date = value.get("date").toString(); 
      String time = value.get("time").toString(); 
      String ip = value.get("ip").toString(); 
      String output = "\t" + url + "\t" + date + "\t" + time + "\t" + ip; 
      context.write(new Text(md5), new Text(output)); 
     } 
    } 

    public static void main(String[] args) throws Exception { 
     final Configuration conf = new Configuration(); 
     MongoConfigUtil.setInputURI(conf,"mongodb://localhost:27017/clusterdb.fish"); 
     MongoConfigUtil.setCreateInputSplits(conf, false); 
     System.out.println("Configuration: " + conf); 
     @SuppressWarnings("deprecation") 
     final Job job = new Job(conf, "Mongo Import"); 
     Path out = new Path("/home/mongo_import"); 
     FileOutputFormat.setOutputPath(job, out); 
     job.setJarByClass(ImportWeblogsFromMongo.class); 
     job.setMapperClass(ReadWeblogsFromMongo.class); 
     job.setOutputKeyClass(Text.class); 
     job.setOutputValueClass(Text.class); 
     job.setInputFormatClass(MongoInputFormat.class); 
     job.setOutputFormatClass(TextOutputFormat.class); 
     job.setNumReduceTasks(0); 
     System.exit(job.waitForCompletion(true) ? 0 : 1); 
    } 
    } 

1.出口命名的Jar文件后importmongo.jar 我试图去执行这个命令hadoop jar /home/yass/importmongo.jar hdfs.ImportWeblogsFromMongo,但我得到了以下错误:

Exception in thread "main" java.lang.NoClassDefFoundError: com/mongodb/hadoop/util/MongoConfigUtil 
    at hdfs.ImportWeblogsFromMongo.main(ImportWeblogsFromMongo.java:33) 
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
    at java.lang.reflect.Method.invoke(Method.java:498) 
    at org.apache.hadoop.util.RunJar.run(RunJar.java:221) 
    at org.apache.hadoop.util.RunJar.main(RunJar.java:136) 
Caused by: java.lang.ClassNotFoundException: com.mongodb.hadoop.util.MongoConfigUtil 
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381) 
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424) 
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357) 
    ... 7 more 

NB:clustedb是数据库名称和它的集合 和hdfs.ImportWeblogsFromMongo是package.class

任何建议请

+0

你尝试'Hadoop的JAR/home/yass/importmongo.jar -libjars HADOOP_MONGODB_JAR hdfs.ImportWeblogsFromMongo'? – tk421

+0

是的,我得到了一个异常在线程“主”java.lang.ClassNotFoundException异常:-libjars – user3623460

+0

对不起,我的意思是'hadoop jar /home/yass/importmongo.jar hdfs.ImportWeblogsFromMongo -libjars HADOOP_MONGODB_JAR'。你可以看到[https://stackoverflow.com/questions/13095402/hadoop-libjars-and-classnotfoundexception](https://stackoverflow.com/questions/13095402/hadoop-libjars-and-classnotfoundexception)或[https:/ /stackoverflow.com/questions/6890087/problem-with-libjars-in-hadoop](https://stackoverflow.com/questions/6890087/problem-with-libjars-in-hadoop)正确的语法使用-libjars 。 – tk421

回答

0

我没有解决这个样子,但我通过将文件复制到Hdfs发现使用Mongodump解决方案,它下面的线可以帮助别人完成工作

mongodump --db clusterdb --collection CollectionName 

    bsondump file.bson > file.json 

    hadoop dfs -copyFromLocal /path/to/file/fish.json mongo