2014-01-11 65 views
4

我用Hadoop 2.2.0工作,并试图运行此hdfs_test.cpp应用:Hadoop的C++ HDFS测试运行异常

#include "hdfs.h" 

int main(int argc, char **argv) { 

    hdfsFS fs = hdfsConnect("default", 0); 
    const char* writePath = "/tmp/testfile.txt"; 
    hdfsFile writeFile = hdfsOpenFile(fs, writePath, O_WRONLY|O_CREAT, 0, 0, 0); 
    if(!writeFile) { 
      fprintf(stderr, "Failed to open %s for writing!\n", writePath); 
      exit(-1); 
    } 
    char* buffer = "Hello, World!"; 
    tSize num_written_bytes = hdfsWrite(fs, writeFile, (void*)buffer, strlen(buffer)+1); 
    if (hdfsFlush(fs, writeFile)) { 
      fprintf(stderr, "Failed to 'flush' %s\n", writePath); 
      exit(-1); 
    } 
    hdfsCloseFile(fs, writeFile); 
} 

我编译,但是当我与运行它。/hdfs_test我有此:

loadFileSystems error: 
(unable to get stack trace for java.lang.NoClassDefFoundError exception: ExceptionUtils::getStackTrace error.) 
hdfsBuilderConnect(forceNewInstance=0, nn=default, port=0, kerbTicketCachePath=(NULL), userName=(NULL)) error: 
(unable to get stack trace for java.lang.NoClassDefFoundError exception: ExceptionUtils::getStackTrace error.) 
hdfsOpenFile(/tmp/testfile.txt): constructNewObjectOfPath error: 
(unable to get stack trace for java.lang.NoClassDefFoundError exception: ExceptionUtils::getStackTrace error.) 
Failed to open /tmp/testfile.txt for writing! 

也许是类路径的问题。 我的$ HADOOP_HOME为/ usr /本地/ Hadoop的,这是我真正变量* CLASSPATH *

echo $CLASSPATH 
/usr/local/hadoop/etc/hadoop:/usr/local/hadoop/share/hadoop/common/lib/*:/usr/local/hadoop/share/hadoop/common/*:/usr/local/hadoop/share/hadoop/hdfs:/usr/local/hadoop/share/hadoop/hdfs/lib/*:/usr/local/hadoop/share/hadoop/hdfs/*:/usr/local/hadoop/share/hadoop/yarn/lib/*:/usr/local/hadoop/share/hadoop/yarn/*:/usr/local/hadoop/share/hadoop/mapreduce/lib/*:/usr/local/hadoop/share/hadoop/mapreduce/*:/contrib/capacity-scheduler/*.jar 

任何帮助表示赞赏..谢谢

回答

3

我所面临的问题与使用使用基于JNI的程序时,类路径中的通配符。尝试使用直接jar-in-classpath的方法,例如我在https://github.com/QwertyManiac/cdh4-libhdfs-example/blob/master/exec.sh#L3这个示例代码中生成的方法,我相信它应该工作。整个包含的示例在https://github.com/QwertyManiac/cdh4-libhdfs-example目前确实有效。也

https://stackoverflow.com/a/9322747/1660002

+0

结果添加到CLASSPATH可变我不得不通过以下拐杖解决同样的问题:'出口CLASSPATH = $ (对于'hadoop classpath -glob | sed s'/://g''中的p,找到$ p -name'* .jar'2>/dev/null; done | tr'\ n'':' )' –

4

试试这个:

hadoop classpath --glob 

然后在~/.bashrc

相关问题