2014-05-19 60 views
4

我跟随this tutorial在Windows 7环境中构建Apache Hadoop。长话短说。我可以mvn compile命令编译Hadoop和可mvn -package -DskipTests在Windows 7上构建Hadoop

构建软件包,但我不能mvn package -Pdist,native-win -DskipTests -Dtar 我得到的I/O异常并不能解决这些例外。没有-Dtar参数

有没有人可以帮我解决这些异常?

[INFO] Executing tasks 
main: 
     [get] Destination already exists (skipping): C:\hadoop\hadoop-hdfs- project\hadoop-hdfs-httpfs\downloads\tomcat.tar.gz 
    [mkdir] Created dir: C:\hadoop\hadoop-hdfs-project\hadoop-hdfs-httpfs\target\tomcat.exp 
[exec] tar (child): C\:hadoophadoop-hdfs-projecthadoop-hdfs-httpfs/downloads/tomcat.tar.gz: Cannot open: I/O error 
[exec] tar (child): Error is not recoverable: exiting now 
[exec] 
[exec] gzip: stdin: unexpected end of file 
[exec] tar: Child returned status 2 
[exec] tar: Error exit delayed from previous errors 
[INFO] ------------------------------------------------------------------------ 
[INFO] Reactor Summary: 
[INFO] 
[INFO] Apache Hadoop Main ................................ SUCCESS [ 1.018 s] 
[INFO] Apache Hadoop Project POM ......................... SUCCESS [ 1.653 s] 
[INFO] Apache Hadoop Annotations ......................... SUCCESS [ 2.181 s] 
[INFO] Apache Hadoop Assemblies .......................... SUCCESS [ 0.200 s] 
[INFO] Apache Hadoop Project Dist POM .................... SUCCESS [ 2.889 s] 
[INFO] Apache Hadoop Auth ................................ SUCCESS [ 1.957 s] 
[INFO] Apache Hadoop Auth Examples ....................... SUCCESS [ 1.570 s] 
[INFO] Apache Hadoop Common .............................. SUCCESS [ 50.085 s] 
[INFO] Apache Hadoop Common Project ...................... SUCCESS [ 0.090 s] 
[INFO] Apache Hadoop HDFS ................................ SUCCESS [ 35.510 s] 
[INFO] Apache Hadoop HttpFS .............................. FAILURE [ 5.155 s] 
[INFO] Apache Hadoop HDFS Project ........................ SKIPPED 
[INFO] hadoop-yarn ....................................... SKIPPED 
[INFO] hadoop-yarn-api ................................... SKIPPED 
[INFO] hadoop-yarn-common ................................ SKIPPED 
[INFO] hadoop-yarn-server ................................ SKIPPED 
[INFO] hadoop-yarn-server-common ......................... SKIPPED 
[INFO] hadoop-yarn-server-nodemanager .................... SKIPPED 
[INFO] hadoop-yarn-server-web-proxy ...................... SKIPPED 
[INFO] hadoop-yarn-server-resourcemanager ................ SKIPPED 
[INFO] hadoop-yarn-server-tests .......................... SKIPPED 
[INFO] hadoop-yarn-client ................................ SKIPPED 
[INFO] hadoop-mapreduce-client ........................... SKIPPED 
[INFO] hadoop-mapreduce-client-core ...................... SKIPPED 
[INFO] hadoop-yarn-applications .......................... SKIPPED 
[INFO] hadoop-yarn-applications-distributedshell ......... SKIPPED 
[INFO] hadoop-yarn-site .................................. SKIPPED 
[INFO] hadoop-yarn-project ............................... SKIPPED 
[INFO] hadoop-mapreduce-client-common .................... SKIPPED 
[INFO] hadoop-mapreduce-client-shuffle ................... SKIPPED 
[INFO] hadoop-mapreduce-client-app ....................... SKIPPED 
[INFO] hadoop-mapreduce-client-hs ........................ SKIPPED 
[INFO] hadoop-mapreduce-client-jobclient ................. SKIPPED 
[INFO] hadoop-mapreduce-client-hs-plugins ................ SKIPPED 
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED 
[INFO] hadoop-mapreduce .................................. SKIPPED 
[INFO] Apache Hadoop MapReduce Streaming ................. SKIPPED 
[INFO] Apache Hadoop Distributed Copy .................... SKIPPED 
[INFO] Apache Hadoop Archives ............................ SKIPPED 
[INFO] Apache Hadoop Rumen ............................... SKIPPED 
[INFO] Apache Hadoop Gridmix ............................. SKIPPED 
[INFO] Apache Hadoop Data Join ........................... SKIPPED 
[INFO] Apache Hadoop Extras .............................. SKIPPED  
[INFO] Apache Hadoop Pipes ............................... SKIPPED 
[INFO] Apache Hadoop Tools Dist .......................... SKIPPED 
[INFO] Apache Hadoop Tools ............................... SKIPPED 
[INFO] Apache Hadoop Distribution ........................ SKIPPED 
[INFO] Apache Hadoop Client .............................. SKIPPED 
[INFO] Apache Hadoop Mini-Cluster ........................ SKIPPED 
[INFO] ------------------------------------------------------------------------ 
[INFO] BUILD FAILURE 
[INFO] ------------------------------------------------------------------------ 
[INFO] Total time: 01:43 min 
[INFO] Finished at: 2014-05-19T11:24:25+00:00 
[INFO] Final Memory: 49M/179M 
[INFO] ------------------------------------------------------------------------ 
[WARNING] The requested profile "native-win" could not be activated because it does not 
exist. 
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run  dist) on project hadoop-hdfs-httpfs: An Ant BuildExcept ion has occured: exec returned: 2 - > [Help 1] 

[ERROR] 

[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. 

[ERROR] Re-run Maven using the -X switch to enable full debug logging. 

[ERROR] 

[ERROR] For more information about the errors and possible solutions, please read the 
following articles: 

[ERROR] [Help 1]  http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException 

[ERROR] 

[ERROR] After correcting the problems, you can resume the build with the command 
[ERROR] mvn <goals> -rf :hadoop-hdfs-httpfs 
c:\hadoop> 

回答

0

如果您正在使用更高版本的Hadoop的,即Hadoop的2.6,2.7或2.8,那么就没有必要建立Hadoop的SRC用于获取Windows本地的Hadoop。这里有一个GitHub链接,它有最新版本的Hadoop的winutils。

我在使用maven构建Hadoop-src时也遇到过类似的问题,这些步骤对我来说很合适。

Download &在安装Java c:/java/

(确保路径是这样,如果安装程序的Java 文件,然后Hadoop的env.cmd不会承认java的路径)

Download Hadoop二进制分发。

(我使用二进制分发版Hadoop的2.8.1)

设置环境变量:

JAVA_HOME = "c:/Java" 
HADOOP_HOME="<your hadoop home>" 
Path= "JAVA_HOME/bin" 
Path = "HADOOP_HOME/bin" 

的Hadoop可以在Windows工作,如果Hadoop-src是使用maven内置 你的Windows机器。建立Hadoop-src(发行版)将 创建一个Hadoop二进制发行版,该发行版将作为windows本地版本的 版本。

但是,如果你不想这样做,那么下载预先builted winutils of Hadoop distribution. 这里是一个GitHub link,它具有的Hadoop的某些版本的winutils。

(如果你正在使用的版本不在列表中,按照 传统方法在Windows上建立的Hadoop - link)如果您发现您的版本

,然后复制粘贴所有文件夹的内容转换成路径:/ bin中/

设置所有.xml配置文件 - 在Hadoop的ENV Link &设置JAVA_HOME路径。CMD文件

从CMD去:

<HADOOP_HOME>/bin/> hdfs namenode -format 
<HADOOP_HOME>/sbin> start-all.cmd 

希望这有助于。

相关问题