2015-04-04 14 views
2

的CentOS
6.2
的Hadoop
2.6.0

2.10.5
Java版本
“1.7.0_75” OpenJDK运行时环境(rhel-2.5.4.0.el6_6-x86_64 u75-b13)
OpenJDK 64位服务器虚拟机(版本24.75-b04,混合模式)
mvn版本
Apache Maven 3.3.1(cab6659f9874fa96462afef40fcf6bc033d58c1c; 2015-03-13T21:10:27 + 01:00) Maven home:/ opt/maven
Java版本:1.7.0_75,供应商:Oracle Corporation
Java home:/usr/lib/jvm/java-1.7 .0-openjdk-1.7.0.75.x86_64/jre
默认语言环境:en_US,平台编码:UTF-8
操作系统名称:“linux”,版本:“2.6.32-220.el6.x86_64”,arch: “AMD64”,家人: “UNIX”
环境变量
内置火花1.2使用maven有误差包com.google.common

export SCALA_HOME=/opt/scala 
export JAVA_HOME=/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.75.x86_64 
export JRE_HOME=/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.75.x86_64/jre 
export HADOOP_HOME=/home/tom/hadoop 
export SPARK_HOME=/home/tom/spark 
export HADOOP_INSTALL=$HADOOP_HOME 
export HADOOP_MAPRED_HOME=$HADOOP_HOME 
export HADOOP_COMMON_HOME=$HADOOP_HOME 
export HADOOP_HDFS_HOME=$HADOOP_HOME 
export YARN_HOME=$HADOOP_HOME 
export HADOOP_YARN_HOME=$HADOOP_HOME 
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib 
export PATH=$PATH:$JAVA_HOME/bin:$JRE_HOME/bin:$HADOOP_HOME/sbin:$HADOOP_HOME/bin:$SPARK_HOME/bin:$MAVEN_HOME/bin:$SCALA_HOME/bin 
export MAVEN_HOME=/opt/maven 

export SPARK_EXAMPLES_JAR=$SPARK_HOME/spark-0.7.2/examples/target/scala-2.9.3/spark-examples_2.9.3-0.7.2.jar 
export HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop 
export YARN_CONF_DIR=$HADOOP_HOME/etc/hadoop 
export HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=$HADOOP_HOME/lib/" 

构建COMM和
MVN -Pyarn -Phadoop-2.4 -Dhadoop.version = 2.6.0 -Phive -Phive-0.12.0 -Phive-thriftserver -DskipTests清洁套装
错误消息

[ERROR] /home/tom/spark/external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumeBatchFetcher.scala:22: object Throwables is not a member of package com.google.common.base 
[ERROR] import com.google.common.base.Throwables 
[ERROR]  ^
[ERROR] /home/tom/spark/external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumeBatchFetcher.scala:59: not found: value Throwables 
[ERROR]   Throwables.getRootCause(e) match { 
[ERROR]   ^
[ERROR] /home/tom/spark/external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumePollingInputDStream.scala:26: object util is not a member of package com.google.common 
[ERROR] import com.google.common.util.concurrent.ThreadFactoryBuilder 
[ERROR]      ^
[ERROR] /home/tom/spark/external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumePollingInputDStream.scala:69: not found: type ThreadFactoryBuilder 
[ERROR]  Executors.newCachedThreadPool(new ThreadFactoryBuilder().setDaemon(true). 
[ERROR]          ^
[ERROR] /home/tom/spark/external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumePollingInputDStream.scala:76: not found: type ThreadFactoryBuilder 
[ERROR]  new ThreadFactoryBuilder().setDaemon(true).setNameFormat("Flume Receiver Thread - %d").build()) 
[ERROR]  ^
[ERROR] 5 errors found 


[INFO] ------------------------------------------------------------------------ 
[INFO] Reactor Summary: 
[INFO] 
[INFO] Spark Project Parent POM ........................... SUCCESS [ 10.121 s] 
[INFO] Spark Project Networking ........................... SUCCESS [ 14.957 s] 
[INFO] Spark Project Shuffle Streaming Service ............ SUCCESS [ 10.858 s] 
[INFO] Spark Project Core ................................. SUCCESS [07:33 min] 
[INFO] Spark Project Bagel ................................ SUCCESS [ 52.312 s] 
[INFO] Spark Project GraphX ............................... SUCCESS [02:19 min] 
[INFO] Spark Project Streaming ............................ SUCCESS [03:28 min] 
[INFO] Spark Project Catalyst ............................. SUCCESS [03:18 min] 
[INFO] Spark Project SQL .................................. SUCCESS [03:48 min] 
[INFO] Spark Project ML Library ........................... SUCCESS [03:40 min] 
[INFO] Spark Project Tools ................................ SUCCESS [ 29.380 s] 
[INFO] Spark Project Hive ................................. SUCCESS [02:53 min] 
[INFO] Spark Project REPL ................................. SUCCESS [01:32 min] 
[INFO] Spark Project YARN Parent POM ...................... SUCCESS [ 5.124 s] 
[INFO] Spark Project YARN Stable API ...................... SUCCESS [01:34 min] 
[INFO] Spark Project Hive Thrift Server ................... SUCCESS [ 56.404 s] 
[INFO] Spark Project Assembly ............................. SUCCESS [01:11 min] 
[INFO] Spark Project External Twitter ..................... SUCCESS [ 36.661 s] 
[INFO] Spark Project External Flume Sink .................. SUCCESS [ 50.006 s] 
[INFO] Spark Project External Flume ....................... FAILURE [ 14.287 s] 
[INFO] Spark Project External MQTT ........................ SKIPPED 
[INFO] Spark Project External ZeroMQ ...................... SKIPPED 
[INFO] Spark Project External Kafka ....................... SKIPPED 
[INFO] Spark Project Examples ............................. SKIPPED 
[INFO] Spark Project YARN Shuffle Service ................. SKIPPED 
[INFO] ------------------------------------------------------------------------ 
[INFO] BUILD FAILURE 
[INFO] ------------------------------------------------------------------------ 
[INFO] Total time: 36:02 min 
[INFO] Finished at: 2015-04-04T03:58:19+02:00 
[INFO] Final Memory: 60M/330M 
[INFO] ------------------------------------------------------------------------ 
[ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.0:compile (scala-compile-first) on project spark-streaming-flume_2.10: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.0:compile failed. CompileFailed -> [Help 1] 
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. 
[ERROR] Re-run Maven using the -X switch to enable full debug logging. 
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles: 




我怀疑这是一些依赖问题,但我无法弄清楚。有人能帮我吗?

回答

0

如果你确定作弊,那么你可以跳过那些未能编译即

火花流-flume_2.10和火花流-kafka_2.10模块

以下命令用于编译Spark包与Hive支持Spark SQL与CDH5.3.3和Spark 1.2.0。

MVN -Pyarn -Dhadoop.version = 2.5.0-cdh5.3.3 -DskipTests -Phive -Phive-thriftserver -pl“org.apache.spark:!火花流-flume_2.10,组织.apache.spark:spark-streaming-kafka_2.10' 包

0

今天我有类似的问题。这Spark Project External Flume ....................... FAILURE日志很烦人,但我认为这是git clean -xdf帮助。如果它不够,请尝试git clean -Xdf。再次运行mvn ...。祝你好运!

mvn -e -Pyarn -Phadoop-2.6 -Phive -Phive-thriftserver -DskipTests clean package 

的Apache Maven的的版本似乎在这里的角色扮演 -

1

同时用如下命令建立我曾面临同样的问题Apache Spark 1.2.1。在失败的情况下,Maven的版本是 -

./mvn -version

Apache Maven **3.3.3** (7994120775791599e205a5524ec3e0dfe41d4a06; 2015-04-22T07:57:37-04:00) 
Maven home: /opt/Spark/amit/apache-maven/apache-maven-3.3.3 
Java version: 1.8.0, vendor: IBM Corporation 
Java home: /opt/Spark/amit/ibmjava8sdk/sdk/jre 
Default locale: en_US, platform encoding: UTF-8 
OS name: "linux", version: "3.14.8-200.fc20.x86_64", arch: "amd64", family: "unix" 

,当我试图用旧的行家,构建成功。 Apache Maven 3.2.X的用法似乎解决了这个问题。 我用了 -

MVN -version

Apache Maven **3.2.5** (12a6b3acb947671f09b81f49094c53f426d8cea1; 2014-12-14T12:29:23-05:00) 
Maven home: /opt/Spark/amit/apache-maven/apache-maven-3.2.5 
Java version: 1.8.0, vendor: IBM Corporation 
Java home: /opt/Spark/amit/ibmjava8sdk/sdk/jre 
Default locale: en_US, platform encoding: UTF-8 
OS name: "linux", version: "3.14.8-200.fc20.x86_64", arch: "amd64", family: "unix" 

希望这有助于。

谢谢, Amit