2016-07-18 67 views
5

我用下面的命令来运行单词计数的火花Java示例: -火花提交“服务‘驱动程序’无法在端口绑定”错误

time spark-submit --deploy-mode cluster --master spark://192.168.0.7:6066 --class org.apache.spark.examples.JavaWordCount /home/pi/Desktop/example/new/target/javaword.jar /books_50.txt 

当运行它,以下是输出: -

Running Spark using the REST application submission protocol. 
16/07/18 03:55:41 INFO rest.RestSubmissionClient: Submitting a request to launch an application in spark://192.168.0.7:6066. 
16/07/18 03:55:44 INFO rest.RestSubmissionClient: Submission successfully created as driver-20160718035543-0000. Polling submission state... 
16/07/18 03:55:44 INFO rest.RestSubmissionClient: Submitting a request for the status of submission driver-20160718035543-0000 in spark://192.168.0.7:6066. 
16/07/18 03:55:44 INFO rest.RestSubmissionClient: State of driver driver-20160718035543-0000 is now RUNNING. 
16/07/18 03:55:44 INFO rest.RestSubmissionClient: Driver is running on worker worker-20160718041005-192.168.0.12-42405 at 192.168.0.12:42405. 
16/07/18 03:55:44 INFO rest.RestSubmissionClient: Server responded with CreateSubmissionResponse: 
{ 
    "action" : "CreateSubmissionResponse", 
    "message" : "Driver successfully submitted as driver-20160718035543-0000", 
    "serverSparkVersion" : "1.6.2", 
    "submissionId" : "driver-20160718035543-0000", 
    "success" : true 
} 

我检查的具体工作人员(192.168.0.12),用于其日志和它说: -

Launch Command: "/usr/lib/jvm/jdk-8-oracle-arm32-vfp-hflt/jre/bin/java" "-cp" "/opt/spark/conf/:/opt/spark/lib/spark-assembly-1.6.2-hadoop2.6.0.jar:/opt/spark/lib/datanucleus-api-jdo-3.2.6.jar:/opt/spark/lib/datanucleus-core-3.2.10.jar:/opt/spark/lib/datanucleus-rdbms-3.2.9.jar" "-Xms1024M" "-Xmx1024M" "-Dspark.driver.supervise=false" "-Dspark.app.name=org.apache.spark.examples.JavaWordCount" "-Dspark.submit.deployMode=cluster" "-Dspark.jars=file:/home/pi/Desktop/example/new/target/javaword.jar" "-Dspark.master=spark://192.168.0.7:7077" "-Dspark.executor.memory=10M" "org.apache.spark.deploy.worker.DriverWrapper" "spark://[email protected]:42405" "/opt/spark/work/driver-20160718035543-0000/javaword.jar" "org.apache.spark.examples.JavaWordCount" "/books_50.txt" 
======================================== 

log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory). 
log4j:WARN Please initialize the log4j system properly. 
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info. 
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties 
16/07/18 04:10:58 INFO SecurityManager: Changing view acls to: pi 
16/07/18 04:10:58 INFO SecurityManager: Changing modify acls to: pi 
16/07/18 04:10:58 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(pi); users with modify permissions: Set(pi) 
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1. 
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1. 
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1. 
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1. 
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1. 
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1. 
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1. 
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1. 
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1. 
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1. 
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1. 
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1. 
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1. 
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1. 
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1. 
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1. 
Exception in thread "main" java.net.BindException: Cannot assign requested address: Service 'Driver' failed after 16 retries! Consider explicitly setting the appropriate port for the service 'Driver' (for example spark.ui.port for SparkUI) to an available port or increasing spark.port.maxRetries. 
    at sun.nio.ch.Net.bind0(Native Method) 
    at sun.nio.ch.Net.bind(Net.java:433) 
    at sun.nio.ch.Net.bind(Net.java:425) 
    at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223) 
    at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74) 
    at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:125) 
    at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:485) 
    at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1089) 
    at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:430) 
    at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:415) 
    at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:903) 
    at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:198) 
    at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:348) 
    at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357) 
    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357) 
    at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111) 
    at java.lang.Thread.run(Thread.java:745) 

我spark-env.sh文件(主)包含: -

export SPARK_MASTER_WEBUI_PORT="8080" 
export SPARK_MASTER_IP="192.168.0.7" 
export SPARK_EXECUTOR_MEMORY="10M" 

我spark-env.sh文件(工人)包含: -

export SPARK_WORKER_WEBUI_PORT="8080" 
export SPARK_MASTER_IP="192.168.0.7" 
export SPARK_EXECUTOR_MEMORY="10M" 

请帮助...!

+1

,你永远能解决这个问题?我与spark v2.0.0完全一样的问题。 – yee379

+0

嗨,我无法找到这个特定问题的任何线索。因此,我开始为wordcount流python示例。如果您找到解决方法,请让我知道。 – itsamineral

回答

1

您需要在/etc/hosts文件中输入主机名。 喜欢的东西:

127.0.0.1 localhost "hostname" 
+0

感谢您的建议马纳夫。我尝试在hosts文件中添加/删除/编辑该行,但不会改变任何内容。 – itsamineral

+1

我很确定它与你的网络设置有关,而不是你的火花设置。我得到了同样的错误,并能够通过在hosts文件中添加条目来解决它。可以肯定的是,你需要用你的主机名(在shell中输入$ hostname cmd)替换最后一个cmd中的“主机名”,也没有引号 –

+0

这是根本原因。 – okwap

10

我试图运行shell时有同样的问题,并能够通过设置SPARK_LOCAL_IP环境变量来得到这个工作。你可以在命令行中运行shell时转让本:

SPARK_LOCAL_IP=127.0.0.1 ./bin/spark-shell

对于一个更永久的解决方案,建立在你的星火根conf目录下的一个文件spark-env.sh。添加以下行:

SPARK_LOCAL_IP=127.0.0.1

给执行权限使用chmod +x ./conf/spark-env.sh的脚本,这将设置默认这个环境变量。

+0

在启动脚本之前为'SPARK_LOCAL_IP'做'export'工作 - 快速测试它是否有效。 –

3

我使用Maven/SBT来管理依赖关系,并且Spark核心包含在一个jar文件中。

您可以通过设置“spark.driver.bindAddress”(在这里斯卡拉)覆盖在运行时SPARK_LOCAL_IP:

val config = new SparkConf() 
config.setMaster("local[*]") 
config.setAppName("Test App") 
config.set("spark.driver.bindAddress", "127.0.0.1") 
val sc = new SparkContext(config) 
相关问题