2016-09-03 30 views
0

我试图遵循Apache的星火文档站点的例子:https://spark.apache.org/docs/2.0.0-preview/submitting-applications.html与Apache星火提交Python应用程序提交

我开始了星火独立集群,要运行示例Python应用程序。我在我的火花2.0.0彬hadoop2.7目录并运行以下命令

./bin/spark-submit \ 
--master spark://207.184.161.138:7077 \ 
examples/src/main/python/pi.py \ 
1000 

但是,我得到的错误

jupyter: '/Users/MyName/spark-2.0.0-bin- \ 
hadoop2.7/examples/src/main/python/pi.py' is not a Jupyter command 

这是我的.bash_profile样子

#setting path for Spark 
export SPARK_PATH=~/spark-2.0.0-bin-hadoop2.7 
export PYSPARK_DRIVER_PYTHON="jupyter" 
export PYSPARK_DRIVER_PYTHON_OPTS="notebook" 
alias snotebook='$SPARK_PATH/bin/pyspark --master local[2]' 

我在做什么错?

+0

取消设置'PYSPARK_DRIVER_PYTHON'和提交之前'PYSPARK_DRIVER_PYTHON_OPTS'。 – zero323

回答

1

火花提交命令之前添加PYSPARK_DRIVER_PYTHON=ipython

alias snotebook='PYSPARK_DRIVER_PYTHON=jupyter PYSPARK_DRIVER_PYTHON_OPTS=notebook $SPARK_PATH/bin/pyspark --master local[2]' 

所以它不会干扰pyspark:

您可以设置此类似。

实施例:

PYSPARK_DRIVER_PYTHON=ipython ./bin/spark-submit \ 
/home/SimpleApp.py 
+0

这很有帮助..谢谢 – lakshmi