2015-10-20 23 views
1

我下载了预置的火花Hadoop的2.4和我收到以下错误,当我试图启动一个SparkContext在python:获取星火1.5在Mac上运行的本地

ClassNotFoundException: org.apache.spark.launcher.Main 

下面的代码应该是纠正:

import sys, os 
os.environ['SPARK_HOME'] = '/spark-1.5.1-bin-hadoop2.4/' 
sys.path.insert(0, '/spark-1.5.1-bin-hadoop2.4/python/') 
os.environ['PYTHONPATH'] = '/usr/local/Cellar/python/2.7.9/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/' 
import pyspark 
from pyspark import SparkContext 
sc = SparkContext('local[2]') 

回答