2017-03-22 126 views
1

后在Mac上运行pyspark我最近安装了SPARK在使用下面的命令我MAC电脑:不能SPARK安装

brew install apache-spark 

现在我想跑 pyspark却是露出我下面的错误。

pyspark 

Python 3.6.0 |Anaconda custom (x86_64)| (default, Dec 23 2016, 13:19:00 
[GCC 4.2.1 Compatible Apple LLVM 6.0 (clang-600.0.57)] on darwin 
Type "help", "copyright", "credits" or "license" for more information. 
Traceback (most recent call last) 
    File "/usr/local/Cellar/apache-spark/2.1.0/libexec/python/pyspark/shell.py", line 30, in <module> 
import pyspark 
File "/usr/local/Cellar/apache-spark/2.1.0/libexec/python/pyspark/__init__.py", line 44, in <module> 
from pyspark.context import SparkContext 
File "/usr/local/Cellar/apache-spark/2.1.0/libexec/python/pyspark/context.py", line 36, in <module> 
    from pyspark.java_gateway import launch_gateway 
File "/usr/local/Cellar/apache-spark/2.1.0/libexec/python/pyspark/java_gateway.py", line 31, in <module> 
from py4j.java_gateway import java_import, JavaGateway, GatewayClient 
    File "<frozen importlib._bootstrap>", line 961, in _find_and_load 
    File "<frozen importlib._bootstrap>", line 950, in _find_and_load_unlocked 
File "<frozen importlib._bootstrap>", line 646, in _load_unlocked 
File "<frozen importlib._bootstrap>", line 616, in _load_backward_compatible 
File "/usr/local/Cellar/apache-spark/2.1.0/libexec/python/lib/py4j-0.10.4-src.zip/py4j/java_gateway.py", line 18, in <module> 
File "/Users/hellmaster/anaconda/lib/python3.6/pydoc.py", line 62, in <module> 

import pkgutil 
File "/Users/hellmaster/anaconda/lib/python3.6/pkgutil.py", line 22, in <module> 
    ModuleInfo = namedtuple('ModuleInfo', 'module_finder name ispkg') 
File "/usr/local/Cellar/apache-spark/2.1.0/libexec/python/pyspark/serializers.py", line 393, in namedtuple 
cls = _old_namedtuple(*args, **kwargs) 
TypeError: namedtuple() missing 3 required keyword-only arguments: 'verbose', 'rename', and 'module' 

我该如何解决这个问题?

回答

2

这是因为Spark 2.1.0与Python 3.6不兼容。请参阅this question