2014-11-02 37 views
2

我已经在Ubuntu 12.o4客户端操作系统上安装了Scala,sbt和hadoop 1.0.3。通过链接 - http://docs.sigmoidanalytics.com/index.php/How_to_Install_Spark_on_Ubuntu-12.04的参考,我尝试构建Spark并获得与保留空间有关的错误。构建火花时的内存问题

这里是我试图运行:

[email protected]:/usr/local/spark-1.1.0$ SPARK_HADOOP_VERSION=1.1.0 sbt/sbt assembly 

输出与以下错误:

Using /usr/lib/jvm/java-6-openjdk-i386/ as default JAVA_HOME. 
Note, this will be overridden by -java-home if it is set. 
Error occurred during initialization of VM 
Could not reserve enough space for object heap 
Could not create the Java virtual machine. 

回答

6

我得到传递MEM与SBT命令属性此解决下面给出,(4 GB RAM系统)

SPARK_HADOOP_VERSION=1.1.0 sbt/sbt assembly -mem 1024 
+1

谢谢,您在终端中传递的确切命令是什么? – 2014-12-31 14:47:32

+1

您可以在-mem参数之后传递可用RAM的大小。 (这是确切的命令) – 2014-12-31 15:06:19