2016-06-18 54 views
0

我正在使用sbt构建火花。当我运行以下命令:内存不足错误构建火花时出错

sbt/sbt assembly 

它需要一些时间来建立火花。有出现,并在年底我收到以下错误几个警告:

[error] java.util.concurrent.ExecutionException: java.lang.OutOfMemoryError: Java heap space 
[error] Use 'last' for the full log. 

当我检查使用命令SBT sbtVersion SBT版本,我得到以下结果:

[warn] Multiple resolvers having different access mechanism configured with same name 'sbt-plugin-releases'. To avoid conflict, Remove duplicate project resolvers (`resolvers`) or rename publishing resolver (`publishTo`). 
[warn] There may be incompatibilities among your library dependencies. 
[warn] Here are some of the libraries that were evicted: 
[warn] * com.typesafe.sbt:sbt-git:0.6.1 -> 0.6.2 
[warn] * com.typesafe.sbt:sbt-site:0.7.0 -> 0.7.1 
....... 
[info] streaming-zeromq/*:sbtVersion 
[info] 0.13.7 
[info] repl/*:sbtVersion 
[info] 0.13.7 
[info] spark/*:sbtVersion 
[info] 0.13.7 

当我给命令,./bin/spark-shell,我获得以下的输出:

ls: cannot access '/home/neel_shah/spark/spark-1.6.1/assembly/target/scala-2.10': No such file or directory 
Failed to find Spark assembly in /home/neel_shah/spark/spark-1.6.1/assembly/target/scala-2.10. 
You need to build Spark before running this program. 

什解决方案可以吗?

回答

3

您必须配置SBT堆大小:

  • 在Linux类型export SBT_OPTS="-Xmx2G"
  • 在Windows类型set JAVA_OPTS=-Xmx2G

更多信息:

http://www.scala-sbt.org/0.13.1/docs/Getting-Started/Setup.html

How to set heap size for sbt?

+0

工作,非常感谢! –

+0

当我这样做(在Windows中)我得到:忽略选项MaxPermSize = 256m;在8.0中删除了支持。不知道现在该做什么? – cs0815

+0

'Xmx'参数设置堆的最大大小。 'PermSize'是不同的内存区域。尝试阅读更多关于它在这里:http://stackoverflow.com/questions/22634644/java-hotspottm-64-bit-server-vm-warning-ignoring-option-maxpermsize和http://www.journaldev.com/4098/java-heap-space-vs-stack-memory 您可以忽略此消息。这只是警告 – mgosk