我跟着这个链接:http://blog.miz.space/tutorial/2016/08/30/how-to-integrate-spark-intellij-idea-and-scala-install-setup-ubuntu-windows-mac/Unresolve依赖星火库
当我尝试编译我的IntelliJ与项目,SBT是抱怨未解决的依赖
[Warn(警告)] ===公众:试图[警告] https://repol.maven.org/maven2/org/apache/spark/spark-core/2.1.1/spark-core-2.1.1.pom [警告]未解决的依赖性路径:org.apache.spark:火花芯:2.1.1
我的阶版本是2.12.2和sparkVersion是2.1 0.1
这里是我的build.sbt样子:
name := "test" version := "1.0" scalaVersion := "2.12.2"
val sparkVersion = "2.1.1"
libraryDependencies ++= Seq("org.apache.spark" % "spark-core" & sparkVersion)`
谢谢
必须使用Scala的2.10.x或2.11.x为2.12阶不被火花尚未 – Nonontb
星火社区正在斯卡拉2.12 support.Please遵循https://issues.apache.org/jira支持/ browse/SPARK-14220 – hadooper