0
我想与Kafka集成一起使用Spark流。我使用Spark 2.0.0版。sbt未解决火花传输依赖关系卡夫卡集成
但是我得到一个无法解析的依赖错误(“未解析的依赖关系:org.apache.spark#spark-sql-kafka-0-10_2.11; 2.0.0:not found”)。
如何访问此包?或者我做错了什么/失踪?
我build.sbt文件:
name := "Spark Streaming"
version := "0.1"
scalaVersion := "2.11.11"
val sparkVersion = "2.0.0"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % sparkVersion,
"org.apache.spark" %% "spark-sql" % sparkVersion,
"org.apache.spark" %% "spark-streaming" % sparkVersion,
"org.apache.spark" %% "spark-sql-kafka-0-10" % sparkVersion
)
libraryDependencies += "org.apache.spark" % "spark-streaming_2.11" % "2.0.0-preview"
谢谢你的帮助。