0
我试图通过使用sbt来构建复杂功能。编译失败当我使用sbt构建火花自包含应用程序
教程:http://spark.apache.org/docs/latest/quick-start.html#self-contained-applications
但错误发生。
[error] /home/zeng/workspace/spark/als/src/main/scala/ALSExample.scala:22:object ml is not a member of package org.apache.spark
[error] import org.apache.spark.ml.evaluation.RegressionEvaluator
[error] ^
[error] /home/zeng/workspace/spark/als/src/main/scala/ALSExample.scala:23: object ml is not a member of package org.apache.spark
[error] import org.apache.spark.ml.recommendation.ALS
[error] ^
[error] /home/zeng/workspace/spark/als/src/main/scala/ALSExample.scala:25: object sql is not a member of package org.apache.spark
[error] import org.apache.spark.sql.SparkSession
[error] ^
[error] /home/zeng/workspace/spark/als/src/main/scala/ALSExample.scala:46: not found: value SparkSession
[error] val spark = SparkSession
[error] ^
[error] /home/zeng/workspace/spark/als/src/main/scala/ALSExample.scala:61: not found: type ALS
[error] val als = new ALS()
[error] ^
[error] 5 errors found
[error] (compile:compileIncremental) Compilation failed
为什么会发生这种情况?顺便说一句,spark verion是2.0.0。
您的build.sbt文件是什么样子的?它看起来像你没有正确指定库。 – GameOfThrows
@GameOfThrows build.sbt文件:'名称:= “ALSExample” 版本:= “1.0” scalaVersion:= “2.11.7” libraryDependencies + = “org.apache.spark” %%“火花核心“%”2.0.0“' –