2015-12-02 98 views
3

我试图用sbt 0.13.8编译一个非常简单的火花项目,其唯一的功能是未解决的依赖问题,编译具有SBT

Test.scala

import org.apache.spark.SparkContext 
import org.apache.spark.SparkContext._ 
import org.apache.spark.SparkConf 

object SimpleApp { 
    def main(args: Array[String]) { 
    val logFile = "YOUR_SPARK_HOME/README.md" // Should be some file on your system 
    val conf = new SparkConf().setAppName("Simple Application") 
    val sc = new SparkContext(conf) 
    val logData = sc.textFile(logFile, 2).cache() 
    val numAs = logData.filter(line => line.contains("a")).count() 
    val numBs = logData.filter(line => line.contains("b")).count() 
    println("Lines with a: %s, Lines with b: %s".format(numAs, numBs)) 
    } 
} 

build.sbt火花项目时文件中的projet根目录如下:

name := "Test" 

version := "1.0" 

scalaVersion := "2.11.7" 

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.5.2" 

resolvers ++= Seq(
    "Apache Repository" at "https://repository.apache.org/content/repositories/releases/", 
    "Cloudera Repository" at "https://repository.cloudera.com/artifactory/cloudera-repos/", 
    Resolver.sonatypeRepo("public") 
) 

错误返回sbt compile是:

[warn] :::::::::::::::::::::::::::::::::::::::::::::: 
[warn] ::   UNRESOLVED DEPENDENCIES   :: 
[warn] :::::::::::::::::::::::::::::::::::::::::::::: 
[warn] :: oro#oro;2.0.8: configuration not found in oro#oro;2.0.8: 'master(compile)'. Missing configuration: 'compile'. It was required from org.apache.spark#spark-core_2.11;1.5.2 compile 
[warn] :::::::::::::::::::::::::::::::::::::::::::::: 
[warn] 
[warn] Note: Unresolved dependencies path: 
[warn]  oro:oro:2.0.8 
[warn]  +- org.apache.spark:spark-core_2.11:1.5.2 (/home/osboxes/Documents/bookings/Test/build.sbt#L7-8) 
[warn]  +- default:test_2.11:1.0 
sbt.ResolveException: unresolved dependency: oro#oro;2.0.8: configuration not found in oro#oro;2.0.8: 'master(compile)'. Missing configuration: 'compile'. It was required from org.apache.spark#spark-core_2.11;1.5.2 compile 
    at sbt.IvyActions$.sbt$IvyActions$$resolve(IvyActions.scala:291) 
    at sbt.IvyActions$$anonfun$updateEither$1.apply(IvyActions.scala:188) 
    at sbt.IvyActions$$anonfun$updateEither$1.apply(IvyActions.scala:165) 
    at sbt.IvySbt$Module$$anonfun$withModule$1.apply(Ivy.scala:155) 
    at sbt.IvySbt$Module$$anonfun$withModule$1.apply(Ivy.scala:155) 
    at sbt.IvySbt$$anonfun$withIvy$1.apply(Ivy.scala:132) 
    at sbt.IvySbt.sbt$IvySbt$$action$1(Ivy.scala:57) 
    at sbt.IvySbt$$anon$4.call(Ivy.scala:65) 
    at xsbt.boot.Locks$GlobalLock.withChannel$1(Locks.scala:93) 
    at xsbt.boot.Locks$GlobalLock.xsbt$boot$Locks$GlobalLock$$withChannelRetries$1(Locks.scala:78) 
    at xsbt.boot.Locks$GlobalLock$$anonfun$withFileLock$1.apply(Locks.scala:97) 
    at xsbt.boot.Using$.withResource(Using.scala:10) 
    at xsbt.boot.Using$.apply(Using.scala:9) 
    at xsbt.boot.Locks$GlobalLock.ignoringDeadlockAvoided(Locks.scala:58) 
    at xsbt.boot.Locks$GlobalLock.withLock(Locks.scala:48) 
    at xsbt.boot.Locks$.apply0(Locks.scala:31) 
    at xsbt.boot.Locks$.apply(Locks.scala:28) 
    at sbt.IvySbt.withDefaultLogger(Ivy.scala:65) 
    at sbt.IvySbt.withIvy(Ivy.scala:127) 
    at sbt.IvySbt.withIvy(Ivy.scala:124) 
    at sbt.IvySbt$Module.withModule(Ivy.scala:155) 
    at sbt.IvyActions$.updateEither(IvyActions.scala:165) 
    at sbt.Classpaths$$anonfun$sbt$Classpaths$$work$1$1.apply(Defaults.scala:1369) 
    at sbt.Classpaths$$anonfun$sbt$Classpaths$$work$1$1.apply(Defaults.scala:1365) 
    at sbt.Classpaths$$anonfun$doWork$1$1$$anonfun$87.apply(Defaults.scala:1399) 
    at sbt.Classpaths$$anonfun$doWork$1$1$$anonfun$87.apply(Defaults.scala:1397) 
    at sbt.Tracked$$anonfun$lastOutput$1.apply(Tracked.scala:37) 
    at sbt.Classpaths$$anonfun$doWork$1$1.apply(Defaults.scala:1402) 
    at sbt.Classpaths$$anonfun$doWork$1$1.apply(Defaults.scala:1396) 
    at sbt.Tracked$$anonfun$inputChanged$1.apply(Tracked.scala:60) 
    at sbt.Classpaths$.cachedUpdate(Defaults.scala:1419) 
    at sbt.Classpaths$$anonfun$updateTask$1.apply(Defaults.scala:1348) 
    at sbt.Classpaths$$anonfun$updateTask$1.apply(Defaults.scala:1310) 
    at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47) 
    at sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:40) 
    at sbt.std.Transform$$anon$4.work(System.scala:63) 
    at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226) 
    at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226) 
    at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:17) 
    at sbt.Execute.work(Execute.scala:235) 
    at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226) 
    at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226) 
    at sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:159) 
    at sbt.CompletionService$$anon$2.call(CompletionService.scala:28) 
    at java.util.concurrent.FutureTask.run(FutureTask.java:266) 
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) 
    at java.util.concurrent.FutureTask.run(FutureTask.java:266) 
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) 
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) 
    at java.lang.Thread.run(Thread.java:745) 
[error] (*:update) sbt.ResolveException: unresolved dependency: oro#oro;2.0.8: configuration not found in oro#oro;2.0.8: 'master(compile)'. Missing configuration: 'compile'. It was required from org.apache.spark#spark-core_2.11;1.5.2 compile 

我该如何解决这个依赖性问题?

编辑

我都跟着@ mark91的建议是:

  • 变化斯卡拉版2.10.5
  • 变化火花依赖于libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "1.2.0-cdh5.3.2" % "provided"

但是我仍然可以在org.scala浪#斯卡拉库未解决的依赖; 2.10.4:

[error] sbt.ResolveException: unresolved dependency: org.scala-lang#scala-library;2.10.4: configuration not found in org.scala-lang#scala-library;2.10.4: 'master(compile)'. Missing configuration: 'compile'. It was required from org.apache.spark#spark-core_2.10;1.2.0-cdh5.3.2 compile 

你有为什么我得到这个问题的任何想法?

回答

11

你的问题是Spark是用Scala 2.10编写的。所以你应该使用Scala 2.10版而不是2.11。

例子:

scalaVersion := "2.10.5" 

libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "1.2.0-cdh5.3.2" % "provided" 


resolvers += "Cloudera Repository" at "https://repository.cloudera.com/artifactory/cloudera-repos/" 
+0

当我使用Scala的2.10(准确地说2.10.4),我得到了一些错误。此外,还有两个未解决的依赖项:'org.scala-lang#scala-library; 2.10.4'和'org.scala-lang#scala-reflect; 2.10.4' – Pop

+0

尝试使用2.10.5。我已经用我正在使用的一个工作示例更新了答案。 – mgaido

+0

仍然存在您的新解决方案的问题:'[error] sbt.ResolveException:无法解析的依赖关系:org.scala-lang#scala-library; 2.10.4:在org.scala-lang#scala-library中找不到配置; 2.10.4:'master(compile)'。缺少配置:“编译”。它是从org.apache.spark需要#spark-core_2.10; 1.2.0-cdh5.3.2编译 ' – Pop

5

我也有类似的依赖性问题,并通过重新加载插件和更新的依赖性解决它。我认为你的依赖问题是由于Ivy缓存。通常,如果自上次成功解析以来没有依赖项管理配置发生变化,并且检索到的文件仍然存在,那么sbt不会要求Ivy执行解析。

尝试运行:

sbt reload plugins 
sbt update 
sbt reload 

如果还是不行,请按照指示,http://www.scala-sbt.org/0.13/docs/Dependency-Management-Flow.html

相关问题