2017-08-25 101 views
1

我使用SBT使用Scala的火花试图流Twitter的数据,一切都顺利,但我有一个问题:流星火异常“主”

这是我buld.sbt:

import Assembly._ 
import AssemblyPlugin._ 

name := "TwitterSparkStreaming" 
version := "0.1" 
scalaVersion := "2.12.3" 

libraryDependencies ++= Seq(
    "org.apache.spark" % "spark-core_2.11" % "1.5.2", 
    "org.apache.spark" % "spark-sql_2.11" % "1.5.2", 
    "org.apache.spark" % "spark-streaming_2.11" % "1.5.2", 
    "org.apache.spark" % "spark-streaming-twitter_2.11" % "1.6.3", 
    "joda-time" %% "joda-time" % "2.9.1", 
    "org.twitter4j" % "twitter4j-core" % "3.0.3", 
    "org.twitter4j" % "twitter4j-stream" % "3.0.3", 
    "edu.stanford.nlp" % "stanford-corenlp" % "3.5.2", 
    "edu.stanford.nlp" % "stanford-corenlp" % "3.5.2" classifier "models" 
) 

resolvers += "Akka Repository" at "http://repo.akka.io./releases/" 

assemblyMergeStrategy in assembly := { 
    case PathList("META-INF", xs @ _*) => MergeStrategy.discard 
    case x => MergeStrategy.first 
} 

这是类包含org.apache.spark.Logging:

import org.apache.log4j.{Logger, Level} 
import org.apache.spark.Logging 

object LogUtils extends Logging{ 
    def setStreamingLogLevels(): Unit ={ 
    val log4jInitialized = Logger.getRootLogger.getAllAppenders.hasMoreElements 
    if(!log4jInitialized) 
    { 
     logInfo("Setting log level to [WARN] for streaming example." + " To override add a custom log4j.properties to the classpath.") 
     Logger.getRootLogger.setLevel(Level.WARN) 
    } 
    } 
} 

这是错误不断出现对我来说:

Exception in thread "main" java.lang.NoSuchMethodError: org.apache.spark.Logging.$init$(Lorg/apache/spark/Logging;)V 
    at LogUtils$.<init>(LogUtils.scala:4) 
    at LogUtils$.<clinit>(LogUtils.scala) 
    at TwitterStreaming$.main(TwitterStreaming.scala:30) 
    at TwitterStreaming.main(TwitterStreaming.scala) 

我可以知道我该如何解决它?

注:我试图改变从2.2.0版本org.apache.spark依赖于1.5.2,但问题是相同的

+0

试试这个:HTTPS: //community.hortonworks.com/questions/58286/noclassdeffounderror-orgapachesparklogging-using-s.html –

+0

谢谢@YosiDahari,但这并不奏效,我只是用1.5.2版本重新安装spark然后才能正常工作,非常感谢 –

回答

0

我不知道为什么这个代码块是给错误。但是有一个更好的方法来设置Spark中的日志级别。

请参考链接https://spark.apache.org/docs/latest/api/java/org/apache/spark/SparkContext.html#setLogLevel-java.lang.String-

星火对sparkContext水平的方法,所以,你可以叫,

sparkContext.setLogLevel( “WARN”)

+0

谢谢你Ganesh,但没有奏效。 –

+0

我只是重装版本1.5.2的火花,然后工作,非常感谢你 –

+0

@AmaniAlFarasani没问题,如果有用,请提高回答。 – Ganesh