2017-06-12 57 views
0

YARN阈值错误

我正在使用新的HDP2.6。和Ambari。在这我已经安装了纱,MapReduce的,Spark2,Hadoop和等 我试图用--master纱进入火花外壳,但我经常收到这类错误:

$bin/spark-shell --master yarn --deploy-mode client 


Warning: Ignoring non-spark config property: spark-executor.memory=4g 
Setting default log level to "WARN". 
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel). 
17/06/12 13:38:38 ERROR SparkContext: Error initializing SparkContext. 
java.lang.IllegalArgumentException: Required executor memory (8192+819 MB) is above the max threshold (8192 MB) of this cluster! Please check the values of 'yarn.scheduler.maximum-allocation-mb' and/or 'yarn.nodemanager.resource.memory-mb'. 
     at org.apache.spark.deploy.yarn.Client.verifyClusterResources(Client.scala:334) 
     at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:168) 
     at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:56) 
     at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:156) 
     at org.apache.spark.SparkContext.<init>(SparkContext.scala:509) 
     at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2320) 
     at org.apache.spark.sql.SparkSession$Builder$anonfun$6.apply(SparkSession.scala:868) 
     at org.apache.spark.sql.SparkSession$Builder$anonfun$6.apply(SparkSession.scala:860) 
     at scala.Option.getOrElse(Option.scala:121) 
     at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:860) 
     at org.apache.spark.repl.Main$.createSparkSession(Main.scala:96) 
     at $line3.$read$iw$iw.<init>(<console>:15) 
     at $line3.$read$iw.<init>(<console>:42) 
     at $line3.$read.<init>(<console>:44) 
     at $line3.$read$.<init>(<console>:48) 
     at $line3.$read$.<clinit>(<console>) 
     at $line3.$eval$.$print$lzycompute(<console>:7) 
     at $line3.$eval$.$print(<console>:6) 
     at $line3.$eval.$print(<console>) 
     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
     at java.lang.reflect.Method.invoke(Method.java:497) 
     at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786) 
     at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047) 
     at scala.tools.nsc.interpreter.IMain$WrappedRequest$anonfun$loadAndRunReq$1.apply(IMain.scala:638) 
     at scala.tools.nsc.interpreter.IMain$WrappedRequest$anonfun$loadAndRunReq$1.apply(IMain.scala:637) 
     at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31) 
     at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19) 
     at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637) 
     at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569) 
     at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565) 
     at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807) 
     at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681) 
     at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395) 
     at org.apache.spark.repl.SparkILoop$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38) 
     at org.apache.spark.repl.SparkILoop$anonfun$initializeSpark$1.apply(SparkILoop.scala:37) 
     at org.apache.spark.repl.SparkILoop$anonfun$initializeSpark$1.apply(SparkILoop.scala:37) 
     at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214) 
     at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37) 
     at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:105) 
     at scala.tools.nsc.interpreter.ILoop$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920) 
     at scala.tools.nsc.interpreter.ILoop$anonfun$process$1.apply(ILoop.scala:909) 
     at scala.tools.nsc.interpreter.ILoop$anonfun$process$1.apply(ILoop.scala:909) 
     at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97) 
     at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909) 
     at org.apache.spark.repl.Main$.doMain(Main.scala:69) 
     at org.apache.spark.repl.Main$.main(Main.scala:52) 
     at org.apache.spark.repl.Main.main(Main.scala) 
     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
     at java.lang.reflect.Method.invoke(Method.java:497) 
     at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$runMain(SparkSubmit.scala:745) 
     at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187) 
     at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212) 
     at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126) 
     at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 

而且我试着用这行代码:

bin/spark-shell --conf spark-executor.memory=4g --conf spark.executor.cores=2 --master yarn --deploy-mode client 

但仍然得到完全相同的错误。 这是我的纱线资源: enter image description here

这是应用程序,在Ambari测试succeded:

enter image description here

谁能告诉我我在做什么错在这里,因为我跑疯了。试图解决这个问题已经有一个星期了,我不能再做了。请别人。 :(

回答

0

在您的命令行:

bin/spark-shell --conf spark-executor.memory=4g --conf spark.executor.cores=2 --master yarn --deploy-mode client 

拼错性能spark-executor.memory应该spark.executor.memory

而且你可以在你的日志火花甚至看到告诉你:

Warning: Ignoring non-spark config property: spark-executor.memory=4g 

如果4g仍然太高,减少到2g。