2017-10-18 172 views
0

我正在使用scala应用程序和spark依赖项。 这里我有什么Spark:显示日志消息

log4j.properties

# Here we have defined root logger 
log4j.rootLogger=WARN,ERROR,R 

#Direct log messages to file 
log4j.appender.R=org.apache.log4j.RollingFileAppender 
log4j.appender.R.File=./logging.log 
log4j.appender.R.layout=org.apache.log4j.PatternLayout 
log4j.appender.R.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L- %m%n 

# Change this to set Spark log level 
log4j.logger.org.apache.spark=ERROR 

# Silence akka remoting 
#log4j.logger.Remoting=WARN 

# Ignore messages below warning level from Jetty, because it's a bit verbose 
#log4j.logger.org.eclipse.jetty=WARN 

主类

org.apache.log4j.PropertyConfigurator.configure("./log4j.properties") 

val sparkSession = SparkSession.builder.master("local").appName("spark session example").getOrCreate 
val spark = sparkSession.sqlContext 

Logger.getLogger("org").setLevel(Level.OFF) 
Logger.getLogger("akka").setLevel(Level.OFF) 

val log = Logger.getLogger(this.getClass.getName) 
log.error("This is error"); 
log.warn("This is warn"); 
println("log ok") 

的pom.xml

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" 
     xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"> 
     <modelVersion>4.0.0</modelVersion> 

     <groupId>com.ayd</groupId> 
     <artifactId>data2</artifactId> 
     <version>0.0.1-SNAPSHOT</version> 
     <packaging>jar</packaging> 

     <name>data2</name> 
     <url>http://maven.apache.org</url> 

     <properties> 
     <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding> 
     </properties> 

     <dependencies> 
     <dependency> 
       <groupId>org.apache.spark</groupId> 
       <artifactId>spark-catalyst_2.11</artifactId> 
       <version>2.0.0</version> 
      </dependency> 

      <dependency> 
       <groupId>org.apache.spark</groupId> 
       <artifactId>spark-core_2.11</artifactId> 
       <version>2.0.0</version> 
      </dependency> 

      <dependency> 
       <groupId>org.apache.spark</groupId> 
       <artifactId>spark-sql_2.11</artifactId> 
       <version>2.0.0</version> 

      </dependency> 

      <dependency> 
       <groupId>org.postgresql</groupId> 
       <artifactId>postgresql</artifactId> 
       <version>42.1.3</version> 
      </dependency> 





      <!-- https://mvnrepository.com/artifact/log4j/log4j --> 
      <dependency> 
       <groupId>log4j</groupId> 
       <artifactId>log4j</artifactId> 
       <version>1.2.17</version> 
      </dependency> 
     <dependency> 
      <groupId>junit</groupId> 
      <artifactId>junit</artifactId> 
      <version>3.8.1</version> 
      <scope>test</scope> 
     </dependency> 
     </dependencies> 
    </project> 

我只想得到的消息显示我我,不火花日志。 当我这样做,我发现这在日志文件中仍然火花

2017-10-18 11:58:38 WARN NativeCodeLoader:62- Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 

     2017-10-18 11:58:41 ERROR Run$:23- This is error 
     2017-10-18 11:58:41 WARN Run$:24- This is warn 

我怎样才能摆脱在日志文件中的两个第一线的这些额外的日志信息。

在此先感谢

回答

0

您设置日志火花类WARN的水平,所以你看到这个警告。

如果要过滤掉所有Spark警告(并不推荐),您可以将级别设置为你的log4j属性ERROR文件:

# Change this to set Spark log level 
log4j.logger.org.apache.spark=ERROR 

其他警告来自Hadoop的代码(包org.apache.hadoop) - 所以要禁用一个(与所有其他的Hadoop警告一起),你可以指定该包的水平也为ERROR

log4j.logger.org.apache.hadoop=ERROR 
+0

我还是得到了这个2017年10月18日14时16分35秒WARN NativeCodeLoader: 62-无法加载native-hadoop libr ary为您的平台...使用内置的java类适用 – maher

+0

已更新的职位,以说明这一点 –

+0

我更新了这篇文章,只是检查。谢谢很多 – maher