0

我正在使用Logstash将JSON消息输出到API。我正在读取日志文件中的日志。我的配置工作正常,它也将所有消息发送到API。以下是示例日志文件:如何在Logstash的过滤器中格式化日期

日志文件:

2014 Jun 01 18:57:34:158 GMT +5 BW.Customer_01_001_009-Process_Archive Info [BW-Core] BWENGINE-300009 BW Plugins: version 5.10.0, build V48, 2012-6-3 
2014 Jun 01 18:57:34:162 GMT +5 BW.Customer_01_001_009-Process_Archive Info [BW-Core] BWENGINE-300010 XML Support: TIBCOXML Version 5.51.500.003 
2014 Jun 01 18:57:34:162 GMT +5 BW.Customer_01_001_009-Process_Archive Info [BW-Core] BWENGINE-300011 Java version: Java HotSpot(TM) Server VM 20.5-b03 
2014 Jun 01 18:57:34:162 GMT +5 BW.Customer_01_001_009-Process_Archive Info [BW-Core] BWENGINE-300012 OS version: i386 Linux 3.11.0-12-generic 
2014 Jun 01 18:57:41:018 GMT +5 BW.Customer_01_001_009-Process_Archive Warn [BW_Core] Duplicate message map entry for BW-HTTP-100118 
2014 Jun 01 18:57:41:027 GMT +5 BW.Customer_01_001_009-Process_Archive Warn [BW_Core] Duplicate message map entry for BW-HTTP-100206 
2014 Jun 01 18:57:41:408 GMT +5 BW.Customer_01_001_009-Process_Archive Info [BW-Core] BWENGINE-300013 Tibrv string encoding: ISO8859-1 
2014 Jun 01 18:57:42:408 GMT +5 BW.Customer_01_001_009-Process_Archive Warn [BW_Core] Duplicate message map entry for BW-HTTP-100118 
2014 Jun 01 18:57:42:408 GMT +5 BW.Customer_01_001_009-Process_Archive Warn [BW_Core] Duplicate message map entry for BW-HTTP-100206 
2014 Jun 01 18:57:42:555 GMT +5 BW.Customer_01_001_009-Process_Archive Warn [BW_Core] Duplicate message map entry for BW-HTTP-100118 
2014 Jun 01 18:57:42:555 GMT +5 BW.Customer_01_001_009-Process_Archive Warn [BW_Core] Duplicate message map entry for BW-HTTP-100206 
2014 Jun 01 18:57:42:557 GMT +5 BW.Customer_01_001_009-Process_Archive Warn [BW_Core] Duplicate message map entry for BW-HTTP-100118 
2014 Jun 01 18:57:42:557 GMT +5 BW.Customer_01_001_009-Process_Archive Warn [BW_Core] Duplicate message map entry for BW-HTTP-100206 
2014 Jun 01 18:57:42:595 GMT +5 BW.Customer_01_001_009-Process_Archive Warn [BW_Core] Duplicate message map entry for BW-HTTP-100118 

我使用神交模式来解析这个日志文件,下面是我的示例配置文件:

配置文件:

filter { 
     if [type] == "bw5applog" { 
     grok { 
      match => [ "message", "(?<log_timestamp>%{YEAR}\s%{MONTH}\s%{MONTHDAY}\s%{TIME}:\d{3})\s(?<log_Timezone>%{DATA}\s%{DATA})\s(?<log_MessageTitle>%{DATA})(?<MessageType>%{LOGLEVEL})%{SPACE}\[%{DATA:ProcessName}\]%{SPACE}%{GREEDYDATA:Message}" ] 
      add_tag => [ "grokked" ]   
     } 
     mutate { 
      gsub => [ 
      "TimeStamp", "\s", "T", 
      "TimeStamp", ",", "." 
      ] 
     } 
     if !("_grokparsefailure" in [tags]) { 
      grok{ 
        match => [ "message", "%{GREEDYDATA:StackTrace}" ] 
        add_tag => [ "grokked" ]  
       } 
      date { 
        match => [ "timestamp", "yyyy MMM dd HH:mm:ss:SSS" ] 
        target => "TimeStamp" 
        timezone => "UTC" 
       } 
     } 
    } 
} 

我能够根据我的r解析完整的日志条目equirement,但我想格式化日期。

问题陈述:

log_timestamp: 2014·May·28·12:07:35:927 

但在我的API期待的日期是以下格式:

从解析的日志条目的格式如下目前我正在日期:

预期输出:

log_timestamp: 2014-05-28T12:07:35:927 

如何通过使用上述过滤器配置来实现这一点,我试着用下列配置做一些事情,但是我无法成功。

回答

1

您在错误的字段上应用日期过滤器。取而代之的timestamp,你要它适用于log_timestamp场,其中包含要解析日期:

date { 
     match => [ "log_timestamp", "yyyy MMM dd HH:mm:ss:SSS" ] 
     target => "log_timestamp" 
     timezone => "UTC" 
} 

此外,由于它是在一个领域不存在(应用发生变异过滤器是无用Timestamp)。