0
我一直在试图解决这个问题,下面,但没有任何成功(Logstash 2.1,Elasticsearch 2.1,Kibana 4.3.1)无法获取映射。你有索引匹配的模式?
这是我logstash.conf文件
input {
file {
path => ["/var/log/network.log"]
start_position => "beginning"
type => "syslog"
tags => [ "netsyslog" ]
}
} #end input block
########################################
filter {
if [type] == "syslog" {
# Split the syslog part and Cisco tag out of the message
grok {
match => ["message", "%{CISCO_TAGGED_SYSLOG} %{GREEDYDATA:cisco_message}"]
}
# Parse the syslog severity and facility
#syslog_pri { }
# Parse the date from the "timestamp" field to the "@timestamp" field
# 2015-05-01T00:00:00+02:00 is ISO8601
grok {
match => ["message", "%{TIMESTAMP_ISO8601:timestamp}"]
}
date {
#2015-05-01T00:00:00+03:00
match => ["timestamp",
"yyyy-MM-dd'T'HH:mm:ssZ"
# "yyyy MM dd HH:mm:ss",
]
#timezone => "Asia/Kuwait"
}
# Clean up redundant fields if parsing was successful
if "_grokparsefailure" not in [tags] {
mutate {
rename => ["cisco_message", "message"]
remove_field => ["timestamp"]
}
}
# Extract fields from the each of the detailed message types
grok {
match => [
"message", "%{CISCOFW106001}",
"message", "%{CISCOFW106006_106007_106010}",
"message", "%{CISCOFW106014}",
"message", "%{CISCOFW106015}",
"message", "%{CISCOFW106021}",
"message", "%{CISCOFW106023}",
"message", "%{CISCOFW106100}",
"message", "%{CISCOFW110002}",
"message", "%{CISCOFW302010}",
"message", "%{CISCOFW302013_302014_302015_302016}",
"message", "%{CISCOFW302020_302021}",
"message", "%{CISCOFW305011}",
"message", "%{CISCOFW313001_313004_313008}",
"message", "%{CISCOFW313005}",
"message", "%{CISCOFW402117}",
"message", "%{CISCOFW402119}",
"message", "%{CISCOFW419001}",
"message", "%{CISCOFW419002}",
"message", "%{CISCOFW500004}",
"message", "%{CISCOFW602303_602304}",
"message", "%{CISCOFW710001_710002_710003_710005_710006}",
"message", "%{CISCOFW713172}",
"message", "%{CISCOFW733100}"
]
}
}
if [dst_ip] and [dst_ip] !~ "(^127\.0\.0\.1)|(^10\.)|(^172\.1[6-9]\.)|(^172\.2[0-9]\.)|(^172\.3[0-1]\.)|(^192\.168\.)|(^169\.254\.)" {
geoip {
source => "dst_ip"
database => "/opt/logstash/vendor/GeoLiteCity.dat" ### Change me to location of GeoLiteCity.dat file
target => "dst_geoip"
}
}
if [src_ip] and [src_ip] !~ "(^127\.0\.0\.1)|(^10\.)|(^172\.1[6-9]\.)|(^172\.2[0-9]\.)|(^172\.3[0-1]\.)|(^192\.168\.)|(^169\.254\.)" {
geoip {
source => "src_ip"
database => "/opt/logstash/vendor/GeoLiteCity.dat" ### Change me to location of GeoLiteCity.dat file
target => "src_geoip"
}
}
mutate {
convert => [ "[src_geoip][coordinates]", "float" ]
}
}
########################################
output {
stdout { codec => rubydebug }
elasticsearch {
hosts => "localhost"
template => "/opt/logstash/elasticsearch-template.json"
template_overwrite => true
}
} #end output block
当我的尾巴logstash .conf文件我可以看到它是解析。但是当我运行 curl'localhost:9200/_cat/indices?v' 我得到的只有.kibana存在 加载Kibana接口说无法获取映射。你有索引匹配的模式?
任何帮助,将不胜感激。
在此先感谢。
如果您没有创建任何索引,请检查logstash和elasticsearch日志。例如,映射不匹配将丢弃文档。 –
是的,你是对的。我在条目中得到下面的错误。 “_type”=>“syslog”,“_id”=> nil,“status”=> 400,“error”=> {“type”=>“mapper_parsing_exception”,“reason”=>“解析映射失败[_default_ ]:[SourceGeo]的映射定义具有不受支持的参数:[path:full]“,”causes_by“=> {”type“=>”mapper_parsing_exception“,”reason“=> [path:full]“}}}},:level =>:warn} –
检查您的映射/模板以获取SourceGeo的定义。 –