2
我试图从多个主题过滤kafka事件,但是一旦来自一个主题的所有事件都被过滤,logstash无法从其他kafka主题获取事件。我使用的主题有3个分区和2次重复下面是我logstash配置文件具有多个kafka输入的Logstash
input {
kafka{
auto_offset_reset => "smallest"
consumer_id => "logstashConsumer1"
topic_id => "unprocessed_log1"
zk_connect=>"192.42.79.67:2181,192.41.85.48:2181,192.10.13.14:2181"
type => "kafka_type_1"
}
kafka{
auto_offset_reset => "smallest"
consumer_id => "logstashConsumer1"
topic_id => "unprocessed_log2"
zk_connect => "192.42.79.67:2181,192.41.85.48:2181,192.10.13.14:2181"
type => "kafka_type_2"
}
}
filter{
if [type] == "kafka_type_1"{
csv {
separator=>" "
source => "data"
}
}
if [type] == "kafka_type_2"{
csv {
separator => " "
source => "data"
}
}
}
output{
stdout{ codec=>rubydebug{metadata => true }}
}
尝试在你的第二个'kafka'输入 – Val
@val非常感谢使用不同的消费(例如'logstashConsumer2')!它的工作 – Abhijeet
你设法使它工作? – Val