2016-10-18 31 views
1

我正在使用Manning的统一日志处理书,它的第一个练习是Java中的简单卡夫卡消费者。运行时程序只会在调用consumer.poll()时停止。kafka java客户端不消费 - 只是挂在consumer.poll

我从书的作者提供的Valgrind的环境中运行这一点,它是在git clone https://github.com/alexanderdean/Unified-Log-Processing.git

zookeeper-3.4.6kafka_2.10-0.8.2.1

我曾尝试使用以下命令行创建一个话题:

./kafka-topics.sh --create --topic raw --zookeeper localhost:2181 --replication-factor 1 --partitions 1 
Created topic "raw". 

kafka-console-producerkafka-console-consumer按预期工作。

./kafka-console-producer.sh --topic raw --broker-list localhost:9092 
[2016-10-17 14:09:05,899] WARN Property topic is not valid (kafka.utils.VerifiableProperties) 
one 
two 
three 
four 
five 


./kafka-console-consumer.sh --topic raw --from-beginning --zookeeper localhost:2181 
one 
two 
three 
four 
five 
^CConsumed 5 messages 

我测试的java代码是非常基本的,只是创建一个消费者。我使用

StreamApp.java

package nile; 

public class StreamApp {               

    public static void main(String[] args){          
     String servers = args[0];            
     String groupId = args[1]; 
     String inTopic = args[2]; 
     String goodTopic = args[3];            

     Consumer consumer = new Consumer(servers, groupId, inTopic);    
     consumer.run();            
    }                   
} 

Consumer.java

package nile; 

import java.util.*; 

import org.apache.kafka.clients.consumer.*; 

public class Consumer {                                                

    private final KafkaConsumer<String, String> consumer;    // a                                   
    private final String topic;                                              

    public Consumer(String servers, String groupId, String topic) {                                     
     this.consumer = new KafkaConsumer<String, String>(createConfig(servers, groupId));                               
     this.topic = topic;                                               
     System.out.println("Topic to listen for:" + this.topic + ":");                                    
    }                                                    

    public void run() {                                                
     System.out.println("Starting to listen for items ");                                      
     this.consumer.subscribe(Arrays.asList(this.topic));    // b                                  
     try {                                                  
      while (true) {                                               
       System.out.println("Subscribed to: " + consumer.subscription());                                 
       System.out.println("Inside the loop");                                        
       ConsumerRecords<String, String> records = consumer.poll(100); // c                                 
       System.out.println("After consuming");                                        
       for (ConsumerRecord<String, String> record : records) {                                    
        System.out.println("Got an item from kafka: " + record.value());                                
       }                                                 
      }                                                  
     } finally {                                                 
      consumer.close();                                              
     }                                                   
    }                                                    

    private static Properties createConfig(String servers, String groupId) {                                  

     Properties props = new Properties();                                          
     props.put("bootstrap.servers", servers);                                         
     props.put("group.id", groupId);         // e                                  
     props.put("enable.auto.commit", "true");                                         
     props.put("auto.commit.interval.ms", "1000");                                        
     props.put("auto.offset.reset", "earliest");                                         
     props.put("session.timeout.ms", "30000");                                         
     props.put("key.deserializer",                                            
        "org.apache.kafka.common.serialization.StringDeserializer"); // a                                
     props.put("value.deserializer",                                            
        "org.apache.kafka.common.serialization.StringDeserializer"); // a     
     return props;                                                
    }                                                    
} 

库是(从我的build.gradle)

dependencies {          // b 
    compile 'org.apache.kafka:kafka-clients:0.9.0.0' 
    compile 'com.maxmind.geoip:geoip-api:1.2.14' 
    compile 'com.fasterxml.jackson.core:jackson-databind:2.6.3' 
    compile 'org.slf4j:slf4j-api:1.7.5' 
} 

我运行的代码为:

java -jar ./build/libs/nile-0.1.0.jar localhost:9092 ulp-ch03-3.3 raw enriched 

输出是:

SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder". 
SLF4J: Defaulting to no-operation (NOP) logger implementation 
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details. 
Topic to listen for:raw: 
Starting to listen for items 
Subscribed to: [raw] 
Inside the loop 

而这正是它停止。 consumer.poll()没有返回任何东西,也没有超时。不知道这里有什么问题。现在我的头发已经分裂了2天,任何帮助得到这个工作将非常感激。 :)

回答

1

似乎您正在使用0.9.x使用者API来使用来自0.8.x服务器的消息,由于0.9.0.0具有与先前版本不同的代理间协议更改,所以不允许使用这些消息。 使用旧消费者(即Scala消费者)或将kafka服务器版本升级到0.9.x

+0

谢谢!我升级了kafka,现在程序运行良好。 – Raj