2016-04-01 73 views
1

我读一个JSON文件,并尝试使用生产它卡夫卡.. 这里是我的代码:kafka.common.FailedToSendMessageException:卡夫卡产生错误

public class FlatFileDataProducer { 

    private String topic = "JsonTopic"; 
    private Producer<String, String> producer = null; 
    KeyedMessage<String, String> message = null; 
    public JsonReader reader; 

    public void run(String jsonPath) throws ClassNotFoundException, FileNotFoundException, IOException, ParseException{ 
     reader = new JsonReader(); 
     System.out.println("---------------------"); 
     System.out.println("JSON FILE PATH IS : "+jsonPath); 
     System.out.println("---------------------"); 
     Properties prop = new Properties(); 
     prop.put("metadata.broker.list", "192.168.63.145:9092"); 
     prop.put("serializer.class", "kafka.serializer.StringEncoder"); 
     // prop.put("partitioner.class", "example.producer.SimplePartitioner"); 
     prop.put("request.required.acks", "1"); 


     ProducerConfig config = new ProducerConfig(prop); 
     producer = new Producer<String, String>(config); 
     List<Employee> emp = reader.readJsonFile(jsonPath);  
     for (Employee employee : emp) 
     { 
      System.out.println("---------------------"); 
      System.out.println(employee.toString()); 
      System.out.println("---------------------"); 
      message = new KeyedMessage<String, String>(topic, employee.toString()); 

      producer.send(message); 
      producer.close(); 

     } 
     System.out.println("Messages to Kafka successfully"); 
    } 

和代码读取JSON文件是:

public List<Employee> readJsonFile(String path) throws FileNotFoundException, IOException, ParseException{ 
     Employee employee = new Employee(); 
     parser=new JSONParser(); 
     Object obj = parser.parse(new FileReader(path)); 
     JSONObject jsonObject = (JSONObject) obj; 
     employee.setId(Integer.parseInt(jsonObject.get("id").toString()));  
     employee.setName((String)jsonObject.get("name")); 
     employee.setSalary(Integer.parseInt(jsonObject.get("salary").toString())); 
     list.add(employee); 
     return list; 
    } 

但是,当我执行程序, 问题1:

> [[email protected] ~]# java -jar sparkkafka.jar /root/customer.json 
> JSON FILE PATH IS : /root/customer.json 
> log4j:WARN No appenders could be found for logger (kafka.utils.VerifiableProperties). log4j:WARN Please 
> initialize the log4j system properly. 
> 1,Smith,25 
> Exception in thread "main" kafka.common.FailedToSendMessageException: Failed to send messages 
> after 3 tries. 
>   at kafka.producer.async.DefaultEventHandler.handle(DefaultEventHandler.scala:91) 
>   at kafka.producer.Producer.send(Producer.scala:77) 
>   at kafka.javaapi.producer.Producer.send(Producer.scala:33) 
>   at com.up.jsonType.FlatFileDataProducer.run(FlatFileDataProducer.java:41) 
>   at com.up.jsonType.FlatFileDataProducer.main(FlatFileDataProducer.java:49) 

提示错误,但是当我检查cosumer壳,我得到如下图所示:FOR ONE行JSON文件我看到壳4项.. 问题2:

[根@沙斌]# [根@沙斌]#./kafka-console-consumer.sh --zookeeper本地主机:2181 --topic JsonTopic --from-开始

1,Smith,25 
1,Smith,25 
1,Smith,25 
1,Smith,25 

我收到了相同的数据的4倍的行。

回答

1

您需要同时删除以下proprty:

//prop.put("request.required.acks", "1"); 
    //prop.put("producer.type","async"); 

此属性将actully大约需要确认的照顾。

+0

这解决了问题 – Alka

1

你能尝试添加以下属性:

prop.put("producer.type","async"); 
+0

这解决了问题1 – Alka