2017-03-01 51 views
0

我是Spark新手。我写了下面的Java代码Spark Java - 为什么Tuple2不能成为map中函数的参数?

JavaPairRDD<Detection, Detection> allCombinations = currentLevelNodes.cartesian(nextLevelNodes); 

allCombinations.map(new Function<Tuple2<Detection, Detection>, Segment>(){ 
    public Segment call(Tuple2<Detection, Detection> combination){ 
    Segment segment = new Segment(); 
    Detection a = combination._1(); 
    Detection b = combination._2(); 
    segment.distance = Math.sqrt(Math.pow((a.x)-(b.x), 2) + Math.pow((a.y)-(b.y), 2)); 

    return segment; 
    } 
}); 


的IDE(Eclipse的霓虹灯)显示了以下消息

Multiple markers at this line 
- The method map(Function<Tuple2<Detection,Detection>,R>) in the type 
AbstractJavaRDDLike<Tuple2<Detection,Detection>,JavaPairRDD<Detection,Detection>> is not applicable for the arguments (new 
Function<Tuple2<Detection,Detection>,Segment>(){}) 
- The type new Function<Tuple2<Detection,Detection>,Segment>(){} must implement the inherited abstract method 
Function<Tuple2<Detection,Detection>,Segment>.apply(Tuple2<Detection,Detection>) 


我该如何解决这个问题?

+0

错误表示您正在实现call(),但应该实现apply()。使用lambda会让你的生活更轻松 – Jack

+0

也是,你确定你可以使用Tuple2而不是JavaPairRDD作为映射函数的参数吗? – Jack

+0

我尝试将函数名称“call”修改为“apply”,但它仍显示第一行消息。 – lucienlo

回答

0

我使用了Lambda表达式并解决了这个问题。

这是我的示例代码

JavaPairRDD<Detection,Detection> allCombinations = currentLevelNodes.cartesian(nextLevelNodes); 

JavaRDD<Segment> segmentRDD = allCombinations.map(tuple -> new Segment(tuple._1, tuple._2)); 


和我添加的构造

public Segment(Detection a, Detection b){ 
    this.distance = Math.sqrt(Math.pow(a.x-b.x, 2)+Math.pow(a.y-b.y, 2)); 
} 


和它的作品。

+0

希望这个答案可以帮助像我这样的人 – lucienlo

相关问题