0
我是Spark新手。我写了下面的Java代码Spark Java - 为什么Tuple2不能成为map中函数的参数?
JavaPairRDD<Detection, Detection> allCombinations = currentLevelNodes.cartesian(nextLevelNodes);
allCombinations.map(new Function<Tuple2<Detection, Detection>, Segment>(){
public Segment call(Tuple2<Detection, Detection> combination){
Segment segment = new Segment();
Detection a = combination._1();
Detection b = combination._2();
segment.distance = Math.sqrt(Math.pow((a.x)-(b.x), 2) + Math.pow((a.y)-(b.y), 2));
return segment;
}
});
的IDE(Eclipse的霓虹灯)显示了以下消息
Multiple markers at this line
- The method map(Function<Tuple2<Detection,Detection>,R>) in the type
AbstractJavaRDDLike<Tuple2<Detection,Detection>,JavaPairRDD<Detection,Detection>> is not applicable for the arguments (new
Function<Tuple2<Detection,Detection>,Segment>(){})
- The type new Function<Tuple2<Detection,Detection>,Segment>(){} must implement the inherited abstract method
Function<Tuple2<Detection,Detection>,Segment>.apply(Tuple2<Detection,Detection>)
我该如何解决这个问题?
错误表示您正在实现call(),但应该实现apply()。使用lambda会让你的生活更轻松 – Jack
也是,你确定你可以使用Tuple2而不是JavaPairRDD作为映射函数的参数吗? – Jack
我尝试将函数名称“call”修改为“apply”,但它仍显示第一行消息。 – lucienlo