2
我正在使用HashPartioner
但得到意外的结果。 我使用3个不同的字符串作为键,并给分区参数为3,所以我期望3个分区。Spark HashPartitioner意外分区
val cars = Array("Honda", "Toyota", "Kia")
val carnamePrice = sc.parallelize(for {
x <- cars
y <- Array(100,200,300)
} yield (x, y), 8)
val rddEachCar = carnamePrice.partitionBy(new HashPartitioner(3))
val mapped = rddEachCar.mapPartitionsWithIndex{
(index, iterator) => {
println("Called in Partition -> " + index)
val myList = iterator.toList
myList.map(x => x + " -> " + index).iterator
}
}
mapped.take(10)
结果如下。它只提供2个分区。我检查了字符串 (69909220 75427 -1783892706)的哈希码。这里有什么问题?可能我误解了分区算法。
Array[String] = Array((Toyota,100) -> 0, (Toyota,200) -> 0, (Toyota,300) -> 0, (Honda,100) -> 1, (Honda,200) -> 1, (Honda,300) -> 1, (Kia,100) -> 1, (Kia,200) -> 1, (Kia,300) -> 1)