2016-11-15 170 views
3
test.csv 
name,key1,key2 
A,1,2 
B,1,3 
C,4,3 

我想改变这个数据是这样的(如数据集或RDD)火花 - 与阶

whatIwant.csv 
name,key,newkeyname 
A,1,KEYA 
A,2,KEYB 
B,1,KEYA 
B,3,KEYB 
C,4,KEYA 
C,3,KEYB 

.csv数据分割我与读取方法加载的数据。

val df = spark.read 
      .option("header", true) 
      .option("charset", "euc-kr") 
      .csv(csvFilePath) 

我可以加载每个数据集一样(名称,键1)或(名称,密钥2),以及他们的工会工会,但要做到这一点星星之火会议。 对此有何想法?


那些不工作。

val df2 = df.select(df("TAG_NO"), df.map { x => (x.getAs[String]("MK_VNDRNM"), x.getAs[String]("WK_ORD_DT")) }) 

val df2 = df.select(df("TAG_NO"), Seq(df("TAG_NO"), df("WK_ORD_DT"))) 
+0

你尝试'从数据帧explode'功能? – Shankar

+0

nope。我会尝试爆炸。谢谢:) –

+0

因为key1和key2不在单列中,所以我认为explode并不是正确的答案。 –

回答

2

这可以用explode完成和udf

scala> val df = Seq(("A", 1, 2), ("B", 1, 3), ("C", 4, 3)).toDF("name", "key1", "key2") 
df: org.apache.spark.sql.DataFrame = [name: string, key1: int ... 1 more field] 

scala> df.show 
+----+----+----+ 
|name|key1|key2| 
+----+----+----+ 
| A| 1| 2| 
| B| 1| 3| 
| C| 4| 3| 
+----+----+----+ 

scala> val explodeUDF = udf((v1: String, v2: String) => Vector((v1, "Key1"), (v2, "Key2"))) 
explodeUDF: org.apache.spark.sql.expressions.UserDefinedFunction = UserDefinedFunction(<function2>,ArrayType(StructType(StructField(_1,StringType,true), StructField(_2,StringType,true)),true),Some(List(StringType, StringType))) 

scala> df = df.withColumn("TMP", explode(explodeUDF($"key1", $"key2"))).drop("key1", "key2") 
df: org.apache.spark.sql.DataFrame = [name: string, TMP: struct<_1: string, _2: string>] 

scala> df = df.withColumn("key", $"TMP".apply("_1")).withColumn("new key name", $"TMP".apply("_2")) 
df: org.apache.spark.sql.DataFrame = [name: string, TMP: struct<_1: string, _2: string> ... 2 more fields] 

scala> df = df.drop("TMP") 
df: org.apache.spark.sql.DataFrame = [name: string, key: string ... 1 more field] 

scala> df.show 
+----+---+------------+ 
|name|key|new key name| 
+----+---+------------+ 
| A| 1|  Key1| 
| A| 2|  Key2| 
| B| 1|  Key1| 
| B| 3|  Key2| 
| C| 4|  Key1| 
| C| 3|  Key2| 
+----+---+------------+ 
+1

获利!这与我的起源问题有点不同,但可以做到这一点。非常感谢 :) –