2017-03-01 42 views
0

我能够将结构化流式处理的结果写入实验性文件。问题是这些文件在本地文件系统中,现在我想将它们写入hadoop文件系统。有没有办法做到这一点?结构化流式传输将实验性文件写入hadoop

StreamingQuery query = result //.orderBy("window") 
      .repartition(1) 
      .writeStream() 
      .outputMode(OutputMode.Append()) 
      .format("parquet") 
      .option("checkpointLocation", "hdfs://localhost:19000/data/checkpoints") 
      .start("hdfs://localhost:19000/data/total"); 

我用这个代码,但它说:

Exception in thread "main" java.lang.IllegalArgumentException: Wrong FS: hdfs://localhost:19000/data/checkpoints/metadata, expected: file:/// 
at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:649) 
at org.apache.hadoop.fs.RawLocalFileSystem.pathToFile(RawLocalFileSystem.java:82) 
at org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:606) 
at org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:824) 
at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:601) 
at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:421) 
at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1426) 
at org.apache.spark.sql.execution.streaming.StreamMetadata$.read(StreamMetadata.scala:51) 
at org.apache.spark.sql.execution.streaming.StreamExecution.<init>(StreamExecution.scala:100) 
at org.apache.spark.sql.streaming.StreamingQueryManager.createQuery(StreamingQueryManager.scala:232) 
at org.apache.spark.sql.streaming.StreamingQueryManager.startQuery(StreamingQueryManager.scala:269) 
at org.apache.spark.sql.streaming.DataStreamWriter.start(DataStreamWriter.scala:262) 
at org.apache.spark.sql.streaming.DataStreamWriter.start(DataStreamWriter.scala:206) 

感谢

回答

1

这是一个已知的问题:https://issues.apache.org/jira/browse/SPARK-19407

应固定在下一版本中。您可以使用--conf spark.hadoop.fs.defaultFS=hdfs://localhost:19000作为解决方法将默认文件系统设置为s3。

+0

它的工作原理,以及我使用:SparkSession.builder() .appName( “火花数据处理”) 的.master( “本地[2]”) 的.config( “spark.hadoop.fs.defaultFS”, “hdfs:// localhost:19000”) .getOrCreate(); – taniGroup

+0

是的,这是一样的。 – zsxwing