1
早上好, 这可能听起来像一个愚蠢的问题,但我想通过RStudio访问Spark中的临时表。我没有任何Spark群集,我只在本地PC上运行所有的东西。 当我开始通过的IntelliJ星火,实例被罚款运行:使用RStudio-sparklyr连接到本地Spark由IntelliJ提供
17/11/11 10:11:33 INFO Utils: Successfully started service 'sparkDriver' on port 59505.
17/11/11 10:11:33 INFO SparkEnv: Registering MapOutputTracker
17/11/11 10:11:33 INFO SparkEnv: Registering BlockManagerMaster
17/11/11 10:11:33 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
17/11/11 10:11:33 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
17/11/11 10:11:33 INFO DiskBlockManager: Created local directory at C:\Users\stephan\AppData\Local\Temp\blockmgr-7ca4e8fb-9456-4063-bc6d-39324d7dad4c
17/11/11 10:11:33 INFO MemoryStore: MemoryStore started with capacity 898.5 MB
17/11/11 10:11:33 INFO SparkEnv: Registering OutputCommitCoordinator
17/11/11 10:11:33 INFO Utils: Successfully started service 'SparkUI' on port 4040.
17/11/11 10:11:34 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://172.25.240.1:4040
17/11/11 10:11:34 INFO Executor: Starting executor ID driver on host localhost
17/11/11 10:11:34 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 59516.
17/11/11 10:11:34 INFO NettyBlockTransferService: Server created on 172.25.240.1:59516
但我不知道的口,我在RStudio/sparklyr选择:
sc <- spark_connect(master = "spark://localhost:7077", spark_home = "C://Users//stephan//Downloads//spark//spark-2.2.0-bin-hadoop2.7", version = "2.2.0")
Error in file(con, "r") : cannot open the connection
In addition: Warning message:
In file(con, "r") :
cannot open file 'C:\Users\stephan\AppData\Local\Temp\Rtmp61Ejow\file2fa024ce51af_spark.log': Permission denied
我尝试不同的端口,像59516,4040,...但都导致了相同的结果。该许可被拒绝的消息我猜可能是由于该文件被写入罚款被忽略:
17/11/11 01:07:30 WARN StandaloneAppClient$ClientEndpoint: Failed to connect to master localhost:7077
可以请人帮助我,我怎么能建立一个本地运行Spark和RStudio之间的连接,但是不RStudio运行另一个Spark实例?
感谢 斯蒂芬
我明白了。这太糟糕了:( 我跟着这个线程运行它本地︰https://stackoverflow.com/questions/36593446/failed-to-start-master-for-spark-in-windows并将我的配置更改为“火花: //172.25.240.1:7077“。 当我现在运行我的应用程序时,它已经联系上了,但是,StreamingContext不再工作了,它由streamingContext.start()和streamingContext.awaitTermination()执行。我在Intellij中运行它,它正在捕获流。 Btw。流实际上只是我自己生成的本地套接字流 – Stephan