2017-06-22 113 views

回答

0

与驱动程序节点上的Spark UI相同的端口上有一个度量端点可用。 http://<host>:<sparkUI-port>/metrics/json/

流相关的指标有一个在他们的名字.StreamingMetrics:从本地测试工作

样品:

local-1498040220092.driver.printWriter.snb.StreamingMetrics.streaming.lastCompletedBatch_processingDelay: { 
value: 30 
}, 
local-1498040220092.driver.printWriter.snb.StreamingMetrics.streaming.lastCompletedBatch_processingEndTime: { 
value: 1498124090031 
}, 
local-1498040220092.driver.printWriter.snb.StreamingMetrics.streaming.lastCompletedBatch_processingStartTime: { 
value: 1498124090001 
}, 
local-1498040220092.driver.printWriter.snb.StreamingMetrics.streaming.lastCompletedBatch_schedulingDelay: { 
value: 1 
}, 
local-1498040220092.driver.printWriter.snb.StreamingMetrics.streaming.lastCompletedBatch_submissionTime: { 
value: 1498124090000 
}, 
local-1498040220092.driver.printWriter.snb.StreamingMetrics.streaming.lastCompletedBatch_totalDelay: { 
value: 31 
}, 
local-1498040220092.driver.printWriter.snb.StreamingMetrics.streaming.lastReceivedBatch_processingEndTime: { 
value: 1498124090031 
}, 
local-1498040220092.driver.printWriter.snb.StreamingMetrics.streaming.lastReceivedBatch_processingStartTime: { 
value: 1498124090001 
} 

为了得到我们需要diff的局地StreamingMetrics.streaming.lastCompletedBatch_processingEndTime - StreamingMetrics.streaming.lastCompletedBatch_processingStartTime

0
处理时间

由于Spark 2.2.0于7月份发布,在您的文章发布一个月后,我想您的链接指向:spark 2.1.0。显然,REST API已经扩展到Spark Streaming,请参阅spark 2.2.0

因此,如果您仍有可能更新Spark版本,我建议您这样做。然后您可以接收来自所有批次的数据:

/applications/[app-id]/streaming/batches