1
我正在使用spark 2.0.0。 有没有办法将参数从spark driver传递给executors?我尝试了以下。将参数从驱动程序传递到火花执行器
class SparkDriver {
public static void main(String argv[]){
SparkConf conf = new SparkConf().setAppName("test").setMaster("yarn");
SparkSession sparkSession = SparkSession.builder().config(conf).getOrCreate();
Dataset<Row> input = sparkSession.read().load("inputfilepath");
Dataset<Row> modifiedinput = input.mapPartitions(new customMapPartition(5),Encoders.bean(Row.class));
}
class customMapPartition implements MapPartitionsFunction{
private static final long serialVersionUID = -6513655566985939627L;
private static Integer variableThatHastobePassed = null;
public customMapPartition(Integer passedInteger){
customMapPartition.variableThatHastobePassed= passedInteger;
}
@Override
public Iterator<Row> call(Iterator<Row> input) throws Exception {
System.out.println("number that is passed " + variableThatHastobePassed);
}
}
如上所述,我写了一个自定义mappartitionfunction来传递参数。并正在访问partitionfunction的调用方法中的静态变量。当我用setmaster(“local”)在本地运行时,这工作,但是在集群上运行.setmaster(“yarn”)(在system.out.println语句中打印为null)时没有工作
。有没有办法从驱动器参数传递给执行人