You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by rfarrjr <rf...@gmail.com> on 2015/08/13 17:16:54 UTC

possible bug: user SparkConf properties not copied to worker process

Ran into an issue setting a property on the SparkConf that wasn't made
available on the worker.  After some digging[1] I noticed that only
properties that start with "spark." are sent by the schedular.   I'm not
sure if this was intended behavior or not.

Using Spark Streaming 1.4.1 running on Java 8.

~Robert

[1]
https://github.com/apache/spark/blob/v1.4.1/core/src/main/scala/org/apache/spark/scheduler/cluster/CoarseGrainedSchedulerBackend.scala#L243




--
View this message in context: http://apache-spark-developers-list.1001551.n3.nabble.com/possible-bug-user-SparkConf-properties-not-copied-to-worker-process-tp13665.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Re: possible bug: user SparkConf properties not copied to worker process

Posted by rfarrjr <rf...@gmail.com>.
That works.



--
View this message in context: http://apache-spark-developers-list.1001551.n3.nabble.com/possible-bug-user-SparkConf-properties-not-copied-to-worker-process-tp13665p13689.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Re: possible bug: user SparkConf properties not copied to worker process

Posted by Reynold Xin <rx...@databricks.com>.
Is this through Java properties? For java properties, you can pass them
using spark.executor.extraJavaOptions.




On Thu, Aug 13, 2015 at 2:11 PM, rfarrjr <rf...@gmail.com> wrote:

> Thanks for the response.
>
> In this particular case we passed a url that would be leveraged when
> configuring some serialization support for Kryo.   We are using a schema
> registry and leveraging it to efficiently serialize avro objects without
> the
> need to register specific records or schemas up front.
>
> Adding "spark." to the property name works just didn't want to conflict
> with
> the core spark properties namespace.
>
>
>
> --
> View this message in context:
> http://apache-spark-developers-list.1001551.n3.nabble.com/possible-bug-user-SparkConf-properties-not-copied-to-worker-process-tp13665p13686.html
> Sent from the Apache Spark Developers List mailing list archive at
> Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> For additional commands, e-mail: dev-help@spark.apache.org
>
>

Re: possible bug: user SparkConf properties not copied to worker process

Posted by rfarrjr <rf...@gmail.com>.
Thanks for the response.

In this particular case we passed a url that would be leveraged when
configuring some serialization support for Kryo.   We are using a schema
registry and leveraging it to efficiently serialize avro objects without the
need to register specific records or schemas up front.

Adding "spark." to the property name works just didn't want to conflict with
the core spark properties namespace.



--
View this message in context: http://apache-spark-developers-list.1001551.n3.nabble.com/possible-bug-user-SparkConf-properties-not-copied-to-worker-process-tp13665p13686.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Re: possible bug: user SparkConf properties not copied to worker process

Posted by Reynold Xin <rx...@databricks.com>.
That was intentional - what's your use case that require configs not
starting with spark?


On Thu, Aug 13, 2015 at 8:16 AM, rfarrjr <rf...@gmail.com> wrote:

> Ran into an issue setting a property on the SparkConf that wasn't made
> available on the worker.  After some digging[1] I noticed that only
> properties that start with "spark." are sent by the schedular.   I'm not
> sure if this was intended behavior or not.
>
> Using Spark Streaming 1.4.1 running on Java 8.
>
> ~Robert
>
> [1]
>
> https://github.com/apache/spark/blob/v1.4.1/core/src/main/scala/org/apache/spark/scheduler/cluster/CoarseGrainedSchedulerBackend.scala#L243
>
>
>
>
> --
> View this message in context:
> http://apache-spark-developers-list.1001551.n3.nabble.com/possible-bug-user-SparkConf-properties-not-copied-to-worker-process-tp13665.html
> Sent from the Apache Spark Developers List mailing list archive at
> Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> For additional commands, e-mail: dev-help@spark.apache.org
>
>