You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Koert Kuipers <ko...@tresata.com> on 2018/03/30 18:41:41 UTC

all spark settings end up being system properties

does anyone know why all spark settings end up being system properties, and
where this is done?

for example when i pass "--conf spark.foo=bar" into spark-submit then
System.getProperty("spark.foo") will be equal to "bar"

i grepped the spark codebase for System.setProperty or System.setProperties
and i see it being used in some places but never for all spark settings.

we are running into some weird side effects because of this since we use
typesafe config which has system properties as overrides so we see them pop
up there again unexpectedly.

Re: all spark settings end up being system properties

Posted by Koert Kuipers <ko...@tresata.com>.
thanks i will check our SparkSubmit class

On Fri, Mar 30, 2018 at 2:46 PM, Marcelo Vanzin <va...@cloudera.com> wrote:

> Why: it's part historical, part "how else would you do it".
>
> SparkConf needs to read properties read from the command line, but
> SparkConf is something that user code instantiates, so we can't easily
> make it read data from arbitrary locations. You could use thread
> locals and other tricks, but user code can always break those.
>
> Where: this is done by the SparkSubmit class (look for the Scala
> version, "sys.props").
>
>
> On Fri, Mar 30, 2018 at 11:41 AM, Koert Kuipers <ko...@tresata.com> wrote:
> > does anyone know why all spark settings end up being system properties,
> and
> > where this is done?
> >
> > for example when i pass "--conf spark.foo=bar" into spark-submit then
> > System.getProperty("spark.foo") will be equal to "bar"
> >
> > i grepped the spark codebase for System.setProperty or
> System.setProperties
> > and i see it being used in some places but never for all spark settings.
> >
> > we are running into some weird side effects because of this since we use
> > typesafe config which has system properties as overrides so we see them
> pop
> > up there again unexpectedly.
>
>
>
> --
> Marcelo
>

Re: all spark settings end up being system properties

Posted by Marcelo Vanzin <va...@cloudera.com>.
Why: it's part historical, part "how else would you do it".

SparkConf needs to read properties read from the command line, but
SparkConf is something that user code instantiates, so we can't easily
make it read data from arbitrary locations. You could use thread
locals and other tricks, but user code can always break those.

Where: this is done by the SparkSubmit class (look for the Scala
version, "sys.props").


On Fri, Mar 30, 2018 at 11:41 AM, Koert Kuipers <ko...@tresata.com> wrote:
> does anyone know why all spark settings end up being system properties, and
> where this is done?
>
> for example when i pass "--conf spark.foo=bar" into spark-submit then
> System.getProperty("spark.foo") will be equal to "bar"
>
> i grepped the spark codebase for System.setProperty or System.setProperties
> and i see it being used in some places but never for all spark settings.
>
> we are running into some weird side effects because of this since we use
> typesafe config which has system properties as overrides so we see them pop
> up there again unexpectedly.



-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org