You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Chetan Khatri <ch...@gmail.com> on 2019/10/17 13:59:12 UTC
Spark - configuration setting doesn't work
Hi Users,
I am setting spark configuration in below way;
val spark = SparkSession.builder().appName(APP_NAME).getOrCreate()
spark.conf.set("spark.speculation", "false")
spark.conf.set("spark.broadcast.compress", "true")
spark.conf.set("spark.sql.broadcastTimeout", "36000")
spark.conf.set("spark.network.timeout", "2500s")
spark.conf.set("spark.serializer", "org.apache.spark.serializer.KryoSerializer")
spark.conf.set("spark.driver.memory", "10g")
spark.conf.set("spark.executor.memory", "10g")
import spark.implicits._
and submitting spark job with spark - submit. but none of the above
configuration is
getting reflected to the job, I have checked at Spark-UI.
I know setting up like this while creation of spark object, it's working well.
val spark = SparkSession.builder().appName(APP_NAME)
.config("spark.network.timeout", "1500s")
.config("spark.broadcast.compress", "true")
.config("spark.sql.broadcastTimeout", "36000")
.getOrCreate()
import spark.implicits._
Can someone please throw light?
Re: Spark - configuration setting doesn't work
Posted by Chetan Khatri <ch...@gmail.com>.
Ok, thanks. I wanted to confirm that.
On Sun, Oct 27, 2019 at 12:55 PM hemant singh <he...@gmail.com> wrote:
> You should add the configurations while creating the session, I don’t
> think you can override it once the session is created. Few are though.
>
> Thanks,
> Hemant
>
> On Sun, 27 Oct 2019 at 11:02 AM, Chetan Khatri <
> chetan.opensource@gmail.com> wrote:
>
>> Could someone please help me.
>>
>> On Thu, Oct 17, 2019 at 7:29 PM Chetan Khatri <
>> chetan.opensource@gmail.com> wrote:
>>
>>> Hi Users,
>>>
>>> I am setting spark configuration in below way;
>>>
>>> val spark = SparkSession.builder().appName(APP_NAME).getOrCreate()
>>>
>>> spark.conf.set("spark.speculation", "false")
>>> spark.conf.set("spark.broadcast.compress", "true")
>>> spark.conf.set("spark.sql.broadcastTimeout", "36000")
>>> spark.conf.set("spark.network.timeout", "2500s")
>>> spark.conf.set("spark.serializer", "org.apache.spark.serializer.KryoSerializer")
>>> spark.conf.set("spark.driver.memory", "10g")
>>> spark.conf.set("spark.executor.memory", "10g")
>>>
>>> import spark.implicits._
>>>
>>>
>>> and submitting spark job with spark - submit. but none of the above configuration is
>>>
>>> getting reflected to the job, I have checked at Spark-UI.
>>>
>>> I know setting up like this while creation of spark object, it's working well.
>>>
>>>
>>> val spark = SparkSession.builder().appName(APP_NAME)
>>> .config("spark.network.timeout", "1500s")
>>> .config("spark.broadcast.compress", "true")
>>> .config("spark.sql.broadcastTimeout", "36000")
>>> .getOrCreate()
>>>
>>> import spark.implicits._
>>>
>>>
>>> Can someone please throw light?
>>>
>>>
Re: Spark - configuration setting doesn't work
Posted by hemant singh <he...@gmail.com>.
You should add the configurations while creating the session, I don’t think
you can override it once the session is created. Few are though.
Thanks,
Hemant
On Sun, 27 Oct 2019 at 11:02 AM, Chetan Khatri <ch...@gmail.com>
wrote:
> Could someone please help me.
>
> On Thu, Oct 17, 2019 at 7:29 PM Chetan Khatri <ch...@gmail.com>
> wrote:
>
>> Hi Users,
>>
>> I am setting spark configuration in below way;
>>
>> val spark = SparkSession.builder().appName(APP_NAME).getOrCreate()
>>
>> spark.conf.set("spark.speculation", "false")
>> spark.conf.set("spark.broadcast.compress", "true")
>> spark.conf.set("spark.sql.broadcastTimeout", "36000")
>> spark.conf.set("spark.network.timeout", "2500s")
>> spark.conf.set("spark.serializer", "org.apache.spark.serializer.KryoSerializer")
>> spark.conf.set("spark.driver.memory", "10g")
>> spark.conf.set("spark.executor.memory", "10g")
>>
>> import spark.implicits._
>>
>>
>> and submitting spark job with spark - submit. but none of the above configuration is
>>
>> getting reflected to the job, I have checked at Spark-UI.
>>
>> I know setting up like this while creation of spark object, it's working well.
>>
>>
>> val spark = SparkSession.builder().appName(APP_NAME)
>> .config("spark.network.timeout", "1500s")
>> .config("spark.broadcast.compress", "true")
>> .config("spark.sql.broadcastTimeout", "36000")
>> .getOrCreate()
>>
>> import spark.implicits._
>>
>>
>> Can someone please throw light?
>>
>>
Re: Spark - configuration setting doesn't work
Posted by Chetan Khatri <ch...@gmail.com>.
Could someone please help me.
On Thu, Oct 17, 2019 at 7:29 PM Chetan Khatri <ch...@gmail.com>
wrote:
> Hi Users,
>
> I am setting spark configuration in below way;
>
> val spark = SparkSession.builder().appName(APP_NAME).getOrCreate()
>
> spark.conf.set("spark.speculation", "false")
> spark.conf.set("spark.broadcast.compress", "true")
> spark.conf.set("spark.sql.broadcastTimeout", "36000")
> spark.conf.set("spark.network.timeout", "2500s")
> spark.conf.set("spark.serializer", "org.apache.spark.serializer.KryoSerializer")
> spark.conf.set("spark.driver.memory", "10g")
> spark.conf.set("spark.executor.memory", "10g")
>
> import spark.implicits._
>
>
> and submitting spark job with spark - submit. but none of the above configuration is
>
> getting reflected to the job, I have checked at Spark-UI.
>
> I know setting up like this while creation of spark object, it's working well.
>
>
> val spark = SparkSession.builder().appName(APP_NAME)
> .config("spark.network.timeout", "1500s")
> .config("spark.broadcast.compress", "true")
> .config("spark.sql.broadcastTimeout", "36000")
> .getOrCreate()
>
> import spark.implicits._
>
>
> Can someone please throw light?
>
>