You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Chirag Aggarwal <Ch...@guavus.com> on 2014/12/29 13:10:02 UTC
Spark Configurations
Hi,
It seems that spark-defaults.conf is not read by spark-sql. Is it used only by spark-shell?
Thanks,
Chirag
Re: Spark Configurations
Posted by Chirag Aggarwal <Ch...@guavus.com>.
Is there a way to get these set by default in spark-sql shell
Thanks,
Chirag
From: Akhil Das <ak...@sigmoidanalytics.com>>
Date: Monday, 29 December 2014 5:53 PM
To: Chirag Aggarwal <ch...@guavus.com>>
Cc: "user@spark.apache.org<ma...@spark.apache.org>" <us...@spark.apache.org>>
Subject: Re: Spark Configurations
I believe If you use spark-shell or spark-submit, then it will pick up the conf from spark-defaults.conf, If you are running independent application then you can set all those confs while creating the SparkContext.
Thanks
Best Regards
On Mon, Dec 29, 2014 at 5:40 PM, Chirag Aggarwal <Ch...@guavus.com>> wrote:
Hi,
It seems that spark-defaults.conf is not read by spark-sql. Is it used only by spark-shell?
Thanks,
Chirag
Re: Spark Configurations
Posted by Akhil Das <ak...@sigmoidanalytics.com>.
I believe If you use spark-shell or spark-submit, then it will pick up the
conf from spark-defaults.conf, If you are running independent application
then you can set all those confs while creating the SparkContext.
Thanks
Best Regards
On Mon, Dec 29, 2014 at 5:40 PM, Chirag Aggarwal <Chirag.Aggarwal@guavus.com
> wrote:
> Hi,
>
> It seems that spark-defaults.conf is not read by spark-sql. Is it used
> only by spark-shell?
>
> Thanks,
> Chirag
>