You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Sam Elamin <hu...@gmail.com> on 2017/02/10 21:35:38 UTC

[Newbie] spark conf

Hi All,


really newbie question here folks, i have properties like my aws access and
secret keys in the core-site.xml in hadoop among other properties, but
thats the only reason I have hadoop installed which seems a bit of an
overkill.

Is there an equivalent of core-site.xml for spark so I dont have to
reference the HADOOP_CONF_DIR in my spark env.sh?

I know I can export env variables for the AWS credentials but other
properties that my application might want to use?

Regards
Sam

Re: [Newbie] spark conf

Posted by Sam Elamin <hu...@gmail.com>.
yeah I thought of that but the file made it seem that its environment
specific rather than application specific configurations

Im more interested in the best practices, would you recommend using the
default conf file for this and uploading them to where the application will
be running (remote clusters etc) ?


Regards
Sam

On Fri, Feb 10, 2017 at 9:36 PM, Reynold Xin <rx...@databricks.com> wrote:

> You can put them in spark's own conf/spark-defaults.conf file
>
> On Fri, Feb 10, 2017 at 10:35 PM, Sam Elamin <hu...@gmail.com>
> wrote:
>
>> Hi All,
>>
>>
>> really newbie question here folks, i have properties like my aws access
>> and secret keys in the core-site.xml in hadoop among other properties, but
>> thats the only reason I have hadoop installed which seems a bit of an
>> overkill.
>>
>> Is there an equivalent of core-site.xml for spark so I dont have to
>> reference the HADOOP_CONF_DIR in my spark env.sh?
>>
>> I know I can export env variables for the AWS credentials but other
>> properties that my application might want to use?
>>
>> Regards
>> Sam
>>
>>
>>
>>
>

Re: [Newbie] spark conf

Posted by Reynold Xin <rx...@databricks.com>.
You can put them in spark's own conf/spark-defaults.conf file

On Fri, Feb 10, 2017 at 10:35 PM, Sam Elamin <hu...@gmail.com>
wrote:

> Hi All,
>
>
> really newbie question here folks, i have properties like my aws access
> and secret keys in the core-site.xml in hadoop among other properties, but
> thats the only reason I have hadoop installed which seems a bit of an
> overkill.
>
> Is there an equivalent of core-site.xml for spark so I dont have to
> reference the HADOOP_CONF_DIR in my spark env.sh?
>
> I know I can export env variables for the AWS credentials but other
> properties that my application might want to use?
>
> Regards
> Sam
>
>
>
>

Re: [Newbie] spark conf

Posted by Sam Elamin <hu...@gmail.com>.
yup that worked

Thanks for the clarification!

On Fri, Feb 10, 2017 at 9:42 PM, Marcelo Vanzin <va...@cloudera.com> wrote:

> If you place core-site.xml in $SPARK_HOME/conf, I'm pretty sure Spark
> will pick it up. (Sounds like you're not running YARN, which would
> require HADOOP_CONF_DIR.)
>
> Also this is more of a user@ question.
>
> On Fri, Feb 10, 2017 at 1:35 PM, Sam Elamin <hu...@gmail.com>
> wrote:
> > Hi All,
> >
> >
> > really newbie question here folks, i have properties like my aws access
> and
> > secret keys in the core-site.xml in hadoop among other properties, but
> thats
> > the only reason I have hadoop installed which seems a bit of an overkill.
> >
> > Is there an equivalent of core-site.xml for spark so I dont have to
> > reference the HADOOP_CONF_DIR in my spark env.sh?
> >
> > I know I can export env variables for the AWS credentials but other
> > properties that my application might want to use?
> >
> > Regards
> > Sam
> >
> >
> >
>
>
>
> --
> Marcelo
>

Re: [Newbie] spark conf

Posted by Marcelo Vanzin <va...@cloudera.com>.
If you place core-site.xml in $SPARK_HOME/conf, I'm pretty sure Spark
will pick it up. (Sounds like you're not running YARN, which would
require HADOOP_CONF_DIR.)

Also this is more of a user@ question.

On Fri, Feb 10, 2017 at 1:35 PM, Sam Elamin <hu...@gmail.com> wrote:
> Hi All,
>
>
> really newbie question here folks, i have properties like my aws access and
> secret keys in the core-site.xml in hadoop among other properties, but thats
> the only reason I have hadoop installed which seems a bit of an overkill.
>
> Is there an equivalent of core-site.xml for spark so I dont have to
> reference the HADOOP_CONF_DIR in my spark env.sh?
>
> I know I can export env variables for the AWS credentials but other
> properties that my application might want to use?
>
> Regards
> Sam
>
>
>



-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org