You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@livy.apache.org by kant kodali <ka...@gmail.com> on 2018/01/31 09:16:18 UTC

How do I set spark conf parameters?

Hi All,


How do I set Spark Conf Parameters ? The below doesnt seem to get picked
up? If so, how can I change my program such that it can pick it up. I am
not seeing a way if sparkcontext is already created?

public String call(JobContext ctx) throws Exception {
    ctx.sc().setLogLevel("INFO");
    ctx.sc().getConf()
        .set("spark.cassandra.connection.host",
config.getString("cassandra.host"))
        .set("spark.cassandra.auth.username",
config.getString("cassandra.user"))
        .set("spark.cassandra.auth.password",
config.getString("cassandra.pass"))
        .set("es.index.auto.create", "true");

}


Thanks!

Re: How do I set spark conf parameters?

Posted by Meisam Fathi <me...@gmail.com>.
On a more general note, with or without Livy, Spark configurations should
be set before creating the Spark contex (sc). Setting configs after sc is
created has no effect.

Thanks,
Meisam

On Wed, Jan 31, 2018 at 3:14 AM Stefan Miklosovic <mi...@gmail.com>
wrote:

> livyClient = new LivyClientBuilder()
>   .setAll(withSparkProperties())
>   .setURI(new URI(LIVY_URI))
>   .build()
>
>
>
> On Wed, Jan 31, 2018 at 12:13 PM, Stefan Miklosovic <mi...@gmail.com>
> wrote:
>
>> Yes, I am using setAll(Properties).
>>
>> On Wed, Jan 31, 2018 at 11:45 AM, kant kodali <ka...@gmail.com> wrote:
>>
>>> Sorry. I found an answer online.It should be something like this
>>>
>>> new LivyClientBuilder().setConf("spark.es.index.auto.create", "true").setConf("spark.cassandra.connection.host", "127.0.0.01").build();
>>>
>>>
>>> On Wed, Jan 31, 2018 at 1:16 AM, kant kodali <ka...@gmail.com> wrote:
>>>
>>>> Hi All,
>>>>
>>>>
>>>> How do I set Spark Conf Parameters ? The below doesnt seem to get
>>>> picked up? If so, how can I change my program such that it can pick it up.
>>>> I am not seeing a way if sparkcontext is already created?
>>>>
>>>> public String call(JobContext ctx) throws Exception {
>>>>     ctx.sc().setLogLevel("INFO");
>>>>     ctx.sc().getConf()
>>>>         .set("spark.cassandra.connection.host", config.getString("cassandra.host"))
>>>>         .set("spark.cassandra.auth.username", config.getString("cassandra.user"))
>>>>         .set("spark.cassandra.auth.password", config.getString("cassandra.pass"))
>>>>         .set("es.index.auto.create", "true");
>>>>
>>>> }
>>>>
>>>>
>>>> Thanks!
>>>>
>>>>
>>>
>>
>>
>> --
>> Stefan Miklosovic
>>
>
>
>
> --
> Stefan Miklosovic
>

Re: How do I set spark conf parameters?

Posted by Stefan Miklosovic <mi...@gmail.com>.
livyClient = new LivyClientBuilder()
  .setAll(withSparkProperties())
  .setURI(new URI(LIVY_URI))
  .build()



On Wed, Jan 31, 2018 at 12:13 PM, Stefan Miklosovic <mi...@gmail.com>
wrote:

> Yes, I am using setAll(Properties).
>
> On Wed, Jan 31, 2018 at 11:45 AM, kant kodali <ka...@gmail.com> wrote:
>
>> Sorry. I found an answer online.It should be something like this
>>
>> new LivyClientBuilder().setConf("spark.es.index.auto.create", "true").setConf("spark.cassandra.connection.host", "127.0.0.01").build();
>>
>>
>> On Wed, Jan 31, 2018 at 1:16 AM, kant kodali <ka...@gmail.com> wrote:
>>
>>> Hi All,
>>>
>>>
>>> How do I set Spark Conf Parameters ? The below doesnt seem to get picked
>>> up? If so, how can I change my program such that it can pick it up. I am
>>> not seeing a way if sparkcontext is already created?
>>>
>>> public String call(JobContext ctx) throws Exception {
>>>     ctx.sc().setLogLevel("INFO");
>>>     ctx.sc().getConf()
>>>         .set("spark.cassandra.connection.host", config.getString("cassandra.host"))
>>>         .set("spark.cassandra.auth.username", config.getString("cassandra.user"))
>>>         .set("spark.cassandra.auth.password", config.getString("cassandra.pass"))
>>>         .set("es.index.auto.create", "true");
>>>
>>> }
>>>
>>>
>>> Thanks!
>>>
>>>
>>
>
>
> --
> Stefan Miklosovic
>



-- 
Stefan Miklosovic

Re: How do I set spark conf parameters?

Posted by Stefan Miklosovic <mi...@gmail.com>.
Yes, I am using setAll(Properties).

On Wed, Jan 31, 2018 at 11:45 AM, kant kodali <ka...@gmail.com> wrote:

> Sorry. I found an answer online.It should be something like this
>
> new LivyClientBuilder().setConf("spark.es.index.auto.create", "true").setConf("spark.cassandra.connection.host", "127.0.0.01").build();
>
>
> On Wed, Jan 31, 2018 at 1:16 AM, kant kodali <ka...@gmail.com> wrote:
>
>> Hi All,
>>
>>
>> How do I set Spark Conf Parameters ? The below doesnt seem to get picked
>> up? If so, how can I change my program such that it can pick it up. I am
>> not seeing a way if sparkcontext is already created?
>>
>> public String call(JobContext ctx) throws Exception {
>>     ctx.sc().setLogLevel("INFO");
>>     ctx.sc().getConf()
>>         .set("spark.cassandra.connection.host", config.getString("cassandra.host"))
>>         .set("spark.cassandra.auth.username", config.getString("cassandra.user"))
>>         .set("spark.cassandra.auth.password", config.getString("cassandra.pass"))
>>         .set("es.index.auto.create", "true");
>>
>> }
>>
>>
>> Thanks!
>>
>>
>


-- 
Stefan Miklosovic

Re: How do I set spark conf parameters?

Posted by kant kodali <ka...@gmail.com>.
Sorry. I found an answer online.It should be something like this

new LivyClientBuilder().setConf("spark.es.index.auto.create",
"true").setConf("spark.cassandra.connection.host",
"127.0.0.01").build();


On Wed, Jan 31, 2018 at 1:16 AM, kant kodali <ka...@gmail.com> wrote:

> Hi All,
>
>
> How do I set Spark Conf Parameters ? The below doesnt seem to get picked
> up? If so, how can I change my program such that it can pick it up. I am
> not seeing a way if sparkcontext is already created?
>
> public String call(JobContext ctx) throws Exception {
>     ctx.sc().setLogLevel("INFO");
>     ctx.sc().getConf()
>         .set("spark.cassandra.connection.host", config.getString("cassandra.host"))
>         .set("spark.cassandra.auth.username", config.getString("cassandra.user"))
>         .set("spark.cassandra.auth.password", config.getString("cassandra.pass"))
>         .set("es.index.auto.create", "true");
>
> }
>
>
> Thanks!
>
>