You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@zeppelin.apache.org by Anandha L Ranganathan <an...@gmail.com> on 2017/11/22 01:13:42 UTC

Livy interpreter - external libraries and changing queue name at runtime

We are using Livy interpreter from Zeppelin to connect to Spark.

In this,  we want to give the users an option to download the external
libraries.
By default we have added some basic libraries in interpreter setting.

In spark interpreter, an users can download the external libraries they
want using this command.
%spark.dep
z.reset()
z.addRepo("Spark Packages Repo").url("
http://dl.bintray.com/spark-packages/maven")
z.load("com.databricks:spark-csv_2.11:1.2.0")


How can we import the external libraries using livy ?


Another question, is there a way to change the yarn queue name at runtime?
Some users want to use different queue rather than default queue assigned
in the interpreter.  If that feature is not available, then what is the
best approach to implement this ?

Thanks
Anand

Re: Livy interpreter - external libraries and changing queue name at runtime

Posted by Anandha L Ranganathan <an...@gmail.com>.
Thanks Jeff.

We will add dependencies though livy.spark.jars.packages


Thanks
Anand



On Wed, Nov 22, 2017 at 4:29 PM, Jeff Zhang <zj...@gmail.com> wrote:

>
> livy doesn't support adding dependency via in note like %spark.dep, you
> have to do it in interpreter setting.
>
>
> Anandha L Ranganathan <an...@gmail.com>于2017年11月23日周四 上午4:37写道:
>
>> Thanks Jeff.
>>
>> Is that something I can use it in the notebook or in the interpreter? If
>> it is in the notebook can you provide me with syntax ? I tried in the
>> notebook and it is throwing an error.
>>
>>
>>
>>
>> On Tue, Nov 21, 2017 at 5:28 PM, Jeff Zhang <zj...@gmail.com> wrote:
>>
>>>
>>> You can do it via livy interpreter setting.
>>>
>>> Here's 2 configuration which can help you add external jars and external
>>> packages
>>>
>>> livy.spark.jars
>>> livy.spark.jars.packages
>>>
>>> And this is the configuration for queue name
>>>
>>> livy.spark.yarn.queue
>>>
>>>
>>> Anandha L Ranganathan <an...@gmail.com>于2017年11月22日周三 上午9:13写道:
>>>
>>>> We are using Livy interpreter from Zeppelin to connect to Spark.
>>>>
>>>> In this,  we want to give the users an option to download the external
>>>> libraries.
>>>> By default we have added some basic libraries in interpreter setting.
>>>>
>>>> In spark interpreter, an users can download the external libraries they
>>>> want using this command.
>>>> %spark.dep
>>>> z.reset()
>>>> z.addRepo("Spark Packages Repo").url("http://dl.bintray.
>>>> com/spark-packages/maven")
>>>> z.load("com.databricks:spark-csv_2.11:1.2.0")
>>>>
>>>>
>>>> How can we import the external libraries using livy ?
>>>>
>>>>
>>>> Another question, is there a way to change the yarn queue name at
>>>> runtime? Some users want to use different queue rather than default queue
>>>> assigned in the interpreter.  If that feature is not available, then what
>>>> is the best approach to implement this ?
>>>>
>>>> Thanks
>>>> Anand
>>>>
>>>>
>>>>
>>>
>>

Re: Livy interpreter - external libraries and changing queue name at runtime

Posted by Jeff Zhang <zj...@gmail.com>.
livy doesn't support adding dependency via in note like %spark.dep, you
have to do it in interpreter setting.


Anandha L Ranganathan <an...@gmail.com>于2017年11月23日周四 上午4:37写道:

> Thanks Jeff.
>
> Is that something I can use it in the notebook or in the interpreter? If
> it is in the notebook can you provide me with syntax ? I tried in the
> notebook and it is throwing an error.
>
>
>
>
> On Tue, Nov 21, 2017 at 5:28 PM, Jeff Zhang <zj...@gmail.com> wrote:
>
>>
>> You can do it via livy interpreter setting.
>>
>> Here's 2 configuration which can help you add external jars and external
>> packages
>>
>> livy.spark.jars
>> livy.spark.jars.packages
>>
>> And this is the configuration for queue name
>>
>> livy.spark.yarn.queue
>>
>>
>> Anandha L Ranganathan <an...@gmail.com>于2017年11月22日周三 上午9:13写道:
>>
>>> We are using Livy interpreter from Zeppelin to connect to Spark.
>>>
>>> In this,  we want to give the users an option to download the external
>>> libraries.
>>> By default we have added some basic libraries in interpreter setting.
>>>
>>> In spark interpreter, an users can download the external libraries they
>>> want using this command.
>>> %spark.dep
>>> z.reset()
>>> z.addRepo("Spark Packages Repo").url("
>>> http://dl.bintray.com/spark-packages/maven")
>>> z.load("com.databricks:spark-csv_2.11:1.2.0")
>>>
>>>
>>> How can we import the external libraries using livy ?
>>>
>>>
>>> Another question, is there a way to change the yarn queue name at
>>> runtime? Some users want to use different queue rather than default queue
>>> assigned in the interpreter.  If that feature is not available, then what
>>> is the best approach to implement this ?
>>>
>>> Thanks
>>> Anand
>>>
>>>
>>>
>>
>

Re: Livy interpreter - external libraries and changing queue name at runtime

Posted by Anandha L Ranganathan <an...@gmail.com>.
Thanks Jeff.

Is that something I can use it in the notebook or in the interpreter? If it
is in the notebook can you provide me with syntax ? I tried in the notebook
and it is throwing an error.




On Tue, Nov 21, 2017 at 5:28 PM, Jeff Zhang <zj...@gmail.com> wrote:

>
> You can do it via livy interpreter setting.
>
> Here's 2 configuration which can help you add external jars and external
> packages
>
> livy.spark.jars
> livy.spark.jars.packages
>
> And this is the configuration for queue name
>
> livy.spark.yarn.queue
>
>
> Anandha L Ranganathan <an...@gmail.com>于2017年11月22日周三 上午9:13写道:
>
>> We are using Livy interpreter from Zeppelin to connect to Spark.
>>
>> In this,  we want to give the users an option to download the external
>> libraries.
>> By default we have added some basic libraries in interpreter setting.
>>
>> In spark interpreter, an users can download the external libraries they
>> want using this command.
>> %spark.dep
>> z.reset()
>> z.addRepo("Spark Packages Repo").url("http://dl.bintray.
>> com/spark-packages/maven")
>> z.load("com.databricks:spark-csv_2.11:1.2.0")
>>
>>
>> How can we import the external libraries using livy ?
>>
>>
>> Another question, is there a way to change the yarn queue name at
>> runtime? Some users want to use different queue rather than default queue
>> assigned in the interpreter.  If that feature is not available, then what
>> is the best approach to implement this ?
>>
>> Thanks
>> Anand
>>
>>
>>
>

Re: Livy interpreter - external libraries and changing queue name at runtime

Posted by Jeff Zhang <zj...@gmail.com>.
You can do it via livy interpreter setting.

Here's 2 configuration which can help you add external jars and external
packages

livy.spark.jars
livy.spark.jars.packages

And this is the configuration for queue name

livy.spark.yarn.queue


Anandha L Ranganathan <an...@gmail.com>于2017年11月22日周三 上午9:13写道:

> We are using Livy interpreter from Zeppelin to connect to Spark.
>
> In this,  we want to give the users an option to download the external
> libraries.
> By default we have added some basic libraries in interpreter setting.
>
> In spark interpreter, an users can download the external libraries they
> want using this command.
> %spark.dep
> z.reset()
> z.addRepo("Spark Packages Repo").url("
> http://dl.bintray.com/spark-packages/maven")
> z.load("com.databricks:spark-csv_2.11:1.2.0")
>
>
> How can we import the external libraries using livy ?
>
>
> Another question, is there a way to change the yarn queue name at runtime?
> Some users want to use different queue rather than default queue assigned
> in the interpreter.  If that feature is not available, then what is the
> best approach to implement this ?
>
> Thanks
> Anand
>
>
>