You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@zeppelin.apache.org by Fabian Böhnlein <fa...@gmail.com> on 2017/05/08 12:11:54 UTC

Multiple Spark versions / groups in Zeppelin

Hi all,

we're looking to support multiple Spark versions in the same Zeppelin
instances. Can this work with multiple Spark groups or in another way?

We already use multiple Interpreters (via "Create"in the Interpreter UI) to
configure different Spark environments (all using group "spark").

How can I copy the spark group and adjust its SPARK_HOME? I could not find
interpreter/spark/interpreter-setting.json which might configure this.

Thanks,
Fabian

Re: Multiple Spark versions / groups in Zeppelin

Posted by Fabian Böhnlein <fa...@gmail.com>.
Indeed, that's it, thanks!

On Mon, 8 May 2017 at 17:23 Jeff Zhang <zj...@gmail.com> wrote:

>
> you can define SPARK_HOME in the interpreter setting page for different
> spark version.
>
> [image: image.png]
>
>
> Fabian Böhnlein <fa...@gmail.com>于2017年5月8日周一 上午5:12写道:
>
>> Hi all,
>>
>> we're looking to support multiple Spark versions in the same Zeppelin
>> instances. Can this work with multiple Spark groups or in another way?
>>
>> We already use multiple Interpreters (via "Create"in the Interpreter UI)
>> to configure different Spark environments (all using group "spark").
>>
>> How can I copy the spark group and adjust its SPARK_HOME? I could not
>> find interpreter/spark/interpreter-setting.json which might configure this.
>>
>> Thanks,
>> Fabian
>>
>

Re: Multiple Spark versions / groups in Zeppelin

Posted by Jeff Zhang <zj...@gmail.com>.
you can define SPARK_HOME in the interpreter setting page for different
spark version.

[image: image.png]


Fabian Böhnlein <fa...@gmail.com>于2017年5月8日周一 上午5:12写道:

> Hi all,
>
> we're looking to support multiple Spark versions in the same Zeppelin
> instances. Can this work with multiple Spark groups or in another way?
>
> We already use multiple Interpreters (via "Create"in the Interpreter UI)
> to configure different Spark environments (all using group "spark").
>
> How can I copy the spark group and adjust its SPARK_HOME? I could not find
> interpreter/spark/interpreter-setting.json which might configure this.
>
> Thanks,
> Fabian
>