You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@zeppelin.apache.org by Alex Dzhagriev <dz...@gmail.com> on 2018/10/19 19:12:34 UTC

Can not connect to a remote spark master

Hello,

I have a remote Spark cluster and I'm trying to use it by setting the spark
interpreter property:

master spark://spark-cluster-master:7077, however I'm getting the following
error:

java.lang.RuntimeException: SPARK_HOME is not specified in
interpreter-setting for non-local mode, if you specify it in
zeppelin-env.sh, please move that into interpreter setting

version: Docker Image 0.8.0

Thanks, Alex.

Re: Can not connect to a remote spark master

Posted by Alex Dzhagriev <dz...@gmail.com>.
Thanks a lot for clarifying that.

Thanks, Alex

On Oct 20, 2018 7:15 AM, "Jhon Anderson Cardenas Diaz" <
jhonderson2007@gmail.com> wrote:

> Hi, You can specify it in the zeppelin-env.sh, or in the Dockerfile.
>
> Zeppelin will look for that variable first in the interpreter settings,
> and if it does not find it, it will look for it on zeppelin environment
> variables; so you can specify it in both sides, but as it does not change
> frenquently it is better on zeppelin environment variable.
>
> El sáb., 20 oct. 2018 a las 0:25, Alex Dzhagriev (<dz...@gmail.com>)
> escribió:
>
>> Thanks for the quick reply. Should I specify it to the Zeppelin process
>> or the Spark interpreter?
>>
>> Thanks, Alex.
>>
>> On Fri, Oct 19, 2018 at 4:53 PM Jeff Zhang <zj...@gmail.com> wrote:
>>
>>> You need to specify SPARK_HOME which is where spark installed.
>>>
>>>
>>> Alex Dzhagriev <dz...@gmail.com>于2018年10月20日周六 上午3:12写道:
>>>
>>>> Hello,
>>>>
>>>> I have a remote Spark cluster and I'm trying to use it by setting the
>>>> spark interpreter property:
>>>>
>>>> master spark://spark-cluster-master:7077, however I'm getting the
>>>> following error:
>>>>
>>>> java.lang.RuntimeException: SPARK_HOME is not specified in
>>>> interpreter-setting for non-local mode, if you specify it in
>>>> zeppelin-env.sh, please move that into interpreter setting
>>>>
>>>> version: Docker Image 0.8.0
>>>>
>>>> Thanks, Alex.
>>>>
>>>

Re: Can not connect to a remote spark master

Posted by Jhon Anderson Cardenas Diaz <jh...@gmail.com>.
Hi, You can specify it in the zeppelin-env.sh, or in the Dockerfile.

Zeppelin will look for that variable first in the interpreter settings, and
if it does not find it, it will look for it on zeppelin environment
variables; so you can specify it in both sides, but as it does not change
frenquently it is better on zeppelin environment variable.

El sáb., 20 oct. 2018 a las 0:25, Alex Dzhagriev (<dz...@gmail.com>)
escribió:

> Thanks for the quick reply. Should I specify it to the Zeppelin process or
> the Spark interpreter?
>
> Thanks, Alex.
>
> On Fri, Oct 19, 2018 at 4:53 PM Jeff Zhang <zj...@gmail.com> wrote:
>
>> You need to specify SPARK_HOME which is where spark installed.
>>
>>
>> Alex Dzhagriev <dz...@gmail.com>于2018年10月20日周六 上午3:12写道:
>>
>>> Hello,
>>>
>>> I have a remote Spark cluster and I'm trying to use it by setting the
>>> spark interpreter property:
>>>
>>> master spark://spark-cluster-master:7077, however I'm getting the
>>> following error:
>>>
>>> java.lang.RuntimeException: SPARK_HOME is not specified in
>>> interpreter-setting for non-local mode, if you specify it in
>>> zeppelin-env.sh, please move that into interpreter setting
>>>
>>> version: Docker Image 0.8.0
>>>
>>> Thanks, Alex.
>>>
>>

Re: Can not connect to a remote spark master

Posted by Alex Dzhagriev <dz...@gmail.com>.
Thanks for the quick reply. Should I specify it to the Zeppelin process or
the Spark interpreter?

Thanks, Alex.

On Fri, Oct 19, 2018 at 4:53 PM Jeff Zhang <zj...@gmail.com> wrote:

> You need to specify SPARK_HOME which is where spark installed.
>
>
> Alex Dzhagriev <dz...@gmail.com>于2018年10月20日周六 上午3:12写道:
>
>> Hello,
>>
>> I have a remote Spark cluster and I'm trying to use it by setting the
>> spark interpreter property:
>>
>> master spark://spark-cluster-master:7077, however I'm getting the
>> following error:
>>
>> java.lang.RuntimeException: SPARK_HOME is not specified in
>> interpreter-setting for non-local mode, if you specify it in
>> zeppelin-env.sh, please move that into interpreter setting
>>
>> version: Docker Image 0.8.0
>>
>> Thanks, Alex.
>>
>

Re: Can not connect to a remote spark master

Posted by Jeff Zhang <zj...@gmail.com>.
You need to specify SPARK_HOME which is where spark installed.


Alex Dzhagriev <dz...@gmail.com>于2018年10月20日周六 上午3:12写道:

> Hello,
>
> I have a remote Spark cluster and I'm trying to use it by setting the
> spark interpreter property:
>
> master spark://spark-cluster-master:7077, however I'm getting the
> following error:
>
> java.lang.RuntimeException: SPARK_HOME is not specified in
> interpreter-setting for non-local mode, if you specify it in
> zeppelin-env.sh, please move that into interpreter setting
>
> version: Docker Image 0.8.0
>
> Thanks, Alex.
>