You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@zeppelin.apache.org by Stefano Ortolani <os...@gmail.com> on 2016/09/27 15:05:34 UTC

Zeppelin 0.6.1 and pyspark 1.6.1

Hi,

I have been using Zeppelin for quite a while without issues.
Lately, I have been trying to configure pyspark, but I can't seem to make
it work.
Using the local pyspark works perfectly, but then, regardless of the
PYTHONPATH I specify in zeppelin-env.sh, any usage of pyspark results in:

Error from python worker:
  /usr/bin/python: No module named pyspark
PYTHONPATH was:
  /usr/lib/spark/python/lib/pyspark.zip:/usr/lib/spark/python/
lib/py4j-0.9-src.zip:/usr/lib/spark/lib/spark-assembly-1.6.1-hadoop2.6.0.jar


Now, that pyspark.zip, is actually not there (my distribution have a tar.gz
inside /usr/lib/spark/lib), but no matter what I set, the PYTHONPATH does
not change.

Any idea?

Regards,
Stefano

Re: Zeppelin 0.6.1 and pyspark 1.6.1

Posted by Mina Lee <mi...@apache.org>.
Hi Stefano,

Can you tell me which spark you are using?
If you have set cdh spark as SPARK_HOME, could you confirm that you
installed spark-pyspark package?

Let me know I

On Wed, Sep 28, 2016 at 12:06 AM Stefano Ortolani <os...@gmail.com>
wrote:

> Hi,
>
> I have been using Zeppelin for quite a while without issues.
> Lately, I have been trying to configure pyspark, but I can't seem to make
> it work.
> Using the local pyspark works perfectly, but then, regardless of the
> PYTHONPATH I specify in zeppelin-env.sh, any usage of pyspark results in:
>
> Error from python worker:
>   /usr/bin/python: No module named pyspark
> PYTHONPATH was:
>
> /usr/lib/spark/python/lib/pyspark.zip:/usr/lib/spark/python/lib/py4j-0.9-src.zip:/usr/lib/spark/lib/spark-assembly-1.6.1-hadoop2.6.0.jar
>
>
> Now, that pyspark.zip, is actually not there (my distribution have a
> tar.gz inside /usr/lib/spark/lib), but no matter what I set, the PYTHONPATH
> does not change.
>
> Any idea?
>
> Regards,
> Stefano
>