You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@zeppelin.apache.org by clark djilo kuissu <dj...@yahoo.fr> on 2015/08/04 13:48:30 UTC

Launch Pyspark Interpretor

Hi,    
   I try to launch the pyspark interpretor without success.
I ran the server:

$ bin/zeppelin-daemon.sh start

Running a simple notebook beginning with %pyspark, I got an error about py4j not being found. Just did pip install py4j (ref).

Now I'm getting this error:

pyspark is not responding Traceback (most recent call last):
  File "/tmp/zeppelin_pyspark.py", line 22, in <module>
    from pyspark.conf import SparkConf
ImportError: No module named pyspark.conf
So I tried this in my bashrc file took from stackoverflow


SPARK_HOME=/spark
PYTHONPATH=$SPARK_HOME/python:$SPARK_HOME/python/lib/py4j-0.8.2.1-src.zip:$PYTHONPATH
It didn't work.
What I am suppose to do ?
Regards,
Clark

RE: Launch Pyspark Interpretor

Posted by "Vadla, Karthik" <ka...@intel.com>.
And forgot to mention. Change version numbers accordingly.

From: clark djilo kuissu [mailto:djilokuissu@yahoo.fr]
Sent: Tuesday, August 4, 2015 9:35 AM
To: Vadla, Karthik; users@zeppelin.incubator.apache.org; Moon Soo Lee
Subject: Re: Launch Pyspark Interpretor

I didn't done this.I try and let you know

Regards,


Le Mardi 4 août 2015 18h32, "Vadla, Karthik" <ka...@intel.com>> a écrit :

Hi Clark,

How did you build your zeppelin binaries? Did you configure pyspark interpreter manually?


To configure pyspark automatically while building binaries use below
mvn  clean package -Pspark-1.3  -Ppyspark  -Dhadoop.version=2.6.0-cdh5.4.2 -Phadoop-2.6 –DskipTests


try above, still if you face same issue. Let me know.


Thanks
Karthik


From: clark djilo kuissu [mailto:djilokuissu@yahoo.fr]
Sent: Tuesday, August 4, 2015 4:49 AM
To: Users; Moon Soo Lee
Subject: Launch Pyspark Interpretor

Hi,

   I try to launch the pyspark interpretor without success.

I ran the server:

$ bin/zeppelin-daemon.sh start

Running a simple notebook beginning with %pyspark, I got an error about py4j not being found. Just did pip install py4j (ref).

Now I'm getting this error:

pyspark is not responding Traceback (most recent call last):
  File "/tmp/zeppelin_pyspark.py", line 22, in <module>
    from pyspark.conf import SparkConf
ImportError: No module named pyspark.conf

So I tried this in my bashrc file took from stackoverflow


SPARK_HOME=/spark
PYTHONPATH=$SPARK_HOME/python:$SPARK_HOME/python/lib/py4j-0.8.2.1-src.zip:$PYTHONPATH

It didn't work.

What I am suppose to do ?

Regards,

Clark


Re: Launch Pyspark Interpretor

Posted by clark djilo kuissu <dj...@yahoo.fr>.
I didn't done this.I try and let you know
Regards,
 


     Le Mardi 4 août 2015 18h32, "Vadla, Karthik" <ka...@intel.com> a écrit :
   

 #yiv4390282639 #yiv4390282639 -- _filtered #yiv4390282639 {panose-1:2 4 5 3 5 4 6 3 2 4;} _filtered #yiv4390282639 {font-family:Calibri;panose-1:2 15 5 2 2 2 4 3 2 4;} _filtered #yiv4390282639 {font-family:inherit;panose-1:0 0 0 0 0 0 0 0 0 0;}#yiv4390282639 #yiv4390282639 p.yiv4390282639MsoNormal, #yiv4390282639 li.yiv4390282639MsoNormal, #yiv4390282639 div.yiv4390282639MsoNormal {margin:0in;margin-bottom:.0001pt;font-size:12.0pt;}#yiv4390282639 a:link, #yiv4390282639 span.yiv4390282639MsoHyperlink {color:#0563C1;text-decoration:underline;}#yiv4390282639 a:visited, #yiv4390282639 span.yiv4390282639MsoHyperlinkFollowed {color:#954F72;text-decoration:underline;}#yiv4390282639 span.yiv4390282639EmailStyle17 {color:#1F497D;}#yiv4390282639 span.yiv4390282639crayon-i {}#yiv4390282639 span.yiv4390282639crayon-h {}#yiv4390282639 span.yiv4390282639crayon-e {}#yiv4390282639 span.yiv4390282639crayon-t {}#yiv4390282639 span.yiv4390282639crayon-o {}#yiv4390282639 span.yiv4390282639crayon-v {}#yiv4390282639 span.yiv4390282639crayon-cn {}#yiv4390282639 span.yiv4390282639crayon-sy {}#yiv4390282639 .yiv4390282639MsoChpDefault {font-size:10.0pt;} _filtered #yiv4390282639 {margin:1.0in 1.0in 1.0in 1.0in;}#yiv4390282639 div.yiv4390282639WordSection1 {}#yiv4390282639 Hi Clark,    How did you build your zeppelin binaries? Did you configure pyspark interpreter manually?       To configure pyspark automatically while building binaries use below mvn  clean package -Pspark-1.3  -Ppyspark  -Dhadoop.version=2.6.0-cdh5.4.2 -Phadoop-2.6 –DskipTests       try above, still if you face same issue. Let me know.       Thanks Karthik       From: clark djilo kuissu [mailto:djilokuissu@yahoo.fr]
Sent: Tuesday, August 4, 2015 4:49 AM
To: Users; Moon Soo Lee
Subject: Launch Pyspark Interpretor    Hi,         I try to launch the pyspark interpretor without success.    I ran the server:

$ bin/zeppelin-daemon.sh start

Running a simple notebook beginning with %pyspark, I got an error about py4j not being found. Just did pip install py4j (ref).

Now I'm getting this error:

pyspark is not responding Traceback (most recent call last):
  File "/tmp/zeppelin_pyspark.py", line 22, in <module>
    from pyspark.conf import SparkConf
ImportError: No module named pyspark.conf    So I tried this in my bashrc file took from stackoverflow 

SPARK_HOME=/spark
PYTHONPATH=$SPARK_HOME/python:$SPARK_HOME/python/lib/py4j-0.8.2.1-src.zip:$PYTHONPATH    It didn't work.    What I am suppose to do ?    Regards,    Clark 

  

RE: Launch Pyspark Interpretor

Posted by "Vadla, Karthik" <ka...@intel.com>.
Hi Clark,

How did you build your zeppelin binaries? Did you configure pyspark interpreter manually?


To configure pyspark automatically while building binaries use below
mvn  clean package -Pspark-1.3  -Ppyspark  -Dhadoop.version=2.6.0-cdh5.4.2 -Phadoop-2.6 –DskipTests


try above, still if you face same issue. Let me know.


Thanks
Karthik


From: clark djilo kuissu [mailto:djilokuissu@yahoo.fr]
Sent: Tuesday, August 4, 2015 4:49 AM
To: Users; Moon Soo Lee
Subject: Launch Pyspark Interpretor

Hi,

   I try to launch the pyspark interpretor without success.

I ran the server:

$ bin/zeppelin-daemon.sh start

Running a simple notebook beginning with %pyspark, I got an error about py4j not being found. Just did pip install py4j (ref).

Now I'm getting this error:

pyspark is not responding Traceback (most recent call last):
  File "/tmp/zeppelin_pyspark.py", line 22, in <module>
    from pyspark.conf import SparkConf
ImportError: No module named pyspark.conf

So I tried this in my bashrc file took from stackoverflow


SPARK_HOME=/spark
PYTHONPATH=$SPARK_HOME/python:$SPARK_HOME/python/lib/py4j-0.8.2.1-src.zip:$PYTHONPATH

It didn't work.

What I am suppose to do ?

Regards,

Clark