You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@zeppelin.apache.org by tog <gu...@gmail.com> on 2015/11/07 10:48:54 UTC

0.5.5 with pyspark

Hi

Before casting my vote I was willing to give a try to pyspark (which I have
never done so far). Is it supposed to work out of the box in local just
untaring the -all archive or do I have to configure something?

Cheers
Guillaume


-- 
PGP KeyID: 2048R/EA31CFC9  subkeys.pgp.net

Re: 0.5.5 with pyspark

Posted by moon soo Lee <mo...@apache.org>.
You'll need add -Ppyspark profile in your build command.

Thanks,
moon


On Sun, Nov 8, 2015 at 9:45 AM Jeff Steinmetz <je...@gmail.com>
wrote:

> Does pyspark require py4j to be installed (via pip install py4j)?
>
> In 5.5 branch, ran:
>
> mvn clean package -Pspark-1.5 -Dhadoop.version=2.2.0 -Phadoop-2.2
> -DskipTests -P build-distr
>
> Build passed. Ran zeppelin from distribution directory
> Did not alter or create zeppelin-env.sh or zeppelin-site.xml (left as
> templates)
>
>
> ./bin/zeppelin-daemon.sh start
>
> Created a simple notebook
>
> %pyspark
>
> i = 1
>
>
> pyspark is not responding Traceback (most recent call last):
>   File "/tmp/zeppelin_pyspark.py", line 20, in <module>
>     from py4j.java_gateway import java_import, JavaGateway, GatewayClient
> ImportError: No module named py4j.java_gateway
>
>
>
>
>
>
> On 11/7/15, 1:56 AM, "moon soo Lee" <mo...@apache.org> wrote:
>
> >Thanks for verifying.
> >
> >Yes that's right.
> >pyspark in -all archive will work out of box without any configuration.
> >
> >Best,
> >moon
> >
> >On Sat, Nov 7, 2015 at 6:49 PM tog <gu...@gmail.com> wrote:
> >
> >> Hi
> >>
> >> Before casting my vote I was willing to give a try to pyspark (which I
> have
> >> never done so far). Is it supposed to work out of the box in local just
> >> untaring the -all archive or do I have to configure something?
> >>
> >> Cheers
> >> Guillaume
> >>
> >>
> >> --
> >> PGP KeyID: 2048R/EA31CFC9  subkeys.pgp.net
> >>
>
>

Re: 0.5.5 with pyspark

Posted by Jeff Steinmetz <je...@gmail.com>.
Does pyspark require py4j to be installed (via pip install py4j)?

In 5.5 branch, ran: 

mvn clean package -Pspark-1.5 -Dhadoop.version=2.2.0 -Phadoop-2.2 -DskipTests -P build-distr

Build passed. Ran zeppelin from distribution directory 
Did not alter or create zeppelin-env.sh or zeppelin-site.xml (left as templates)


./bin/zeppelin-daemon.sh start

Created a simple notebook

%pyspark

i = 1


pyspark is not responding Traceback (most recent call last):
  File "/tmp/zeppelin_pyspark.py", line 20, in <module>
    from py4j.java_gateway import java_import, JavaGateway, GatewayClient
ImportError: No module named py4j.java_gateway






On 11/7/15, 1:56 AM, "moon soo Lee" <mo...@apache.org> wrote:

>Thanks for verifying.
>
>Yes that's right.
>pyspark in -all archive will work out of box without any configuration.
>
>Best,
>moon
>
>On Sat, Nov 7, 2015 at 6:49 PM tog <gu...@gmail.com> wrote:
>
>> Hi
>>
>> Before casting my vote I was willing to give a try to pyspark (which I have
>> never done so far). Is it supposed to work out of the box in local just
>> untaring the -all archive or do I have to configure something?
>>
>> Cheers
>> Guillaume
>>
>>
>> --
>> PGP KeyID: 2048R/EA31CFC9  subkeys.pgp.net
>>


Re: 0.5.5 with pyspark

Posted by moon soo Lee <mo...@apache.org>.
No problem. Really appreciate for verifying RC.

Thanks,
moon

On Sun, Nov 8, 2015 at 9:09 PM tog <gu...@gmail.com> wrote:

> OK - tried again on abrand new  Ubuntu VM ... and it worked like a charm
> Sorry must have been confused between my dev repo and the rc2 deployment
>
> Cheers
> Guillaume
>
> On 8 November 2015 at 10:55, tog <gu...@gmail.com> wrote:
>
> > Yes I am puzzled - I removed py4j and it worked.
> > My OS is 10.11.1 (El Capitan)
> > Will try on a fresh Ubuntu VM and let you know.
> >
> > Cheers
> > Guillaume
> >
> > On 8 November 2015 at 10:10, moon soo Lee <mo...@apache.org> wrote:
> >
> >> yes, PYTHONPATH looks fine. If your filesystem has
> >>
> >>
> >>
> /Users/tog/Downloads/zeppelin-0.5.5-incubating-bin-all/interpreter/spark/pyspark/pyspark.zip
> >>
> >>
> /Users/tog/Downloads/zeppelin-0.5.5-incubating-bin-all/interpreter/spark/pyspark/py4j-0.8.2.1-src.zip
> >>
> >> then, it supposed to work without py4j installed in your system.
> >>
> >> Could you share your OS?
> >>
> >> Thanks,
> >> moon
> >>
> >> On Sun, Nov 8, 2015 at 7:04 PM tog <gu...@gmail.com> wrote:
> >>
> >> > Hi Moon
> >> >
> >> > Here is what I get with zeppelin-0.5.5-incubating-bin-all:
> >> > res0: String =
> >> >
> >> >
> >>
> /Users/tog/Downloads/zeppelin-0.5.5-incubating-bin-all/interpreter/spark/pyspark/pyspark.zip:/Users/tog/Downloads/zeppelin-0.5.5-incubating-bin-all/interpreter/spark/pyspark/py4j-0.8.2.1-src.zip
> >> >
> >> >
> >> > this looks fine
> >> > Cheers
> >> > Guillaume
> >> >
> >> >
> >> > On 8 November 2015 at 01:46, moon soo Lee <mo...@apache.org> wrote:
> >> >
> >> > > without -Ppyspark, interpreter/spark/pyspark/ directory is not being
> >> > > created although PYTHONPATH populated to pointing pyspark.zip
> >> > > and py4j-0.8.2.1-src.zip in that directory.
> >> > >
> >> > > with -Ppyspark, it will work.
> >> > >
> >> > > Thanks,
> >> > > moon
> >> > >
> >> > > On Sun, Nov 8, 2015 at 10:40 AM Jeff Steinmetz <
> >> > > jeffrey.steinmetz@gmail.com>
> >> > > wrote:
> >> > >
> >> > > > Shows this (I did not build with -Ppyspark).  I can rebuild again
> >> with
> >> > > > -Ppyspark
> >> > > >
> >> > > > Either way, here is my output (I have a repo in a vagrant VM, and
> >> the
> >> > > > build ends up zeppelin-distribution, which is where I run my test)
> >> > > >
> >> > > > %spark
> >> > > > System.getenv("PYTHONPATH")
> >> > > >
> >> > > >
> >> > > > res1: String =
> >> > > >
> >> > >
> >> >
> >>
> /vagrant/incubator-zeppelin/zeppelin-distribution/target/zeppelin-0.5.6-incubating-SNAPSHOT/zeppelin-0.5.6-incubating-SNAPSHOT/interpreter/spark/pyspark/pyspark.zip:/vagrant/incubator-zeppelin/zeppelin-distribution/target/zeppelin-0.5.6-incubating-SNAPSHOT/zeppelin-0.5.6-incubating-SNAPSHOT/interpreter/spark/pyspark/py4j-0.8.2.1-src.zip
> >> > > >
> >> > > >
> >> > > >
> >> > > >
> >> > > > On 11/7/15, 5:34 PM, "moon soo Lee" <mo...@apache.org> wrote:
> >> > > >
> >> > > > >Hi Guillaume,
> >> > > > >
> >> > > > >In my test with the machine that does not py4j installed, it
> worked
> >> > > > without
> >> > > > >problem.
> >> > > > >
> >> > > > >Could you try to run
> >> > > > >
> >> > > > >%spark
> >> > > > >System.getenv("PYTHONPATH")
> >> > > > >
> >> > > > >in Zeppelin notebook and see output?
> >> > > > >Mine is
> >> > > > >
> >> > > > >res1: String =
> >> > > >
> >> > > >
> >> > >
> >> >
> >>
> >/Users/moon/Downloads/zeppelin-0.5.5-incubating-bin-all/interpreter/spark/pyspark/pyspark.zip:/Users/moon/Downloads/zeppelin-0.5.5-incubating-bin-all/interpreter/spark/pyspark/py4j-0.8.2.1-src.zip
> >> > > > >
> >> > > > >
> >> > > > >Thanks,
> >> > > > >moon
> >> > > > >
> >> > > > >On Sat, Nov 7, 2015 at 7:13 PM tog <gu...@gmail.com>
> >> > wrote:
> >> > > > >
> >> > > > >> Hi moon
> >> > > > >> I had to install: "pip install py4j" and then it worked
> >> > > > >>
> >> > > > >> Cheers
> >> > > > >> Guillaume
> >> > > > >>
> >> > > > >>
> >> > > > >> On Nov 7, 2015 9:56 AM, "moon soo Lee" <mo...@apache.org>
> wrote:
> >> > > > >>
> >> > > > >> > Thanks for verifying.
> >> > > > >> >
> >> > > > >> > Yes that's right.
> >> > > > >> > pyspark in -all archive will work out of box without any
> >> > > > configuration.
> >> > > > >> >
> >> > > > >> > Best,
> >> > > > >> > moon
> >> > > > >> >
> >> > > > >> > On Sat, Nov 7, 2015 at 6:49 PM tog <
> guillaume.alleon@gmail.com
> >> >
> >> > > > wrote:
> >> > > > >> >
> >> > > > >> > > Hi
> >> > > > >> > >
> >> > > > >> > > Before casting my vote I was willing to give a try to
> pyspark
> >> > > > (which I
> >> > > > >> > have
> >> > > > >> > > never done so far). Is it supposed to work out of the box
> in
> >> > local
> >> > > > just
> >> > > > >> > > untaring the -all archive or do I have to configure
> >> something?
> >> > > > >> > >
> >> > > > >> > > Cheers
> >> > > > >> > > Guillaume
> >> > > > >> > >
> >> > > > >> > >
> >> > > > >> > > --
> >> > > > >> > > PGP KeyID: 2048R/EA31CFC9  subkeys.pgp.net
> >> > > > >> > >
> >> > > > >> >
> >> > > > >>
> >> > > >
> >> > > >
> >> > >
> >> >
> >> >
> >> >
> >> > --
> >> > PGP KeyID: 2048R/EA31CFC9  subkeys.pgp.net
> >> >
> >>
> >
> >
> >
> > --
> > PGP KeyID: 2048R/EA31CFC9  subkeys.pgp.net
> >
>
>
>
> --
> PGP KeyID: 2048R/EA31CFC9  subkeys.pgp.net
>

Re: 0.5.5 with pyspark

Posted by tog <gu...@gmail.com>.
OK - tried again on abrand new  Ubuntu VM ... and it worked like a charm
Sorry must have been confused between my dev repo and the rc2 deployment

Cheers
Guillaume

On 8 November 2015 at 10:55, tog <gu...@gmail.com> wrote:

> Yes I am puzzled - I removed py4j and it worked.
> My OS is 10.11.1 (El Capitan)
> Will try on a fresh Ubuntu VM and let you know.
>
> Cheers
> Guillaume
>
> On 8 November 2015 at 10:10, moon soo Lee <mo...@apache.org> wrote:
>
>> yes, PYTHONPATH looks fine. If your filesystem has
>>
>>
>> /Users/tog/Downloads/zeppelin-0.5.5-incubating-bin-all/interpreter/spark/pyspark/pyspark.zip
>>
>> /Users/tog/Downloads/zeppelin-0.5.5-incubating-bin-all/interpreter/spark/pyspark/py4j-0.8.2.1-src.zip
>>
>> then, it supposed to work without py4j installed in your system.
>>
>> Could you share your OS?
>>
>> Thanks,
>> moon
>>
>> On Sun, Nov 8, 2015 at 7:04 PM tog <gu...@gmail.com> wrote:
>>
>> > Hi Moon
>> >
>> > Here is what I get with zeppelin-0.5.5-incubating-bin-all:
>> > res0: String =
>> >
>> >
>> /Users/tog/Downloads/zeppelin-0.5.5-incubating-bin-all/interpreter/spark/pyspark/pyspark.zip:/Users/tog/Downloads/zeppelin-0.5.5-incubating-bin-all/interpreter/spark/pyspark/py4j-0.8.2.1-src.zip
>> >
>> >
>> > this looks fine
>> > Cheers
>> > Guillaume
>> >
>> >
>> > On 8 November 2015 at 01:46, moon soo Lee <mo...@apache.org> wrote:
>> >
>> > > without -Ppyspark, interpreter/spark/pyspark/ directory is not being
>> > > created although PYTHONPATH populated to pointing pyspark.zip
>> > > and py4j-0.8.2.1-src.zip in that directory.
>> > >
>> > > with -Ppyspark, it will work.
>> > >
>> > > Thanks,
>> > > moon
>> > >
>> > > On Sun, Nov 8, 2015 at 10:40 AM Jeff Steinmetz <
>> > > jeffrey.steinmetz@gmail.com>
>> > > wrote:
>> > >
>> > > > Shows this (I did not build with -Ppyspark).  I can rebuild again
>> with
>> > > > -Ppyspark
>> > > >
>> > > > Either way, here is my output (I have a repo in a vagrant VM, and
>> the
>> > > > build ends up zeppelin-distribution, which is where I run my test)
>> > > >
>> > > > %spark
>> > > > System.getenv("PYTHONPATH")
>> > > >
>> > > >
>> > > > res1: String =
>> > > >
>> > >
>> >
>> /vagrant/incubator-zeppelin/zeppelin-distribution/target/zeppelin-0.5.6-incubating-SNAPSHOT/zeppelin-0.5.6-incubating-SNAPSHOT/interpreter/spark/pyspark/pyspark.zip:/vagrant/incubator-zeppelin/zeppelin-distribution/target/zeppelin-0.5.6-incubating-SNAPSHOT/zeppelin-0.5.6-incubating-SNAPSHOT/interpreter/spark/pyspark/py4j-0.8.2.1-src.zip
>> > > >
>> > > >
>> > > >
>> > > >
>> > > > On 11/7/15, 5:34 PM, "moon soo Lee" <mo...@apache.org> wrote:
>> > > >
>> > > > >Hi Guillaume,
>> > > > >
>> > > > >In my test with the machine that does not py4j installed, it worked
>> > > > without
>> > > > >problem.
>> > > > >
>> > > > >Could you try to run
>> > > > >
>> > > > >%spark
>> > > > >System.getenv("PYTHONPATH")
>> > > > >
>> > > > >in Zeppelin notebook and see output?
>> > > > >Mine is
>> > > > >
>> > > > >res1: String =
>> > > >
>> > > >
>> > >
>> >
>> >/Users/moon/Downloads/zeppelin-0.5.5-incubating-bin-all/interpreter/spark/pyspark/pyspark.zip:/Users/moon/Downloads/zeppelin-0.5.5-incubating-bin-all/interpreter/spark/pyspark/py4j-0.8.2.1-src.zip
>> > > > >
>> > > > >
>> > > > >Thanks,
>> > > > >moon
>> > > > >
>> > > > >On Sat, Nov 7, 2015 at 7:13 PM tog <gu...@gmail.com>
>> > wrote:
>> > > > >
>> > > > >> Hi moon
>> > > > >> I had to install: "pip install py4j" and then it worked
>> > > > >>
>> > > > >> Cheers
>> > > > >> Guillaume
>> > > > >>
>> > > > >>
>> > > > >> On Nov 7, 2015 9:56 AM, "moon soo Lee" <mo...@apache.org> wrote:
>> > > > >>
>> > > > >> > Thanks for verifying.
>> > > > >> >
>> > > > >> > Yes that's right.
>> > > > >> > pyspark in -all archive will work out of box without any
>> > > > configuration.
>> > > > >> >
>> > > > >> > Best,
>> > > > >> > moon
>> > > > >> >
>> > > > >> > On Sat, Nov 7, 2015 at 6:49 PM tog <guillaume.alleon@gmail.com
>> >
>> > > > wrote:
>> > > > >> >
>> > > > >> > > Hi
>> > > > >> > >
>> > > > >> > > Before casting my vote I was willing to give a try to pyspark
>> > > > (which I
>> > > > >> > have
>> > > > >> > > never done so far). Is it supposed to work out of the box in
>> > local
>> > > > just
>> > > > >> > > untaring the -all archive or do I have to configure
>> something?
>> > > > >> > >
>> > > > >> > > Cheers
>> > > > >> > > Guillaume
>> > > > >> > >
>> > > > >> > >
>> > > > >> > > --
>> > > > >> > > PGP KeyID: 2048R/EA31CFC9  subkeys.pgp.net
>> > > > >> > >
>> > > > >> >
>> > > > >>
>> > > >
>> > > >
>> > >
>> >
>> >
>> >
>> > --
>> > PGP KeyID: 2048R/EA31CFC9  subkeys.pgp.net
>> >
>>
>
>
>
> --
> PGP KeyID: 2048R/EA31CFC9  subkeys.pgp.net
>



-- 
PGP KeyID: 2048R/EA31CFC9  subkeys.pgp.net

Re: 0.5.5 with pyspark

Posted by tog <gu...@gmail.com>.
Yes I am puzzled - I removed py4j and it worked.
My OS is 10.11.1 (El Capitan)
Will try on a fresh Ubuntu VM and let you know.

Cheers
Guillaume

On 8 November 2015 at 10:10, moon soo Lee <mo...@apache.org> wrote:

> yes, PYTHONPATH looks fine. If your filesystem has
>
>
> /Users/tog/Downloads/zeppelin-0.5.5-incubating-bin-all/interpreter/spark/pyspark/pyspark.zip
>
> /Users/tog/Downloads/zeppelin-0.5.5-incubating-bin-all/interpreter/spark/pyspark/py4j-0.8.2.1-src.zip
>
> then, it supposed to work without py4j installed in your system.
>
> Could you share your OS?
>
> Thanks,
> moon
>
> On Sun, Nov 8, 2015 at 7:04 PM tog <gu...@gmail.com> wrote:
>
> > Hi Moon
> >
> > Here is what I get with zeppelin-0.5.5-incubating-bin-all:
> > res0: String =
> >
> >
> /Users/tog/Downloads/zeppelin-0.5.5-incubating-bin-all/interpreter/spark/pyspark/pyspark.zip:/Users/tog/Downloads/zeppelin-0.5.5-incubating-bin-all/interpreter/spark/pyspark/py4j-0.8.2.1-src.zip
> >
> >
> > this looks fine
> > Cheers
> > Guillaume
> >
> >
> > On 8 November 2015 at 01:46, moon soo Lee <mo...@apache.org> wrote:
> >
> > > without -Ppyspark, interpreter/spark/pyspark/ directory is not being
> > > created although PYTHONPATH populated to pointing pyspark.zip
> > > and py4j-0.8.2.1-src.zip in that directory.
> > >
> > > with -Ppyspark, it will work.
> > >
> > > Thanks,
> > > moon
> > >
> > > On Sun, Nov 8, 2015 at 10:40 AM Jeff Steinmetz <
> > > jeffrey.steinmetz@gmail.com>
> > > wrote:
> > >
> > > > Shows this (I did not build with -Ppyspark).  I can rebuild again
> with
> > > > -Ppyspark
> > > >
> > > > Either way, here is my output (I have a repo in a vagrant VM, and the
> > > > build ends up zeppelin-distribution, which is where I run my test)
> > > >
> > > > %spark
> > > > System.getenv("PYTHONPATH")
> > > >
> > > >
> > > > res1: String =
> > > >
> > >
> >
> /vagrant/incubator-zeppelin/zeppelin-distribution/target/zeppelin-0.5.6-incubating-SNAPSHOT/zeppelin-0.5.6-incubating-SNAPSHOT/interpreter/spark/pyspark/pyspark.zip:/vagrant/incubator-zeppelin/zeppelin-distribution/target/zeppelin-0.5.6-incubating-SNAPSHOT/zeppelin-0.5.6-incubating-SNAPSHOT/interpreter/spark/pyspark/py4j-0.8.2.1-src.zip
> > > >
> > > >
> > > >
> > > >
> > > > On 11/7/15, 5:34 PM, "moon soo Lee" <mo...@apache.org> wrote:
> > > >
> > > > >Hi Guillaume,
> > > > >
> > > > >In my test with the machine that does not py4j installed, it worked
> > > > without
> > > > >problem.
> > > > >
> > > > >Could you try to run
> > > > >
> > > > >%spark
> > > > >System.getenv("PYTHONPATH")
> > > > >
> > > > >in Zeppelin notebook and see output?
> > > > >Mine is
> > > > >
> > > > >res1: String =
> > > >
> > > >
> > >
> >
> >/Users/moon/Downloads/zeppelin-0.5.5-incubating-bin-all/interpreter/spark/pyspark/pyspark.zip:/Users/moon/Downloads/zeppelin-0.5.5-incubating-bin-all/interpreter/spark/pyspark/py4j-0.8.2.1-src.zip
> > > > >
> > > > >
> > > > >Thanks,
> > > > >moon
> > > > >
> > > > >On Sat, Nov 7, 2015 at 7:13 PM tog <gu...@gmail.com>
> > wrote:
> > > > >
> > > > >> Hi moon
> > > > >> I had to install: "pip install py4j" and then it worked
> > > > >>
> > > > >> Cheers
> > > > >> Guillaume
> > > > >>
> > > > >>
> > > > >> On Nov 7, 2015 9:56 AM, "moon soo Lee" <mo...@apache.org> wrote:
> > > > >>
> > > > >> > Thanks for verifying.
> > > > >> >
> > > > >> > Yes that's right.
> > > > >> > pyspark in -all archive will work out of box without any
> > > > configuration.
> > > > >> >
> > > > >> > Best,
> > > > >> > moon
> > > > >> >
> > > > >> > On Sat, Nov 7, 2015 at 6:49 PM tog <gu...@gmail.com>
> > > > wrote:
> > > > >> >
> > > > >> > > Hi
> > > > >> > >
> > > > >> > > Before casting my vote I was willing to give a try to pyspark
> > > > (which I
> > > > >> > have
> > > > >> > > never done so far). Is it supposed to work out of the box in
> > local
> > > > just
> > > > >> > > untaring the -all archive or do I have to configure something?
> > > > >> > >
> > > > >> > > Cheers
> > > > >> > > Guillaume
> > > > >> > >
> > > > >> > >
> > > > >> > > --
> > > > >> > > PGP KeyID: 2048R/EA31CFC9  subkeys.pgp.net
> > > > >> > >
> > > > >> >
> > > > >>
> > > >
> > > >
> > >
> >
> >
> >
> > --
> > PGP KeyID: 2048R/EA31CFC9  subkeys.pgp.net
> >
>



-- 
PGP KeyID: 2048R/EA31CFC9  subkeys.pgp.net

Re: 0.5.5 with pyspark

Posted by moon soo Lee <mo...@apache.org>.
yes, PYTHONPATH looks fine. If your filesystem has

/Users/tog/Downloads/zeppelin-0.5.5-incubating-bin-all/interpreter/spark/pyspark/pyspark.zip
/Users/tog/Downloads/zeppelin-0.5.5-incubating-bin-all/interpreter/spark/pyspark/py4j-0.8.2.1-src.zip

then, it supposed to work without py4j installed in your system.

Could you share your OS?

Thanks,
moon

On Sun, Nov 8, 2015 at 7:04 PM tog <gu...@gmail.com> wrote:

> Hi Moon
>
> Here is what I get with zeppelin-0.5.5-incubating-bin-all:
> res0: String =
>
> /Users/tog/Downloads/zeppelin-0.5.5-incubating-bin-all/interpreter/spark/pyspark/pyspark.zip:/Users/tog/Downloads/zeppelin-0.5.5-incubating-bin-all/interpreter/spark/pyspark/py4j-0.8.2.1-src.zip
>
>
> this looks fine
> Cheers
> Guillaume
>
>
> On 8 November 2015 at 01:46, moon soo Lee <mo...@apache.org> wrote:
>
> > without -Ppyspark, interpreter/spark/pyspark/ directory is not being
> > created although PYTHONPATH populated to pointing pyspark.zip
> > and py4j-0.8.2.1-src.zip in that directory.
> >
> > with -Ppyspark, it will work.
> >
> > Thanks,
> > moon
> >
> > On Sun, Nov 8, 2015 at 10:40 AM Jeff Steinmetz <
> > jeffrey.steinmetz@gmail.com>
> > wrote:
> >
> > > Shows this (I did not build with -Ppyspark).  I can rebuild again with
> > > -Ppyspark
> > >
> > > Either way, here is my output (I have a repo in a vagrant VM, and the
> > > build ends up zeppelin-distribution, which is where I run my test)
> > >
> > > %spark
> > > System.getenv("PYTHONPATH")
> > >
> > >
> > > res1: String =
> > >
> >
> /vagrant/incubator-zeppelin/zeppelin-distribution/target/zeppelin-0.5.6-incubating-SNAPSHOT/zeppelin-0.5.6-incubating-SNAPSHOT/interpreter/spark/pyspark/pyspark.zip:/vagrant/incubator-zeppelin/zeppelin-distribution/target/zeppelin-0.5.6-incubating-SNAPSHOT/zeppelin-0.5.6-incubating-SNAPSHOT/interpreter/spark/pyspark/py4j-0.8.2.1-src.zip
> > >
> > >
> > >
> > >
> > > On 11/7/15, 5:34 PM, "moon soo Lee" <mo...@apache.org> wrote:
> > >
> > > >Hi Guillaume,
> > > >
> > > >In my test with the machine that does not py4j installed, it worked
> > > without
> > > >problem.
> > > >
> > > >Could you try to run
> > > >
> > > >%spark
> > > >System.getenv("PYTHONPATH")
> > > >
> > > >in Zeppelin notebook and see output?
> > > >Mine is
> > > >
> > > >res1: String =
> > >
> > >
> >
> >/Users/moon/Downloads/zeppelin-0.5.5-incubating-bin-all/interpreter/spark/pyspark/pyspark.zip:/Users/moon/Downloads/zeppelin-0.5.5-incubating-bin-all/interpreter/spark/pyspark/py4j-0.8.2.1-src.zip
> > > >
> > > >
> > > >Thanks,
> > > >moon
> > > >
> > > >On Sat, Nov 7, 2015 at 7:13 PM tog <gu...@gmail.com>
> wrote:
> > > >
> > > >> Hi moon
> > > >> I had to install: "pip install py4j" and then it worked
> > > >>
> > > >> Cheers
> > > >> Guillaume
> > > >>
> > > >>
> > > >> On Nov 7, 2015 9:56 AM, "moon soo Lee" <mo...@apache.org> wrote:
> > > >>
> > > >> > Thanks for verifying.
> > > >> >
> > > >> > Yes that's right.
> > > >> > pyspark in -all archive will work out of box without any
> > > configuration.
> > > >> >
> > > >> > Best,
> > > >> > moon
> > > >> >
> > > >> > On Sat, Nov 7, 2015 at 6:49 PM tog <gu...@gmail.com>
> > > wrote:
> > > >> >
> > > >> > > Hi
> > > >> > >
> > > >> > > Before casting my vote I was willing to give a try to pyspark
> > > (which I
> > > >> > have
> > > >> > > never done so far). Is it supposed to work out of the box in
> local
> > > just
> > > >> > > untaring the -all archive or do I have to configure something?
> > > >> > >
> > > >> > > Cheers
> > > >> > > Guillaume
> > > >> > >
> > > >> > >
> > > >> > > --
> > > >> > > PGP KeyID: 2048R/EA31CFC9  subkeys.pgp.net
> > > >> > >
> > > >> >
> > > >>
> > >
> > >
> >
>
>
>
> --
> PGP KeyID: 2048R/EA31CFC9  subkeys.pgp.net
>

Re: 0.5.5 with pyspark

Posted by tog <gu...@gmail.com>.
Hi Moon

Here is what I get with zeppelin-0.5.5-incubating-bin-all:
res0: String =
/Users/tog/Downloads/zeppelin-0.5.5-incubating-bin-all/interpreter/spark/pyspark/pyspark.zip:/Users/tog/Downloads/zeppelin-0.5.5-incubating-bin-all/interpreter/spark/pyspark/py4j-0.8.2.1-src.zip


this looks fine
Cheers
Guillaume


On 8 November 2015 at 01:46, moon soo Lee <mo...@apache.org> wrote:

> without -Ppyspark, interpreter/spark/pyspark/ directory is not being
> created although PYTHONPATH populated to pointing pyspark.zip
> and py4j-0.8.2.1-src.zip in that directory.
>
> with -Ppyspark, it will work.
>
> Thanks,
> moon
>
> On Sun, Nov 8, 2015 at 10:40 AM Jeff Steinmetz <
> jeffrey.steinmetz@gmail.com>
> wrote:
>
> > Shows this (I did not build with -Ppyspark).  I can rebuild again with
> > -Ppyspark
> >
> > Either way, here is my output (I have a repo in a vagrant VM, and the
> > build ends up zeppelin-distribution, which is where I run my test)
> >
> > %spark
> > System.getenv("PYTHONPATH")
> >
> >
> > res1: String =
> >
> /vagrant/incubator-zeppelin/zeppelin-distribution/target/zeppelin-0.5.6-incubating-SNAPSHOT/zeppelin-0.5.6-incubating-SNAPSHOT/interpreter/spark/pyspark/pyspark.zip:/vagrant/incubator-zeppelin/zeppelin-distribution/target/zeppelin-0.5.6-incubating-SNAPSHOT/zeppelin-0.5.6-incubating-SNAPSHOT/interpreter/spark/pyspark/py4j-0.8.2.1-src.zip
> >
> >
> >
> >
> > On 11/7/15, 5:34 PM, "moon soo Lee" <mo...@apache.org> wrote:
> >
> > >Hi Guillaume,
> > >
> > >In my test with the machine that does not py4j installed, it worked
> > without
> > >problem.
> > >
> > >Could you try to run
> > >
> > >%spark
> > >System.getenv("PYTHONPATH")
> > >
> > >in Zeppelin notebook and see output?
> > >Mine is
> > >
> > >res1: String =
> >
> >
> >/Users/moon/Downloads/zeppelin-0.5.5-incubating-bin-all/interpreter/spark/pyspark/pyspark.zip:/Users/moon/Downloads/zeppelin-0.5.5-incubating-bin-all/interpreter/spark/pyspark/py4j-0.8.2.1-src.zip
> > >
> > >
> > >Thanks,
> > >moon
> > >
> > >On Sat, Nov 7, 2015 at 7:13 PM tog <gu...@gmail.com> wrote:
> > >
> > >> Hi moon
> > >> I had to install: "pip install py4j" and then it worked
> > >>
> > >> Cheers
> > >> Guillaume
> > >>
> > >>
> > >> On Nov 7, 2015 9:56 AM, "moon soo Lee" <mo...@apache.org> wrote:
> > >>
> > >> > Thanks for verifying.
> > >> >
> > >> > Yes that's right.
> > >> > pyspark in -all archive will work out of box without any
> > configuration.
> > >> >
> > >> > Best,
> > >> > moon
> > >> >
> > >> > On Sat, Nov 7, 2015 at 6:49 PM tog <gu...@gmail.com>
> > wrote:
> > >> >
> > >> > > Hi
> > >> > >
> > >> > > Before casting my vote I was willing to give a try to pyspark
> > (which I
> > >> > have
> > >> > > never done so far). Is it supposed to work out of the box in local
> > just
> > >> > > untaring the -all archive or do I have to configure something?
> > >> > >
> > >> > > Cheers
> > >> > > Guillaume
> > >> > >
> > >> > >
> > >> > > --
> > >> > > PGP KeyID: 2048R/EA31CFC9  subkeys.pgp.net
> > >> > >
> > >> >
> > >>
> >
> >
>



-- 
PGP KeyID: 2048R/EA31CFC9  subkeys.pgp.net

Re: 0.5.5 with pyspark

Posted by moon soo Lee <mo...@apache.org>.
without -Ppyspark, interpreter/spark/pyspark/ directory is not being
created although PYTHONPATH populated to pointing pyspark.zip
and py4j-0.8.2.1-src.zip in that directory.

with -Ppyspark, it will work.

Thanks,
moon

On Sun, Nov 8, 2015 at 10:40 AM Jeff Steinmetz <je...@gmail.com>
wrote:

> Shows this (I did not build with -Ppyspark).  I can rebuild again with
> -Ppyspark
>
> Either way, here is my output (I have a repo in a vagrant VM, and the
> build ends up zeppelin-distribution, which is where I run my test)
>
> %spark
> System.getenv("PYTHONPATH")
>
>
> res1: String =
> /vagrant/incubator-zeppelin/zeppelin-distribution/target/zeppelin-0.5.6-incubating-SNAPSHOT/zeppelin-0.5.6-incubating-SNAPSHOT/interpreter/spark/pyspark/pyspark.zip:/vagrant/incubator-zeppelin/zeppelin-distribution/target/zeppelin-0.5.6-incubating-SNAPSHOT/zeppelin-0.5.6-incubating-SNAPSHOT/interpreter/spark/pyspark/py4j-0.8.2.1-src.zip
>
>
>
>
> On 11/7/15, 5:34 PM, "moon soo Lee" <mo...@apache.org> wrote:
>
> >Hi Guillaume,
> >
> >In my test with the machine that does not py4j installed, it worked
> without
> >problem.
> >
> >Could you try to run
> >
> >%spark
> >System.getenv("PYTHONPATH")
> >
> >in Zeppelin notebook and see output?
> >Mine is
> >
> >res1: String =
>
> >/Users/moon/Downloads/zeppelin-0.5.5-incubating-bin-all/interpreter/spark/pyspark/pyspark.zip:/Users/moon/Downloads/zeppelin-0.5.5-incubating-bin-all/interpreter/spark/pyspark/py4j-0.8.2.1-src.zip
> >
> >
> >Thanks,
> >moon
> >
> >On Sat, Nov 7, 2015 at 7:13 PM tog <gu...@gmail.com> wrote:
> >
> >> Hi moon
> >> I had to install: "pip install py4j" and then it worked
> >>
> >> Cheers
> >> Guillaume
> >>
> >>
> >> On Nov 7, 2015 9:56 AM, "moon soo Lee" <mo...@apache.org> wrote:
> >>
> >> > Thanks for verifying.
> >> >
> >> > Yes that's right.
> >> > pyspark in -all archive will work out of box without any
> configuration.
> >> >
> >> > Best,
> >> > moon
> >> >
> >> > On Sat, Nov 7, 2015 at 6:49 PM tog <gu...@gmail.com>
> wrote:
> >> >
> >> > > Hi
> >> > >
> >> > > Before casting my vote I was willing to give a try to pyspark
> (which I
> >> > have
> >> > > never done so far). Is it supposed to work out of the box in local
> just
> >> > > untaring the -all archive or do I have to configure something?
> >> > >
> >> > > Cheers
> >> > > Guillaume
> >> > >
> >> > >
> >> > > --
> >> > > PGP KeyID: 2048R/EA31CFC9  subkeys.pgp.net
> >> > >
> >> >
> >>
>
>

Re: 0.5.5 with pyspark

Posted by Jeff Steinmetz <je...@gmail.com>.
Shows this (I did not build with -Ppyspark).  I can rebuild again with -Ppyspark

Either way, here is my output (I have a repo in a vagrant VM, and the build ends up zeppelin-distribution, which is where I run my test)

%spark
System.getenv("PYTHONPATH")


res1: String = /vagrant/incubator-zeppelin/zeppelin-distribution/target/zeppelin-0.5.6-incubating-SNAPSHOT/zeppelin-0.5.6-incubating-SNAPSHOT/interpreter/spark/pyspark/pyspark.zip:/vagrant/incubator-zeppelin/zeppelin-distribution/target/zeppelin-0.5.6-incubating-SNAPSHOT/zeppelin-0.5.6-incubating-SNAPSHOT/interpreter/spark/pyspark/py4j-0.8.2.1-src.zip




On 11/7/15, 5:34 PM, "moon soo Lee" <mo...@apache.org> wrote:

>Hi Guillaume,
>
>In my test with the machine that does not py4j installed, it worked without
>problem.
>
>Could you try to run
>
>%spark
>System.getenv("PYTHONPATH")
>
>in Zeppelin notebook and see output?
>Mine is
>
>res1: String =
>/Users/moon/Downloads/zeppelin-0.5.5-incubating-bin-all/interpreter/spark/pyspark/pyspark.zip:/Users/moon/Downloads/zeppelin-0.5.5-incubating-bin-all/interpreter/spark/pyspark/py4j-0.8.2.1-src.zip
>
>
>Thanks,
>moon
>
>On Sat, Nov 7, 2015 at 7:13 PM tog <gu...@gmail.com> wrote:
>
>> Hi moon
>> I had to install: "pip install py4j" and then it worked
>>
>> Cheers
>> Guillaume
>>
>>
>> On Nov 7, 2015 9:56 AM, "moon soo Lee" <mo...@apache.org> wrote:
>>
>> > Thanks for verifying.
>> >
>> > Yes that's right.
>> > pyspark in -all archive will work out of box without any configuration.
>> >
>> > Best,
>> > moon
>> >
>> > On Sat, Nov 7, 2015 at 6:49 PM tog <gu...@gmail.com> wrote:
>> >
>> > > Hi
>> > >
>> > > Before casting my vote I was willing to give a try to pyspark (which I
>> > have
>> > > never done so far). Is it supposed to work out of the box in local just
>> > > untaring the -all archive or do I have to configure something?
>> > >
>> > > Cheers
>> > > Guillaume
>> > >
>> > >
>> > > --
>> > > PGP KeyID: 2048R/EA31CFC9  subkeys.pgp.net
>> > >
>> >
>>


Re: 0.5.5 with pyspark

Posted by moon soo Lee <mo...@apache.org>.
Hi Guillaume,

In my test with the machine that does not py4j installed, it worked without
problem.

Could you try to run

%spark
System.getenv("PYTHONPATH")

in Zeppelin notebook and see output?
Mine is

res1: String =
/Users/moon/Downloads/zeppelin-0.5.5-incubating-bin-all/interpreter/spark/pyspark/pyspark.zip:/Users/moon/Downloads/zeppelin-0.5.5-incubating-bin-all/interpreter/spark/pyspark/py4j-0.8.2.1-src.zip


Thanks,
moon

On Sat, Nov 7, 2015 at 7:13 PM tog <gu...@gmail.com> wrote:

> Hi moon
> I had to install: "pip install py4j" and then it worked
>
> Cheers
> Guillaume
>
>
> On Nov 7, 2015 9:56 AM, "moon soo Lee" <mo...@apache.org> wrote:
>
> > Thanks for verifying.
> >
> > Yes that's right.
> > pyspark in -all archive will work out of box without any configuration.
> >
> > Best,
> > moon
> >
> > On Sat, Nov 7, 2015 at 6:49 PM tog <gu...@gmail.com> wrote:
> >
> > > Hi
> > >
> > > Before casting my vote I was willing to give a try to pyspark (which I
> > have
> > > never done so far). Is it supposed to work out of the box in local just
> > > untaring the -all archive or do I have to configure something?
> > >
> > > Cheers
> > > Guillaume
> > >
> > >
> > > --
> > > PGP KeyID: 2048R/EA31CFC9  subkeys.pgp.net
> > >
> >
>

Re: 0.5.5 with pyspark

Posted by tog <gu...@gmail.com>.
Hi moon
I had to install: "pip install py4j" and then it worked

Cheers
Guillaume


On Nov 7, 2015 9:56 AM, "moon soo Lee" <mo...@apache.org> wrote:

> Thanks for verifying.
>
> Yes that's right.
> pyspark in -all archive will work out of box without any configuration.
>
> Best,
> moon
>
> On Sat, Nov 7, 2015 at 6:49 PM tog <gu...@gmail.com> wrote:
>
> > Hi
> >
> > Before casting my vote I was willing to give a try to pyspark (which I
> have
> > never done so far). Is it supposed to work out of the box in local just
> > untaring the -all archive or do I have to configure something?
> >
> > Cheers
> > Guillaume
> >
> >
> > --
> > PGP KeyID: 2048R/EA31CFC9  subkeys.pgp.net
> >
>

Re: 0.5.5 with pyspark

Posted by moon soo Lee <mo...@apache.org>.
Thanks for verifying.

Yes that's right.
pyspark in -all archive will work out of box without any configuration.

Best,
moon

On Sat, Nov 7, 2015 at 6:49 PM tog <gu...@gmail.com> wrote:

> Hi
>
> Before casting my vote I was willing to give a try to pyspark (which I have
> never done so far). Is it supposed to work out of the box in local just
> untaring the -all archive or do I have to configure something?
>
> Cheers
> Guillaume
>
>
> --
> PGP KeyID: 2048R/EA31CFC9  subkeys.pgp.net
>