You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Leonid Blokhin <lb...@provectus.com> on 2016/02/15 13:10:01 UTC

Single context Spark from Python and Scala

Hello

 I want to work with single context Spark from Python and Scala. Is it
possible?

Is it possible to do betwen started  ./bin/pyspark and ./bin/spark-shell
for dramatic example?


Cheers,

Leonid

Re: Single context Spark from Python and Scala

Posted by Chandeep Singh <cs...@chandeep.com>.
You could consider using Zeppelin - https://zeppelin.incubator.apache.org/docs/latest/interpreter/spark.html <https://zeppelin.incubator.apache.org/docs/latest/interpreter/spark.html>

https://zeppelin.incubator.apache.org/ <https://zeppelin.incubator.apache.org/>

ZeppelinContext
Zeppelin automatically injects ZeppelinContext as variable 'z' in your scala/python environment. ZeppelinContext provides some additional functions and utility.



Object exchange

ZeppelinContext extends map and it's shared between scala, python environment. So you can put some object from scala and read it from python, vise versa.

Put object from scala

%spark
val myObject = ...
z.put("objName", myObject)
Get object from python

%python
myObject = z.get("objName")


> On Feb 15, 2016, at 12:10 PM, Leonid Blokhin <lb...@provectus.com> wrote:
> 
> Hello
> 
>  I want to work with single context Spark from Python and Scala. Is it possible?
> 
> Is it possible to do betwen started  ./bin/pyspark and ./bin/spark-shell for dramatic example?
> 
> 
> 
> Cheers,
> 
> Leonid
>