You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Robert C Senkbeil <rc...@us.ibm.com> on 2014/12/12 23:16:45 UTC

IBM open-sources Spark Kernel



We are happy to announce a developer preview of the Spark Kernel which
enables remote applications to dynamically interact with Spark. You can
think of the Spark Kernel as a remote Spark Shell that uses the IPython
notebook interface to provide a common entrypoint for any application. The
Spark Kernel obviates the need to submit jars using spark-submit, and can
replace the existing Spark Shell.

You can try out the Spark Kernel today by installing it from our github
repo at https://github.com/ibm-et/spark-kernel. To help you get a demo
environment up and running quickly, the repository also includes a
Dockerfile and a Vagrantfile to build a Spark Kernel container and connect
to it from an IPython notebook.

We have included a number of documents with the project to help explain it
and provide how-to information:

* A high-level overview of the Spark Kernel and its client library (
https://issues.apache.org/jira/secure/attachment/12683624/Kernel%20Architecture.pdf
).

* README (https://github.com/ibm-et/spark-kernel/blob/master/README.md) -
building and testing the kernel, and deployment options including building
the Docker container and packaging the kernel.

* IPython instructions (
https://github.com/ibm-et/spark-kernel/blob/master/docs/IPYTHON.md) -
setting up the development version of IPython and connecting a Spark
Kernel.

* Client library tutorial (
https://github.com/ibm-et/spark-kernel/blob/master/docs/CLIENT.md) -
building and using the client library to connect to a Spark Kernel.

* Magics documentation (
https://github.com/ibm-et/spark-kernel/blob/master/docs/MAGICS.md) - the
magics in the kernel and how to write your own.

We think the Spark Kernel will be useful for developing applications for
Spark, and we are making it available with the intention of improving these
capabilities within the context of the Spark community (
https://issues.apache.org/jira/browse/SPARK-4605). We will continue to
develop the codebase and welcome your comments and suggestions.


Signed,

Chip Senkbeil
IBM Emerging Technology Software Engineer

Re: IBM open-sources Spark Kernel

Posted by Robert C Senkbeil <rc...@us.ibm.com>.
Hi Sam,

We developed the Spark Kernel with a focus on the newest version of the
IPython message protocol (5.0) for the upcoming IPython 3.0 release.

We are building around Apache Spark's REPL, which is used in the current
Spark Shell implementation.

The Spark Kernel was designed to be extensible through magics (
https://github.com/ibm-et/spark-kernel/blob/master/docs/MAGICS.md),
providing functionality that might be needed outside the Scala interpreter.

Finally, a big part of our focus is on application development. Because of
this, we are providing a client library for applications to connect to the
Spark Kernel without needing to implement the ZeroMQ protocol.

Signed,
Chip Senkbeil



From:	Sam Bessalah <sa...@gmail.com>
To:	Robert C Senkbeil/Austin/IBM@IBMUS
Date:	12/12/2014 04:20 PM
Subject:	Re: IBM open-sources Spark Kernel



Wow. Thanks. Can't wait to try this out.
Great job.
How Is it different from Iscala or Ispark?


On Dec 12, 2014 11:17 PM, "Robert C Senkbeil" <rc...@us.ibm.com> wrote:



  We are happy to announce a developer preview of the Spark Kernel which
  enables remote applications to dynamically interact with Spark. You can
  think of the Spark Kernel as a remote Spark Shell that uses the IPython
  notebook interface to provide a common entrypoint for any application.
  The
  Spark Kernel obviates the need to submit jars using spark-submit, and can
  replace the existing Spark Shell.

  You can try out the Spark Kernel today by installing it from our github
  repo at https://github.com/ibm-et/spark-kernel. To help you get a demo
  environment up and running quickly, the repository also includes a
  Dockerfile and a Vagrantfile to build a Spark Kernel container and
  connect
  to it from an IPython notebook.

  We have included a number of documents with the project to help explain
  it
  and provide how-to information:

  * A high-level overview of the Spark Kernel and its client library (
  https://issues.apache.org/jira/secure/attachment/12683624/Kernel%20Architecture.pdf

  ).

  * README (https://github.com/ibm-et/spark-kernel/blob/master/README.md) -
  building and testing the kernel, and deployment options including
  building
  the Docker container and packaging the kernel.

  * IPython instructions (
  https://github.com/ibm-et/spark-kernel/blob/master/docs/IPYTHON.md) -
  setting up the development version of IPython and connecting a Spark
  Kernel.

  * Client library tutorial (
  https://github.com/ibm-et/spark-kernel/blob/master/docs/CLIENT.md) -
  building and using the client library to connect to a Spark Kernel.

  * Magics documentation (
  https://github.com/ibm-et/spark-kernel/blob/master/docs/MAGICS.md) - the
  magics in the kernel and how to write your own.

  We think the Spark Kernel will be useful for developing applications for
  Spark, and we are making it available with the intention of improving
  these
  capabilities within the context of the Spark community (
  https://issues.apache.org/jira/browse/SPARK-4605). We will continue to
  develop the codebase and welcome your comments and suggestions.


  Signed,

  Chip Senkbeil
  IBM Emerging Technology Software Engineer

Re: IBM open-sources Spark Kernel

Posted by Robert C Senkbeil <rc...@us.ibm.com>.
Hi Sam,

We developed the Spark Kernel with a focus on the newest version of the
IPython message protocol (5.0) for the upcoming IPython 3.0 release.

We are building around Apache Spark's REPL, which is used in the current
Spark Shell implementation.

The Spark Kernel was designed to be extensible through magics (
https://github.com/ibm-et/spark-kernel/blob/master/docs/MAGICS.md),
providing functionality that might be needed outside the Scala interpreter.

Finally, a big part of our focus is on application development. Because of
this, we are providing a client library for applications to connect to the
Spark Kernel without needing to implement the ZeroMQ protocol.

Signed,
Chip Senkbeil



From:	Sam Bessalah <sa...@gmail.com>
To:	Robert C Senkbeil/Austin/IBM@IBMUS
Date:	12/12/2014 04:20 PM
Subject:	Re: IBM open-sources Spark Kernel



Wow. Thanks. Can't wait to try this out.
Great job.
How Is it different from Iscala or Ispark?


On Dec 12, 2014 11:17 PM, "Robert C Senkbeil" <rc...@us.ibm.com> wrote:



  We are happy to announce a developer preview of the Spark Kernel which
  enables remote applications to dynamically interact with Spark. You can
  think of the Spark Kernel as a remote Spark Shell that uses the IPython
  notebook interface to provide a common entrypoint for any application.
  The
  Spark Kernel obviates the need to submit jars using spark-submit, and can
  replace the existing Spark Shell.

  You can try out the Spark Kernel today by installing it from our github
  repo at https://github.com/ibm-et/spark-kernel. To help you get a demo
  environment up and running quickly, the repository also includes a
  Dockerfile and a Vagrantfile to build a Spark Kernel container and
  connect
  to it from an IPython notebook.

  We have included a number of documents with the project to help explain
  it
  and provide how-to information:

  * A high-level overview of the Spark Kernel and its client library (
  https://issues.apache.org/jira/secure/attachment/12683624/Kernel%20Architecture.pdf

  ).

  * README (https://github.com/ibm-et/spark-kernel/blob/master/README.md) -
  building and testing the kernel, and deployment options including
  building
  the Docker container and packaging the kernel.

  * IPython instructions (
  https://github.com/ibm-et/spark-kernel/blob/master/docs/IPYTHON.md) -
  setting up the development version of IPython and connecting a Spark
  Kernel.

  * Client library tutorial (
  https://github.com/ibm-et/spark-kernel/blob/master/docs/CLIENT.md) -
  building and using the client library to connect to a Spark Kernel.

  * Magics documentation (
  https://github.com/ibm-et/spark-kernel/blob/master/docs/MAGICS.md) - the
  magics in the kernel and how to write your own.

  We think the Spark Kernel will be useful for developing applications for
  Spark, and we are making it available with the intention of improving
  these
  capabilities within the context of the Spark community (
  https://issues.apache.org/jira/browse/SPARK-4605). We will continue to
  develop the codebase and welcome your comments and suggestions.


  Signed,

  Chip Senkbeil
  IBM Emerging Technology Software Engineer