You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@toree.apache.org by lr...@apache.org on 2016/01/11 22:02:03 UTC

[21/50] [abbrv] incubator-toree git commit: Changes to README file.

Changes to README file.


Project: http://git-wip-us.apache.org/repos/asf/incubator-toree/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-toree/commit/f8b37cc3
Tree: http://git-wip-us.apache.org/repos/asf/incubator-toree/tree/f8b37cc3
Diff: http://git-wip-us.apache.org/repos/asf/incubator-toree/diff/f8b37cc3

Branch: refs/heads/master
Commit: f8b37cc354b2bca36dcaa889ae6959a84e945236
Parents: be758d8
Author: Gino Bustelo <pa...@us.ibm.com>
Authored: Wed Nov 18 17:10:51 2015 -0600
Committer: Gino Bustelo <pa...@us.ibm.com>
Committed: Tue Nov 24 08:49:50 2015 -0600

----------------------------------------------------------------------
 README.md | 9 +++++----
 1 file changed, 5 insertions(+), 4 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/incubator-toree/blob/f8b37cc3/README.md
----------------------------------------------------------------------
diff --git a/README.md b/README.md
index c9367d2..2115304 100644
--- a/README.md
+++ b/README.md
@@ -23,24 +23,25 @@ A version of the Spark Kernel is deployed as part of the [Try Jupyter!][try-jupy
 
 Develop
 =======
-[Vagrant][vagrant] is used to simplify the development experience. It is the only requirement to be able to build and test the Spark Kernel on your development machine. 
+[Vagrant][vagrant] is used to simplify the development experience. It is the only requirement to be able to build, package and test the Spark Kernel on your development machine. 
 
-To interact with the Spark Kernel using Jupyter, run
+To build and interact with the Spark Kernel using Jupyter, run
 ```
 make dev
 ```
 
 This will start a Jupyter notebook server accessible at `http://192.168.44.44:8888`. From here you can create notebooks that use the Spark Kernel configured for local mode.
 
+Tests can be run by doing `make test`.
 
 Build & Package
 ===============
 To build and package up the Spark Kernel, run
 ```
-make build
+make dist
 ```
 
-The resulting package of the kernel will be located at `./kernel/target/pack`. It contains a `Makefile` that can be used to install the Spark Kernel by running `make install` within the directory. More details about building and packaging can be found [here][4].
+The resulting package of the kernel will be located at `./dist/spark-kernel-<VERSION>.tar.gz`. The uncompressed package is what is used is ran by Jupyter when doing `make dev`.
 
 
 Version