You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by jo...@apache.org on 2013/12/15 23:12:03 UTC

[1/4] git commit: Fix 'IPYTHON=1 ./pyspark' throwing 'ValueError: Cannot run multiple SparkContexts at once'

Updated Branches:
  refs/heads/master c55e69855 -> d2ced6d58


Fix 'IPYTHON=1 ./pyspark' throwing 'ValueError: Cannot run multiple SparkContexts at once'


Project: http://git-wip-us.apache.org/repos/asf/incubator-spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-spark/commit/8cdfb08c
Tree: http://git-wip-us.apache.org/repos/asf/incubator-spark/tree/8cdfb08c
Diff: http://git-wip-us.apache.org/repos/asf/incubator-spark/diff/8cdfb08c

Branch: refs/heads/master
Commit: 8cdfb08c47131ce3438e5faf1222af2039424324
Parents: d2efe13
Author: Nick Pentreath <ni...@gmail.com>
Authored: Thu Dec 12 13:08:59 2013 +0200
Committer: Nick Pentreath <ni...@gmail.com>
Committed: Thu Dec 12 13:08:59 2013 +0200

----------------------------------------------------------------------
 pyspark | 3 +--
 1 file changed, 1 insertion(+), 2 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/incubator-spark/blob/8cdfb08c/pyspark
----------------------------------------------------------------------
diff --git a/pyspark b/pyspark
index 4941a36..18f668e 100755
--- a/pyspark
+++ b/pyspark
@@ -59,8 +59,7 @@ if [ -n "$IPYTHON_OPTS" ]; then
 fi
 
 if [[ "$IPYTHON" = "1" ]] ; then
-  IPYTHON_OPTS=${IPYTHON_OPTS:--i}
-  exec ipython "$IPYTHON_OPTS" -c "%run $PYTHONSTARTUP"
+  exec ipython "$IPYTHON_OPTS" "$@"
 else
   exec "$PYSPARK_PYTHON" "$@"
 fi


[3/4] git commit: Making IPython PySpark compatible across versions <1.0.0. Also cleaned up '-i' option and made IPYTHON_OPTS work

Posted by jo...@apache.org.
Making IPython PySpark compatible across versions <1.0.0. Also cleaned up '-i' option and made IPYTHON_OPTS work


Project: http://git-wip-us.apache.org/repos/asf/incubator-spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-spark/commit/bb5277b1
Tree: http://git-wip-us.apache.org/repos/asf/incubator-spark/tree/bb5277b1
Diff: http://git-wip-us.apache.org/repos/asf/incubator-spark/diff/bb5277b1

Branch: refs/heads/master
Commit: bb5277b10a3797c1beeca01c32b287ee79db831d
Parents: d36ee3b
Author: Nick Pentreath <ni...@gmail.com>
Authored: Sun Dec 15 09:39:45 2013 +0200
Committer: Nick Pentreath <ni...@gmail.com>
Committed: Sun Dec 15 09:39:45 2013 +0200

----------------------------------------------------------------------
 pyspark | 7 ++++++-
 1 file changed, 6 insertions(+), 1 deletion(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/incubator-spark/blob/bb5277b1/pyspark
----------------------------------------------------------------------
diff --git a/pyspark b/pyspark
index 8b34c98..12cc926 100755
--- a/pyspark
+++ b/pyspark
@@ -59,7 +59,12 @@ if [ -n "$IPYTHON_OPTS" ]; then
 fi
 
 if [[ "$IPYTHON" = "1" ]] ; then
-  exec ipython "$IPYTHON_OPTS" "$@"
+  # IPython <1.0.0 doesn't honor PYTHONSTARTUP, while 1.0.0+ does. 
+  # Hence we clear PYTHONSTARTUP and use the -c "%run $IPYTHONSTARTUP" command which works on all versions
+  # We also force interactive mode with "-i"
+  IPYTHONSTARTUP=$PYTHONSTARTUP
+  PYTHONSTARTUP=
+  exec ipython "$IPYTHON_OPTS" -i -c "%run $IPYTHONSTARTUP"
 else
   exec "$PYSPARK_PYTHON" "$@"
 fi


[2/4] git commit: Merge remote-tracking branch 'upstream/master'

Posted by jo...@apache.org.
Merge remote-tracking branch 'upstream/master'


Project: http://git-wip-us.apache.org/repos/asf/incubator-spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-spark/commit/d36ee3b1
Tree: http://git-wip-us.apache.org/repos/asf/incubator-spark/tree/d36ee3b1
Diff: http://git-wip-us.apache.org/repos/asf/incubator-spark/diff/d36ee3b1

Branch: refs/heads/master
Commit: d36ee3b159dff04b3a3222d6c55ee27ba93cd074
Parents: 8cdfb08 7db9165
Author: Nick Pentreath <ni...@gmail.com>
Authored: Sun Dec 15 08:34:05 2013 +0200
Committer: Nick Pentreath <ni...@gmail.com>
Committed: Sun Dec 15 08:34:05 2013 +0200

----------------------------------------------------------------------
 .gitignore                                      |    1 +
 README.md                                       |    5 +-
 assembly/pom.xml                                |   16 +-
 bagel/pom.xml                                   |   12 +-
 bin/compute-classpath.cmd                       |    2 +-
 bin/compute-classpath.sh                        |    2 +-
 core/pom.xml                                    |   29 +-
 .../apache/spark/network/netty/FileClient.java  |    2 -
 .../apache/spark/network/netty/FileServer.java  |    1 -
 .../org/apache/spark/MapOutputTracker.scala     |   29 +-
 .../scala/org/apache/spark/Partitioner.scala    |    8 +-
 .../scala/org/apache/spark/SparkContext.scala   |   77 +-
 .../main/scala/org/apache/spark/SparkEnv.scala  |   13 +-
 .../main/scala/org/apache/spark/TaskState.scala |    3 +-
 .../apache/spark/api/java/JavaDoubleRDD.scala   |    9 +-
 .../org/apache/spark/api/java/JavaPairRDD.scala |   77 +-
 .../org/apache/spark/api/java/JavaRDD.scala     |    7 +-
 .../org/apache/spark/api/java/JavaRDDLike.scala |   32 +-
 .../spark/api/java/JavaSparkContext.scala       |   60 +-
 .../java/JavaSparkContextVarargsWorkaround.java |    1 -
 .../api/java/function/FlatMapFunction.scala     |    4 +-
 .../api/java/function/FlatMapFunction2.scala    |    4 +-
 .../spark/api/java/function/Function.java       |    8 +-
 .../spark/api/java/function/Function2.java      |    8 +-
 .../spark/api/java/function/Function3.java      |    8 +-
 .../api/java/function/PairFlatMapFunction.java  |   12 +-
 .../spark/api/java/function/PairFunction.java   |   12 +-
 .../org/apache/spark/api/python/PythonRDD.scala |   10 +-
 .../spark/api/python/PythonWorkerFactory.scala  |    4 +-
 .../org/apache/spark/deploy/ExecutorState.scala |    3 +-
 .../apache/spark/deploy/LocalSparkCluster.scala |   13 +-
 .../org/apache/spark/deploy/client/Client.scala |   48 +-
 .../spark/deploy/master/ApplicationState.scala  |    3 +-
 .../master/FileSystemPersistenceEngine.scala    |    6 +-
 .../org/apache/spark/deploy/master/Master.scala |   69 +-
 .../spark/deploy/master/RecoveryState.scala     |    4 +-
 .../spark/deploy/master/WorkerState.scala       |    4 +-
 .../master/ZooKeeperPersistenceEngine.scala     |    6 +-
 .../deploy/master/ui/ApplicationPage.scala      |    5 +-
 .../spark/deploy/master/ui/IndexPage.scala      |    4 +-
 .../spark/deploy/master/ui/MasterWebUI.scala    |    2 +-
 .../org/apache/spark/deploy/worker/Worker.scala |   48 +-
 .../spark/deploy/worker/ui/IndexPage.scala      |    5 +-
 .../spark/deploy/worker/ui/WorkerWebUI.scala    |   13 +-
 .../executor/CoarseGrainedExecutorBackend.scala |   23 +-
 .../org/apache/spark/executor/Executor.scala    |    2 +-
 .../spark/network/ConnectionManager.scala       |    8 +-
 .../spark/network/ConnectionManagerTest.scala   |    4 +-
 .../org/apache/spark/rdd/AsyncRDDActions.scala  |    3 +-
 .../scala/org/apache/spark/rdd/BlockRDD.scala   |    4 +-
 .../org/apache/spark/rdd/CartesianRDD.scala     |    3 +-
 .../org/apache/spark/rdd/CheckpointRDD.scala    |   14 +-
 .../org/apache/spark/rdd/CoalescedRDD.scala     |    3 +-
 .../apache/spark/rdd/DoubleRDDFunctions.scala   |    5 +-
 .../scala/org/apache/spark/rdd/EmptyRDD.scala   |    5 +-
 .../org/apache/spark/rdd/FilteredRDD.scala      |    3 +-
 .../org/apache/spark/rdd/FlatMappedRDD.scala    |    3 +-
 .../scala/org/apache/spark/rdd/GlommedRDD.scala |    3 +-
 .../scala/org/apache/spark/rdd/JdbcRDD.scala    |    4 +-
 .../org/apache/spark/rdd/MapPartitionsRDD.scala |    4 +-
 .../scala/org/apache/spark/rdd/MappedRDD.scala  |    4 +-
 .../apache/spark/rdd/OrderedRDDFunctions.scala  |   10 +-
 .../org/apache/spark/rdd/PairRDDFunctions.scala |   33 +-
 .../spark/rdd/ParallelCollectionRDD.scala       |    8 +-
 .../apache/spark/rdd/PartitionPruningRDD.scala  |    6 +-
 .../scala/org/apache/spark/rdd/PipedRDD.scala   |    3 +-
 .../main/scala/org/apache/spark/rdd/RDD.scala   |   64 +-
 .../apache/spark/rdd/RDDCheckpointData.scala    |    4 +-
 .../scala/org/apache/spark/rdd/SampledRDD.scala |    5 +-
 .../spark/rdd/SequenceFileRDDFunctions.scala    |   11 +-
 .../org/apache/spark/rdd/ShuffledRDD.scala      |    6 +-
 .../org/apache/spark/rdd/SubtractedRDD.scala    |    5 +-
 .../scala/org/apache/spark/rdd/UnionRDD.scala   |    7 +-
 .../apache/spark/rdd/ZippedPartitionsRDD.scala  |    9 +-
 .../scala/org/apache/spark/rdd/ZippedRDD.scala  |    6 +-
 .../apache/spark/scheduler/DAGScheduler.scala   |   13 +-
 .../apache/spark/scheduler/SchedulingMode.scala |    2 +-
 .../apache/spark/scheduler/TaskLocality.scala   |    4 +-
 .../scheduler/cluster/ClusterScheduler.scala    |    5 +-
 .../cluster/CoarseGrainedSchedulerBackend.scala |   23 +-
 .../cluster/SimrSchedulerBackend.scala          |    2 +-
 .../cluster/SparkDeploySchedulerBackend.scala   |    2 +-
 .../scheduler/cluster/TaskResultGetter.scala    |    4 +-
 .../mesos/CoarseMesosSchedulerBackend.scala     |    2 +-
 .../org/apache/spark/storage/BlockManager.scala |    7 +-
 .../spark/storage/BlockManagerMaster.scala      |   16 +-
 .../spark/storage/BlockManagerMasterActor.scala |    7 +-
 .../apache/spark/storage/ThreadingTest.scala    |    2 +-
 .../apache/spark/ui/jobs/JobProgressUI.scala    |    2 +-
 .../spark/ui/storage/BlockManagerUI.scala       |    2 +-
 .../scala/org/apache/spark/util/AkkaUtils.scala |   79 +-
 .../spark/util/IndestructibleActorSystem.scala  |   68 +
 .../org/apache/spark/util/MetadataCleaner.scala |    3 +-
 .../apache/spark/util/TimeStampedHashMap.scala  |    2 +-
 .../scala/org/apache/spark/util/Utils.scala     |    5 +-
 .../spark/util/collection/OpenHashMap.scala     |    3 +-
 .../spark/util/collection/OpenHashSet.scala     |   11 +-
 .../collection/PrimitiveKeyOpenHashMap.scala    |    7 +-
 .../spark/util/collection/PrimitiveVector.scala |    4 +-
 .../org/apache/spark/AccumulatorSuite.scala     |   32 +-
 .../org/apache/spark/CheckpointSuite.scala      |    5 +-
 .../org/apache/spark/DistributedSuite.scala     |    3 +-
 .../scala/org/apache/spark/DriverSuite.scala    |    2 +-
 .../apache/spark/MapOutputTrackerSuite.scala    |   14 +-
 .../scala/org/apache/spark/UnpersistSuite.scala |    2 +-
 .../scala/org/apache/spark/rdd/RDDSuite.scala   |    8 +-
 .../spark/scheduler/SparkListenerSuite.scala    |    2 +-
 .../cluster/TaskResultGetterSuite.scala         |    4 +-
 .../org/apache/spark/storage/BlockIdSuite.scala |    2 +-
 .../spark/storage/BlockManagerSuite.scala       |    2 +-
 .../scala/org/apache/spark/ui/UISuite.scala     |    1 -
 .../apache/spark/util/SizeEstimatorSuite.scala  |   72 +-
 docs/_config.yml                                |    2 +-
 docs/_plugins/copy_api_dirs.rb                  |    2 +-
 docs/configuration.md                           |   23 +-
 docs/running-on-yarn.md                         |   12 +-
 examples/pom.xml                                |   26 +-
 .../org/apache/spark/examples/JavaLogQuery.java |    2 +-
 .../org/apache/spark/examples/JavaPageRank.java |    3 +-
 .../apache/spark/examples/JavaWordCount.java    |    2 +-
 .../apache/spark/mllib/examples/JavaALS.java    |    1 -
 .../streaming/examples/ActorWordCount.scala     |    7 +-
 .../streaming/examples/ZeroMQWordCount.scala    |    8 +-
 mllib/pom.xml                                   |   12 +-
 .../spark/mllib/util/MFDataGenerator.scala      |    2 +-
 .../spark/mllib/clustering/JavaKMeansSuite.java |    4 +-
 .../mllib/recommendation/JavaALSSuite.java      |    2 -
 pom.xml                                         |  169 +-
 project/SparkBuild.scala                        |  145 +-
 pyspark                                         |    2 +-
 pyspark2.cmd                                    |    2 +-
 python/pyspark/rdd.py                           |    4 +-
 repl-bin/pom.xml                                |    8 +-
 repl-bin/src/deb/bin/run                        |    2 +-
 repl/lib/scala-jline.jar                        |  Bin 158463 -> 0 bytes
 repl/pom.xml                                    |   18 +-
 .../main/scala/org/apache/spark/repl/Main.scala |    8 +-
 .../org/apache/spark/repl/SparkExprTyper.scala  |  109 ++
 .../org/apache/spark/repl/SparkILoop.scala      |  944 +++++-----
 .../org/apache/spark/repl/SparkILoopInit.scala  |  143 ++
 .../org/apache/spark/repl/SparkIMain.scala      | 1681 ++++++++++--------
 .../org/apache/spark/repl/SparkISettings.scala  |   63 -
 .../org/apache/spark/repl/SparkImports.scala    |  108 +-
 .../spark/repl/SparkJLineCompletion.scala       |  206 ++-
 .../apache/spark/repl/SparkJLineReader.scala    |   65 +-
 .../apache/spark/repl/SparkMemberHandlers.scala |  109 +-
 .../scala/org/apache/spark/repl/ReplSuite.scala |  178 +-
 run-example                                     |    2 +-
 run-example2.cmd                                |    2 +-
 spark-class                                     |    4 +-
 streaming/pom.xml                               |   22 +-
 .../org/apache/spark/streaming/DStream.scala    |   37 +-
 .../spark/streaming/DStreamCheckpointData.scala |    6 +-
 .../spark/streaming/NetworkInputTracker.scala   |    2 +-
 .../spark/streaming/PairDStreamFunctions.scala  |   63 +-
 .../spark/streaming/StreamingContext.scala      |   44 +-
 .../spark/streaming/api/java/JavaDStream.scala  |    8 +-
 .../streaming/api/java/JavaDStreamLike.scala    |   79 +-
 .../streaming/api/java/JavaPairDStream.scala    |   93 +-
 .../api/java/JavaStreamingContext.scala         |  108 +-
 .../dstream/ConstantInputDStream.scala          |    3 +-
 .../streaming/dstream/FileInputDStream.scala    |   12 +-
 .../streaming/dstream/FilteredDStream.scala     |    3 +-
 .../dstream/FlatMapValuedDStream.scala          |    3 +-
 .../streaming/dstream/FlatMappedDStream.scala   |    3 +-
 .../streaming/dstream/FlumeInputDStream.scala   |    3 +-
 .../streaming/dstream/ForEachDStream.scala      |    3 +-
 .../streaming/dstream/GlommedDStream.scala      |    3 +-
 .../spark/streaming/dstream/InputDStream.scala  |    4 +-
 .../streaming/dstream/KafkaInputDStream.scala   |   23 +-
 .../streaming/dstream/MQTTInputDStream.scala    |    3 +-
 .../dstream/MapPartitionedDStream.scala         |    3 +-
 .../streaming/dstream/MapValuedDStream.scala    |    3 +-
 .../spark/streaming/dstream/MappedDStream.scala |    3 +-
 .../streaming/dstream/NetworkInputDStream.scala |   13 +-
 .../dstream/PluggableInputDStream.scala         |    3 +-
 .../streaming/dstream/QueueInputDStream.scala   |    4 +-
 .../streaming/dstream/RawInputDStream.scala     |    4 +-
 .../dstream/ReducedWindowedDStream.scala        |    9 +-
 .../streaming/dstream/ShuffledDStream.scala     |    3 +-
 .../streaming/dstream/SocketInputDStream.scala  |    6 +-
 .../spark/streaming/dstream/StateDStream.scala  |    4 +-
 .../streaming/dstream/TransformedDStream.scala  |    3 +-
 .../spark/streaming/dstream/UnionDStream.scala  |    5 +-
 .../streaming/dstream/WindowedDStream.scala     |    7 +-
 .../streaming/receivers/ActorReceiver.scala     |   35 +-
 .../streaming/receivers/ZeroMQReceiver.scala    |   13 +-
 .../streaming/util/MasterFailureTest.scala      |   45 +-
 .../apache/spark/streaming/JavaAPISuite.java    |   88 +-
 .../apache/spark/streaming/JavaTestUtils.scala  |   22 +-
 .../spark/streaming/CheckpointSuite.scala       |   36 +-
 .../apache/spark/streaming/TestSuiteBase.scala  |   29 +-
 tools/pom.xml                                   |   12 +-
 .../tools/JavaAPICompletenessChecker.scala      |    4 +-
 yarn/pom.xml                                    |   10 +-
 .../spark/deploy/yarn/WorkerLauncher.scala      |   17 +-
 .../deploy/yarn/YarnAllocationHandler.scala     |    4 +-
 197 files changed, 3470 insertions(+), 2905 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/incubator-spark/blob/d36ee3b1/pyspark
----------------------------------------------------------------------


[4/4] git commit: Merge pull request #256 from MLnick/master

Posted by jo...@apache.org.
Merge pull request #256 from MLnick/master

Fix 'IPYTHON=1 ./pyspark' throwing ValueError

This fixes an annoying issue where running ```IPYTHON=1 ./pyspark``` resulted in:

```
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /__ / .__/\_,_/_/ /_/\_\   version 0.8.0
      /_/

Using Python version 2.7.5 (default, Jun 20 2013 11:06:30)
Spark context avaiable as sc.
---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
/usr/local/lib/python2.7/site-packages/IPython/utils/py3compat.pyc in execfile(fname, *where)
    202             else:
    203                 filename = fname
--> 204             __builtin__.execfile(filename, *where)

/Users/Nick/workspace/scala/spark-0.8.0-incubating-bin-hadoop1/python/pyspark/shell.py in <module>()
     30 add_files = os.environ.get("ADD_FILES").split(',') if os.environ.get("ADD_FILES") != None else None
     31
---> 32 sc = SparkContext(os.environ.get("MASTER", "local"), "PySparkShell", pyFiles=add_files)
     33
     34 print """Welcome to

/Users/Nick/workspace/scala/spark-0.8.0-incubating-bin-hadoop1/python/pyspark/context.pyc in __init__(self, master, jobName, sparkHome, pyFiles, environment, batchSize)
     70         with SparkContext._lock:
     71             if SparkContext._active_spark_context:
---> 72                 raise ValueError("Cannot run multiple SparkContexts at once")
     73             else:
     74                 SparkContext._active_spark_context = self

ValueError: Cannot run multiple SparkContexts at once
```

The issue arises since previously IPython didn't seem to respect ```$PYTHONSTARTUP```, but since at least 1.0.0 it has. Technically this might break for older versions of IPython, but most users should be able to upgrade IPython to at least 1.0.0 (and should be encouraged to do so :).

New behaviour:
```
Nicks-MacBook-Pro:incubator-spark-mlnick Nick$ IPYTHON=1 ./pyspark
Python 2.7.5 (default, Jun 20 2013, 11:06:30)
Type "copyright", "credits" or "license" for more information.

IPython 1.1.0 -- An enhanced Interactive Python.
?         -> Introduction and overview of IPython's features.
%quickref -> Quick reference.
help      -> Python's own help system.
object?   -> Details about 'object', use 'object??' for extra details.
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/Users/Nick/workspace/scala/incubator-spark-mlnick/tools/target/scala-2.9.3/spark-tools-assembly-0.9.0-incubating-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/Users/Nick/workspace/scala/incubator-spark-mlnick/assembly/target/scala-2.9.3/spark-assembly-0.9.0-incubating-SNAPSHOT-hadoop1.0.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
13/12/12 13:08:15 WARN Utils: Your hostname, Nicks-MacBook-Pro.local resolves to a loopback address: 127.0.0.1; using 10.0.0.4 instead (on interface en0)
13/12/12 13:08:15 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
13/12/12 13:08:15 INFO Slf4jEventHandler: Slf4jEventHandler started
13/12/12 13:08:15 INFO SparkEnv: Registering BlockManagerMaster
13/12/12 13:08:15 INFO DiskBlockManager: Created local directory at /var/folders/_l/06wxljt13wqgm7r08jlc44_r0000gn/T/spark-local-20131212130815-0e76
13/12/12 13:08:15 INFO MemoryStore: MemoryStore started with capacity 326.7 MB.
13/12/12 13:08:15 INFO ConnectionManager: Bound socket to port 53732 with id = ConnectionManagerId(10.0.0.4,53732)
13/12/12 13:08:15 INFO BlockManagerMaster: Trying to register BlockManager
13/12/12 13:08:15 INFO BlockManagerMasterActor$BlockManagerInfo: Registering block manager 10.0.0.4:53732 with 326.7 MB RAM
13/12/12 13:08:15 INFO BlockManagerMaster: Registered BlockManager
13/12/12 13:08:15 INFO HttpBroadcast: Broadcast server started at http://10.0.0.4:53733
13/12/12 13:08:15 INFO SparkEnv: Registering MapOutputTracker
13/12/12 13:08:15 INFO HttpFileServer: HTTP File server directory is /var/folders/_l/06wxljt13wqgm7r08jlc44_r0000gn/T/spark-8f40e897-8211-4628-a7a8-755562d5244c
13/12/12 13:08:16 INFO SparkUI: Started Spark Web UI at http://10.0.0.4:4040
2013-12-12 13:08:16.337 java[56801:4003] Unable to load realm info from SCDynamicStore
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /__ / .__/\_,_/_/ /_/\_\   version 0.9.0-SNAPSHOT
      /_/

Using Python version 2.7.5 (default, Jun 20 2013 11:06:30)
Spark context avaiable as sc.
```


Project: http://git-wip-us.apache.org/repos/asf/incubator-spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-spark/commit/d2ced6d5
Tree: http://git-wip-us.apache.org/repos/asf/incubator-spark/tree/d2ced6d5
Diff: http://git-wip-us.apache.org/repos/asf/incubator-spark/diff/d2ced6d5

Branch: refs/heads/master
Commit: d2ced6d58c5e8aea23f909c2fc4ac11aa1b55607
Parents: c55e698 bb5277b
Author: Josh Rosen <jo...@apache.org>
Authored: Sun Dec 15 14:11:34 2013 -0800
Committer: Josh Rosen <jo...@apache.org>
Committed: Sun Dec 15 14:11:34 2013 -0800

----------------------------------------------------------------------
 pyspark | 8 ++++++--
 1 file changed, 6 insertions(+), 2 deletions(-)
----------------------------------------------------------------------