You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Rob O'Dwyer (JIRA)" <ji...@apache.org> on 2014/08/28 00:04:58 UTC
[jira] [Created] (SPARK-3265) Allow using custom ipython executable
with pyspark
Rob O'Dwyer created SPARK-3265:
----------------------------------
Summary: Allow using custom ipython executable with pyspark
Key: SPARK-3265
URL: https://issues.apache.org/jira/browse/SPARK-3265
Project: Spark
Issue Type: Improvement
Components: PySpark
Affects Versions: 1.0.2
Reporter: Rob O'Dwyer
Priority: Minor
Fix For: 1.0.2
Although you can make pyspark use ipython with IPYTHON=1, and also change the python executable with PYSPARK_PYTHON=..., you can't use both at the same time because it hardcodes the default ipython script.
This makes it use the PYSPARK_PYTHON variable if present and fall back to default python, similarly to how the default python executable is handled.
So you can use a custom ipython like so:
{{PYSPARK_PYTHON=./anaconda/bin/ipython IPYTHON_OPTS="notebook" pyspark}}
--
This message was sent by Atlassian JIRA
(v6.2#6252)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org