You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by gu...@apache.org on 2019/02/08 02:43:43 UTC
[spark] branch master updated: [SPARK-26831][PYTHON] Eliminates
Python version check for executor at driver side when using IPython
This is an automated email from the ASF dual-hosted git repository.
gurwls223 pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/master by this push:
new d893e3c [SPARK-26831][PYTHON] Eliminates Python version check for executor at driver side when using IPython
d893e3c is described below
commit d893e3c5d3875bcfe68d80d502d227a51a68203a
Author: Stefaan Lippens <so...@gmail.com>
AuthorDate: Fri Feb 8 10:43:17 2019 +0800
[SPARK-26831][PYTHON] Eliminates Python version check for executor at driver side when using IPython
## What changes were proposed in this pull request?
I was trying out pyspark on a system with only a `python3` command but no `python` command and got this error:
```bash
/opt/spark/bin/pyspark: line 45: python: command not found
```
I also noticed the bash syntax for the IPython version check is wrong: `[[ ! $WORKS_WITH_IPYTHON ]]` always evaluates to false when `$WORKS_WITH_IPYTHON` is non-empty
This PR simply eliminates the Python version check for executor at driver side when using IPython.
## How was this patch tested?
I manually tested the `pyspark` launch script and bash syntax stuff
Closes #23736 from soxofaan/master.
Authored-by: Stefaan Lippens <so...@gmail.com>
Signed-off-by: Hyukjin Kwon <gu...@apache.org>
---
bin/pyspark | 21 ++++++---------------
1 file changed, 6 insertions(+), 15 deletions(-)
diff --git a/bin/pyspark b/bin/pyspark
index 1dcddcc..44891ae 100755
--- a/bin/pyspark
+++ b/bin/pyspark
@@ -38,22 +38,15 @@ if [[ -n "$IPYTHON" || -n "$IPYTHON_OPTS" ]]; then
fi
# Default to standard python interpreter unless told otherwise
-if [[ -z "$PYSPARK_DRIVER_PYTHON" ]]; then
- PYSPARK_DRIVER_PYTHON="${PYSPARK_PYTHON:-"python"}"
-fi
-
-WORKS_WITH_IPYTHON=$(python -c 'import sys; print(sys.version_info >= (2, 7, 0))')
-
-# Determine the Python executable to use for the executors:
if [[ -z "$PYSPARK_PYTHON" ]]; then
- if [[ $PYSPARK_DRIVER_PYTHON == *ipython* && ! $WORKS_WITH_IPYTHON ]]; then
- echo "IPython requires Python 2.7+; please install python2.7 or set PYSPARK_PYTHON" 1>&2
- exit 1
- else
- PYSPARK_PYTHON=python
- fi
+ PYSPARK_PYTHON=python
+fi
+if [[ -z "$PYSPARK_DRIVER_PYTHON" ]]; then
+ PYSPARK_DRIVER_PYTHON=$PYSPARK_PYTHON
fi
export PYSPARK_PYTHON
+export PYSPARK_DRIVER_PYTHON
+export PYSPARK_DRIVER_PYTHON_OPTS
# Add the PySpark classes to the Python path:
export PYTHONPATH="${SPARK_HOME}/python/:$PYTHONPATH"
@@ -72,6 +65,4 @@ if [[ -n "$SPARK_TESTING" ]]; then
exit
fi
-export PYSPARK_DRIVER_PYTHON
-export PYSPARK_DRIVER_PYTHON_OPTS
exec "${SPARK_HOME}"/bin/spark-submit pyspark-shell-main --name "PySparkShell" "$@"
---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org