You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by sh...@apache.org on 2019/04/19 16:45:53 UTC
[spark] branch branch-2.3 updated:
[SPARK-25079][PYTHON][BRANCH-2.3] update python3 executable to 3.6.x
This is an automated email from the ASF dual-hosted git repository.
shaneknapp pushed a commit to branch branch-2.3
in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/branch-2.3 by this push:
new a85ab12 [SPARK-25079][PYTHON][BRANCH-2.3] update python3 executable to 3.6.x
a85ab12 is described below
commit a85ab120e3d29a323e6d28aa307d4c20ee5f2c6c
Author: shane knapp <in...@gmail.com>
AuthorDate: Fri Apr 19 09:45:40 2019 -0700
[SPARK-25079][PYTHON][BRANCH-2.3] update python3 executable to 3.6.x
## What changes were proposed in this pull request?
have jenkins test against python3.6 (instead of 3.4).
## How was this patch tested?
extensive testing on both the centos and ubuntu jenkins workers revealed that 2.3 probably doesn't like python 3.6... :(
NOTE: this is just for branch-2.3
PLEASE DO NOT MERGE
Author: shane knapp <in...@gmail.com>
Closes #24380 from shaneknapp/update-python-executable-2.3.
---
python/run-tests.py | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/python/run-tests.py b/python/run-tests.py
index 3539c76..d571855 100755
--- a/python/run-tests.py
+++ b/python/run-tests.py
@@ -114,7 +114,7 @@ def run_individual_python_test(test_name, pyspark_python):
def get_default_python_executables():
- python_execs = [x for x in ["python2.7", "python3.4", "pypy"] if which(x)]
+ python_execs = [x for x in ["python2.7", "python3.6", "pypy"] if which(x)]
if "python2.7" not in python_execs:
LOGGER.warning("Not testing against `python2.7` because it could not be found; falling"
" back to `python` instead")
---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org