You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2017/08/01 18:12:00 UTC

[jira] [Commented] (SPARK-21573) Tests failing with run-tests.py SyntaxError occasionally in Jenkins

    [ https://issues.apache.org/jira/browse/SPARK-21573?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16109440#comment-16109440 ] 

Sean Owen commented on SPARK-21573:
-----------------------------------

FWIW I noticed it fails on 4 and 7 but not on 6. Go ahead with that PR when ready :) it's failing about half the builds right now unfortunately.

> Tests failing with run-tests.py SyntaxError occasionally in Jenkins
> -------------------------------------------------------------------
>
>                 Key: SPARK-21573
>                 URL: https://issues.apache.org/jira/browse/SPARK-21573
>             Project: Spark
>          Issue Type: Test
>          Components: Tests
>    Affects Versions: 2.3.0
>            Reporter: Hyukjin Kwon
>            Priority: Minor
>
> It looks default {{python}} in the path at few places such as {{./dev/run-tests}} use Python 2.6 in Jenkins and it fails to execute {{run-tests.py}}:
> {code}
> python2.6 run-tests.py
>   File "run-tests.py", line 124
>     {m: set(m.dependencies).intersection(modules_to_test) for m in modules_to_test}, sort=True)
>                                                             ^
> SyntaxError: invalid syntax
> {code}
> It looks there are quite some places to fix to support Python 2.6 in {{run-tests.py}} and related Python scripts.
> We might just try to set Python 2.7 in few other scripts running this if available.
> Please also see http://apache-spark-developers-list.1001551.n3.nabble.com/Tests-failing-with-run-tests-py-SyntaxError-td22030.html



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org