You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2017/06/02 03:47:04 UTC

[jira] [Commented] (SPARK-20149) Audit PySpark code base for 2.6 specific work arounds

    [ https://issues.apache.org/jira/browse/SPARK-20149?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16034090#comment-16034090 ] 

Hyukjin Kwon commented on SPARK-20149:
--------------------------------------

[~holdenk], I quickly look the Python 2.7 changes in http://svn.python.org/projects/python/tags/r27/Misc/NEWS. Mostly about the words, "backport" and "deprecate". IMHO, I think it is fine to resolve this issue for now. If there are 2.6 specific workarounds, I guess there'd be not so many instances. 

I could identify notable changes:

- Issue #2335: Backport set literals syntax from Python 3.x. (for example, {1, 2} for a set)

- Issue #2333: Backport set and dict comprehensions syntax from Python 3.x. (for example {x : 1 for x in [1, 2, 3]} for a dict)

However, I guess these are not required to be changed.

> Audit PySpark code base for 2.6 specific work arounds
> -----------------------------------------------------
>
>                 Key: SPARK-20149
>                 URL: https://issues.apache.org/jira/browse/SPARK-20149
>             Project: Spark
>          Issue Type: Sub-task
>          Components: PySpark
>    Affects Versions: 2.2.0
>            Reporter: holdenk
>
> We should determine what the areas in PySpark are that have specific 2.6 work arounds and create issues for them. The audit can be started during 2.2.0, but cleaning up all the 2.6 specific code is likely too much to try and get in so the actual fixing should probably be considered for 2.2.1 or 2.3 (unless 2.2.0 is delayed).



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org