You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Josh Rosen (JIRA)" <ji...@apache.org> on 2016/01/05 22:39:39 UTC

[jira] [Comment Edited] (SPARK-6883) Fork pyspark's cloudpickle as a separate dependency

    [ https://issues.apache.org/jira/browse/SPARK-6883?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15083861#comment-15083861 ] 

Josh Rosen edited comment on SPARK-6883 at 1/5/16 9:39 PM:
-----------------------------------------------------------

Closing as "Later" for now. Let's file a separate issue later down the line in case we want to explore having Spark depend on the cloudpipe/cloudpickle fork.


was (Author: joshrosen):
Closing as "Later" for now. Let's file a separate issue later down the line in case we want to explore having Spark depend on the cloudpickle/cloudpickle fork.

> Fork pyspark's cloudpickle as a separate dependency
> ---------------------------------------------------
>
>                 Key: SPARK-6883
>                 URL: https://issues.apache.org/jira/browse/SPARK-6883
>             Project: Spark
>          Issue Type: Improvement
>          Components: PySpark
>            Reporter: Kyle Kelley
>              Labels: fork
>
> IPython, pyspark, picloud/multyvac/cloudpipe all rely on cloudpickle from various sources (cloud, pyspark, and multyvac correspondingly). It would be great to have this as a separately maintained project that can:
> * Work with Python3
> * Add tests!
> * Use higher order pickling (when on Python3)
> * Be installed with pip
> We're starting this off at the PyCon sprints under https://github.com/cloudpipe/cloudpickle. We'd like to coordinate with PySpark to make it work across all the above mentioned projects.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org