You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2017/05/03 14:16:04 UTC

[jira] [Commented] (SPARK-20580) Allow RDD cache with unserializable objects

    [ https://issues.apache.org/jira/browse/SPARK-20580?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15994947#comment-15994947 ] 

Sean Owen commented on SPARK-20580:
-----------------------------------

In general, such objects need to be serializable, because otherwise there's no way to move such data, and most anything you do involves moving the data. You might be able to get away with in scenarios where the objects are created from a data source into memory and never participate in any operation that involves serialization. Here, it sounds like you chose one of the "_SER" storage levels which serializes the object into memory. If so, that's your problem.

> Allow RDD cache with unserializable objects
> -------------------------------------------
>
>                 Key: SPARK-20580
>                 URL: https://issues.apache.org/jira/browse/SPARK-20580
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>    Affects Versions: 1.3.0
>            Reporter: Fernando Pereira
>            Priority: Minor
>
> In my current scenario we load complex Python objects in the worker nodes that are not completely serializable. We then apply map certain operations to the RDD which at some point we collect. In this basic usage all works well.
> However, if we cache() the RDD (which defaults to memory) suddenly it fails to execute the transformations after the caching step. Apparently caching serializes the RDD data and deserializes it whenever more transformations are required.
> It would be nice to avoid serialization of the objects if they are to be cached to memory, and keep the original object



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org