You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Naveen Pishe (JIRA)" <ji...@apache.org> on 2016/06/20 17:31:05 UTC

[jira] [Created] (SPARK-16072) Map operation on JavaPairDStream throws Task not serializable Exception

Naveen Pishe created SPARK-16072:
------------------------------------

             Summary: Map operation on JavaPairDStream throws  Task not serializable Exception
                 Key: SPARK-16072
                 URL: https://issues.apache.org/jira/browse/SPARK-16072
             Project: Spark
          Issue Type: Improvement
          Components: Spark Core
    Affects Versions: 1.3.0, 1.4.0, 1.5.0
            Reporter: Naveen Pishe


There is requirement to save "user-defined" Metadata as part of Sequence File "Header" using Spark. 

To write a User-defined metadata as part of Sequence File using regular Hadoop API's I pass the metadata object to SequenceFile.Writer constructor which when creates a SequenceFile ensures the metadata is part of the sequence file header.

Currently Spark's JavaPairRDD Api provides methods to save an RDD to an SequenceFile format, but I don't see any API which can either give the SequenceFile.writer or a method where in I can pass the user-defined metadata so as to be written as part of sequence file header.

The enhancement request an API to implement the same



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org