You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2018/12/11 10:39:00 UTC

[jira] [Resolved] (SPARK-20528) Add BinaryFileReader and Writer for DataFrames

     [ https://issues.apache.org/jira/browse/SPARK-20528?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Hyukjin Kwon resolved SPARK-20528.
----------------------------------
    Resolution: Won't Fix

[~josephkb], let me leave this resolved for now. Please reopen this if you think this should be fixed ..

> Add BinaryFileReader and Writer for DataFrames
> ----------------------------------------------
>
>                 Key: SPARK-20528
>                 URL: https://issues.apache.org/jira/browse/SPARK-20528
>             Project: Spark
>          Issue Type: New Feature
>          Components: SQL
>    Affects Versions: 2.2.0
>            Reporter: Joseph K. Bradley
>            Priority: Major
>         Attachments: part-00000-5ae00646-8400-4b45-aa6f-d6f27068972c-c000.json, stocklist.json, stocklist.pdub
>
>
> It would be very useful to have a binary data reader/writer for DataFrames, presumably called via {{spark.read.binaryFiles}}, etc.
> Currently, going through RDDs is annoying since it requires different code paths for Scala vs Python:
> Scala:
> {code}
> val binaryFilesRDD = sc.binaryFiles("mypath")
> val binaryFilesDF = spark.createDataFrame(binaryFilesRDD)
> {code}
> Python:
> {code}
> binaryFilesRDD = sc.binaryFiles("mypath")
> binaryFilesRDD_recast = binaryFilesRDD.map(lambda x: (x[0], bytearray(x[1])))
> binaryFilesDF = spark.createDataFrame(binaryFilesRDD_recast)
> {code}
> This is because Scala and Python {{sc.binaryFiles}} return different types, which makes sense in RDD land but not DataFrame land.
> My motivation here is working with images in Spark.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org