You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Wenchen Fan (JIRA)" <ji...@apache.org> on 2018/09/21 06:40:00 UTC

[jira] [Commented] (SPARK-22739) Additional Expression Support for Objects

    [ https://issues.apache.org/jira/browse/SPARK-22739?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16623151#comment-16623151 ] 

Wenchen Fan commented on SPARK-22739:
-------------------------------------

I'm closing this ticket, since avro is now a builtin data source. We can create another ticket to support avro records in Dataset.

> Additional Expression Support for Objects
> -----------------------------------------
>
>                 Key: SPARK-22739
>                 URL: https://issues.apache.org/jira/browse/SPARK-22739
>             Project: Spark
>          Issue Type: New Feature
>          Components: SQL
>    Affects Versions: 2.3.0
>            Reporter: Aleksander Eskilson
>            Priority: Major
>
> Some discussion in Spark-Avro [1] motivates additions and minor changes to the {{Objects}} Expressions API [2]. The proposed changes include
> * a generalized form of {{initializeJavaBean}} taking a sequence of initialization expressions that can be applied to instances of varying objects
> * an object cast that performs a simple Java type cast against a value
> * making {{ExternalMapToCatalyst}} public, for use in outside libraries
> These changes would facilitate the writing of custom encoders for varying objects that cannot already be readily converted to a statically typed dataset by a JavaBean encoder (e.g. Avro).
> [1] -- https://github.com/databricks/spark-avro/pull/217#issuecomment-342599110
> [2] --
>  https://github.com/apache/spark/blob/master/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/objects/objects.scala



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org