You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (Jira)" <ji...@apache.org> on 2022/05/19 17:20:00 UTC

[jira] [Assigned] (SPARK-30983) Support more than 5 typed column in typed Dataset.select API

     [ https://issues.apache.org/jira/browse/SPARK-30983?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Apache Spark reassigned SPARK-30983:
------------------------------------

    Assignee: Apache Spark

> Support more than 5 typed column in typed Dataset.select API
> ------------------------------------------------------------
>
>                 Key: SPARK-30983
>                 URL: https://issues.apache.org/jira/browse/SPARK-30983
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 3.1.0
>            Reporter: L. C. Hsieh
>            Assignee: Apache Spark
>            Priority: Major
>
> Because Dataset only provides overloading typed select API to at most 5 typed columns, once more than 5 typed columns given, the select API call will go for untyped one.
> Currently users cannot call typed select with more than 5 typed columns. There are few options:
> 1. Expose Dataset.selectUntyped (could rename it) to accept any number (due to the limit of ExpressionEncoder.tuple, at most 22 actually) of typed columns. Pros: not need to add too much code in Dataset. Cons: The returned type is generally Dataset[_], not a specified one like Dataset[(U1, U2)] for the overloading method.
> 2. Add more overloading typed select APIs up to 22 typed column inputs. Pros: Clear returned type. Cons: A lot of code to be added to Dataset for just corner cases. It can be a breaking change to existing user code that calls untyped select API. 



--
This message was sent by Atlassian Jira
(v8.20.7#820007)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org