You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Xiang Gao (JIRA)" <ji...@apache.org> on 2016/08/22 14:45:21 UTC

[jira] [Commented] (SPARK-17185) Unify naming of API for RDD and Dataset

    [ https://issues.apache.org/jira/browse/SPARK-17185?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15430897#comment-15430897 ] 

Xiang Gao commented on SPARK-17185:
-----------------------------------

Changing API is a bad idea and we should not do this.

Maybe these changes might help(I'm not sure):

* add {{aggregateByKey}} and {{aggregateBy}} to {{Dataset}}, which does exactly the same thing as {{groupByKey}} and {{groupBy}} does now.

* The return value of {{aggregateByKey}} and {{aggregateBy}} should be two new class: {{KeyValueAggregatedDataset}} and {{RelationalAggregatedDataset}}, which is a copy of {{KeyValueGroupedDataset}} and {{RelationalGroupedDataset}} now.

* add new methods to get a key-list pair to class {{KeyValueGroupedDataset}} and {{RelationalAggregatedDataset}} and maybe deprecate the methods to do aggregation in these two class

> Unify naming of API for RDD and Dataset
> ---------------------------------------
>
>                 Key: SPARK-17185
>                 URL: https://issues.apache.org/jira/browse/SPARK-17185
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core, SQL
>            Reporter: Xiang Gao
>            Priority: Minor
>
> In RDD, groupByKey is used to generate a key-list pair and  aggregateByKey is used to do aggregation.
> In Dataset, aggregation is done by groupBy and groupByKey, and no API for key-list pair is provided.
> The same name "groupBy" is designed to do different things and this might be be confusing. Besides, it would be more convenient to provide API to generate key-list pair for Dataset.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org