You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Koert Kuipers (JIRA)" <ji...@apache.org> on 2017/02/05 17:34:41 UTC
[jira] [Updated] (SPARK-19428) Ability to select first row of
groupby
[ https://issues.apache.org/jira/browse/SPARK-19428?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Koert Kuipers updated SPARK-19428:
----------------------------------
if I remember correctly window function is like a canned secondary sort
with limited functionality, meaning this gets all pushed to the reducer. so
that is great for selecting a large number but somewhat inefficient for
taking say first 10 (compared to aggregator)
On Feb 5, 2017 11:47, "Herman van Hovell (JIRA)" <ji...@apache.org> wrote:
[ https://issues.apache.org/jira/browse/SPARK-19428?page=
com.atlassian.jira.plugin.system.issuetabpanels:comment-
tabpanel&focusedCommentId=15853272#comment-15853272 ]
Herman van Hovell commented on SPARK-19428:
-------------------------------------------
You could also use a window function:
{noformat}
import org.apache.spark.sql.expressions.Window
val df = spark.range(10000).select($"id" % 100 as "a", $"id" as "b")
df.withColumn("rn",
row_number().over(Window.partitionBy($"a").orderBy($"b"))).filter($"rn"
< 10)
{noformat}
Pandas has something like this:
--
This message was sent by Atlassian JIRA
(v6.3.15#6346)
> Ability to select first row of groupby
> --------------------------------------
>
> Key: SPARK-19428
> URL: https://issues.apache.org/jira/browse/SPARK-19428
> Project: Spark
> Issue Type: Brainstorming
> Components: SQL
> Affects Versions: 2.1.0
> Reporter: Luke Miner
> Priority: Minor
>
> It would be nice to be able to select the first row from {{GroupedData}}. Pandas has something like this:
> {{df.groupby('group').first()}}
> It's especially handy if you can order the group as well.
--
This message was sent by Atlassian JIRA
(v6.3.15#6346)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org