You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Martin Senne (JIRA)" <ji...@apache.org> on 2015/09/29 12:05:04 UTC

[jira] [Comment Edited] (SPARK-7135) Expression for monotonically increasing IDs

    [ https://issues.apache.org/jira/browse/SPARK-7135?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14934947#comment-14934947 ] 

Martin Senne edited comment on SPARK-7135 at 9/29/15 10:04 AM:
---------------------------------------------------------------

@[~rxin]: First, great feature. Unfortunately, there seems to be no way to set an offset, where to start indexing at (currently indexing starts always at 0). Is there any chance to add this?

As this is my first post here, I'm not sure if this place here. Should I raise a feature request here in Jira?


was (Author: martinsenne):
@[~rxin]: First, great feature. Unfortunately, there seems to be no way to set an offset, where to start indexing at (currently indexing starts always at 0). Is there any chance, to add this?

As this is my first post here, I'm not sure if this place here. Should I raise a feature request here in Jira?

> Expression for monotonically increasing IDs
> -------------------------------------------
>
>                 Key: SPARK-7135
>                 URL: https://issues.apache.org/jira/browse/SPARK-7135
>             Project: Spark
>          Issue Type: Sub-task
>          Components: SQL
>            Reporter: Reynold Xin
>            Assignee: Reynold Xin
>              Labels: dataframe
>             Fix For: 1.4.0
>
>
> Seems like a common use case that users might want a unique ID for each row. It is more expensive to have consecutive IDs, since that'd require two pass over the data. However, many use cases can be satisfied by just having unique ids.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org