You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Jerry Z (JIRA)" <ji...@apache.org> on 2015/08/07 20:14:47 UTC
[jira] [Comment Edited] (SPARK-9744) Add Java RDD method to map
with lag and lead
[ https://issues.apache.org/jira/browse/SPARK-9744?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14662188#comment-14662188 ]
Jerry Z edited comment on SPARK-9744 at 8/7/15 6:14 PM:
--------------------------------------------------------
Let me know if I'm missing/doing more work than necessary.....
1. zip with index
1.5. flip pair so the key is the index
2. map with -1 to the index
3. repeat this process with however many back I need
4. join RDD pairs (resulting in a chain of pairs, which is annoying with how long the line becomes when filling in the text between the<>)
5. another map function doing the operation you wanted to do with those values
was (Author: mrunowen):
Let me know if I'm missing/doing more work than necessary.
1. zip with index
1.5. flip pair so the key is the index
2. map with -1 to the index
3. repeat this process with however many back I need
4. join RDD pairs (resulting in a chain of pairs, which is annoying with how long the line becomes when filling in the text between the<>)
5. another map function doing the operation you wanted to do with those values
> Add Java RDD method to map with lag and lead
> --------------------------------------------
>
> Key: SPARK-9744
> URL: https://issues.apache.org/jira/browse/SPARK-9744
> Project: Spark
> Issue Type: Wish
> Reporter: Jerry Z
> Priority: Minor
>
> To avoid zipping with index and doing numerous mapping and joins, having a single method call to map with an additional two parameters (1: list of offsets [(-) for lag, 0 for current and (+) for lead])) and (2:default value). The other difference to the map function takes an argument of List<T> and not just T.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org