You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Julien Champ (JIRA)" <ji...@apache.org> on 2017/07/04 09:23:00 UTC
[jira] [Commented] (SPARK-19451) Long values in Window function
[ https://issues.apache.org/jira/browse/SPARK-19451?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16073385#comment-16073385 ]
Julien Champ commented on SPARK-19451:
--------------------------------------
Any news on this bug / feature request ?
Or any workaround ? May be using stream I can efficiently do what I want ?
> Long values in Window function
> ------------------------------
>
> Key: SPARK-19451
> URL: https://issues.apache.org/jira/browse/SPARK-19451
> Project: Spark
> Issue Type: Bug
> Components: Spark Core
> Affects Versions: 1.6.1, 2.0.2
> Reporter: Julien Champ
>
> Hi there,
> there seems to be a major limitation in spark window functions and rangeBetween method.
> If I have the following code :
> {code:title=Exemple |borderStyle=solid}
> val tw = Window.orderBy("date")
> .partitionBy("id")
> .rangeBetween( from , 0)
> {code}
> Everything seems ok, while *from* value is not too large... Even if the rangeBetween() method supports Long parameters.
> But.... If i set *-2160000000L* value to *from* it does not work !
> It is probably related to this part of code in the between() method, of the WindowSpec class, called by rangeBetween()
> {code:title=between() method|borderStyle=solid}
> val boundaryStart = start match {
> case 0 => CurrentRow
> case Long.MinValue => UnboundedPreceding
> case x if x < 0 => ValuePreceding(-start.toInt)
> case x if x > 0 => ValueFollowing(start.toInt)
> }
> {code}
> ( look at this *.toInt* )
> Does anybody know it there's a way to solve / patch this behavior ?
> Any help will be appreciated
> Thx
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org