You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Kevin Zhang (JIRA)" <ji...@apache.org> on 2017/08/01 06:12:00 UTC

[jira] [Created] (SPARK-21590) Structured Streaming window start time should support negative values to adjust time zone

Kevin Zhang created SPARK-21590:
-----------------------------------

             Summary: Structured Streaming window start time should support negative values to adjust time zone
                 Key: SPARK-21590
                 URL: https://issues.apache.org/jira/browse/SPARK-21590
             Project: Spark
          Issue Type: Bug
          Components: Structured Streaming
    Affects Versions: 2.2.0, 2.1.0, 2.0.1, 2.0.0
         Environment: spark 2.2.0
            Reporter: Kevin Zhang


I want to calculate (unique) daily access count using structured streaming (2.2.0). 
Now strut streaming' s window with 1 day duration starts at 
00:00:00 UTC and ends at 23:59:59 UTC each day, but my local timezone is CST (UTC + 8 hours) and I
want date boundaries to be 00:00:00 CST (that is 00:00:00 UTC - 8). 

In Flink I can set the window offset to -8 hours to make it, but here in struct streaming if I set the start time (same as the offset in Flink) to -8 or any other negative values, I will get the following error:

{code:shell}
Exception in thread "main" org.apache.spark.sql.AnalysisException: cannot resolve 'timewindow(timestamp, 86400000000, 86400000000, -28800000000)' due to data type mismatch: The start time (-28800000000) must be greater than or equal to 0.;;
{code}

because the time window checks the input parameters to guarantee each value is greater than or equal to 0.

So I'm thinking about whether we can remove the limit that the start time cannot be negative?



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org