You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Johnny Bai (Jira)" <ji...@apache.org> on 2020/09/09 01:46:00 UTC
[jira] [Comment Edited] (SPARK-32821) cannot group by with window
in sql sentence for structured streaming with watermark
[ https://issues.apache.org/jira/browse/SPARK-32821?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17192563#comment-17192563 ]
Johnny Bai edited comment on SPARK-32821 at 9/9/20, 1:45 AM:
-------------------------------------------------------------
[~kabhwan] As structured streaming going, I think it is necessary to build a relative complete structured streaming SQL standard specification like the ANSI SQL standard
was (Author: johnny bai):
[~kabhwan] as structured streaming going, I think it is necessary to build a relative complete structured streaming SQL standard specification like the ANSI SQL standard
> cannot group by with window in sql sentence for structured streaming with watermark
> -----------------------------------------------------------------------------------
>
> Key: SPARK-32821
> URL: https://issues.apache.org/jira/browse/SPARK-32821
> Project: Spark
> Issue Type: Improvement
> Components: Structured Streaming
> Affects Versions: 2.1.0, 2.2.0, 2.3.0, 2.4.0
> Reporter: Johnny Bai
> Priority: Major
>
> current only support dsl style as below:
> import spark.implicits._
> val words = ... // streaming DataFrame of schema { timestamp: Timestamp, word: String }
> // Group the data by window and word and compute the count of each group
> val windowedCounts = words.groupBy(window($"timestamp", "10 minutes", "5 minutes"),$"word").count()
>
> but not support group by with window in sql style as below:
> "select ts_field,count(\*) as cnt over window(ts_field, '1 minute', '1 minute') with watermark 1 minute from tableX group by ts_field"
>
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org