You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Jackey Lee (JIRA)" <ji...@apache.org> on 2018/12/18 12:40:00 UTC

[jira] [Commented] (SPARK-24630) SPIP: Support SQLStreaming in Spark

    [ https://issues.apache.org/jira/browse/SPARK-24630?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16724011#comment-16724011 ] 

Jackey Lee commented on SPARK-24630:
------------------------------------

[~jackylk] Sorry for the late reply.

I haven't considered about manipulating streaming job, which is currently running directly after starting until the end of application, similar to a Command. SQLStreaming can also be easily supported if methods are later able to manipulate and process current Command execution. Can you show me how to deal with it?

Current, SQLStreaming support Table API, thus we can use table API to show/desc stream tables.

> SPIP: Support SQLStreaming in Spark
> -----------------------------------
>
>                 Key: SPARK-24630
>                 URL: https://issues.apache.org/jira/browse/SPARK-24630
>             Project: Spark
>          Issue Type: Improvement
>          Components: Structured Streaming
>    Affects Versions: 2.2.0, 2.2.1
>            Reporter: Jackey Lee
>            Priority: Minor
>              Labels: SQLStreaming
>         Attachments: SQLStreaming SPIP.pdf
>
>
> At present, KafkaSQL, Flink SQL(which is actually based on Calcite), SQLStream, StormSQL all provide a stream type SQL interface, with which users with little knowledge about streaming,  can easily develop a flow system processing model. In Spark, we can also support SQL API based on StructStreamig.
> To support for SQL Streaming, there are two key points: 
> 1, Analysis should be able to parse streaming type SQL. 
> 2, Analyzer should be able to map metadata information to the corresponding 
> Relation. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org