You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@flink.apache.org by Pedro Silva <pe...@gmail.com> on 2021/09/15 15:00:03 UTC

FlinkSQL Sinks

Hello,

Is it possible to configure a sink for sql client queries other than the
terminal/stdout?
Looking at the SQL Client Configuration
<https://ci.apache.org/projects/flink/flink-docs-master/docs/dev/table/sqlclient/#sql-client-configuration>,
it seems that the output of the client is always to visualize the results.

If I wanted to sink a changelog stream to a database like Postgres or Kafka
topic would I have to create a streaming application, hardcode the SQL
query and configure the sink as java/scala/python code?

Thank you.

Re: FlinkSQL Sinks

Posted by JING ZHANG <be...@gmail.com>.
Hi,
I agree with Martijn.
Besides, there is an example in SQL client documentation[1] which contain
an external sink instead of print sink.

> Flink SQL> CREATE TABLE pageviews (>   user_id BIGINT,>   page_id BIGINT,>   viewtime TIMESTAMP,>   proctime AS PROCTIME()> ) WITH (>   'connector' = 'kafka',>   'topic' = 'pageviews',>   'properties.bootstrap.servers' = '...',>   'format' = 'avro'> );[INFO] Execute statement succeed.
> Flink SQL> CREATE TABLE uniqueview (>   page_id BIGINT,>   cnt BIGINT> ) WITH (>   'connector' = 'jdbc',>   'url' = 'jdbc:mysql://localhost:3306/mydatabase',>   'table-name' = 'uniqueview'> );[INFO] Execute statement succeed.
> Flink SQL> INSERT INTO uniqueview> SELECT page_id, count(distinct user_id)> FROM pageviews> GROUP BY page_id;
>
>
[1]
https://ci.apache.org/projects/flink/flink-docs-release-1.13/docs/dev/table/sqlclient/#execute-a-set-of-sql-statements

Best,
JING ZHANG

Martijn Visser <ma...@ververica.com> 于2021年9月15日周三 下午11:07写道:

> Hi,
>
> You can do this directly via the SQL client by defining (for example
> Kafka) as a TABLE [1] and using an INSERT INTO [2].
>
> Best regards,
>
> Martijn
>
> [1]
> https://ci.apache.org/projects/flink/flink-docs-release-1.13/docs/connectors/table/kafka/
> [2]
> https://ci.apache.org/projects/flink/flink-docs-release-1.13/docs/dev/table/sql/insert/
>
> On Wed, 15 Sept 2021 at 17:00, Pedro Silva <pe...@gmail.com> wrote:
>
>> Hello,
>>
>> Is it possible to configure a sink for sql client queries other than the
>> terminal/stdout?
>> Looking at the SQL Client Configuration
>> <https://ci.apache.org/projects/flink/flink-docs-master/docs/dev/table/sqlclient/#sql-client-configuration>,
>> it seems that the output of the client is always to visualize the results.
>>
>> If I wanted to sink a changelog stream to a database like Postgres or
>> Kafka topic would I have to create a streaming application, hardcode the
>> SQL query and configure the sink as java/scala/python code?
>>
>> Thank you.
>>
>

Re: FlinkSQL Sinks

Posted by Martijn Visser <ma...@ververica.com>.
Hi,

You can do this directly via the SQL client by defining (for example Kafka)
as a TABLE [1] and using an INSERT INTO [2].

Best regards,

Martijn

[1]
https://ci.apache.org/projects/flink/flink-docs-release-1.13/docs/connectors/table/kafka/
[2]
https://ci.apache.org/projects/flink/flink-docs-release-1.13/docs/dev/table/sql/insert/

On Wed, 15 Sept 2021 at 17:00, Pedro Silva <pe...@gmail.com> wrote:

> Hello,
>
> Is it possible to configure a sink for sql client queries other than the
> terminal/stdout?
> Looking at the SQL Client Configuration
> <https://ci.apache.org/projects/flink/flink-docs-master/docs/dev/table/sqlclient/#sql-client-configuration>,
> it seems that the output of the client is always to visualize the results.
>
> If I wanted to sink a changelog stream to a database like Postgres or
> Kafka topic would I have to create a streaming application, hardcode the
> SQL query and configure the sink as java/scala/python code?
>
> Thank you.
>