You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user-zh@flink.apache.org by chenxuying <cx...@163.com> on 2020/09/17 10:15:09 UTC

使用flinksql时 jdbc connector参数不起作用

环境是flink1.11.2+idea
sql:
CREATE TABLE sourceTable (
    platform STRING
    ,game_id bigint
) WITH (
    ...
);
CREATE TABLE sinktable (
    platform STRING
    ,game_id bigint
) WITH (
    'connector' = 'jdbc',
    'url' = '',
    'table-name' = '',
    'driver' = 'com.mysql.jdbc.Driver',
    'username' = '',
    'password' = '',
    'sink.buffer-flush.max-rows' = '2',
    'sink.buffer-flush.interval' = '30s'
);
insert into sinktable select platform,game_id from sourceTable;


官方文档[1]中 , 说到 sink.buffer-flush.max-rows和sink.buffer-flush.interval 这两个属性可以设置成 '0' 来禁用他 , 不过我试了下是不行
如果设置如下
   sink.buffer-flush.max-rows = '0'
   'sink.buffer-flush.interval' = '60s'
导致每接收一条数据就插入数据库
如果设置如下
   sink.buffer-flush.max-rows = '10'
   'sink.buffer-flush.interval' = '0'
导致无法插入数据库


[1]:https://ci.apache.org/projects/flink/flink-docs-release-1.11/dev/table/connectors/jdbc.html#connector-options


Re:Re: 使用flinksql时 jdbc connector参数不起作用

Posted by chenxuying <cx...@163.com>.
好的, 明白




在 2020-09-17 20:29:09,"Jark Wu" <im...@gmail.com> 写道:
>>  sink.buffer-flush.max-rows = '0' 导致每接收一条数据就插入数据库
>
>这个应该是个 bug,我建了个 issue:https://issues.apache.org/jira/browse/FLINK-19280
>
>Best,
>Jark
>
>On Thu, 17 Sep 2020 at 18:15, chenxuying <cx...@163.com> wrote:
>
>> 环境是flink1.11.2+idea
>> sql:
>> CREATE TABLE sourceTable (
>>     platform STRING
>>     ,game_id bigint
>> ) WITH (
>>     ...
>> );
>> CREATE TABLE sinktable (
>>     platform STRING
>>     ,game_id bigint
>> ) WITH (
>>     'connector' = 'jdbc',
>>     'url' = '',
>>     'table-name' = '',
>>     'driver' = 'com.mysql.jdbc.Driver',
>>     'username' = '',
>>     'password' = '',
>>     'sink.buffer-flush.max-rows' = '2',
>>     'sink.buffer-flush.interval' = '30s'
>> );
>> insert into sinktable select platform,game_id from sourceTable;
>>
>>
>> 官方文档[1]中 , 说到 sink.buffer-flush.max-rows和sink.buffer-flush.interval
>> 这两个属性可以设置成 '0' 来禁用他 , 不过我试了下是不行
>> 如果设置如下
>>    sink.buffer-flush.max-rows = '0'
>>    'sink.buffer-flush.interval' = '60s'
>> 导致每接收一条数据就插入数据库
>> 如果设置如下
>>    sink.buffer-flush.max-rows = '10'
>>    'sink.buffer-flush.interval' = '0'
>> 导致无法插入数据库
>>
>>
>> [1]:
>> https://ci.apache.org/projects/flink/flink-docs-release-1.11/dev/table/connectors/jdbc.html#connector-options
>>
>>

Re: 使用flinksql时 jdbc connector参数不起作用

Posted by Jark Wu <im...@gmail.com>.
>  sink.buffer-flush.max-rows = '0' 导致每接收一条数据就插入数据库

这个应该是个 bug,我建了个 issue:https://issues.apache.org/jira/browse/FLINK-19280

Best,
Jark

On Thu, 17 Sep 2020 at 18:15, chenxuying <cx...@163.com> wrote:

> 环境是flink1.11.2+idea
> sql:
> CREATE TABLE sourceTable (
>     platform STRING
>     ,game_id bigint
> ) WITH (
>     ...
> );
> CREATE TABLE sinktable (
>     platform STRING
>     ,game_id bigint
> ) WITH (
>     'connector' = 'jdbc',
>     'url' = '',
>     'table-name' = '',
>     'driver' = 'com.mysql.jdbc.Driver',
>     'username' = '',
>     'password' = '',
>     'sink.buffer-flush.max-rows' = '2',
>     'sink.buffer-flush.interval' = '30s'
> );
> insert into sinktable select platform,game_id from sourceTable;
>
>
> 官方文档[1]中 , 说到 sink.buffer-flush.max-rows和sink.buffer-flush.interval
> 这两个属性可以设置成 '0' 来禁用他 , 不过我试了下是不行
> 如果设置如下
>    sink.buffer-flush.max-rows = '0'
>    'sink.buffer-flush.interval' = '60s'
> 导致每接收一条数据就插入数据库
> 如果设置如下
>    sink.buffer-flush.max-rows = '10'
>    'sink.buffer-flush.interval' = '0'
> 导致无法插入数据库
>
>
> [1]:
> https://ci.apache.org/projects/flink/flink-docs-release-1.11/dev/table/connectors/jdbc.html#connector-options
>
>