You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@bahir.apache.org by GitBox <gi...@apache.org> on 2020/07/06 08:35:04 UTC

[GitHub] [bahir-flink] wty4427300 opened a new pull request #87: add setnx command

wty4427300 opened a new pull request #87:
URL: https://github.com/apache/bahir-flink/pull/87


   add setnx command


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [bahir-flink] eskabetxe commented on pull request #87: add setnx command

Posted by GitBox <gi...@apache.org>.
eskabetxe commented on pull request #87:
URL: https://github.com/apache/bahir-flink/pull/87#issuecomment-657804282


   Thank you for the contribution :)
   Can you add some testing for this command?


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [bahir-flink] YikSanChan commented on pull request #87: add setnx command

Posted by GitBox <gi...@apache.org>.
YikSanChan commented on pull request #87:
URL: https://github.com/apache/bahir-flink/pull/87#issuecomment-799296646


   @eskabetxe AFAIK, there is little to no test on concrete operators (setnx, hset, etc). Do you want to introduce some real tests in this PR?


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [bahir-flink] eskabetxe commented on pull request #87: add setnx command

Posted by GitBox <gi...@apache.org>.
eskabetxe commented on pull request #87:
URL: https://github.com/apache/bahir-flink/pull/87#issuecomment-662929741


   @Anandonzy that is unrelated to this PR..
   could you please open a new inci


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [bahir-flink] Anandonzy edited a comment on pull request #87: add setnx command

Posted by GitBox <gi...@apache.org>.
Anandonzy edited a comment on pull request #87:
URL: https://github.com/apache/bahir-flink/pull/87#issuecomment-673237162


   Hello,Now I use this GAV  
   <!-- https://mvnrepository.com/artifact/org.apache.bahir/flink-connector-kudu -->
           <dependency>
               <groupId>org.apache.bahir</groupId>
               <artifactId>flink-connector-kudu_2.11</artifactId>
               <version>1.0-csa1.2.0.0</version>
               <scope>compile</scope>
           </dependency>
   to connect kudu.
   My sql ddl:
   
   -- source
   CREATE TABLE logs_resource (
       content VARCHAR
   ) WITH (
   	 'connector' = 'kafka',
   	 'topic' = 'test',
   	 'properties.bootstrap.servers' = '0.0.0.0:9092',
   	 'properties.group.id' = 'test_ziyu_flink_sql1',
   	 'scan.startup.mode'='latest-offset',
   	 'format' = 'csv',
   	 'csv.ignore-parse-errors' = 'true',
        'csv.allow-comments' = 'true',
        'csv.field-delimiter' = ','
   );
   
   -- sink
   CREATE TABLE kudu_sink (
       create_day int,
       itime varchar ,
       ltype varchar ,
       dvid varchar ,
       server_ip varchar
   ) WITH (
     'connector.type' = 'kudu',
     'kudu.masters' = '0.0.0.0',
     'kudu.table' = 'impala::kudu_flux.ziyu_test',
     'kudu.hash-columns' = 'itime',
     'kudu.primary-key-columns' = 'itime,create_day'
   );
   
   
   INSERT INTO kudu_sink(create_day,itime,ltype,dvid,server_ip)
   SELECT
     create_day,
     itime,
     ltype ,
     dvid ,
     server_ip 
   FROM logs_resource ,lateral table(udtfOneColumnToMultiColumn(ParseLogUDF(content)))  as T(create_day,itime,ltype,dvid,server_ip)
   GROUP BY itime,ltype,dvid,server_ip,create_day;
   
   udtfOneColumnToMultiColumn and ParseLogUDF is my udf function to parse my log.
   My flink version is 1.11.1 .
   when I run this sql . no Exception ,but it is no date insert into my kudu table.
   What I shuold do,Dou you have some example to insert into kudu?
   Thanks.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [bahir-flink] Anandonzy commented on pull request #87: add setnx command

Posted by GitBox <gi...@apache.org>.
Anandonzy commented on pull request #87:
URL: https://github.com/apache/bahir-flink/pull/87#issuecomment-673237162


   Hello,Now I use this GAV  
   <!-- https://mvnrepository.com/artifact/org.apache.bahir/flink-connector-kudu -->
           <dependency>
               <groupId>org.apache.bahir</groupId>
               <artifactId>flink-connector-kudu_2.11</artifactId>
               <version>1.0-csa1.2.0.0</version>
               <scope>compile</scope>
           </dependency>
   to connect kudu.
   My sql ddl:
   
   -- source
   CREATE TABLE logs_resource (
       content VARCHAR
   ) WITH (
   	 'connector' = 'kafka',
   	 'topic' = 'test',
   	 'properties.bootstrap.servers' = '172.17.2.148:9092',
   	 'properties.group.id' = 'test_ziyu_flink_sql1',
   	 'scan.startup.mode'='latest-offset',
   	 'format' = 'csv',
   	 'csv.ignore-parse-errors' = 'true',
        'csv.allow-comments' = 'true',
        'csv.field-delimiter' = ','
   );
   
   -- sink
   CREATE TABLE kudu_sink (
       create_day int,
       itime varchar ,
       ltype varchar ,
       dvid varchar ,
       server_ip varchar
   ) WITH (
     'connector.type' = 'kudu',
     'kudu.masters' = '172.23.28.147',
     'kudu.table' = 'impala::kudu_flux.ziyu_test',
     'kudu.hash-columns' = 'itime',
     'kudu.primary-key-columns' = 'itime,create_day'
   );
   
   
   INSERT INTO kudu_sink(create_day,itime,ltype,dvid,server_ip)
   SELECT
     create_day,
     itime,
     ltype ,
     dvid ,
     server_ip 
   FROM logs_resource ,lateral table(udtfOneColumnToMultiColumn(ParseLogUDF(content)))  as T(create_day,itime,ltype,dvid,server_ip)
   GROUP BY itime,ltype,dvid,server_ip,create_day;
   
   udtfOneColumnToMultiColumn and ParseLogUDF is my udf function to parse my log.
   My flink version is 1.11.1 .
   when I run this sql . no Exception ,but it is no date insert into my kudu table.
   What I shuold do,Dou you have some example to insert into kudu?
   Thanks.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [bahir-flink] eskabetxe commented on pull request #87: add setnx command

Posted by GitBox <gi...@apache.org>.
eskabetxe commented on pull request #87:
URL: https://github.com/apache/bahir-flink/pull/87#issuecomment-799288691


   @YikSanChan would you like to finish this?


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [bahir-flink] Anandonzy commented on pull request #87: add setnx command

Posted by GitBox <gi...@apache.org>.
Anandonzy commented on pull request #87:
URL: https://github.com/apache/bahir-flink/pull/87#issuecomment-659365357


   Hello,I can't download your provied dependency to my local warehouse.What I should do How I can use Flink sql ddl with connector kudu.Thanks


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org