You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user-zh@flink.apache.org by "hl9902@126.com" <hl...@126.com> on 2020/09/28 10:05:31 UTC

回复: Re: sql-cli执行sql报错

按照您的方法重试了下,又报了另一个错误:
Flink SQL> CREATE TABLE tx ( 
>                     account_id  BIGINT, 
>                     amount      BIGINT, 
>                     transaction_time TIMESTAMP(3), 
>                     WATERMARK FOR transaction_time AS transaction_time - INTERVAL '5' SECOND 
>                 ) WITH ( 
>                     'connector.type' = 'kafka', 
> 'connector.version' = 'universal',
>                     'connector.topic'     = 'heli01', 
> 'connector.properties.group.id' = 'heli-test',
>                     'connector.properties.bootstrap.servers' = '10.100.51.56:9092', 
> 'connector.startup-mode' = 'earliest-offset',
>                     'format.type'    = 'csv' 
>                 );
[INFO] Table has been created.

Flink SQL> show tables ;
tx

Flink SQL> select * from tx ;
[ERROR] Could not execute SQL statement. Reason:
org.apache.flink.kafka.shaded.org.apache.kafka.common.KafkaException: org.apache.kafka.common.serialization.ByteArrayDeserializer is not an instance of org.apache.flink.kafka.shaded.org.apache.kafka.common.serialization.Deserializer

附:lib包清单
[test@rcx51101 lib]$ pwd
/opt/flink-1.10.2/lib

flink-csv-1.10.2.jar
flink-dist_2.12-1.10.2.jar
flink-jdbc_2.12-1.10.2.jar
flink-json-1.10.2.jar
flink-shaded-hadoop-2-uber-2.6.5-10.0.jar
flink-sql-connector-kafka_2.11-1.10.2.jar
flink-table_2.12-1.10.2.jar
flink-table-blink_2.12-1.10.2.jar
log4j-1.2.17.jar
mysql-connector-java-5.1.48.jar
slf4j-log4j12-1.7.15.jar




hl9902@126.com
 
发件人: Leonard Xu
发送时间: 2020-09-28 16:36
收件人: user-zh
主题: Re: sql-cli执行sql报错
Hi
benchao的回复是的对的,
你用SQL client 时, 不需要datastream connector的jar包,直接用SQL connector 对应的jar包 flink-*sql*-connector-kafka***.jar就行了,把你添加的其他jar包都删掉。
 
 
> 相关lib包:
> flink-connector-kafka_2.12-1.10.2.jar
> kafka-clients-0.11.0.3.jar  
 
祝好
Leonard 

Re: Re: sql-cli执行sql报错

Posted by "hl9902@126.com" <hl...@126.com>.
没有修改kafka,就用官方的jar。后来我用1.11.2版本重新尝试了下,成功了,没有任何错误。
这个问题就不纠结了



hl9902@126.com
 
发件人: Benchao Li
发送时间: 2020-09-29 18:17
收件人: user-zh
主题: Re: Re: sql-cli执行sql报错
这个错误看起来比较奇怪。正常来讲flink-sql-connector-kafka_2.11-1.10.2.jar里面应该都是shaded之后的class了,
但是却报了一个非shaded的ByteArrayDeserializer。
我感觉这个应该是你自己添加了一下比较特殊的逻辑导致的。可以介绍下你对kafka connector做了哪些改造么?
 
hl9902@126.com <hl...@126.com> 于2020年9月28日周一 下午6:06写道:
 
> 按照您的方法重试了下,又报了另一个错误:
> Flink SQL> CREATE TABLE tx (
> >                     account_id  BIGINT,
> >                     amount      BIGINT,
> >                     transaction_time TIMESTAMP(3),
> >                     WATERMARK FOR transaction_time AS transaction_time -
> INTERVAL '5' SECOND
> >                 ) WITH (
> >                     'connector.type' = 'kafka',
> > 'connector.version' = 'universal',
> >                     'connector.topic'     = 'heli01',
> > 'connector.properties.group.id' = 'heli-test',
> >                     'connector.properties.bootstrap.servers' = '
> 10.100.51.56:9092',
> > 'connector.startup-mode' = 'earliest-offset',
> >                     'format.type'    = 'csv'
> >                 );
> [INFO] Table has been created.
>
> Flink SQL> show tables ;
> tx
>
> Flink SQL> select * from tx ;
> [ERROR] Could not execute SQL statement. Reason:
> org.apache.flink.kafka.shaded.org.apache.kafka.common.KafkaException:
> org.apache.kafka.common.serialization.ByteArrayDeserializer is not an
> instance of
> org.apache.flink.kafka.shaded.org.apache.kafka.common.serialization.Deserializer
>
> 附:lib包清单
> [test@rcx51101 lib]$ pwd
> /opt/flink-1.10.2/lib
>
> flink-csv-1.10.2.jar
> flink-dist_2.12-1.10.2.jar
> flink-jdbc_2.12-1.10.2.jar
> flink-json-1.10.2.jar
> flink-shaded-hadoop-2-uber-2.6.5-10.0.jar
> flink-sql-connector-kafka_2.11-1.10.2.jar
> flink-table_2.12-1.10.2.jar
> flink-table-blink_2.12-1.10.2.jar
> log4j-1.2.17.jar
> mysql-connector-java-5.1.48.jar
> slf4j-log4j12-1.7.15.jar
>
>
>
>
> hl9902@126.com
>
> 发件人: Leonard Xu
> 发送时间: 2020-09-28 16:36
> 收件人: user-zh
> 主题: Re: sql-cli执行sql报错
> Hi
> benchao的回复是的对的,
> 你用SQL client 时, 不需要datastream connector的jar包,直接用SQL connector 对应的jar包
> flink-*sql*-connector-kafka***.jar就行了,把你添加的其他jar包都删掉。
>
>
> > 相关lib包:
> > flink-connector-kafka_2.12-1.10.2.jar
> > kafka-clients-0.11.0.3.jar
>
> 祝好
> Leonard
>
 
 
-- 
 
Best,
Benchao Li

Re: Re: sql-cli执行sql报错

Posted by Benchao Li <li...@apache.org>.
这个错误看起来比较奇怪。正常来讲flink-sql-connector-kafka_2.11-1.10.2.jar里面应该都是shaded之后的class了,
但是却报了一个非shaded的ByteArrayDeserializer。
我感觉这个应该是你自己添加了一下比较特殊的逻辑导致的。可以介绍下你对kafka connector做了哪些改造么?

hl9902@126.com <hl...@126.com> 于2020年9月28日周一 下午6:06写道:

> 按照您的方法重试了下,又报了另一个错误:
> Flink SQL> CREATE TABLE tx (
> >                     account_id  BIGINT,
> >                     amount      BIGINT,
> >                     transaction_time TIMESTAMP(3),
> >                     WATERMARK FOR transaction_time AS transaction_time -
> INTERVAL '5' SECOND
> >                 ) WITH (
> >                     'connector.type' = 'kafka',
> > 'connector.version' = 'universal',
> >                     'connector.topic'     = 'heli01',
> > 'connector.properties.group.id' = 'heli-test',
> >                     'connector.properties.bootstrap.servers' = '
> 10.100.51.56:9092',
> > 'connector.startup-mode' = 'earliest-offset',
> >                     'format.type'    = 'csv'
> >                 );
> [INFO] Table has been created.
>
> Flink SQL> show tables ;
> tx
>
> Flink SQL> select * from tx ;
> [ERROR] Could not execute SQL statement. Reason:
> org.apache.flink.kafka.shaded.org.apache.kafka.common.KafkaException:
> org.apache.kafka.common.serialization.ByteArrayDeserializer is not an
> instance of
> org.apache.flink.kafka.shaded.org.apache.kafka.common.serialization.Deserializer
>
> 附:lib包清单
> [test@rcx51101 lib]$ pwd
> /opt/flink-1.10.2/lib
>
> flink-csv-1.10.2.jar
> flink-dist_2.12-1.10.2.jar
> flink-jdbc_2.12-1.10.2.jar
> flink-json-1.10.2.jar
> flink-shaded-hadoop-2-uber-2.6.5-10.0.jar
> flink-sql-connector-kafka_2.11-1.10.2.jar
> flink-table_2.12-1.10.2.jar
> flink-table-blink_2.12-1.10.2.jar
> log4j-1.2.17.jar
> mysql-connector-java-5.1.48.jar
> slf4j-log4j12-1.7.15.jar
>
>
>
>
> hl9902@126.com
>
> 发件人: Leonard Xu
> 发送时间: 2020-09-28 16:36
> 收件人: user-zh
> 主题: Re: sql-cli执行sql报错
> Hi
> benchao的回复是的对的,
> 你用SQL client 时, 不需要datastream connector的jar包,直接用SQL connector 对应的jar包
> flink-*sql*-connector-kafka***.jar就行了,把你添加的其他jar包都删掉。
>
>
> > 相关lib包:
> > flink-connector-kafka_2.12-1.10.2.jar
> > kafka-clients-0.11.0.3.jar
>
> 祝好
> Leonard
>


-- 

Best,
Benchao Li

Re: Re: sql-cli执行sql报错

Posted by zhisheng <zh...@gmail.com>.
这个问题同样在最新的 master 分支也有这个问题,我建了一个 Issue 描述了下整个流程
https://issues.apache.org/jira/browse/FLINK-19995

hl9902@126.com <hl...@126.com> 于2020年9月28日周一 下午6:06写道:

> 按照您的方法重试了下,又报了另一个错误:
> Flink SQL> CREATE TABLE tx (
> >                     account_id  BIGINT,
> >                     amount      BIGINT,
> >                     transaction_time TIMESTAMP(3),
> >                     WATERMARK FOR transaction_time AS transaction_time -
> INTERVAL '5' SECOND
> >                 ) WITH (
> >                     'connector.type' = 'kafka',
> > 'connector.version' = 'universal',
> >                     'connector.topic'     = 'heli01',
> > 'connector.properties.group.id' = 'heli-test',
> >                     'connector.properties.bootstrap.servers' = '
> 10.100.51.56:9092',
> > 'connector.startup-mode' = 'earliest-offset',
> >                     'format.type'    = 'csv'
> >                 );
> [INFO] Table has been created.
>
> Flink SQL> show tables ;
> tx
>
> Flink SQL> select * from tx ;
> [ERROR] Could not execute SQL statement. Reason:
> org.apache.flink.kafka.shaded.org.apache.kafka.common.KafkaException:
> org.apache.kafka.common.serialization.ByteArrayDeserializer is not an
> instance of
> org.apache.flink.kafka.shaded.org.apache.kafka.common.serialization.Deserializer
>
> 附:lib包清单
> [test@rcx51101 lib]$ pwd
> /opt/flink-1.10.2/lib
>
> flink-csv-1.10.2.jar
> flink-dist_2.12-1.10.2.jar
> flink-jdbc_2.12-1.10.2.jar
> flink-json-1.10.2.jar
> flink-shaded-hadoop-2-uber-2.6.5-10.0.jar
> flink-sql-connector-kafka_2.11-1.10.2.jar
> flink-table_2.12-1.10.2.jar
> flink-table-blink_2.12-1.10.2.jar
> log4j-1.2.17.jar
> mysql-connector-java-5.1.48.jar
> slf4j-log4j12-1.7.15.jar
>
>
>
>
> hl9902@126.com
>
> 发件人: Leonard Xu
> 发送时间: 2020-09-28 16:36
> 收件人: user-zh
> 主题: Re: sql-cli执行sql报错
> Hi
> benchao的回复是的对的,
> 你用SQL client 时, 不需要datastream connector的jar包,直接用SQL connector 对应的jar包
> flink-*sql*-connector-kafka***.jar就行了,把你添加的其他jar包都删掉。
>
>
> > 相关lib包:
> > flink-connector-kafka_2.12-1.10.2.jar
> > kafka-clients-0.11.0.3.jar
>
> 祝好
> Leonard
>