You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@flink.apache.org by "wanglei2@geekplus.com.cn" <wa...@geekplus.com.cn> on 2020/05/22 13:34:00 UTC
java.lang.AbstractMethodError when implementing KafkaSerializationSchema
public class MyKafkaSerializationSchema implements KafkaSerializationSchema<Tuple2<String, String>> {
@Override
public ProducerRecord<byte[], byte[]> serialize(Tuple2<String, String> o, @Nullable Long aLong) {
ProducerRecord<byte[], byte[]> record = new ProducerRecord<>(o.f0,
o.f1.getBytes(StandardCharsets.UTF_8));
return record;
}
}
FlinkKafkaProducer<Tuple2<String, String>> producer = new FlinkKafkaProducer<Tuple2<String, String>>(
"default", new MyKafkaSerializationSchema(),
prop2,Semantic.EXACTLY_ONCE);
But there's error when runnng:
java.lang.AbstractMethodError: com.geekplus.flinketl.schema.MyKafkaSerializationSchema.serialize(Ljava/lang/Object;Ljava/lang/Long;)Lorg/apache/flink/kafka/shaded/org/apache/kafka/clients/producer/ProducerRecord;
Any suggestion on this?
Thanks,
Lei
wanglei2@geekplus.com.cn
回复: Re: java.lang.AbstractMethodError when implementing KafkaSerializationSchema
Posted by "wanglei2@geekplus.com.cn" <wa...@geekplus.com.cn>.
It is because the jar conflict and i have fixed it.
I put flink-connector-kafka_2.11-1.10.0.jar in the flink lib directory.
Also in my project pom file has the dependency flink-connector-kafka and builded as a fat jar
Thanks,
Lei
wanglei2@geekplus.com.cn
发件人: Leonard Xu
发送时间: 2020-05-26 15:47
收件人: Aljoscha Krettek
抄送: user
主题: Re: java.lang.AbstractMethodError when implementing KafkaSerializationSchema
Hi,wanglei
I think Aljoscha is wright. Could you post your denpendency list?
Dependency flink-connector-kafka is used in dataStream Application which you should use, dependency flink-sql-connector-kafka is used in Table API & SQL Application. We should only add one of them because the two dependency will conflict.
Best,
Leonard Xu
在 2020年5月26日,15:02,Aljoscha Krettek <al...@apache.org> 写道:
I think what might be happening is that you're mixing dependencies from the flink-sql-connector-kafka and the proper flink-connector-kafka that should be used with the DataStream API. Could that be the case?
Best,
Aljoscha
On 25.05.20 19:18, Piotr Nowojski wrote:
Hi,
It would be helpful if you could provide full stack trace, what Flink version and which Kafka connector version are you using?
It sounds like either a dependency convergence error (mixing Kafka dependencies/various versions of flink-connector-kafka inside a single job/jar) or some shading issue. Can you check your project for such issues (`mvn dependency:tree` command [1]).
Also what’s a bit suspicious for me is the return type:
Lorg/apache/flink/kafka/shaded/org/apache/kafka/clients/producer/ProducerRecord;
I’m not sure, but I was not aware that we are shading Kafka dependency in our connectors? Are you manually shading something?
Piotrek
[1] https://maven.apache.org/plugins/maven-dependency-plugin/examples/resolving-conflicts-using-the-dependency-tree.html <https://maven.apache.org/plugins/maven-dependency-plugin/examples/resolving-conflicts-using-the-dependency-tree.html>
On 22 May 2020, at 15:34, wanglei2@geekplus.com.cn wrote:
public class MyKafkaSerializationSchema implements KafkaSerializationSchema<Tuple2<String, String>> {
@Override
public ProducerRecord<byte[], byte[]> serialize(Tuple2<String, String> o, @Nullable Long aLong) {
ProducerRecord<byte[], byte[]> record = new ProducerRecord<>(o.f0,
o.f1.getBytes(StandardCharsets.UTF_8));
return record;
}
}
FlinkKafkaProducer<Tuple2<String, String>> producer = new FlinkKafkaProducer<Tuple2<String, String>>(
"default", new MyKafkaSerializationSchema(),
prop2,Semantic.EXACTLY_ONCE);
But there's error when runnng:
java.lang.AbstractMethodError: com.geekplus.flinketl.schema.MyKafkaSerializationSchema.serialize(Ljava/lang/Object;Ljava/lang/Long;)Lorg/apache/flink/kafka/shaded/org/apache/kafka/clients/producer/ProducerRecord;
Any suggestion on this?
Thanks,
Lei
wanglei2@geekplus.com.cn <ma...@geekplus.com.cn>
Re: java.lang.AbstractMethodError when implementing
KafkaSerializationSchema
Posted by Leonard Xu <xb...@gmail.com>.
Hi,wanglei
I think Aljoscha is wright. Could you post your denpendency list?
Dependency flink-connector-kafka is used in dataStream Application which you should use, dependency flink-sql-connector-kafka is used in Table API & SQL Application. We should only add one of them because the two dependency will conflict.
Best,
Leonard Xu
> 在 2020年5月26日,15:02,Aljoscha Krettek <al...@apache.org> 写道:
>
> I think what might be happening is that you're mixing dependencies from the flink-sql-connector-kafka and the proper flink-connector-kafka that should be used with the DataStream API. Could that be the case?
>
> Best,
> Aljoscha
>
> On 25.05.20 19:18, Piotr Nowojski wrote:
>> Hi,
>> It would be helpful if you could provide full stack trace, what Flink version and which Kafka connector version are you using?
>> It sounds like either a dependency convergence error (mixing Kafka dependencies/various versions of flink-connector-kafka inside a single job/jar) or some shading issue. Can you check your project for such issues (`mvn dependency:tree` command [1]).
>> Also what’s a bit suspicious for me is the return type:
>>> Lorg/apache/flink/kafka/shaded/org/apache/kafka/clients/producer/ProducerRecord;
>> I’m not sure, but I was not aware that we are shading Kafka dependency in our connectors? Are you manually shading something?
>> Piotrek
>> [1] https://maven.apache.org/plugins/maven-dependency-plugin/examples/resolving-conflicts-using-the-dependency-tree.html <https://maven.apache.org/plugins/maven-dependency-plugin/examples/resolving-conflicts-using-the-dependency-tree.html>
>>> On 22 May 2020, at 15:34, wanglei2@geekplus.com.cn wrote:
>>>
>>>
>>> public class MyKafkaSerializationSchema implements KafkaSerializationSchema<Tuple2<String, String>> {
>>> @Override
>>> public ProducerRecord<byte[], byte[]> serialize(Tuple2<String, String> o, @Nullable Long aLong) {
>>> ProducerRecord<byte[], byte[]> record = new ProducerRecord<>(o.f0,
>>> o.f1.getBytes(StandardCharsets.UTF_8));
>>> return record;
>>> }
>>> }
>>> FlinkKafkaProducer<Tuple2<String, String>> producer = new FlinkKafkaProducer<Tuple2<String, String>>(
>>> "default", new MyKafkaSerializationSchema(),
>>> prop2,Semantic.EXACTLY_ONCE);
>>>
>>> But there's error when runnng:
>>>
>>> java.lang.AbstractMethodError: com.geekplus.flinketl.schema.MyKafkaSerializationSchema.serialize(Ljava/lang/Object;Ljava/lang/Long;)Lorg/apache/flink/kafka/shaded/org/apache/kafka/clients/producer/ProducerRecord;
>>>
>>> Any suggestion on this?
>>>
>>> Thanks,
>>> Lei
>>> wanglei2@geekplus.com.cn <ma...@geekplus.com.cn>
>
Re: java.lang.AbstractMethodError when implementing
KafkaSerializationSchema
Posted by Aljoscha Krettek <al...@apache.org>.
I think what might be happening is that you're mixing dependencies from
the flink-sql-connector-kafka and the proper flink-connector-kafka that
should be used with the DataStream API. Could that be the case?
Best,
Aljoscha
On 25.05.20 19:18, Piotr Nowojski wrote:
> Hi,
>
> It would be helpful if you could provide full stack trace, what Flink version and which Kafka connector version are you using?
>
> It sounds like either a dependency convergence error (mixing Kafka dependencies/various versions of flink-connector-kafka inside a single job/jar) or some shading issue. Can you check your project for such issues (`mvn dependency:tree` command [1]).
>
> Also what’s a bit suspicious for me is the return type:
>
>> Lorg/apache/flink/kafka/shaded/org/apache/kafka/clients/producer/ProducerRecord;
>
> I’m not sure, but I was not aware that we are shading Kafka dependency in our connectors? Are you manually shading something?
>
> Piotrek
>
> [1] https://maven.apache.org/plugins/maven-dependency-plugin/examples/resolving-conflicts-using-the-dependency-tree.html <https://maven.apache.org/plugins/maven-dependency-plugin/examples/resolving-conflicts-using-the-dependency-tree.html>
>
>> On 22 May 2020, at 15:34, wanglei2@geekplus.com.cn wrote:
>>
>>
>> public class MyKafkaSerializationSchema implements KafkaSerializationSchema<Tuple2<String, String>> {
>> @Override
>> public ProducerRecord<byte[], byte[]> serialize(Tuple2<String, String> o, @Nullable Long aLong) {
>> ProducerRecord<byte[], byte[]> record = new ProducerRecord<>(o.f0,
>> o.f1.getBytes(StandardCharsets.UTF_8));
>> return record;
>> }
>> }
>> FlinkKafkaProducer<Tuple2<String, String>> producer = new FlinkKafkaProducer<Tuple2<String, String>>(
>> "default", new MyKafkaSerializationSchema(),
>> prop2,Semantic.EXACTLY_ONCE);
>>
>> But there's error when runnng:
>>
>> java.lang.AbstractMethodError: com.geekplus.flinketl.schema.MyKafkaSerializationSchema.serialize(Ljava/lang/Object;Ljava/lang/Long;)Lorg/apache/flink/kafka/shaded/org/apache/kafka/clients/producer/ProducerRecord;
>>
>> Any suggestion on this?
>>
>> Thanks,
>> Lei
>> wanglei2@geekplus.com.cn <ma...@geekplus.com.cn>
>
Re: java.lang.AbstractMethodError when implementing
KafkaSerializationSchema
Posted by Piotr Nowojski <pi...@ververica.com>.
Hi,
It would be helpful if you could provide full stack trace, what Flink version and which Kafka connector version are you using?
It sounds like either a dependency convergence error (mixing Kafka dependencies/various versions of flink-connector-kafka inside a single job/jar) or some shading issue. Can you check your project for such issues (`mvn dependency:tree` command [1]).
Also what’s a bit suspicious for me is the return type:
> Lorg/apache/flink/kafka/shaded/org/apache/kafka/clients/producer/ProducerRecord;
I’m not sure, but I was not aware that we are shading Kafka dependency in our connectors? Are you manually shading something?
Piotrek
[1] https://maven.apache.org/plugins/maven-dependency-plugin/examples/resolving-conflicts-using-the-dependency-tree.html <https://maven.apache.org/plugins/maven-dependency-plugin/examples/resolving-conflicts-using-the-dependency-tree.html>
> On 22 May 2020, at 15:34, wanglei2@geekplus.com.cn wrote:
>
>
> public class MyKafkaSerializationSchema implements KafkaSerializationSchema<Tuple2<String, String>> {
> @Override
> public ProducerRecord<byte[], byte[]> serialize(Tuple2<String, String> o, @Nullable Long aLong) {
> ProducerRecord<byte[], byte[]> record = new ProducerRecord<>(o.f0,
> o.f1.getBytes(StandardCharsets.UTF_8));
> return record;
> }
> }
> FlinkKafkaProducer<Tuple2<String, String>> producer = new FlinkKafkaProducer<Tuple2<String, String>>(
> "default", new MyKafkaSerializationSchema(),
> prop2,Semantic.EXACTLY_ONCE);
>
> But there's error when runnng:
>
> java.lang.AbstractMethodError: com.geekplus.flinketl.schema.MyKafkaSerializationSchema.serialize(Ljava/lang/Object;Ljava/lang/Long;)Lorg/apache/flink/kafka/shaded/org/apache/kafka/clients/producer/ProducerRecord;
>
> Any suggestion on this?
>
> Thanks,
> Lei
> wanglei2@geekplus.com.cn <ma...@geekplus.com.cn>