You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@flink.apache.org by Nick Bendtner <bu...@gmail.com> on 2020/05/07 00:34:49 UTC

Using flink-connector-kafka-1.9.1 with flink-core-1.7.2

Hi guys,
I am using flink 1.7.2 version. I have to deserialize data from kafka into
consumer records therefore I decided to update the flink-connector-kafka to
1.9.1 which provides support for consumer record. We use child first class
loading. However it seems like I have compatibility issue as I get this
exception, *Exception in thread "main" java.lang.NoClassDefFoundError:
org/apache/flink/api/common/ExecutionConfig$ClosureCleanerLevel *.

Any tricks to make this work without changing the version of flink-core ?


Best,
Nick.

Re: Using flink-connector-kafka-1.9.1 with flink-core-1.7.2

Posted by Arvid Heise <ar...@ververica.com>.
Hi Nick,

yes, you can be lucky that no involved classes have changed (much), but
there is no guarantee.
You could try to fiddle around and add the respective class (
*ClosureCleanerLevel)* from Flink 1.9 in your jar, but it's hacky at best.

Another option is to bundle Flink 1.9 with your code if you cannot upgrade
the Flink cluster. That works, for example, when working with Yarn.

On Thu, May 14, 2020 at 3:22 PM Nick Bendtner <bu...@gmail.com> wrote:

> Hi Arvid,
> I had no problems using flink Kafka connector 1.8.0 with flink 1.7.2 core
> .
>
> Best
> Nick
>
> On Thu, May 7, 2020 at 1:34 AM Arvid Heise <ar...@ververica.com> wrote:
>
>> Hi Nick,
>>
>> all Flink dependencies are only compatible with the same major version.
>>
>> You can workaround it by checking out the code [1] and manually set the
>> dependency of the respective module to your flink-core version and revert
>> all changes that are not compiling. But there is no guarantee that this
>> will ultimately work as you are pretty much backporting some changes to the
>> old version.
>>
>> [1] https://github.com/AHeise/flink
>>
>> On Thu, May 7, 2020 at 2:35 AM Nick Bendtner <bu...@gmail.com> wrote:
>>
>>> Hi guys,
>>> I am using flink 1.7.2 version. I have to deserialize data from kafka
>>> into  consumer records therefore I decided to update
>>> the flink-connector-kafka to 1.9.1 which provides support for consumer
>>> record. We use child first class loading. However it seems like I have
>>> compatibility issue as I get this exception, *Exception in thread
>>> "main" java.lang.NoClassDefFoundError:
>>> org/apache/flink/api/common/ExecutionConfig$ClosureCleanerLevel *.
>>>
>>> Any tricks to make this work without changing the version of flink-core
>>> ?
>>>
>>>
>>> Best,
>>> Nick.
>>>
>>>
>>
>> --
>>
>> Arvid Heise | Senior Java Developer
>>
>> <https://www.ververica.com/>
>>
>> Follow us @VervericaData
>>
>> --
>>
>> Join Flink Forward <https://flink-forward.org/> - The Apache Flink
>> Conference
>>
>> Stream Processing | Event Driven | Real Time
>>
>> --
>>
>> Ververica GmbH | Invalidenstrasse 115, 10115 Berlin, Germany
>> <https://www.google.com/maps/search/Invalidenstrasse+115,+10115+Berlin,+Germany?entry=gmail&source=g>
>>
>> --
>> Ververica GmbH
>> Registered at Amtsgericht Charlottenburg: HRB 158244 B
>> Managing Directors: Timothy Alexander Steinert, Yip Park Tung Jason, Ji
>> (Toni) Cheng
>>
>

-- 

Arvid Heise | Senior Java Developer

<https://www.ververica.com/>

Follow us @VervericaData

--

Join Flink Forward <https://flink-forward.org/> - The Apache Flink
Conference

Stream Processing | Event Driven | Real Time

--

Ververica GmbH | Invalidenstrasse 115, 10115 Berlin, Germany

--
Ververica GmbH
Registered at Amtsgericht Charlottenburg: HRB 158244 B
Managing Directors: Timothy Alexander Steinert, Yip Park Tung Jason, Ji
(Toni) Cheng

Re: Using flink-connector-kafka-1.9.1 with flink-core-1.7.2

Posted by Arvid Heise <ar...@ververica.com>.
Hi Nick,

all Flink dependencies are only compatible with the same major version.

You can workaround it by checking out the code [1] and manually set the
dependency of the respective module to your flink-core version and revert
all changes that are not compiling. But there is no guarantee that this
will ultimately work as you are pretty much backporting some changes to the
old version.

[1] https://github.com/AHeise/flink

On Thu, May 7, 2020 at 2:35 AM Nick Bendtner <bu...@gmail.com> wrote:

> Hi guys,
> I am using flink 1.7.2 version. I have to deserialize data from kafka
> into  consumer records therefore I decided to update
> the flink-connector-kafka to 1.9.1 which provides support for consumer
> record. We use child first class loading. However it seems like I have
> compatibility issue as I get this exception, *Exception in thread "main"
> java.lang.NoClassDefFoundError:
> org/apache/flink/api/common/ExecutionConfig$ClosureCleanerLevel *.
>
> Any tricks to make this work without changing the version of flink-core ?
>
>
> Best,
> Nick.
>
>

-- 

Arvid Heise | Senior Java Developer

<https://www.ververica.com/>

Follow us @VervericaData

--

Join Flink Forward <https://flink-forward.org/> - The Apache Flink
Conference

Stream Processing | Event Driven | Real Time

--

Ververica GmbH | Invalidenstrasse 115, 10115 Berlin, Germany

--
Ververica GmbH
Registered at Amtsgericht Charlottenburg: HRB 158244 B
Managing Directors: Timothy Alexander Steinert, Yip Park Tung Jason, Ji
(Toni) Cheng