You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@flink.apache.org by "Roel Van der Paal (JIRA)" <ji...@apache.org> on 2018/08/31 11:20:00 UTC

[jira] [Created] (FLINK-10272) using Avro's DateConversion causes ClassNotFoundException on Hadoop bundle

Roel Van der Paal created FLINK-10272:
-----------------------------------------

             Summary: using Avro's DateConversion causes ClassNotFoundException on Hadoop bundle
                 Key: FLINK-10272
                 URL: https://issues.apache.org/jira/browse/FLINK-10272
             Project: Flink
          Issue Type: Improvement
    Affects Versions: 1.6.0, 1.5.3
            Reporter: Roel Van der Paal


When using org.apache.avro.data.TimeConversions.DateConversion()
in a job on a Hadoop bundled Flink cluster, it throws a ClassNotFoundException on org.joda.time.ReadablePartial

* it only occurs on the Hadoop bundled Flink cluster, not on the one without Hadoop
* it occurs for both version 1.5.3 and 1.6.0 (I did not check the other versions)
* this is probably because org.apache.avro:avro is included in the flink-shaded-hadoop2-uber-x.x.x.jar, but joda-time not (joda-time is an optional dependency from org.apache.avro:avro)
* adding joda-time to the flink lib folder fixes the problem

Proposed solution is to add joda-time to the flink-shaded-hadoop2-uber-x.x.x.jar or remove org.apache.avro:avro from the flink-shaded-hadoop2-uber-x.x.x.jar.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)