You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@flink.apache.org by "Robert Metzger (JIRA)" <ji...@apache.org> on 2019/02/28 15:40:00 UTC

[jira] [Updated] (FLINK-10272) using Avro's DateConversion causes ClassNotFoundException on Hadoop bundle

     [ https://issues.apache.org/jira/browse/FLINK-10272?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Robert Metzger updated FLINK-10272:
-----------------------------------
    Component/s: Formats (JSON, Avro, Parquet, ORC, SequenceFile)

> using Avro's DateConversion causes ClassNotFoundException on Hadoop bundle
> --------------------------------------------------------------------------
>
>                 Key: FLINK-10272
>                 URL: https://issues.apache.org/jira/browse/FLINK-10272
>             Project: Flink
>          Issue Type: Improvement
>          Components: Formats (JSON, Avro, Parquet, ORC, SequenceFile)
>    Affects Versions: 1.5.3, 1.6.0
>            Reporter: Roel Van der Paal
>            Priority: Minor
>
> When using org.apache.avro.data.TimeConversions.DateConversion()
> in a job on a Hadoop bundled Flink cluster, it throws a ClassNotFoundException on org.joda.time.ReadablePartial
> * it only occurs on the Hadoop bundled Flink cluster, not on the one without Hadoop
> * it occurs for both version 1.5.3 and 1.6.0 (I did not check the other versions)
> * this is probably because org.apache.avro:avro is included in the flink-shaded-hadoop2-uber-x.x.x.jar, but joda-time not (joda-time is an optional dependency from org.apache.avro:avro)
> * adding joda-time to the flink lib folder fixes the problem
> Proposed solution is to add joda-time to the flink-shaded-hadoop2-uber-x.x.x.jar or remove org.apache.avro:avro from the flink-shaded-hadoop2-uber-x.x.x.jar.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)