You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@flink.apache.org by "Aljoscha Krettek (JIRA)" <ji...@apache.org> on 2019/06/13 12:24:00 UTC

[jira] [Updated] (FLINK-12163) User correct ClassLoader for Hadoop Writable TypeInfo

     [ https://issues.apache.org/jira/browse/FLINK-12163?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Aljoscha Krettek updated FLINK-12163:
-------------------------------------
    Summary: User correct ClassLoader for Hadoop Writable TypeInfo  (was: Hadoop Compatibility, could not load the TypeInformation due to incorrect classloader)

> User correct ClassLoader for Hadoop Writable TypeInfo
> -----------------------------------------------------
>
>                 Key: FLINK-12163
>                 URL: https://issues.apache.org/jira/browse/FLINK-12163
>             Project: Flink
>          Issue Type: Bug
>          Components: Connectors / Hadoop Compatibility
>    Affects Versions: 1.7.2, 1.8.0
>         Environment: Flink 1.5.6 standalone, Flink 1.7.2 standalone, 
> Hadoop 2.9.1 standalone
>            Reporter: morvenhuang
>            Assignee: arganzheng
>            Priority: Critical
>
> For Flink 1.5.6, 1.7.2, I keep getting error when using Hadoop Compatibility, 
> {code:java}
> Caused by: java.lang.RuntimeException: Could not load the TypeInformation for the class 'org.apache.hadoop.io.Writable'. You may be missing the 'flink-hadoop-compatibility' dependency.
> at org.apache.flink.api.java.typeutils.TypeExtractor.createHadoopWritableTypeInfo(TypeExtractor.java:2140)
> at org.apache.flink.api.java.typeutils.TypeExtractor.privateGetForClass(TypeExtractor.java:1759)
> at org.apache.flink.api.java.typeutils.TypeExtractor.privateGetForClass(TypeExtractor.java:1701)
> at org.apache.flink.api.java.typeutils.TypeExtractor.createTypeInfoWithTypeHierarchy(TypeExtractor.java:956)
> at org.apache.flink.api.java.typeutils.TypeExtractor.createSubTypesInfo(TypeExtractor.java:1176)
> at org.apache.flink.api.java.typeutils.TypeExtractor.createTypeInfoWithTypeHierarchy(TypeExtractor.java:889)
> at org.apache.flink.api.java.typeutils.TypeExtractor.privateCreateTypeInfo(TypeExtractor.java:839)
> at org.apache.flink.api.java.typeutils.TypeExtractor.createTypeInfo(TypeExtractor.java:805)
> at org.apache.flink.api.java.typeutils.TypeExtractor.createTypeInfo(TypeExtractor.java:798)
> at org.apache.flink.api.common.typeinfo.TypeHint.<init>(TypeHint.java:50)
> {code}
> Packaging the flink-hadoop-compatibility dependency with my code into a fat jar doesn't help.
> The error won't go until I copy the flink-hadoop-compatibility jar to FLINK_HOME/lib.
> This seems to be a classloader issue when looking into the TypeExtractor#createHadoopWritableTypeInfo
> {code:java}
> Class<?> typeInfoClass;
> try {
> typeInfoClass = Class.forName(HADOOP_WRITABLE_TYPEINFO_CLASS, false, TypeExtractor.class.getClassLoader());
> }
> catch (ClassNotFoundException e) {
> throw new RuntimeException("Could not load the TypeInformation for the class '"
> + HADOOP_WRITABLE_CLASS + "'. You may be missing the 'flink-hadoop-compatibility' dependency.");
> }
> {code}
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)