You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@flink.apache.org by "luoyuxia (Jira)" <ji...@apache.org> on 2022/11/04 03:32:00 UTC

[jira] [Commented] (FLINK-29875) Can not find JobConf when using Hive Connector

    [ https://issues.apache.org/jira/browse/FLINK-29875?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17628646#comment-17628646 ] 

luoyuxia commented on FLINK-29875:
----------------------------------

I think it's expected. When use yarn to start cluster, we need to add the classpath of  mapreduce class to the classpath in yarn-site.xml .  

But it'll be better to hava a doc for it.

> Can not find JobConf when using Hive Connector 
> -----------------------------------------------
>
>                 Key: FLINK-29875
>                 URL: https://issues.apache.org/jira/browse/FLINK-29875
>             Project: Flink
>          Issue Type: Bug
>          Components: Connectors / Hive
>    Affects Versions: 1.17.0, 1.16.1
>            Reporter: Shengkai Fang
>            Priority: Major
>
> {code:java}
> org.apache.flink.runtime.JobException: Cannot instantiate the coordinator for operator Source: src[1] -> Sink: Collect table sink
>   at org.apache.flink.runtime.executiongraph.ExecutionJobVertex.initialize(ExecutionJobVertex.java:229)
>   at org.apache.flink.runtime.executiongraph.DefaultExecutionGraph.initializeJobVertex(DefaultExecutionGraph.java:901)
>   at org.apache.flink.runtime.executiongraph.DefaultExecutionGraph.initializeJobVertices(DefaultExecutionGraph.java:891)
>   at org.apache.flink.runtime.executiongraph.DefaultExecutionGraph.attachJobGraph(DefaultExecutionGraph.java:848)
>   at org.apache.flink.runtime.executiongraph.DefaultExecutionGraph.attachJobGraph(DefaultExecutionGraph.java:830)
>   at org.apache.flink.runtime.executiongraph.DefaultExecutionGraphBuilder.buildGraph(DefaultExecutionGraphBuilder.java:203)
>   at org.apache.flink.runtime.scheduler.DefaultExecutionGraphFactory.createAndRestoreExecutionGraph(DefaultExecutionGraphFactory.java:156)
>   at org.apache.flink.runtime.scheduler.SchedulerBase.createAndRestoreExecutionGraph(SchedulerBase.java:361)
>   at org.apache.flink.runtime.scheduler.SchedulerBase.<init>(SchedulerBase.java:206)
>   at org.apache.flink.runtime.scheduler.DefaultScheduler.<init>(DefaultScheduler.java:134)
>   at org.apache.flink.runtime.scheduler.DefaultSchedulerFactory.createInstance(DefaultSchedulerFactory.java:152)
>   at org.apache.flink.runtime.jobmaster.DefaultSlotPoolServiceSchedulerFactory.createScheduler(DefaultSlotPoolServiceSchedulerFactory.java:119)
>   at org.apache.flink.runtime.jobmaster.JobMaster.createScheduler(JobMaster.java:369)
>   at org.apache.flink.runtime.jobmaster.JobMaster.<init>(JobMaster.java:346)
>   at org.apache.flink.runtime.jobmaster.factories.DefaultJobMasterServiceFactory.internalCreateJobMasterService(DefaultJobMasterServiceFactory.java:123)
>   at org.apache.flink.runtime.jobmaster.factories.DefaultJobMasterServiceFactory.lambda$createJobMasterService$0(DefaultJobMasterServiceFactory.java:95)
>   at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedSupplier$4(FunctionUtils.java:112)
>   ... 4 more
> Caused by: java.lang.NoClassDefFoundError: Lorg/apache/hadoop/mapred/JobConf;
>   at java.lang.Class.getDeclaredFields0(Native Method)
>   at java.lang.Class.privateGetDeclaredFields(Class.java:2583)
>   at java.lang.Class.getDeclaredField(Class.java:2068)
>   at java.io.ObjectStreamClass.getDeclaredSUID(ObjectStreamClass.java:1871)
>   at java.io.ObjectStreamClass.access$700(ObjectStreamClass.java:79)
>   at java.io.ObjectStreamClass$3.run(ObjectStreamClass.java:506)
>   at java.io.ObjectStreamClass$3.run(ObjectStreamClass.java:494)
>   at java.security.AccessController.doPrivileged(Native Method)
>   at java.io.ObjectStreamClass.<init>(ObjectStreamClass.java:494)
>   at java.io.ObjectStreamClass.lookup(ObjectStreamClass.java:391)
>   at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:681)
>   at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1941)
>   at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1807)
>   at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2098)
>   at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1624)
>   at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2343)
>   at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2267)
>   at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2125)
>   at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1624)
>   at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2343)
>   at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2267)
>   at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2125)
>   at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1624)
>   at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2343)
>   at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2267)
>   at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2125)
>   at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1624)
>   at java.io.ObjectInputStream.readObject(ObjectInputStream.java:464)
>   at java.io.ObjectInputStream.readObject(ObjectInputStream.java:422)
>   at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:617)
>   at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:602)
>   at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:589)
>   at org.apache.flink.util.SerializedValue.deserializeValue(SerializedValue.java:67)
>   at org.apache.flink.runtime.operators.coordination.OperatorCoordinatorHolder.create(OperatorCoordinatorHolder.java:488)
>   at org.apache.flink.runtime.executiongraph.ExecutionJobVertex.createOperatorCoordinatorHolder(ExecutionJobVertex.java:286)
>   at org.apache.flink.runtime.executiongraph.ExecutionJobVertex.initialize(ExecutionJobVertex.java:223)
>   ... 20 more
> Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.mapred.JobConf
>   at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
>   at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
>   at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
>   at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
>   ... 56 more (state=,code=0)
> {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)