You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@flink.apache.org by "Rong Rong (Jira)" <ji...@apache.org> on 2020/04/29 14:16:00 UTC

[jira] [Comment Edited] (FLINK-17386) Exception in HadoopSecurityContextFactory.createContext while no shaded-hadoop-lib provided.

    [ https://issues.apache.org/jira/browse/FLINK-17386?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17094595#comment-17094595 ] 

Rong Rong edited comment on FLINK-17386 at 4/29/20, 2:15 PM:
-------------------------------------------------------------

I just create a quick fix for this and attach a PR. If you could help testing the patch later that would be super great!. Thx



was (Author: rongr):
hmm. that's not what I expected. I have 2 theories:

1. in 1.10 impl the SecurityContext was override using the same logic. However, the difference is that the classloader used in 1.10 could've been the runtime classloader (SecurityUtils.class.getClassLoader());
while in 1.11 the classloader is actually the service provider classloader (HadoopSecurityContextFactory.class.getClassLoader())
  - this may be an issue but I am not exactly sure.

2. in 1.10 impl the Context installation catches {{LinkageError}} as well as ClassNotFoundExceptions. which is a much more strong catch term. and NoClassDefFoundError actually extends LinkageError. 

I am more convinced that the #2 is the root cause, would try to create a quick fix for this and attach a PR. If you could help testing the patch later that would be super great!.


> Exception in HadoopSecurityContextFactory.createContext while no shaded-hadoop-lib provided.
> --------------------------------------------------------------------------------------------
>
>                 Key: FLINK-17386
>                 URL: https://issues.apache.org/jira/browse/FLINK-17386
>             Project: Flink
>          Issue Type: Bug
>            Reporter: Wenlong Lyu
>            Assignee: Rong Rong
>            Priority: Major
>              Labels: pull-request-available
>
> java.io.IOException: Process execution failed due error. Error output:java.lang.NoClassDefFoundError: Could not initialize class org.apache.hadoop.security.UserGroupInformation\n\tat org.apache.flink.runtime.security.contexts.HadoopSecurityContextFactory.createContext(HadoopSecurityContextFactory.java:59)\n\tat org.apache.flink.runtime.security.SecurityUtils.installContext(SecurityUtils.java:92)\n\tat org.apache.flink.runtime.security.SecurityUtils.install(SecurityUtils.java:60)\n\tat org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:964)\n\n\tat com.alibaba.flink.vvr.util.AutoClosableProcess$AutoClosableProcessBuilder.runBlocking(AutoClosableProcess.java:144)\n\tat com.alibaba.flink.vvr.util.AutoClosableProcess$AutoClosableProcessBuilder.runBlocking(AutoClosableProcess.java:126)\n\tat com.alibaba.flink.vvr.VVRCompileTest.runSingleJobCompileCheck(VVRCompileTest.java:173)\n\tat com.alibaba.flink.vvr.VVRCompileTest.lambda$runJobsCompileCheck$0(VVRCompileTest.java:101)\n\tat java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)\n\tat java.util.concurrent.FutureTask.run(FutureTask.java:266)\n\tat java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1147)\n\tat java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:622)\n\tat java.lang.Thread.run(Thread.java:834)
> I think it is because exception throw in the static code block of UserInformation, we should catch Throwable instead of Exception in HadoopSecurityContextFactory#createContext?
> [~rongr] what do you think?



--
This message was sent by Atlassian Jira
(v8.3.4#803005)