You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@flink.apache.org by "Robert Metzger (JIRA)" <ji...@apache.org> on 2019/02/28 15:21:01 UTC

[jira] [Updated] (FLINK-9029) Getting write permission from HDFS after updating flink-1.40 to flink-1.4.2

     [ https://issues.apache.org/jira/browse/FLINK-9029?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Robert Metzger updated FLINK-9029:
----------------------------------
    Component/s: FileSystems

> Getting write permission from HDFS after updating flink-1.40 to flink-1.4.2
> ---------------------------------------------------------------------------
>
>                 Key: FLINK-9029
>                 URL: https://issues.apache.org/jira/browse/FLINK-9029
>             Project: Flink
>          Issue Type: Bug
>          Components: FileSystems
>    Affects Versions: 1.4.1, 1.4.2
>         Environment: * Flink-1.4.2 (Flink-1.4.1)
>  * Hadoop 2.6.0-cdh5.13.0 with 4 nodes in service and {{Security is off.}}
>  * Ubuntu 16.04.3 LTS
>  * Java 8
>            Reporter: Mohammad Abareghi
>            Priority: Major
>
> *Environment*
>  * Flink-1.4.2
>  * Hadoop 2.6.0-cdh5.13.0 with 4 nodes in service and {{Security is off.}}
>  * Ubuntu 16.04.3 LTS
>  * Java 8
>  
> *Description*
> I have a Java job in flink-1.4.0 which writes to HDFS to a specific path. After updating to flink-1.4.2 I'm getting the following error from Hadoop complaining that the user doesn't have write permission to the given path:
> {code:java}
> WARN org.apache.hadoop.security.UserGroupInformation: PriviledgedActionException as:xng (auth:SIMPLE) cause:org.apache.hadoop.security.AccessControlException: Permission denied: user=user1, access=WRITE, inode="/user":hdfs:hadoop:drwxr-xr-x
> {code}
> *NOTE*:
>  * If I run the same job on flink-1.4.0, Error disappears regardless of what version of flink (1.4.0 or 1.4.2) dependencies I have for job
>  * Also if I run the job main method from my IDE and pass the same parameters, I don't get above error.
> *NOTE*:
> It seems the problem somehow is in {{flink-1.4.2/lib/flink-shaded-hadoop2-uber-1.4.2.jar}}. If I replace that with {{flink-1.4.0/lib/flink-shaded-hadoop2-uber-1.4.0.jar}}, restart the cluster and run my job (flink topology) then the error doesn't appear.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)