You are viewing a plain text version of this content. The canonical link for it is here.
Posted to hdfs-dev@hadoop.apache.org by "zhiyong zhang (JIRA)" <ji...@apache.org> on 2009/07/09 23:26:22 UTC

[jira] Created: (HDFS-481) hadoop-version is not recognized if run ant command from src/contrib/ or from src/contrib/hdfsproxy

hadoop-version is not recognized if run ant command from src/contrib/ or from src/contrib/hdfsproxy  
-----------------------------------------------------------------------------------------------------

                 Key: HDFS-481
                 URL: https://issues.apache.org/jira/browse/HDFS-481
             Project: Hadoop HDFS
          Issue Type: Bug
          Components: contrib/hdfsproxy
    Affects Versions: 0.21.0
            Reporter: zhiyong zhang
            Assignee: zhiyong zhang


If running ant command from $HADOOP_HDFS_HOME, hadoop-version will be passed to contrib's build through subant. But if running from src/contrib or src/contrib/hdfsproxy, the hadoop-version will not be recognized. 

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.


[jira] Resolved: (HDFS-481) Bug Fixes + HdfsProxy to use proxy user to impresonate the real user

Posted by "Tsz Wo (Nicholas), SZE (JIRA)" <ji...@apache.org>.
     [ https://issues.apache.org/jira/browse/HDFS-481?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Tsz Wo (Nicholas), SZE resolved HDFS-481.
-----------------------------------------

    Resolution: Fixed

I also have tested it locally.  It worked fine.

I have committed this.  Thanks, Srikanth!

> Bug Fixes + HdfsProxy to use proxy user to impresonate the real user
> --------------------------------------------------------------------
>
>                 Key: HDFS-481
>                 URL: https://issues.apache.org/jira/browse/HDFS-481
>             Project: Hadoop HDFS
>          Issue Type: Bug
>          Components: contrib/hdfsproxy
>    Affects Versions: 0.21.0
>            Reporter: zhiyong zhang
>            Assignee: Srikanth Sundarrajan
>         Attachments: HDFS-481-bp-y20.patch, HDFS-481-bp-y20s.patch, HDFS-481.out, HDFS-481.patch, HDFS-481.patch, HDFS-481.patch, HDFS-481.patch, HDFS-481.patch, HDFS-481.patch, HDFS-481.patch, HDFS-481.patch
>
>
> Bugs:
> 1. hadoop-version is not recognized if run ant command from src/contrib/ or from src/contrib/hdfsproxy  
> If running ant command from $HADOOP_HDFS_HOME, hadoop-version will be passed to contrib's build through subant. But if running from src/contrib or src/contrib/hdfsproxy, the hadoop-version will not be recognized. 
> 2. LdapIpDirFilter.java is not thread safe. userName, Group & Paths are per request and can't be class members.
> 3. Addressed the following StackOverflowError. 
> ERROR [org.apache.catalina.core.ContainerBase.[Catalina].[localh
> ost].[/].[proxyForward]] Servlet.service() for servlet proxyForward threw exception
> java.lang.StackOverflowError
>         at org.apache.catalina.core.ApplicationHttpRequest.getAttribute(ApplicationHttpR
> equest.java:229)
>      This is due to when the target war (/target.war) does not exist, the forwarding war will forward to its parent context path /, which defines the forwarding war itself. This cause infinite loop.  Added "HDFS Proxy Forward".equals(dstContext.getServletContextName() in the if logic to break the loop.
> 4. Kerberos credentials of remote user aren't available. HdfsProxy needs to act on behalf of the real user to service the requests

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.