You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by jiang licht <li...@yahoo.com> on 2010/06/21 18:58:41 UTC

Re: user authentication: protect hdfs/job web interface from public

Got time to work on this. So, is there some sample of illustrating this and where to deploy this filter class? Thanks for your help!

--- On Fri, 3/5/10, Jakob Homan <jh...@yahoo-inc.com> wrote:

From: Jakob Homan <jh...@yahoo-inc.com>
Subject: Re: user authentication: protect hdfs/job web interface from public
To: common-user@hadoop.apache.org
Date: Friday, March 5, 2010, 4:38 PM

Jiang-
   Hadoop has support for this via the hadoop.http.filter.initializers property, which allows you set the name of a class to add as a standard servlet filter for the public-facing websites, such as:

<property>
      <name>hadoop.http.filter.initializers</name>
 
     <value>com.widgetcorp.HadoopFilter</value>
</property>

Each public-facing page will then be routed through this filter, which can reject the request.  This was designed to be pluggable to work with different organization's authentication schemes.  Other web servers are not currently secured, but will be via Kerberos in the currently in-progress security release.

-Jakob
Hadoop @ Yahoo!

jiang licht wrote:
> I guess I might need to check jetty documentation as well. Anyway, here is my question.
> 
> hdfs and map/reduce can be tracked via web interfaces, e.g. at 50070, 50030, etc. We won't want to see that anyone can access such web pages. But only authorized ppl can access them from anywhere anytime. One way might be to wrap the JSPs or rename JSPs and put an access page in front of them (cautious, not to break any possible links). But what is the best way to protect
 such web interfaces by adding some user authentication? Any suggestions?
> 
> Thanks,
> --
> 
> Michael
> 
> 
>