You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Jhon Anderson Cardenas Diaz <jh...@gmail.com> on 2018/01/09 22:23:50 UTC

How to create security filter for Spark UI in Spark on YARN

*Environment*:
AWS EMR, yarn cluster.

*Description*:
I am trying to use a java filter to protect the access to spark ui, this by
using the property spark.ui.filters; the problem is that when spark is
running on yarn mode, that property is being allways overriden by hadoop
with the filter org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter:

*spark.ui.filters:
org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter*

And this properties are automatically added:


*spark.org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter.param.PROXY_HOSTS:
ip-x-x-x-226.eu-west-1.compute.internalspark.org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter.param.PROXY_URI_BASES:
http://ip-x-x-x-226.eu-west-1.compute.internal:20888/proxy/application_xxxxxxxxxxxxx_xxxx
<http://ip-x-x-x-226.eu-west-1.compute.internal:20888/proxy/application_xxxxxxxxxxxxx_xxxx>*

Any suggestion of how to add a java security filter so hadoop does not
override it, or maybe how to configure the security from hadoop side?

Thanks.