You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "YUBI LEE (Jira)" <ji...@apache.org> on 2022/10/29 04:08:00 UTC

[jira] [Updated] (SPARK-40964) Cannot run spark history server with shaded hadoop jar

     [ https://issues.apache.org/jira/browse/SPARK-40964?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

YUBI LEE updated SPARK-40964:
-----------------------------
    Description: 
Since SPARK-33212, Spark uses shaded client jars from Hadoop 3.x+.
In this situation, if you try to start Spark History Server with shaded client jars and enable security using org.apache.hadoop.security.authentication.server.AuthenticationFilter.
You will meet following exception.


{code}
22/10/27 15:29:48 INFO AbstractConnector: Started ServerConnector@5ca1f591{HTTP/1.1, (http/1.1)}{0.0.0.0:18081}
22/10/27 15:29:48 INFO Utils: Successfully started service 'HistoryServerUI' on port 18081.
22/10/27 15:29:48 INFO ServerInfo: Adding filter to /: org.apache.hadoop.security.authentication.server.AuthenticationFilter
22/10/27 15:29:48 ERROR HistoryServer: Failed to bind HistoryServer
java.lang.IllegalStateException: class org.apache.hadoop.security.authentication.server.AuthenticationFilter is not a javax.servlet.Filter
        at org.sparkproject.jetty.servlet.FilterHolder.doStart(FilterHolder.java:103)
        at org.sparkproject.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:73)
        at org.sparkproject.jetty.servlet.ServletHandler.lambda$initialize$0(ServletHandler.java:730)
        at java.util.Spliterators$ArraySpliterator.forEachRemaining(Spliterators.java:948)
        at java.util.stream.Streams$ConcatSpliterator.forEachRemaining(Streams.java:742)
        at java.util.stream.ReferencePipeline$Head.forEach(ReferencePipeline.java:647)
        at org.sparkproject.jetty.servlet.ServletHandler.initialize(ServletHandler.java:755)
        at org.sparkproject.jetty.servlet.ServletContextHandler.startContext(ServletContextHandler.java:379)
        at org.sparkproject.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:910)
        at org.sparkproject.jetty.servlet.ServletContextHandler.doStart(ServletContextHandler.java:288)
        at org.sparkproject.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:73)
        at org.apache.spark.ui.ServerInfo.addHandler(JettyUtils.scala:491)
        at org.apache.spark.ui.WebUI.$anonfun$bind$3(WebUI.scala:148)
        at org.apache.spark.ui.WebUI.$anonfun$bind$3$adapted(WebUI.scala:148)
        at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
        at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
        at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
        at org.apache.spark.ui.WebUI.bind(WebUI.scala:148)
        at org.apache.spark.deploy.history.HistoryServer.bind(HistoryServer.scala:164)
        at org.apache.spark.deploy.history.HistoryServer$.main(HistoryServer.scala:310)
        at org.apache.spark.deploy.history.HistoryServer.main(HistoryServer.scala)
{code}


I think "AuthenticationFilter" in the shaded jar imports "org.apache.hadoop.shaded.javax.servlet.Filter", not "javax.servlet.Filter".

{code}
❯ grep -r org.apache.hadoop.shaded.javax.servlet.Filter *
Binary file hadoop-client-runtime-3.3.1.jar matches
{code}

It causes the exception I mentioned.

I'm not sure what is the best answer.
Workaround is not to use spark with pre-built for Apache Hadoop, specify `HADOOP_HOME` or `SPARK_DIST_CLASSPATH` in spark-env.sh for Spark History Server.

May be the possible options are:
- Not to shade "javax.servlet.Filter" at hadoop shaded jar
- Or, shade "javax.servlet.Filter" also at jetty.

  was:
Since SPARK-33212, Spark uses shaded client jars from Hadoop 3.x+.
In this situation, if you try to start Spark History Server with shaded client jars and enable security using org.apache.hadoop.security.authentication.server.AuthenticationFilter.
You will meet following exception.


{code}
22/10/27 15:29:48 INFO AbstractConnector: Started ServerConnector@5ca1f591{HTTP/1.1, (http/1.1)}{0.0.0.0:18081}
22/10/27 15:29:48 INFO Utils: Successfully started service 'HistoryServerUI' on port 18081.
22/10/27 15:29:48 INFO ServerInfo: Adding filter to /: org.apache.hadoop.security.authentication.server.AuthenticationFilter
22/10/27 15:29:48 ERROR HistoryServer: Failed to bind HistoryServer
java.lang.IllegalStateException: class org.apache.hadoop.security.authentication.server.AuthenticationFilter is not a javax.servlet.Filter
        at org.sparkproject.jetty.servlet.FilterHolder.doStart(FilterHolder.java:103)
        at org.sparkproject.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:73)
        at org.sparkproject.jetty.servlet.ServletHandler.lambda$initialize$0(ServletHandler.java:730)
        at java.util.Spliterators$ArraySpliterator.forEachRemaining(Spliterators.java:948)
        at java.util.stream.Streams$ConcatSpliterator.forEachRemaining(Streams.java:742)
        at java.util.stream.ReferencePipeline$Head.forEach(ReferencePipeline.java:647)
        at org.sparkproject.jetty.servlet.ServletHandler.initialize(ServletHandler.java:755)
        at org.sparkproject.jetty.servlet.ServletContextHandler.startContext(ServletContextHandler.java:379)
        at org.sparkproject.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:910)
        at org.sparkproject.jetty.servlet.ServletContextHandler.doStart(ServletContextHandler.java:288)
        at org.sparkproject.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:73)
        at org.apache.spark.ui.ServerInfo.addHandler(JettyUtils.scala:491)
        at org.apache.spark.ui.WebUI.$anonfun$bind$3(WebUI.scala:148)
        at org.apache.spark.ui.WebUI.$anonfun$bind$3$adapted(WebUI.scala:148)
        at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
        at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
        at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
        at org.apache.spark.ui.WebUI.bind(WebUI.scala:148)
        at org.apache.spark.deploy.history.HistoryServer.bind(HistoryServer.scala:164)
        at org.apache.spark.deploy.history.HistoryServer$.main(HistoryServer.scala:310)
        at org.apache.spark.deploy.history.HistoryServer.main(HistoryServer.scala)
{code}


I think "AuthenticationFilter" in the shaded jar imports "org.apache.hadoop.shaded.javax.servlet.Filter", not "javax.servlet.Filter".

```
❯ grep -r org.apache.hadoop.shaded.javax.servlet.Filter *
Binary file hadoop-client-runtime-3.3.1.jar matches
```

It causes the exception I mentioned.

I'm not sure what is the best answer.
Workaround is not to use spark with pre-built for Apache Hadoop, specify `HADOOP_HOME` or `SPARK_DIST_CLASSPATH` in spark-env.sh for Spark History Server.

May be the possible options are:
- Not to shade "javax.servlet.Filter" at hadoop shaded jar
- Or, shade "javax.servlet.Filter" also at jetty.


> Cannot run spark history server with shaded hadoop jar
> ------------------------------------------------------
>
>                 Key: SPARK-40964
>                 URL: https://issues.apache.org/jira/browse/SPARK-40964
>             Project: Spark
>          Issue Type: Bug
>          Components: Web UI
>    Affects Versions: 3.2.2
>            Reporter: YUBI LEE
>            Priority: Major
>
> Since SPARK-33212, Spark uses shaded client jars from Hadoop 3.x+.
> In this situation, if you try to start Spark History Server with shaded client jars and enable security using org.apache.hadoop.security.authentication.server.AuthenticationFilter.
> You will meet following exception.
> {code}
> 22/10/27 15:29:48 INFO AbstractConnector: Started ServerConnector@5ca1f591{HTTP/1.1, (http/1.1)}{0.0.0.0:18081}
> 22/10/27 15:29:48 INFO Utils: Successfully started service 'HistoryServerUI' on port 18081.
> 22/10/27 15:29:48 INFO ServerInfo: Adding filter to /: org.apache.hadoop.security.authentication.server.AuthenticationFilter
> 22/10/27 15:29:48 ERROR HistoryServer: Failed to bind HistoryServer
> java.lang.IllegalStateException: class org.apache.hadoop.security.authentication.server.AuthenticationFilter is not a javax.servlet.Filter
>         at org.sparkproject.jetty.servlet.FilterHolder.doStart(FilterHolder.java:103)
>         at org.sparkproject.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:73)
>         at org.sparkproject.jetty.servlet.ServletHandler.lambda$initialize$0(ServletHandler.java:730)
>         at java.util.Spliterators$ArraySpliterator.forEachRemaining(Spliterators.java:948)
>         at java.util.stream.Streams$ConcatSpliterator.forEachRemaining(Streams.java:742)
>         at java.util.stream.ReferencePipeline$Head.forEach(ReferencePipeline.java:647)
>         at org.sparkproject.jetty.servlet.ServletHandler.initialize(ServletHandler.java:755)
>         at org.sparkproject.jetty.servlet.ServletContextHandler.startContext(ServletContextHandler.java:379)
>         at org.sparkproject.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:910)
>         at org.sparkproject.jetty.servlet.ServletContextHandler.doStart(ServletContextHandler.java:288)
>         at org.sparkproject.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:73)
>         at org.apache.spark.ui.ServerInfo.addHandler(JettyUtils.scala:491)
>         at org.apache.spark.ui.WebUI.$anonfun$bind$3(WebUI.scala:148)
>         at org.apache.spark.ui.WebUI.$anonfun$bind$3$adapted(WebUI.scala:148)
>         at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
>         at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
>         at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
>         at org.apache.spark.ui.WebUI.bind(WebUI.scala:148)
>         at org.apache.spark.deploy.history.HistoryServer.bind(HistoryServer.scala:164)
>         at org.apache.spark.deploy.history.HistoryServer$.main(HistoryServer.scala:310)
>         at org.apache.spark.deploy.history.HistoryServer.main(HistoryServer.scala)
> {code}
> I think "AuthenticationFilter" in the shaded jar imports "org.apache.hadoop.shaded.javax.servlet.Filter", not "javax.servlet.Filter".
> {code}
> ❯ grep -r org.apache.hadoop.shaded.javax.servlet.Filter *
> Binary file hadoop-client-runtime-3.3.1.jar matches
> {code}
> It causes the exception I mentioned.
> I'm not sure what is the best answer.
> Workaround is not to use spark with pre-built for Apache Hadoop, specify `HADOOP_HOME` or `SPARK_DIST_CLASSPATH` in spark-env.sh for Spark History Server.
> May be the possible options are:
> - Not to shade "javax.servlet.Filter" at hadoop shaded jar
> - Or, shade "javax.servlet.Filter" also at jetty.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org