You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2019/03/10 23:49:00 UTC

[jira] [Created] (SPARK-27122) YARN test failures in Java 9+

Sean Owen created SPARK-27122:
---------------------------------

             Summary: YARN test failures in Java 9+
                 Key: SPARK-27122
                 URL: https://issues.apache.org/jira/browse/SPARK-27122
             Project: Spark
          Issue Type: Sub-task
          Components: YARN
    Affects Versions: 3.0.0
            Reporter: Sean Owen


Currently on Java 11:

{code}
YarnSchedulerBackendSuite:
- RequestExecutors reflects node blacklist and is serializable
- Respect user filters when adding AM IP filter *** FAILED ***
  java.lang.ClassCastException: org.spark_project.jetty.servlet.ServletContextHandler cannot be cast to org.eclipse.jetty.servlet.ServletContextHandler
  at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:237)
  at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
  at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
  at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
  at scala.collection.TraversableLike.map(TraversableLike.scala:237)
  at scala.collection.TraversableLike.map$(TraversableLike.scala:230)
  at scala.collection.AbstractTraversable.map(Traversable.scala:108)
  at org.apache.spark.scheduler.cluster.YarnSchedulerBackend.$anonfun$addWebUIFilter$2(YarnSchedulerBackend.scala:183)
  at org.apache.spark.scheduler.cluster.YarnSchedulerBackend.$anonfun$addWebUIFilter$2$adapted(YarnSchedulerBackend.scala:174)
  at scala.Option.foreach(Option.scala:274)
  ...
{code}

This looks like a classpath issue, probably ultimately related to the same classloader issues in https://issues.apache.org/jira/browse/SPARK-26839 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org