You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Wenchen Fan (Jira)" <ji...@apache.org> on 2020/05/08 08:43:00 UTC

[jira] [Resolved] (SPARK-30385) WebUI occasionally throw IOException on stop()

     [ https://issues.apache.org/jira/browse/SPARK-30385?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Wenchen Fan resolved SPARK-30385.
---------------------------------
    Fix Version/s: 3.1.0
         Assignee: Kousuke Saruta
       Resolution: Fixed

> WebUI occasionally throw IOException on stop()
> ----------------------------------------------
>
>                 Key: SPARK-30385
>                 URL: https://issues.apache.org/jira/browse/SPARK-30385
>             Project: Spark
>          Issue Type: Bug
>          Components: Web UI
>    Affects Versions: 3.0.0
>         Environment: MacOS 10.14.6
> Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_231
> Scala version 2.12.10
>            Reporter: wuyi
>            Assignee: Kousuke Saruta
>            Priority: Major
>             Fix For: 3.1.0
>
>
> While using ./bin/spark-shell, recently, I occasionally see IOException when I try to quit:
> {code:java}
> 19/12/30 17:33:21 WARN AbstractConnector:
> java.io.IOException: No such file or directory
>  at sun.nio.ch.NativeThread.signal(Native Method)
>  at sun.nio.ch.ServerSocketChannelImpl.implCloseSelectableChannel(ServerSocketChannelImpl.java:292)
>  at java.nio.channels.spi.AbstractSelectableChannel.implCloseChannel(AbstractSelectableChannel.java:234)
>  at java.nio.channels.spi.AbstractInterruptibleChannel.close(AbstractInterruptibleChannel.java:115)
>  at org.eclipse.jetty.server.ServerConnector.close(ServerConnector.java:368)
>  at org.eclipse.jetty.server.AbstractNetworkConnector.shutdown(AbstractNetworkConnector.java:105)
>  at org.eclipse.jetty.server.Server.doStop(Server.java:439)
>  at org.eclipse.jetty.util.component.AbstractLifeCycle.stop(AbstractLifeCycle.java:89) 
>  at org.apache.spark.ui.ServerInfo.stop(JettyUtils.scala:499)
>  at org.apache.spark.ui.WebUI.$anonfun$stop$2(WebUI.scala:173)
>  at org.apache.spark.ui.WebUI.$anonfun$stop$2$adapted(WebUI.scala:173)
>  at scala.Option.foreach(Option.scala:407)
>  at org.apache.spark.ui.WebUI.stop(WebUI.scala:173)
>  at org.apache.spark.ui.SparkUI.stop(SparkUI.scala:101)
>  at org.apache.spark.SparkContext.$anonfun$stop$6(SparkContext.scala:1972)
>  at org.apache.spark.SparkContext.$anonfun$stop$6$adapted(SparkContext.scala:1972)
>  at scala.Option.foreach(Option.scala:407)
>  at org.apache.spark.SparkContext.$anonfun$stop$5(SparkContext.scala:1972)
>  at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1357)
>  at org.apache.spark.SparkContext.stop(SparkContext.scala:1972)
>  at org.apache.spark.repl.Main$.$anonfun$doMain$3(Main.scala:79)
>  at org.apache.spark.repl.Main$.$anonfun$doMain$3$adapted(Main.scala:79)
>  at scala.Option.foreach(Option.scala:407)
>  at org.apache.spark.repl.Main$.doMain(Main.scala:79)
>  at org.apache.spark.repl.Main$.main(Main.scala:58)
>  at org.apache.spark.repl.Main.main(Main.scala)
>  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>  at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>  at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>  at java.lang.reflect.Method.invoke(Method.java:498) 
>  at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
>  at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:928)
>  at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
>  at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
>  at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90) 
>  at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1007)
>  at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1016)
>  at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> {code}
> I don't find a way to reproduce it stably, but it will increase possibility if you stay in spark-shell for not a short time.  
> A possible way to reproduce this is: start ./bin/spark-shell , wait for 5 min, then use :q or :quit to quit.
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org