You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Jean-Baptiste Onofré (JIRA)" <ji...@apache.org> on 2015/12/02 14:23:11 UTC

[jira] [Commented] (SPARK-11065) IOException thrown at job submit shutdown

    [ https://issues.apache.org/jira/browse/SPARK-11065?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15035773#comment-15035773 ] 

Jean-Baptiste Onofré commented on SPARK-11065:
----------------------------------------------

I just tested and it's now fixed.

Can you close this Jira ?

> IOException thrown at job submit shutdown
> -----------------------------------------
>
>                 Key: SPARK-11065
>                 URL: https://issues.apache.org/jira/browse/SPARK-11065
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Submit
>    Affects Versions: 1.6.0
>            Reporter: Jean-Baptiste Onofré
>            Priority: Minor
>             Fix For: 1.6.0
>
>
> When submitted a job (for instance JavaWordCount example), even if the job works fine, at the end of execution, we can see:
> {code}
> checkForCorruptJournalFiles="true": 1
> 15/10/12 16:31:12 INFO SparkUI: Stopped Spark web UI at http://192.168.134.10:4040
> 15/10/12 16:31:12 INFO DAGScheduler: Stopping DAGScheduler
> 15/10/12 16:31:12 INFO SparkDeploySchedulerBackend: Shutting down all executors
> 15/10/12 16:31:12 INFO SparkDeploySchedulerBackend: Asking each executor to shut down
> 15/10/12 16:31:12 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
> 15/10/12 16:31:12 INFO MemoryStore: MemoryStore cleared
> 15/10/12 16:31:12 INFO BlockManager: BlockManager stopped
> 15/10/12 16:31:12 INFO BlockManagerMaster: BlockManagerMaster stopped
> 15/10/12 16:31:12 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
> 15/10/12 16:31:12 ERROR TransportResponseHandler: Still have 1 requests outstanding when connection from localhost/127.0.0.1:7077 is closed
> 15/10/12 16:31:12 ERROR NettyRpcEnv: Exception when sending RequestMessage(192.168.134.10:40548,NettyRpcEndpointRef(spark://Master@localhost:7077),UnregisterApplication(app-20151012163109-0000),false)
> java.io.IOException: Connection from localhost/127.0.0.1:7077 closed
>         at org.apache.spark.network.client.TransportResponseHandler.channelUnregistered(TransportResponseHandler.java:104)
>         at org.apache.spark.network.server.TransportChannelHandler.channelUnregistered(TransportChannelHandler.java:91)
>         at io.netty.channel.AbstractChannelHandlerContext.invokeChannelUnregistered(AbstractChannelHandlerContext.java:158)
>         at io.netty.channel.AbstractChannelHandlerContext.fireChannelUnregistered(AbstractChannelHandlerContext.java:144)
>         at io.netty.channel.ChannelInboundHandlerAdapter.channelUnregistered(ChannelInboundHandlerAdapter.java:53)
>         at io.netty.channel.AbstractChannelHandlerContext.invokeChannelUnregistered(AbstractChannelHandlerContext.java:158)
>         at io.netty.channel.AbstractChannelHandlerContext.fireChannelUnregistered(AbstractChannelHandlerContext.java:144)
>         at io.netty.channel.ChannelInboundHandlerAdapter.channelUnregistered(ChannelInboundHandlerAdapter.java:53)
>         at io.netty.channel.AbstractChannelHandlerContext.invokeChannelUnregistered(AbstractChannelHandlerContext.java:158)
>         at io.netty.channel.AbstractChannelHandlerContext.fireChannelUnregistered(AbstractChannelHandlerContext.java:144)
>         at io.netty.channel.ChannelInboundHandlerAdapter.channelUnregistered(ChannelInboundHandlerAdapter.java:53)
>         at io.netty.channel.AbstractChannelHandlerContext.invokeChannelUnregistered(AbstractChannelHandlerContext.java:158)
>         at io.netty.channel.AbstractChannelHandlerContext.fireChannelUnregistered(AbstractChannelHandlerContext.java:144)
>         at io.netty.channel.DefaultChannelPipeline.fireChannelUnregistered(DefaultChannelPipeline.java:739)
>         at io.netty.channel.AbstractChannel$AbstractUnsafe$8.run(AbstractChannel.java:659)
>         at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357)
>         at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
>         at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
>         at java.lang.Thread.run(Thread.java:745)
> 15/10/12 16:31:12 INFO RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon.
> 15/10/12 16:31:12 INFO RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports.
> 15/10/12 16:31:12 INFO SparkContext: Successfully stopped SparkContext
> 15/10/12 16:31:12 INFO ShutdownHookManager: Shutdown hook called
> 15/10/12 16:31:12 INFO ShutdownHookManager: Deleting directory /tmp/spark-81bc4324-1268-4e54-bdd2-f7a2a36dafd4
> {code}
> I gonna investigate about that and I will submit a PR.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org