You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Jong Yoon Lee (JIRA)" <ji...@apache.org> on 2017/07/05 18:30:00 UTC

[jira] [Updated] (SPARK-21321) Spark very verbose on shutdown confusing users

     [ https://issues.apache.org/jira/browse/SPARK-21321?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Jong Yoon Lee updated SPARK-21321:
----------------------------------
    Description: 
on shutdown spark can be very verbose and spit out errors that cause the user to be confused. 

If possible we should not print those out and just ignore them.
This happens more with dynamic allocaiton on.

I am suggesting to change the log level when the shutdown is happening and the RPC connections are closed(RpcEnvStoppedException).

  was:
on shutdown spark can be very verbose and spit out errors that cause the user to be confused. 

If possible we should not print those out and just ignore them.
This happens more with dynamic allocaiton on.


> Spark very verbose on shutdown confusing users
> ----------------------------------------------
>
>                 Key: SPARK-21321
>                 URL: https://issues.apache.org/jira/browse/SPARK-21321
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>    Affects Versions: 2.1.1
>            Reporter: Jong Yoon Lee
>            Priority: Minor
>   Original Estimate: 24h
>  Remaining Estimate: 24h
>
> on shutdown spark can be very verbose and spit out errors that cause the user to be confused. 
> If possible we should not print those out and just ignore them.
> This happens more with dynamic allocaiton on.
> I am suggesting to change the log level when the shutdown is happening and the RPC connections are closed(RpcEnvStoppedException).



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org