You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by NiharS <gi...@git.apache.org> on 2018/10/08 22:20:16 UTC
[GitHub] spark pull request #22504: [SPARK-25118][Submit] Persist Driver Logs in Yarn...
Github user NiharS commented on a diff in the pull request:
https://github.com/apache/spark/pull/22504#discussion_r223512824
--- Diff: core/src/main/scala/org/apache/spark/internal/Logging.scala ---
@@ -192,7 +211,15 @@ private[spark] object Logging {
defaultSparkLog4jConfig = false
LogManager.resetConfiguration()
} else {
- LogManager.getRootLogger().setLevel(defaultRootLevel)
+ val rootLogger = LogManager.getRootLogger()
+ rootLogger.setLevel(defaultRootLevel)
+ rootLogger.getAllAppenders().asScala.foreach { tmp =>
--- End diff --
Any reason not to just iterate through `consoleAppenderToThreshold.keys()`? Not a huge deal but cuts down on a bit of work
---
---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org