You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by "warrenzhu25 (via GitHub)" <gi...@apache.org> on 2023/09/19 19:46:35 UTC

[GitHub] [spark] warrenzhu25 opened a new pull request, #42999: [SPARK-45217][CORE] Support change log level of specific package or class

warrenzhu25 opened a new pull request, #42999:
URL: https://github.com/apache/spark/pull/42999

   ### What changes were proposed in this pull request?
   Add `SparkContext.setLogLevel(loggerName: String, logLevel: String)` to support change log level of specific package or class
   
   ### Why are the changes needed?
   Currently, only support log level of root logger, we may want to change log level of specific package or class
   
   ### Does this PR introduce _any_ user-facing change?
   Yes, added below method in `SparkContext`
   ```
   def setLogLevel(loggerName: String, logLevel: String): Unit
   def getLoggerLevel(name: String): Option[Level]
   def removeLogger(name: String): Unit
   ```
   
   ### How was this patch tested?
   Added test in `UtilsSuite`
   
   ### Was this patch authored or co-authored using generative AI tooling?
   No
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] mridulm commented on pull request #42999: [SPARK-45217][CORE] Support change log level of specific package or class

Posted by "mridulm (via GitHub)" <gi...@apache.org>.
mridulm commented on PR #42999:
URL: https://github.com/apache/spark/pull/42999#issuecomment-1735830065

   +CC @srowen 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] srowen commented on a diff in pull request #42999: [SPARK-45217][CORE] Support change log level of specific package or class

Posted by "srowen (via GitHub)" <gi...@apache.org>.
srowen commented on code in PR #42999:
URL: https://github.com/apache/spark/pull/42999#discussion_r1337449084


##########
core/src/main/scala/org/apache/spark/SparkContext.scala:
##########
@@ -401,6 +402,23 @@ class SparkContext(config: SparkConf) extends Logging {
     }
   }
 
+  /** Change logLevel of specific package or class name.
+   *  This overrides any user-defined log settings.
+   *
+   * @param loggerName package or class name such as "org.apache.spark" or
+   *                   "org.apache.spark.SparkContext"
+   * @param logLevel The desired log level as a string.
+   *                 Valid log levels include: ALL, DEBUG, ERROR, FATAL, INFO, OFF, TRACE, WARN
+   */

Review Comment:
   Need the "since 4.0.0" on the new methods



##########
core/src/main/scala/org/apache/spark/SparkContext.scala:
##########
@@ -401,6 +402,23 @@ class SparkContext(config: SparkConf) extends Logging {
     }
   }
 
+  /** Change logLevel of specific package or class name.
+   *  This overrides any user-defined log settings.
+   *
+   * @param loggerName package or class name such as "org.apache.spark" or
+   *                   "org.apache.spark.SparkContext"
+   * @param logLevel The desired log level as a string.
+   *                 Valid log levels include: ALL, DEBUG, ERROR, FATAL, INFO, OFF, TRACE, WARN
+   */

Review Comment:
   We need this in Pyspark, no?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] dongjoon-hyun commented on a diff in pull request #42999: [SPARK-45217][CORE] Support change log level of specific package or class

Posted by "dongjoon-hyun (via GitHub)" <gi...@apache.org>.
dongjoon-hyun commented on code in PR #42999:
URL: https://github.com/apache/spark/pull/42999#discussion_r1330789111


##########
core/src/main/scala/org/apache/spark/util/Utils.scala:
##########
@@ -1013,7 +1013,7 @@ private[spark] object Utils
    * In case of IPv6, getHostAddress may return '0:0:0:0:0:0:0:1'.
    */
   def localHostName(): String = {
-    addBracketsIfNeeded(customHostname.getOrElse(localIpAddress.getHostAddress))
+    addBracketsIfNeeded(customHostname.getOrElse(localIpAddress.getHostName))

Review Comment:
   This looks irrelevant to me. Why do you change this?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] srowen commented on a diff in pull request #42999: [SPARK-45217][CORE] Support change log level of specific package or class

Posted by "srowen (via GitHub)" <gi...@apache.org>.
srowen commented on code in PR #42999:
URL: https://github.com/apache/spark/pull/42999#discussion_r1339350784


##########
core/src/main/scala/org/apache/spark/util/Utils.scala:
##########
@@ -2323,31 +2323,75 @@ private[spark] object Utils
    * configure a new log4j level
    */
   def setLogLevel(l: Level): Unit = {
-    val (ctx, loggerConfig) = getLogContext
+    val (ctx, loggerConfig) = getLogContext()
     loggerConfig.setLevel(l)
     ctx.updateLoggers()
 
     // Setting threshold to null as rootLevel will define log level for spark-shell
     Logging.sparkShellThresholdLevel = null
   }
 
+  /**
+   * configure a new log4j level for specific package or class
+   *
+   * @param name package or class name such as "org.apache.spark" or
+   *             "org.apache.spark.SparkContext"
+   * @param level new level of [[org.apache.log4j.Level]]
+   */
+  def setLogLevel(name: String, level: Level): Unit = {

Review Comment:
   Oh, nevermind, there were already two overloads? then two new overloads makes sense.
   But yeah they should call each other in a logical way



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Re: [PR] [SPARK-45217][CORE] Support change log level of specific package or class [spark]

Posted by "github-actions[bot] (via GitHub)" <gi...@apache.org>.
github-actions[bot] commented on PR #42999:
URL: https://github.com/apache/spark/pull/42999#issuecomment-1890804433

   We're closing this PR because it hasn't been updated in a while. This isn't a judgement on the merit of the PR in any way. It's just a way of keeping the PR queue manageable.
   If you'd like to revive this PR, please reopen it and ask a committer to remove the Stale tag!


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] warrenzhu25 commented on pull request #42999: [SPARK-45217][CORE] Support change log level of specific package or class

Posted by "warrenzhu25 (via GitHub)" <gi...@apache.org>.
warrenzhu25 commented on PR #42999:
URL: https://github.com/apache/spark/pull/42999#issuecomment-1736163664

   > Nah, let's address Python and R here
   
   Added.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] warrenzhu25 commented on a diff in pull request #42999: [SPARK-45217][CORE] Support change log level of specific package or class

Posted by "warrenzhu25 (via GitHub)" <gi...@apache.org>.
warrenzhu25 commented on code in PR #42999:
URL: https://github.com/apache/spark/pull/42999#discussion_r1339338775


##########
core/src/main/scala/org/apache/spark/util/Utils.scala:
##########
@@ -2323,31 +2323,75 @@ private[spark] object Utils
    * configure a new log4j level
    */
   def setLogLevel(l: Level): Unit = {
-    val (ctx, loggerConfig) = getLogContext
+    val (ctx, loggerConfig) = getLogContext()
     loggerConfig.setLevel(l)
     ctx.updateLoggers()
 
     // Setting threshold to null as rootLevel will define log level for spark-shell
     Logging.sparkShellThresholdLevel = null
   }
 
+  /**
+   * configure a new log4j level for specific package or class
+   *
+   * @param name package or class name such as "org.apache.spark" or
+   *             "org.apache.spark.SparkContext"
+   * @param level new level of [[org.apache.log4j.Level]]
+   */
+  def setLogLevel(name: String, level: Level): Unit = {
+    val (ctx, loggerConfig) = getLogContext(name)
+    if (loggerConfig != null) {
+      loggerConfig.setLevel(level)
+      logInfo(s"Logger ${loggerConfig.getName} level changed into $level")
+    } else {
+      val newLoggerConfig = new LoggerConfig(name, level, true)
+      ctx.getConfiguration.addLogger(name, newLoggerConfig)
+      logInfo(s"Added new logger $name = $level")
+    }
+    ctx.updateLoggers()
+  }
+
+  /**
+   * get logger level for specific package or class
+   *
+   * @param name  package or class name such as "org.apache.spark" or
+   *              "org.apache.spark.SparkContext"
+   */
+  def getLoggerLevel(name: String): Option[Level] = {
+    val (ctx, loggerConfig) = getLogContext(name)
+    Option(loggerConfig).map(c => c.getLevel)
+  }
+
+  /**
+   * remove logger for specific package or class
+   *
+   * @param name package or class name such as "org.apache.spark" or
+   *             "org.apache.spark.SparkContext"
+   */
+  def removeLogger(name: String): Unit = {

Review Comment:
   Currently, get/remove in `Utils` only used by tests, but we might consider expose them in `SparkContext` in the future. 



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] srowen commented on pull request #42999: [SPARK-45217][CORE] Support change log level of specific package or class

Posted by "srowen (via GitHub)" <gi...@apache.org>.
srowen commented on PR #42999:
URL: https://github.com/apache/spark/pull/42999#issuecomment-1735899356

   Nah, let's address Python and R here


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] warrenzhu25 commented on a diff in pull request #42999: [SPARK-45217][CORE] Support change log level of specific package or class

Posted by "warrenzhu25 (via GitHub)" <gi...@apache.org>.
warrenzhu25 commented on code in PR #42999:
URL: https://github.com/apache/spark/pull/42999#discussion_r1337472499


##########
core/src/main/scala/org/apache/spark/SparkContext.scala:
##########
@@ -401,6 +402,23 @@ class SparkContext(config: SparkConf) extends Logging {
     }
   }
 
+  /** Change logLevel of specific package or class name.
+   *  This overrides any user-defined log settings.
+   *
+   * @param loggerName package or class name such as "org.apache.spark" or
+   *                   "org.apache.spark.SparkContext"
+   * @param logLevel The desired log level as a string.
+   *                 Valid log levels include: ALL, DEBUG, ERROR, FATAL, INFO, OFF, TRACE, WARN
+   */

Review Comment:
   Added since. I'll add this in Pyspark in followup PR.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] srowen commented on a diff in pull request #42999: [SPARK-45217][CORE] Support change log level of specific package or class

Posted by "srowen (via GitHub)" <gi...@apache.org>.
srowen commented on code in PR #42999:
URL: https://github.com/apache/spark/pull/42999#discussion_r1339350129


##########
core/src/main/scala/org/apache/spark/util/Utils.scala:
##########
@@ -2323,31 +2323,75 @@ private[spark] object Utils
    * configure a new log4j level
    */
   def setLogLevel(l: Level): Unit = {
-    val (ctx, loggerConfig) = getLogContext
+    val (ctx, loggerConfig) = getLogContext()
     loggerConfig.setLevel(l)
     ctx.updateLoggers()
 
     // Setting threshold to null as rootLevel will define log level for spark-shell
     Logging.sparkShellThresholdLevel = null
   }
 
+  /**
+   * configure a new log4j level for specific package or class
+   *
+   * @param name package or class name such as "org.apache.spark" or
+   *             "org.apache.spark.SparkContext"
+   * @param level new level of [[org.apache.log4j.Level]]
+   */
+  def setLogLevel(name: String, level: Level): Unit = {
+    val (ctx, loggerConfig) = getLogContext(name)
+    if (loggerConfig != null) {
+      loggerConfig.setLevel(level)
+      logInfo(s"Logger ${loggerConfig.getName} level changed into $level")
+    } else {
+      val newLoggerConfig = new LoggerConfig(name, level, true)
+      ctx.getConfiguration.addLogger(name, newLoggerConfig)
+      logInfo(s"Added new logger $name = $level")
+    }
+    ctx.updateLoggers()
+  }
+
+  /**
+   * get logger level for specific package or class
+   *
+   * @param name  package or class name such as "org.apache.spark" or
+   *              "org.apache.spark.SparkContext"
+   */
+  def getLoggerLevel(name: String): Option[Level] = {
+    val (ctx, loggerConfig) = getLogContext(name)
+    Option(loggerConfig).map(c => c.getLevel)
+  }
+
+  /**
+   * remove logger for specific package or class
+   *
+   * @param name package or class name such as "org.apache.spark" or
+   *             "org.apache.spark.SparkContext"
+   */
+  def removeLogger(name: String): Unit = {

Review Comment:
   They should be private to Spark then and just make sure they're annotated as for testing only



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Re: [PR] [SPARK-45217][CORE] Support change log level of specific package or class [spark]

Posted by "github-actions[bot] (via GitHub)" <gi...@apache.org>.
github-actions[bot] closed pull request #42999: [SPARK-45217][CORE] Support change log level of specific package or class
URL: https://github.com/apache/spark/pull/42999


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] warrenzhu25 commented on pull request #42999: [SPARK-45217][CORE] Support change log level of specific package or class

Posted by "warrenzhu25 (via GitHub)" <gi...@apache.org>.
warrenzhu25 commented on PR #42999:
URL: https://github.com/apache/spark/pull/42999#issuecomment-1726690587

   > Why don't we use `log4j2.properties`?
   
   Added user cases in description.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] warrenzhu25 commented on a diff in pull request #42999: [SPARK-45217][CORE] Support change log level of specific package or class

Posted by "warrenzhu25 (via GitHub)" <gi...@apache.org>.
warrenzhu25 commented on code in PR #42999:
URL: https://github.com/apache/spark/pull/42999#discussion_r1339337836


##########
core/src/main/scala/org/apache/spark/util/Utils.scala:
##########
@@ -2323,31 +2323,75 @@ private[spark] object Utils
    * configure a new log4j level
    */
   def setLogLevel(l: Level): Unit = {
-    val (ctx, loggerConfig) = getLogContext
+    val (ctx, loggerConfig) = getLogContext()
     loggerConfig.setLevel(l)
     ctx.updateLoggers()
 
     // Setting threshold to null as rootLevel will define log level for spark-shell
     Logging.sparkShellThresholdLevel = null
   }
 
+  /**
+   * configure a new log4j level for specific package or class
+   *
+   * @param name package or class name such as "org.apache.spark" or
+   *             "org.apache.spark.SparkContext"
+   * @param level new level of [[org.apache.log4j.Level]]
+   */
+  def setLogLevel(name: String, level: Level): Unit = {

Review Comment:
   Do you want me to remove old method `def setLogLevel(l: Level): Unit`, and calling new method instead?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] srowen commented on a diff in pull request #42999: [SPARK-45217][CORE] Support change log level of specific package or class

Posted by "srowen (via GitHub)" <gi...@apache.org>.
srowen commented on code in PR #42999:
URL: https://github.com/apache/spark/pull/42999#discussion_r1337714285


##########
core/src/main/scala/org/apache/spark/util/Utils.scala:
##########
@@ -2323,31 +2323,75 @@ private[spark] object Utils
    * configure a new log4j level
    */
   def setLogLevel(l: Level): Unit = {
-    val (ctx, loggerConfig) = getLogContext
+    val (ctx, loggerConfig) = getLogContext()
     loggerConfig.setLevel(l)
     ctx.updateLoggers()
 
     // Setting threshold to null as rootLevel will define log level for spark-shell
     Logging.sparkShellThresholdLevel = null
   }
 
+  /**
+   * configure a new log4j level for specific package or class
+   *
+   * @param name package or class name such as "org.apache.spark" or
+   *             "org.apache.spark.SparkContext"
+   * @param level new level of [[org.apache.log4j.Level]]
+   */
+  def setLogLevel(name: String, level: Level): Unit = {

Review Comment:
   I don't think we need this override, do we? the original method didn't support this.
   Id also imagine that the old method just just call this one with some value representing the root logger



##########
python/pyspark/context.py:
##########
@@ -534,6 +534,27 @@ def setLogLevel(self, logLevel: str) -> None:
         """
         self._jsc.setLogLevel(logLevel)
 
+    def setLogLevel(self, logName: str, logLevel: str) -> None:

Review Comment:
   Yeah, I mean, this is all you've exposed in R and Python, so not sure you should/need to expose more in Java/Scala



##########
core/src/main/scala/org/apache/spark/util/Utils.scala:
##########
@@ -2323,31 +2323,75 @@ private[spark] object Utils
    * configure a new log4j level
    */
   def setLogLevel(l: Level): Unit = {
-    val (ctx, loggerConfig) = getLogContext
+    val (ctx, loggerConfig) = getLogContext()
     loggerConfig.setLevel(l)
     ctx.updateLoggers()
 
     // Setting threshold to null as rootLevel will define log level for spark-shell
     Logging.sparkShellThresholdLevel = null
   }
 
+  /**
+   * configure a new log4j level for specific package or class
+   *
+   * @param name package or class name such as "org.apache.spark" or
+   *             "org.apache.spark.SparkContext"
+   * @param level new level of [[org.apache.log4j.Level]]
+   */
+  def setLogLevel(name: String, level: Level): Unit = {
+    val (ctx, loggerConfig) = getLogContext(name)
+    if (loggerConfig != null) {
+      loggerConfig.setLevel(level)
+      logInfo(s"Logger ${loggerConfig.getName} level changed into $level")
+    } else {
+      val newLoggerConfig = new LoggerConfig(name, level, true)
+      ctx.getConfiguration.addLogger(name, newLoggerConfig)
+      logInfo(s"Added new logger $name = $level")
+    }
+    ctx.updateLoggers()
+  }
+
+  /**
+   * get logger level for specific package or class
+   *
+   * @param name  package or class name such as "org.apache.spark" or
+   *              "org.apache.spark.SparkContext"
+   */
+  def getLoggerLevel(name: String): Option[Level] = {
+    val (ctx, loggerConfig) = getLogContext(name)
+    Option(loggerConfig).map(c => c.getLevel)
+  }
+
+  /**
+   * remove logger for specific package or class
+   *
+   * @param name package or class name such as "org.apache.spark" or
+   *             "org.apache.spark.SparkContext"
+   */
+  def removeLogger(name: String): Unit = {

Review Comment:
   Do you really need get/remove? not sure it's necessary



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Re: [PR] [SPARK-45217][CORE] Support change log level of specific package or class [spark]

Posted by "warrenzhu25 (via GitHub)" <gi...@apache.org>.
warrenzhu25 commented on PR #42999:
URL: https://github.com/apache/spark/pull/42999#issuecomment-1747917240

   > The change looks about right, but there are some test failures, eg https://github.com/warrenzhu25/spark/actions/runs/6332926200/job/17202261106
   
   The example test should be skipped as I added `# doctest :+SKIP`. Any ideas why this happen?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org