You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2022/10/12 06:32:06 UTC

[GitHub] [spark] pan3793 commented on a diff in pull request #38205: [SPARK-40747][CORE] Support setting driver log url using env vars on other resource managers

pan3793 commented on code in PR #38205:
URL: https://github.com/apache/spark/pull/38205#discussion_r993046862


##########
core/src/main/scala/org/apache/spark/scheduler/SchedulerBackend.scala:
##########
@@ -73,7 +75,12 @@ private[spark] trait SchedulerBackend {
    * Executors tab for the driver.
    * @return Map containing the log names and their respective URLs
    */
-  def getDriverLogUrls: Option[Map[String, String]] = None
+  def getDriverLogUrls: Option[Map[String, String]] = {

Review Comment:
   It makes sense if we want to keep YARN as-is, another direction is to let Yarn support it then it works on all resource managers.
   
   The pseudo-code would like
   ```
   override def getDriverLogUrls: Option[Map[String, String]] = { 
      YarnContainerInfoHelper.getLogUrls(sc.hadoopConfiguration, container = None) ++
        super.getDriverAttributes.getOrElse(Map.empty))
    } 
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org