You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2020/11/03 18:11:31 UTC

[GitHub] [spark] xkrogen commented on a change in pull request #30096: [SPARK-33185][YARN][WIP] Set up yarn.Client to print direct links to driver stdout/stderr

xkrogen commented on a change in pull request #30096:
URL: https://github.com/apache/spark/pull/30096#discussion_r516863291



##########
File path: resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala
##########
@@ -1172,6 +1181,31 @@ private[spark] class Client(
     }.mkString("")
   }
 
+  private def getDriverLogsLink(appId: String): Option[(String, String)] = {
+    val baseRmUrl = WebAppUtils.getRMWebAppURLWithScheme(hadoopConf)
+    val response = ClientBuilder.newClient()
+      .target(baseRmUrl)
+      .path("ws").path("v1").path("cluster").path("apps").path(appId).path("appattempts")
+      .request(MediaType.APPLICATION_JSON)
+      .get()
+    response.getStatusInfo.getFamily match {
+      case Family.SUCCESSFUL =>
+        val objectMapper = new ObjectMapper()
+        // If JSON response is malformed somewhere along the way, MissingNode will be returned,
+        // which allows for safe continuation of chaining. The `elements()` call will be empty,
+        // and None will get returned.
+        objectMapper.readTree(response.readEntity(classOf[String]))
+            .path("appAttempts").path("appAttempt")

Review comment:
       Thanks, will make sure to fix this and double-check any style issues when I prepare a final PR.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org