You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by vanzin <gi...@git.apache.org> on 2018/10/04 22:03:16 UTC

[GitHub] spark pull request #22571: [SPARK-25392][Spark Job History]Inconsistent beha...

Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/22571#discussion_r222841579
  
    --- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
    @@ -2434,8 +2434,15 @@ class SparkContext(config: SparkConf) extends Logging {
           val schedulingMode = getSchedulingMode.toString
           val addedJarPaths = addedJars.keys.toSeq
           val addedFilePaths = addedFiles.keys.toSeq
    +      // SPARK-25392 pool Information should be stored in the event
    +      val poolInformation = getAllPools.map { it =>
    +        val xmlString = ("<pool><item PoolName=\"%s\" MinimumShare=\"%d\"" +
    --- End diff --
    
    Hmm, really don't like this kind of ad-hoc serialization format.
    
    Instead, create a proper pool class for the REST API, and use that class to transfer pool information around.
    
    Because this is a legacy event (not serialized by Jackson) you'll need some custom serialization in `JsonProtocol.scala`, but that's cleaner than what you have in this change.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org