You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by devaraj-kavali <gi...@git.apache.org> on 2018/08/04 00:54:00 UTC

[GitHub] spark pull request #21996: [SPARK-24888][CORE] spark-submit --master spark:/...

GitHub user devaraj-kavali opened a pull request:

    https://github.com/apache/spark/pull/21996

    [SPARK-24888][CORE] spark-submit --master spark://host:port --status driver-id does not work

    ## What changes were proposed in this pull request?
    
    In `SparkSubmit.scala` (`val uninitLog = initializeLogIfNecessary(true, silent = true)`)  -> `Logging.scala` (`val replLevel = Option(replLogger.getLevel()).getOrElse(Level.WARN)`), the log level for `rootLogger ` is overiri to `WARN ` but the status of the driver and kill driver commands status have been logging with `INFO ` log level, so there is nothing printing on the console for status and kill commands. This PR overrides the `logInfo()` for `RestSubmissionClient` and redirects the msg to `printStream`.
    
    
    ## How was this patch tested?
    
    I verified it manually by running status and kill commands, these are the results with and without the PR change.
    
    - Without the PR change
    
    ```
    [user1@user1-work-pc bin]$ ./spark-submit --master spark://user1-work-pc:6066 --status driver-20180803165641-0000
    [user1@user1-work-pc bin]$ ./spark-submit --master spark://user1-work-pc:6066 --kill driver-20180803165641-0000
    ```
    
    
    - With the PR change
    
    ```
    [user1@user1-work-pc bin]$ ./spark-submit --master spark://user1-work-pc:6066 --kill driver-20180803165641-0000
    Submitting a request to kill submission driver-20180803165641-0000 in spark://user1-work-pc:6066.
    Server responded with KillSubmissionResponse:
    {
      "action" : "KillSubmissionResponse",
      "message" : "Driver driver-20180803165641-0000 has already finished or does not exist",
      "serverSparkVersion" : "2.4.0-SNAPSHOT",
      "submissionId" : "driver-20180803165641-0000",
      "success" : false
    }
    
    [user1@user1-work-pc bin]$ ./spark-submit --master spark://user1-work-pc:6066 --status driver-20180803165641-0000
    Submitting a request for the status of submission driver-20180803165641-0000 in spark://user1-work-pc:6066.
    Server responded with SubmissionStatusResponse:
    {
      "action" : "SubmissionStatusResponse",
      "driverState" : "FINISHED",
      "serverSparkVersion" : "2.4.0-SNAPSHOT",
      "submissionId" : "driver-20180803165641-0000",
      "success" : true,
      "workerHostPort" : "xx.x.xx.xxx:42040",
      "workerId" : "worker-20180803165615-10.3.66.149-42040"
    }
    ```


You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/devaraj-kavali/spark SPARK-24888

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/21996.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #21996
    
----
commit 1e8000ffacabac742d16efee72ec1e421225a272
Author: Devaraj K <de...@...>
Date:   2018-08-04T00:47:23Z

    [SPARK-24888][CORE] spark-submit --master spark://host:port --status
    driver-id does not work

----


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #21996: [SPARK-24888][CORE] spark-submit --master spark://host:p...

Posted by devaraj-kavali <gi...@git.apache.org>.
Github user devaraj-kavali commented on the issue:

    https://github.com/apache/spark/pull/21996
  
    > I'm not sure how the PR title is related to the change here?
    
    As an user perspective, when they don't see any o/p for status/kill commands, they would probably assume that not working but the reason is that it is not showing the command response message to the user. I used the same JIRA title as the PR title here.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #21996: [SPARK-24888][CORE] spark-submit --master spark://host:p...

Posted by devaraj-kavali <gi...@git.apache.org>.
Github user devaraj-kavali commented on the issue:

    https://github.com/apache/spark/pull/21996
  
    @felixcheung can you check this PR, please let me know if there is anything need to be updated.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #21996: [SPARK-24888][CORE] spark-submit --master spark://host:p...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/21996
  
    Can one of the admins verify this patch?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #21996: [SPARK-24888][CORE] spark-submit --master spark://host:p...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/21996
  
    Can one of the admins verify this patch?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #21996: [SPARK-24888][CORE] spark-submit --master spark://host:p...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/21996
  
    Can one of the admins verify this patch?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #21996: [SPARK-24888][CORE] spark-submit --master spark:/...

Posted by felixcheung <gi...@git.apache.org>.
Github user felixcheung commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21996#discussion_r207717051
  
    --- Diff: core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala ---
    @@ -98,17 +98,24 @@ private[spark] class SparkSubmit extends Logging {
        * Kill an existing submission using the REST protocol. Standalone and Mesos cluster mode only.
        */
       private def kill(args: SparkSubmitArguments): Unit = {
    -    new RestSubmissionClient(args.master)
    -      .killSubmission(args.submissionToKill)
    +    createRestSubmissionClient(args).killSubmission(args.submissionToKill)
       }
     
       /**
        * Request the status of an existing submission using the REST protocol.
        * Standalone and Mesos cluster mode only.
        */
       private def requestStatus(args: SparkSubmitArguments): Unit = {
    -    new RestSubmissionClient(args.master)
    -      .requestSubmissionStatus(args.submissionToRequestStatusFor)
    +    createRestSubmissionClient(args).requestSubmissionStatus(args.submissionToRequestStatusFor)
    +  }
    +
    +  /**
    +   * Creates RestSubmissionClient with overridden logInfo()
    +   */
    +  private def createRestSubmissionClient(args: SparkSubmitArguments): RestSubmissionClient = {
    +    new RestSubmissionClient(args.master) {
    +      override protected def logInfo(msg: => String): Unit = printMessage(msg)
    --- End diff --
    
    this is not necessarily always the case - user can config log level easily?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #21996: [SPARK-24888][CORE] spark-submit --master spark:/...

Posted by devaraj-kavali <gi...@git.apache.org>.
Github user devaraj-kavali commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21996#discussion_r207763630
  
    --- Diff: core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala ---
    @@ -98,17 +98,24 @@ private[spark] class SparkSubmit extends Logging {
        * Kill an existing submission using the REST protocol. Standalone and Mesos cluster mode only.
        */
       private def kill(args: SparkSubmitArguments): Unit = {
    -    new RestSubmissionClient(args.master)
    -      .killSubmission(args.submissionToKill)
    +    createRestSubmissionClient(args).killSubmission(args.submissionToKill)
       }
     
       /**
        * Request the status of an existing submission using the REST protocol.
        * Standalone and Mesos cluster mode only.
        */
       private def requestStatus(args: SparkSubmitArguments): Unit = {
    -    new RestSubmissionClient(args.master)
    -      .requestSubmissionStatus(args.submissionToRequestStatusFor)
    +    createRestSubmissionClient(args).requestSubmissionStatus(args.submissionToRequestStatusFor)
    +  }
    +
    +  /**
    +   * Creates RestSubmissionClient with overridden logInfo()
    +   */
    +  private def createRestSubmissionClient(args: SparkSubmitArguments): RestSubmissionClient = {
    +    new RestSubmissionClient(args.master) {
    +      override protected def logInfo(msg: => String): Unit = printMessage(msg)
    --- End diff --
    
    I agree, user can change this log level. If the users configure the log level as WARN or above(WARN is the default log level) then they can't see any update/status from status and kill commands. I think we cannot expect the user to configure the log level to INFO if they want to run status and kill commands with status/update. Please let me know if you have any thoughts to fix this better, I can make the changes. Thanks


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #21996: [SPARK-24888][CORE] spark-submit --master spark://host:p...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/21996
  
    Can one of the admins verify this patch?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #21996: [SPARK-24888][CORE] spark-submit --master spark://host:p...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/21996
  
    Can one of the admins verify this patch?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #21996: [SPARK-24888][CORE] spark-submit --master spark://host:p...

Posted by felixcheung <gi...@git.apache.org>.
Github user felixcheung commented on the issue:

    https://github.com/apache/spark/pull/21996
  
    I think we generally describe the change in PR title. what user see you can put as JIRA title.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #21996: [SPARK-24888][CORE] spark-submit --master spark:/...

Posted by devaraj-kavali <gi...@git.apache.org>.
Github user devaraj-kavali commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21996#discussion_r207702588
  
    --- Diff: core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala ---
    @@ -98,17 +98,24 @@ private[spark] class SparkSubmit extends Logging {
        * Kill an existing submission using the REST protocol. Standalone and Mesos cluster mode only.
        */
       private def kill(args: SparkSubmitArguments): Unit = {
    -    new RestSubmissionClient(args.master)
    -      .killSubmission(args.submissionToKill)
    +    createRestSubmissionClient(args).killSubmission(args.submissionToKill)
       }
     
       /**
        * Request the status of an existing submission using the REST protocol.
        * Standalone and Mesos cluster mode only.
        */
       private def requestStatus(args: SparkSubmitArguments): Unit = {
    -    new RestSubmissionClient(args.master)
    -      .requestSubmissionStatus(args.submissionToRequestStatusFor)
    +    createRestSubmissionClient(args).requestSubmissionStatus(args.submissionToRequestStatusFor)
    +  }
    +
    +  /**
    +   * Creates RestSubmissionClient with overridden logInfo()
    +   */
    +  private def createRestSubmissionClient(args: SparkSubmitArguments): RestSubmissionClient = {
    +    new RestSubmissionClient(args.master) {
    +      override protected def logInfo(msg: => String): Unit = printMessage(msg)
    --- End diff --
    
    When `isInterpreter = true`(repl shell) the log is initialized and working but the log level is setting to WARN due to that `RestSubmissionClient` logInfo messages are not showing which are as part of the response. This PR change effects when `isInterpreter = true` and for status/kill commands, and doesn't change the other behavior.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #21996: [SPARK-24888][CORE] spark-submit --master spark:/...

Posted by felixcheung <gi...@git.apache.org>.
Github user felixcheung commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21996#discussion_r207702200
  
    --- Diff: core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala ---
    @@ -98,17 +98,24 @@ private[spark] class SparkSubmit extends Logging {
        * Kill an existing submission using the REST protocol. Standalone and Mesos cluster mode only.
        */
       private def kill(args: SparkSubmitArguments): Unit = {
    -    new RestSubmissionClient(args.master)
    -      .killSubmission(args.submissionToKill)
    +    createRestSubmissionClient(args).killSubmission(args.submissionToKill)
       }
     
       /**
        * Request the status of an existing submission using the REST protocol.
        * Standalone and Mesos cluster mode only.
        */
       private def requestStatus(args: SparkSubmitArguments): Unit = {
    -    new RestSubmissionClient(args.master)
    -      .requestSubmissionStatus(args.submissionToRequestStatusFor)
    +    createRestSubmissionClient(args).requestSubmissionStatus(args.submissionToRequestStatusFor)
    +  }
    +
    +  /**
    +   * Creates RestSubmissionClient with overridden logInfo()
    +   */
    +  private def createRestSubmissionClient(args: SparkSubmitArguments): RestSubmissionClient = {
    +    new RestSubmissionClient(args.master) {
    +      override protected def logInfo(msg: => String): Unit = printMessage(msg)
    --- End diff --
    
    doesn't this change the behavior even when the logger is initialized/working?
    



---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org