You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by kalvinnchau <gi...@git.apache.org> on 2017/03/23 23:30:03 UTC

[GitHub] spark pull request #17404: [SPARK-20078] [MESOS] Mesos executor configurabil...

GitHub user kalvinnchau opened a pull request:

    https://github.com/apache/spark/pull/17404

    [SPARK-20078] [MESOS] Mesos executor configurability for task name and labels

    ## What changes were proposed in this pull request?
    
    Adding configurable mesos executor names and labels using `spark.mesos.task.name` and `spark.mesos.task.labels`.
    
    Labels were defined as `k1:v1,k2:v2`.
    
    @mgummelt 
    
    ## How was this patch tested?
    
    Added unit tests to verify labels were added correctly, with incorrect labels being ignored and added a test to test the name of the executor.
    
    Tested with: `./build/sbt -Pmesos mesos/test`
    
    Please review http://spark.apache.org/contributing.html before opening a pull request.


You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/kalvinnchau/spark mesos-config

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/17404.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #17404
    
----
commit 84656e3b74058f8e736cba880bf522109a77b0c6
Author: Kalvin Chau <ka...@viasat.com>
Date:   2017-03-23T23:16:41Z

    add configurable mesos.task.name and mesos.task.labels

commit bc57ba0c59d3f997f2df9859077b72a3c3f96b46
Author: Kalvin Chau <ka...@viasat.com>
Date:   2017-03-23T23:27:48Z

    add test to verify incorrect labels are not added

commit 7bf49f8c19bfc76e802334d40a5854bb915aa05f
Author: Kalvin Chau <ka...@viasat.com>
Date:   2017-03-23T23:29:00Z

    fix typo

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #17404: [SPARK-20078] [MESOS] Mesos executor configurability for...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/17404
  
    **[Test build #3609 has started](https://amplab.cs.berkeley.edu/jenkins/job/NewSparkPullRequestBuilder/3609/testReport)** for PR 17404 at commit [`dc5974c`](https://github.com/apache/spark/commit/dc5974cafc7227140a6a924f55c51eca08479a8b).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #17404: [SPARK-20078] [MESOS] Mesos executor configurability for...

Posted by mgummelt <gi...@git.apache.org>.
Github user mgummelt commented on the issue:

    https://github.com/apache/spark/pull/17404
  
    RE: task names.  Great, so let's drop the configuration and just use spark.app.name
    
    RE: labels.  I don't understand why you can't link tasks and drivers.  The driver is a framework.  It's framework_id is not marathon's framework_id.  It has its own framework_id, which is in fact configurable via spark.mesos.driver.frameworkId.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #17404: [SPARK-20078] [MESOS] Mesos executor configurabil...

Posted by asfgit <gi...@git.apache.org>.
Github user asfgit closed the pull request at:

    https://github.com/apache/spark/pull/17404


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #17404: [SPARK-20078] [MESOS] Mesos executor configurability for...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/17404
  
    **[Test build #3609 has finished](https://amplab.cs.berkeley.edu/jenkins/job/NewSparkPullRequestBuilder/3609/testReport)** for PR 17404 at commit [`dc5974c`](https://github.com/apache/spark/commit/dc5974cafc7227140a6a924f55c51eca08479a8b).
     * This patch passes all tests.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #17404: [SPARK-20078] [MESOS] Mesos executor configurability for...

Posted by mgummelt <gi...@git.apache.org>.
Github user mgummelt commented on the issue:

    https://github.com/apache/spark/pull/17404
  
    I agree that "Task X" is a bad naming scheme, but I'm not convinced we need to make it configurable.  What if we just changed it to "<driver-name> X", so that tasks were distinguishable between drivers?
    
    Also, what problem are you trying to solve by making labels configurable?  Even if we decide that's a good idea, please make a new JIRA and PR for that.  It's a different task.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #17404: [SPARK-20078] [MESOS] Mesos executor configurabil...

Posted by kalvinnchau <gi...@git.apache.org>.
Github user kalvinnchau commented on a diff in the pull request:

    https://github.com/apache/spark/pull/17404#discussion_r107997084
  
    --- Diff: resource-managers/mesos/src/test/scala/org/apache/spark/scheduler/cluster/mesos/MesosCoarseGrainedSchedulerBackendSuite.scala ---
    @@ -464,6 +464,17 @@ class MesosCoarseGrainedSchedulerBackendSuite extends SparkFunSuite
         assert(!uris.asScala.head.getCache)
       }
     
    +  test("mesos sets configurable task name") {
    +    setBackend()
    +
    +    val offers = List(Resources(backend.executorMemory(sc), 1))
    +    offerResources(offers)
    +    val launchedTasks = verifyTaskLaunched(driver, "o1")
    +
    +    // Add " 0" to the taskName to match the executor number that is appended
    +    assert(launchedTasks.head.getName == s"${sc.appName} 0")
    --- End diff --
    
    done!


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #17404: [SPARK-20078] [MESOS] Mesos executor configurability for...

Posted by kalvinnchau <gi...@git.apache.org>.
Github user kalvinnchau commented on the issue:

    https://github.com/apache/spark/pull/17404
  
    Yeah the driver does have its own framework id, which we can find in from frameworks, but when we look at all the running tasks, each task's framework id is it's parents framework id.
    
    So the driver task shows up with marathon's framework id, and the executors show up with the framework id of the driver. So when we get the driver logs and look at the ExexutorInfo, it doesn't have the newly registered framework id.
    
    Didn't realize it was configurable though. 
    
    Is there a reason not to add label support ?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #17404: [SPARK-20078] [MESOS] Mesos executor configurability for...

Posted by mgummelt <gi...@git.apache.org>.
Github user mgummelt commented on the issue:

    https://github.com/apache/spark/pull/17404
  
    Thanks for the contribution!
    
    @srowen LGTM.  Please merge.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #17404: [SPARK-20078] [MESOS] Mesos executor configurability for...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/17404
  
    Can one of the admins verify this patch?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #17404: [SPARK-20078] [MESOS] Mesos executor configurabil...

Posted by mgummelt <gi...@git.apache.org>.
Github user mgummelt commented on a diff in the pull request:

    https://github.com/apache/spark/pull/17404#discussion_r108000050
  
    --- Diff: resource-managers/mesos/src/test/scala/org/apache/spark/scheduler/cluster/mesos/MesosCoarseGrainedSchedulerBackendSuite.scala ---
    @@ -464,6 +464,17 @@ class MesosCoarseGrainedSchedulerBackendSuite extends SparkFunSuite
         assert(!uris.asScala.head.getCache)
       }
     
    +  test("mesos sets task name to spark.app.name") {
    +    setBackend()
    +
    +    val offers = List(Resources(backend.executorMemory(sc), 1))
    +    offerResources(offers)
    +    val launchedTasks = verifyTaskLaunched(driver, "o1")
    +
    +    // Add " 0" to the taskName to match the executor number that is appended
    +    assert(launchedTasks.head.getName == "test-mesos-dynamic-alloc 0")
    --- End diff --
    
    Thanks, as a side note, we need to update this task name, since it's not general to all tests here.  But we can do that later.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #17404: [SPARK-20078] [MESOS] Mesos executor configurabil...

Posted by mgummelt <gi...@git.apache.org>.
Github user mgummelt commented on a diff in the pull request:

    https://github.com/apache/spark/pull/17404#discussion_r107995799
  
    --- Diff: resource-managers/mesos/src/test/scala/org/apache/spark/scheduler/cluster/mesos/MesosCoarseGrainedSchedulerBackendSuite.scala ---
    @@ -464,6 +464,17 @@ class MesosCoarseGrainedSchedulerBackendSuite extends SparkFunSuite
         assert(!uris.asScala.head.getCache)
       }
     
    +  test("mesos sets configurable task name") {
    +    setBackend()
    +
    +    val offers = List(Resources(backend.executorMemory(sc), 1))
    +    offerResources(offers)
    +    val launchedTasks = verifyTaskLaunched(driver, "o1")
    +
    +    // Add " 0" to the taskName to match the executor number that is appended
    +    assert(launchedTasks.head.getName == s"${sc.appName} 0")
    --- End diff --
    
    Let's use the actual literal string here, rather than a parameterized string, so we can directly assert the task name we want.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #17404: [SPARK-20078] [MESOS] Mesos executor configurability for...

Posted by kalvinnchau <gi...@git.apache.org>.
Github user kalvinnchau commented on the issue:

    https://github.com/apache/spark/pull/17404
  
    Agreed, I made it configurable just to avoid changing what the default was, but we were just configuring it with the app name anyway. Are you thinking of using `spark.app.name` in place of Task?
    
    We run a lot of our streaming jobs in client mode, and as we move to having more and more tenants in our cluster, we'd like to be able to label which job belongs to who, and we can use those labels to tag the logs coming from each job for centralized logging.
    
    We're trying to link the driver logs with the executor logs. When we query the mesos API for all the running tasks/frameworks there's no easy way to link drivers to executors. Looking at the framework_id doesn't work since the dirver's framework_id is the parent (marathon in this case). And the executors have no information through that API that we can use to link to the driver.
     
    Should I make a new JIRA/PR and discuss it there? Or should we discuss it here before I open those up?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #17404: [SPARK-20078] [MESOS] Mesos executor configurability for...

Posted by srowen <gi...@git.apache.org>.
Github user srowen commented on the issue:

    https://github.com/apache/spark/pull/17404
  
    Merged to master


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #17404: [SPARK-20078] [MESOS] Mesos executor configurability for...

Posted by mgummelt <gi...@git.apache.org>.
Github user mgummelt commented on the issue:

    https://github.com/apache/spark/pull/17404
  
    Oh, I see.  OK, I think labels is a fine feature.  Please submit in a different JIRA/PR.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org