You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by rvesse <gi...@git.apache.org> on 2018/01/05 17:33:15 UTC

[GitHub] spark pull request #20167: Allow providing Mesos principal & secret via file...

GitHub user rvesse opened a pull request:

    https://github.com/apache/spark/pull/20167

    Allow providing Mesos principal & secret via files (SPARK-16501)

    ## What changes were proposed in this pull request?
    
    This commit modifies the Mesos submission client to allow the principal
    and secret to be provided indirectly via files.  The path to these files
    can be specified either via Spark configuration or via environment
    variable.
    
    Assuming these files are appropriately protected by FS/OS permissions
    this means we don't ever leak the actual values in process info like ps
    
    Environment variable specification is useful because it allows you to
    interpolate the location of this file when using per-user Mesos
    credentials.
    
    For some background as to why we have taken this approach I will briefly describe our set up.  On our systems we provide each authorised user account with their own Mesos credentials to provide certain security and audit guarantees to our customers. These credentials are managed by a central Secret management service. In our `spark-env.sh` we determine the appropriate secret and principal files to use depending on the user who is invoking Spark hence the need to inject these via environment variables as well as by configuration properties. So we set these environment variables appropriately and our Spark read in the contents of those files to authenticate itself with Mesos.
    
    ## How was this patch tested?
    
    This is functionality we have been using it in production across multiple customer sites for some time. This has been in the field for around 18 months with no reported issues. These changes have been sufficient to meet our customer security and audit requirements.
    
    We have been building and deploying custom builds of Apache Spark with various minor tweaks like this which we are now looking to contribute back into the community in order that we can rely upon stock Apache Spark builds and stop maintaining our own internal fork. 

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/rvesse/spark SPARK-16501

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/20167.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #20167
    
----
commit d1bd57d29a79bfcfde33e37a6935e761456f438c
Author: Rob Vesse <rv...@...>
Date:   2018-01-05T17:19:44Z

    Allow providing Mesos principal & secret via files (SPARK-16501)
    
    This commit modifies the Mesos submission client to allow the principal
    and secret to be provided indirectly via files.  The path to these files
    can be specified either via Spark configuration or via environment
    variable.
    
    Assuming these files are appropriately protected by FS/OS permissions
    this means we don't ever leak the actual values in process info like ps
    
    Environment variable specification is useful because it allows you to
    interpolate the location of this file when using per-user Mesos
    credentials.

----


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #20167: [SPARK-16501] [MESOS] Allow providing Mesos princ...

Posted by vanzin <gi...@git.apache.org>.
Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20167#discussion_r167061319
  
    --- Diff: docs/running-on-mesos.md ---
    @@ -427,15 +437,30 @@ See the [configuration page](configuration.html) for information on Spark config
       <td><code>spark.mesos.principal</code></td>
       <td>(none)</td>
       <td>
    -    Set the principal with which Spark framework will use to authenticate with Mesos.
    +    Set the principal with which Spark framework will use to authenticate with Mesos.  You can also specify this via the environment variable `SPARK_MESOS_PRINCIPAL`
    --- End diff --
    
    nit: add period


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20167: [SPARK-16501] [MESOS] Allow providing Mesos principal & ...

Posted by vanzin <gi...@git.apache.org>.
Github user vanzin commented on the issue:

    https://github.com/apache/spark/pull/20167
  
    ok to test


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #20167: [SPARK-16501] [MESOS] Allow providing Mesos princ...

Posted by vanzin <gi...@git.apache.org>.
Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20167#discussion_r167321486
  
    --- Diff: resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala ---
    @@ -106,6 +99,40 @@ trait MesosSchedulerUtils extends Logging {
         }
       }
     
    +  def buildCredentials(
    +      conf: SparkConf,
    +      fwInfoBuilder: Protos.FrameworkInfo.Builder): Protos.Credential.Builder = {
    +    val credBuilder = Credential.newBuilder()
    +    conf.getOption("spark.mesos.principal")
    +      .orElse(Option(conf.getenv("SPARK_MESOS_PRINCIPAL")))
    +      .orElse(
    +        conf.getOption("spark.mesos.principal.file")
    +          .orElse(Option(conf.getenv("SPARK_MESOS_PRINCIPAL_FILE")))
    +          .map { principalFile =>
    +            Files.toString(new File(principalFile), StandardCharsets.UTF_8)
    +          }
    +      ).foreach { principal =>
    +          fwInfoBuilder.setPrincipal(principal)
    +          credBuilder.setPrincipal(principal)
    +    }
    --- End diff --
    
    nit: this needs to be indented further


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #20167: [SPARK-16501] [MESOS] Allow providing Mesos princ...

Posted by rvesse <gi...@git.apache.org>.
Github user rvesse commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20167#discussion_r161484346
  
    --- Diff: resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala ---
    @@ -80,10 +80,27 @@ trait MesosSchedulerUtils extends Logging {
         }
         fwInfoBuilder.setHostname(Option(conf.getenv("SPARK_PUBLIC_DNS")).getOrElse(
           conf.get(DRIVER_HOST_ADDRESS)))
    +    conf.getOption("spark.mesos.principal.file")
    --- End diff --
    
    We have customers who operate secure multi-tenant environments who consider even leaking principals of users from other tenants to be a security issue i.e. they want to minimise what users from one tenant can learn about users from another


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20167: [SPARK-16501] [MESOS] Allow providing Mesos principal & ...

Posted by rvesse <gi...@git.apache.org>.
Github user rvesse commented on the issue:

    https://github.com/apache/spark/pull/20167
  
    I've fixed up the Scala style issues so think this is ready to merge


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #20167: [SPARK-16501] [MESOS] Allow providing Mesos princ...

Posted by vanzin <gi...@git.apache.org>.
Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20167#discussion_r167061652
  
    --- Diff: resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala ---
    @@ -106,6 +99,40 @@ trait MesosSchedulerUtils extends Logging {
         }
       }
     
    +  def buildCredentials(
    +      conf: SparkConf,
    +      fwInfoBuilder: Protos.FrameworkInfo.Builder): Protos.Credential.Builder = {
    +    val credBuilder = Credential.newBuilder()
    +    conf.getOption("spark.mesos.principal")
    +      .orElse(Option(conf.getenv("SPARK_MESOS_PRINCIPAL")))
    +      .orElse(
    +        conf.getOption("spark.mesos.principal.file")
    +          .orElse(Option(conf.getenv("SPARK_MESOS_PRINCIPAL_FILE")))
    +          .map { principalFile =>
    +              Files.toString(new File(principalFile), Charset.forName("UTF-8"))
    --- End diff --
    
    nit: indented too far. Also, we generally use `StandardCharsets.UTF_8`.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #20167: [SPARK-16501] [MESOS] Allow providing Mesos princ...

Posted by vanzin <gi...@git.apache.org>.
Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20167#discussion_r165497310
  
    --- Diff: resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala ---
    @@ -71,40 +74,64 @@ trait MesosSchedulerUtils extends Logging {
           failoverTimeout: Option[Double] = None,
           frameworkId: Option[String] = None): SchedulerDriver = {
         val fwInfoBuilder = FrameworkInfo.newBuilder().setUser(sparkUser).setName(appName)
    -    val credBuilder = Credential.newBuilder()
    +    fwInfoBuilder.setHostname(Option(conf.getenv("SPARK_PUBLIC_DNS")).getOrElse(
    +      conf.get(DRIVER_HOST_ADDRESS)))
         webuiUrl.foreach { url => fwInfoBuilder.setWebuiUrl(url) }
         checkpoint.foreach { checkpoint => fwInfoBuilder.setCheckpoint(checkpoint) }
         failoverTimeout.foreach { timeout => fwInfoBuilder.setFailoverTimeout(timeout) }
         frameworkId.foreach { id =>
           fwInfoBuilder.setId(FrameworkID.newBuilder().setValue(id).build())
         }
    -    fwInfoBuilder.setHostname(Option(conf.getenv("SPARK_PUBLIC_DNS")).getOrElse(
    -      conf.get(DRIVER_HOST_ADDRESS)))
    -    conf.getOption("spark.mesos.principal").foreach { principal =>
    -      fwInfoBuilder.setPrincipal(principal)
    -      credBuilder.setPrincipal(principal)
    -    }
    -    conf.getOption("spark.mesos.secret").foreach { secret =>
    -      credBuilder.setSecret(secret)
    -    }
    -    if (credBuilder.hasSecret && !fwInfoBuilder.hasPrincipal) {
    -      throw new SparkException(
    -        "spark.mesos.principal must be configured when spark.mesos.secret is set")
    -    }
    +
         conf.getOption("spark.mesos.role").foreach { role =>
           fwInfoBuilder.setRole(role)
         }
         val maxGpus = conf.getInt("spark.mesos.gpus.max", 0)
         if (maxGpus > 0) {
           fwInfoBuilder.addCapabilities(Capability.newBuilder().setType(Capability.Type.GPU_RESOURCES))
         }
    +    val credBuilder = buildCredentials(conf, fwInfoBuilder)
         if (credBuilder.hasPrincipal) {
           new MesosSchedulerDriver(
             scheduler, fwInfoBuilder.build(), masterUrl, credBuilder.build())
         } else {
           new MesosSchedulerDriver(scheduler, fwInfoBuilder.build(), masterUrl)
         }
       }
    +  
    +  def buildCredentials(
    +      conf: SparkConf, 
    +      fwInfoBuilder: Protos.FrameworkInfo.Builder): Protos.Credential.Builder = {
    +    val credBuilder = Credential.newBuilder()
    +    conf.getOption("spark.mesos.principal")
    +      .orElse(Option(conf.getenv("SPARK_MESOS_PRINCIPAL")))
    --- End diff --
    
    So have you guys reached a conclusion here? I'm not familiar with mesos deployments so I don't really know whether there's a problem here.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #20167: Allow providing Mesos principal & secret via file...

Posted by vanzin <gi...@git.apache.org>.
Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20167#discussion_r161075477
  
    --- Diff: resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala ---
    @@ -80,10 +80,27 @@ trait MesosSchedulerUtils extends Logging {
         }
         fwInfoBuilder.setHostname(Option(conf.getenv("SPARK_PUBLIC_DNS")).getOrElse(
           conf.get(DRIVER_HOST_ADDRESS)))
    +    conf.getOption("spark.mesos.principal.file")
    +      .orElse(Option(conf.getenv("SPARK_MESOS_PRINCIPAL_FILE"))
    +      .foreach { principalFile =>
    +        val file = io.Source.fromFile(principalFile)
    +        val principal = file.getLines.next()
    +        file.close
    +        fwInfoBuilder.setPrincipal(principal)
    +        credBuilder.setPrincipal(principal)
    +      }
         conf.getOption("spark.mesos.principal").foreach { principal =>
           fwInfoBuilder.setPrincipal(principal)
           credBuilder.setPrincipal(principal)
         }
    +    conf.getOption("spark.mesos.secret.file")
    +      .orElse(Option(conf.getenv("SPARK_MESOS_SECRET_FILE"))
    +      .foreach { secretFile =>
    +       val file = io.Source.fromFile(secretFile)
    --- End diff --
    
    You should use `Files.toString()` (from Guava) here (simpler and also handles errors properly).
    
    The whole block is also indented incorrectly.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #20167: [SPARK-16501] [MESOS] Allow providing Mesos princ...

Posted by susanxhuynh <gi...@git.apache.org>.
Github user susanxhuynh commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20167#discussion_r166086691
  
    --- Diff: resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala ---
    @@ -71,40 +74,64 @@ trait MesosSchedulerUtils extends Logging {
           failoverTimeout: Option[Double] = None,
           frameworkId: Option[String] = None): SchedulerDriver = {
         val fwInfoBuilder = FrameworkInfo.newBuilder().setUser(sparkUser).setName(appName)
    -    val credBuilder = Credential.newBuilder()
    +    fwInfoBuilder.setHostname(Option(conf.getenv("SPARK_PUBLIC_DNS")).getOrElse(
    +      conf.get(DRIVER_HOST_ADDRESS)))
         webuiUrl.foreach { url => fwInfoBuilder.setWebuiUrl(url) }
         checkpoint.foreach { checkpoint => fwInfoBuilder.setCheckpoint(checkpoint) }
         failoverTimeout.foreach { timeout => fwInfoBuilder.setFailoverTimeout(timeout) }
         frameworkId.foreach { id =>
           fwInfoBuilder.setId(FrameworkID.newBuilder().setValue(id).build())
         }
    -    fwInfoBuilder.setHostname(Option(conf.getenv("SPARK_PUBLIC_DNS")).getOrElse(
    -      conf.get(DRIVER_HOST_ADDRESS)))
    -    conf.getOption("spark.mesos.principal").foreach { principal =>
    -      fwInfoBuilder.setPrincipal(principal)
    -      credBuilder.setPrincipal(principal)
    -    }
    -    conf.getOption("spark.mesos.secret").foreach { secret =>
    -      credBuilder.setSecret(secret)
    -    }
    -    if (credBuilder.hasSecret && !fwInfoBuilder.hasPrincipal) {
    -      throw new SparkException(
    -        "spark.mesos.principal must be configured when spark.mesos.secret is set")
    -    }
    +
         conf.getOption("spark.mesos.role").foreach { role =>
           fwInfoBuilder.setRole(role)
         }
         val maxGpus = conf.getInt("spark.mesos.gpus.max", 0)
         if (maxGpus > 0) {
           fwInfoBuilder.addCapabilities(Capability.newBuilder().setType(Capability.Type.GPU_RESOURCES))
         }
    +    val credBuilder = buildCredentials(conf, fwInfoBuilder)
         if (credBuilder.hasPrincipal) {
           new MesosSchedulerDriver(
             scheduler, fwInfoBuilder.build(), masterUrl, credBuilder.build())
         } else {
           new MesosSchedulerDriver(scheduler, fwInfoBuilder.build(), masterUrl)
         }
       }
    +  
    +  def buildCredentials(
    +      conf: SparkConf, 
    +      fwInfoBuilder: Protos.FrameworkInfo.Builder): Protos.Credential.Builder = {
    +    val credBuilder = Credential.newBuilder()
    +    conf.getOption("spark.mesos.principal")
    +      .orElse(Option(conf.getenv("SPARK_MESOS_PRINCIPAL")))
    --- End diff --
    
    I think it's fine to set that env var, SPARK_MESOS_PRINCIPAL.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #20167: [SPARK-16501] [MESOS] Allow providing Mesos princ...

Posted by ArtRand <gi...@git.apache.org>.
Github user ArtRand commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20167#discussion_r165994809
  
    --- Diff: resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala ---
    @@ -71,40 +74,64 @@ trait MesosSchedulerUtils extends Logging {
           failoverTimeout: Option[Double] = None,
           frameworkId: Option[String] = None): SchedulerDriver = {
         val fwInfoBuilder = FrameworkInfo.newBuilder().setUser(sparkUser).setName(appName)
    -    val credBuilder = Credential.newBuilder()
    +    fwInfoBuilder.setHostname(Option(conf.getenv("SPARK_PUBLIC_DNS")).getOrElse(
    +      conf.get(DRIVER_HOST_ADDRESS)))
         webuiUrl.foreach { url => fwInfoBuilder.setWebuiUrl(url) }
         checkpoint.foreach { checkpoint => fwInfoBuilder.setCheckpoint(checkpoint) }
         failoverTimeout.foreach { timeout => fwInfoBuilder.setFailoverTimeout(timeout) }
         frameworkId.foreach { id =>
           fwInfoBuilder.setId(FrameworkID.newBuilder().setValue(id).build())
         }
    -    fwInfoBuilder.setHostname(Option(conf.getenv("SPARK_PUBLIC_DNS")).getOrElse(
    -      conf.get(DRIVER_HOST_ADDRESS)))
    -    conf.getOption("spark.mesos.principal").foreach { principal =>
    -      fwInfoBuilder.setPrincipal(principal)
    -      credBuilder.setPrincipal(principal)
    -    }
    -    conf.getOption("spark.mesos.secret").foreach { secret =>
    -      credBuilder.setSecret(secret)
    -    }
    -    if (credBuilder.hasSecret && !fwInfoBuilder.hasPrincipal) {
    -      throw new SparkException(
    -        "spark.mesos.principal must be configured when spark.mesos.secret is set")
    -    }
    +
         conf.getOption("spark.mesos.role").foreach { role =>
           fwInfoBuilder.setRole(role)
         }
         val maxGpus = conf.getInt("spark.mesos.gpus.max", 0)
         if (maxGpus > 0) {
           fwInfoBuilder.addCapabilities(Capability.newBuilder().setType(Capability.Type.GPU_RESOURCES))
         }
    +    val credBuilder = buildCredentials(conf, fwInfoBuilder)
         if (credBuilder.hasPrincipal) {
           new MesosSchedulerDriver(
             scheduler, fwInfoBuilder.build(), masterUrl, credBuilder.build())
         } else {
           new MesosSchedulerDriver(scheduler, fwInfoBuilder.build(), masterUrl)
         }
       }
    +  
    +  def buildCredentials(
    +      conf: SparkConf, 
    +      fwInfoBuilder: Protos.FrameworkInfo.Builder): Protos.Credential.Builder = {
    +    val credBuilder = Credential.newBuilder()
    +    conf.getOption("spark.mesos.principal")
    +      .orElse(Option(conf.getenv("SPARK_MESOS_PRINCIPAL")))
    --- End diff --
    
    I would want to make sure that @susanxhuynh and/or @skonto agree, but I think this is probably fine. 


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #20167: Allow providing Mesos principal & secret via file...

Posted by vanzin <gi...@git.apache.org>.
Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20167#discussion_r161075359
  
    --- Diff: resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala ---
    @@ -80,10 +80,27 @@ trait MesosSchedulerUtils extends Logging {
         }
         fwInfoBuilder.setHostname(Option(conf.getenv("SPARK_PUBLIC_DNS")).getOrElse(
           conf.get(DRIVER_HOST_ADDRESS)))
    +    conf.getOption("spark.mesos.principal.file")
    +      .orElse(Option(conf.getenv("SPARK_MESOS_PRINCIPAL_FILE"))
    +      .foreach { principalFile =>
    +        val file = io.Source.fromFile(principalFile)
    +        val principal = file.getLines.next()
    +        file.close
    +        fwInfoBuilder.setPrincipal(principal)
    +        credBuilder.setPrincipal(principal)
    +      }
         conf.getOption("spark.mesos.principal").foreach { principal =>
           fwInfoBuilder.setPrincipal(principal)
           credBuilder.setPrincipal(principal)
         }
    +    conf.getOption("spark.mesos.secret.file")
    --- End diff --
    
    This should be merged with `getOption("spark.mesos.secret")` below.
    
    ```
    conf.getOption("spark.mesos.secret")
      .orElse(/* Read optional file-based config into string */)
    ```
      


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #20167: [SPARK-16501] [MESOS] Allow providing Mesos princ...

Posted by ArtRand <gi...@git.apache.org>.
Github user ArtRand commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20167#discussion_r164825718
  
    --- Diff: resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala ---
    @@ -71,40 +74,64 @@ trait MesosSchedulerUtils extends Logging {
           failoverTimeout: Option[Double] = None,
           frameworkId: Option[String] = None): SchedulerDriver = {
         val fwInfoBuilder = FrameworkInfo.newBuilder().setUser(sparkUser).setName(appName)
    -    val credBuilder = Credential.newBuilder()
    +    fwInfoBuilder.setHostname(Option(conf.getenv("SPARK_PUBLIC_DNS")).getOrElse(
    +      conf.get(DRIVER_HOST_ADDRESS)))
         webuiUrl.foreach { url => fwInfoBuilder.setWebuiUrl(url) }
         checkpoint.foreach { checkpoint => fwInfoBuilder.setCheckpoint(checkpoint) }
         failoverTimeout.foreach { timeout => fwInfoBuilder.setFailoverTimeout(timeout) }
         frameworkId.foreach { id =>
           fwInfoBuilder.setId(FrameworkID.newBuilder().setValue(id).build())
         }
    -    fwInfoBuilder.setHostname(Option(conf.getenv("SPARK_PUBLIC_DNS")).getOrElse(
    -      conf.get(DRIVER_HOST_ADDRESS)))
    -    conf.getOption("spark.mesos.principal").foreach { principal =>
    -      fwInfoBuilder.setPrincipal(principal)
    -      credBuilder.setPrincipal(principal)
    -    }
    -    conf.getOption("spark.mesos.secret").foreach { secret =>
    -      credBuilder.setSecret(secret)
    -    }
    -    if (credBuilder.hasSecret && !fwInfoBuilder.hasPrincipal) {
    -      throw new SparkException(
    -        "spark.mesos.principal must be configured when spark.mesos.secret is set")
    -    }
    +
         conf.getOption("spark.mesos.role").foreach { role =>
           fwInfoBuilder.setRole(role)
         }
         val maxGpus = conf.getInt("spark.mesos.gpus.max", 0)
         if (maxGpus > 0) {
           fwInfoBuilder.addCapabilities(Capability.newBuilder().setType(Capability.Type.GPU_RESOURCES))
         }
    +    val credBuilder = buildCredentials(conf, fwInfoBuilder)
         if (credBuilder.hasPrincipal) {
           new MesosSchedulerDriver(
             scheduler, fwInfoBuilder.build(), masterUrl, credBuilder.build())
         } else {
           new MesosSchedulerDriver(scheduler, fwInfoBuilder.build(), masterUrl)
         }
       }
    +  
    +  def buildCredentials(
    +      conf: SparkConf, 
    +      fwInfoBuilder: Protos.FrameworkInfo.Builder): Protos.Credential.Builder = {
    +    val credBuilder = Credential.newBuilder()
    +    conf.getOption("spark.mesos.principal")
    +      .orElse(Option(conf.getenv("SPARK_MESOS_PRINCIPAL")))
    --- End diff --
    
    Sorry for the delay. I have a use case where I start the Dispatcher in the Mesos cluster and then execute `spark-submit` cluster calls from within the container. Unfortunately this requires me to unset a few environment variables (`MESOS_EXECUTOR_ID MESOS_FRAMEWORK_ID MESOS_SLAVE_ID MESOS_SLAVE_PID MESOS_TASK_ID`) because they interfere with `spark-submit` due to this function in the rest client. 
    
    If the Dispatcher is started in a mode where it needs these Mesos authentication credentials, can we assume that we'll want to always forward them this same way? I realize I might be getting into the weeds here and this might me a _me_ problem. But I thought I'd bring it up. 


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20167: Allow providing Mesos principal & secret via files (SPAR...

Posted by rvesse <gi...@git.apache.org>.
Github user rvesse commented on the issue:

    https://github.com/apache/spark/pull/20167
  
    @felixcheung Well currently you are required to provide them in clear text either via `--conf` arguments or in your `spark-defaults.conf` which was unacceptable to our security conscious users. The former is trivially accessible via `ps` output and the latter tends to be a world readable file so both methods are insecure.
    
    Providing some indirection by only having to provide Spark with a path to the file that contains the secrets is an improvement. As noted in my description if these files are properly protected this prevents other users from discovering/exposing those secrets via `ps`, `history` or other means.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #20167: [SPARK-16501] [MESOS] Allow providing Mesos princ...

Posted by rvesse <gi...@git.apache.org>.
Github user rvesse commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20167#discussion_r164123285
  
    --- Diff: resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala ---
    @@ -71,40 +74,64 @@ trait MesosSchedulerUtils extends Logging {
           failoverTimeout: Option[Double] = None,
           frameworkId: Option[String] = None): SchedulerDriver = {
         val fwInfoBuilder = FrameworkInfo.newBuilder().setUser(sparkUser).setName(appName)
    -    val credBuilder = Credential.newBuilder()
    +    fwInfoBuilder.setHostname(Option(conf.getenv("SPARK_PUBLIC_DNS")).getOrElse(
    +      conf.get(DRIVER_HOST_ADDRESS)))
         webuiUrl.foreach { url => fwInfoBuilder.setWebuiUrl(url) }
         checkpoint.foreach { checkpoint => fwInfoBuilder.setCheckpoint(checkpoint) }
         failoverTimeout.foreach { timeout => fwInfoBuilder.setFailoverTimeout(timeout) }
         frameworkId.foreach { id =>
           fwInfoBuilder.setId(FrameworkID.newBuilder().setValue(id).build())
         }
    -    fwInfoBuilder.setHostname(Option(conf.getenv("SPARK_PUBLIC_DNS")).getOrElse(
    -      conf.get(DRIVER_HOST_ADDRESS)))
    -    conf.getOption("spark.mesos.principal").foreach { principal =>
    -      fwInfoBuilder.setPrincipal(principal)
    -      credBuilder.setPrincipal(principal)
    -    }
    -    conf.getOption("spark.mesos.secret").foreach { secret =>
    -      credBuilder.setSecret(secret)
    -    }
    -    if (credBuilder.hasSecret && !fwInfoBuilder.hasPrincipal) {
    -      throw new SparkException(
    -        "spark.mesos.principal must be configured when spark.mesos.secret is set")
    -    }
    +
         conf.getOption("spark.mesos.role").foreach { role =>
           fwInfoBuilder.setRole(role)
         }
         val maxGpus = conf.getInt("spark.mesos.gpus.max", 0)
         if (maxGpus > 0) {
           fwInfoBuilder.addCapabilities(Capability.newBuilder().setType(Capability.Type.GPU_RESOURCES))
         }
    +    val credBuilder = buildCredentials(conf, fwInfoBuilder)
         if (credBuilder.hasPrincipal) {
           new MesosSchedulerDriver(
             scheduler, fwInfoBuilder.build(), masterUrl, credBuilder.build())
         } else {
           new MesosSchedulerDriver(scheduler, fwInfoBuilder.build(), masterUrl)
         }
       }
    +  
    +  def buildCredentials(
    +      conf: SparkConf, 
    +      fwInfoBuilder: Protos.FrameworkInfo.Builder): Protos.Credential.Builder = {
    +    val credBuilder = Credential.newBuilder()
    +    conf.getOption("spark.mesos.principal")
    +      .orElse(Option(conf.getenv("SPARK_MESOS_PRINCIPAL")))
    --- End diff --
    
    I am not sure I understand enough of spark internals to answer your question.
    
    These variables are only necessary for Mesos and only on the driver which registers the framework with Mesos. Is it actually possible to submit jobs via REST into a Mesos cluster? Even if it is the Spark framework must exist at that point thereby rendering credentials unnecessary in that scenario.
    
    Or am I missing something here?
    



---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20167: [SPARK-16501] [MESOS] Allow providing Mesos principal & ...

Posted by vanzin <gi...@git.apache.org>.
Github user vanzin commented on the issue:

    https://github.com/apache/spark/pull/20167
  
    > updating documentation appropriately which will need to be a separate PR
    
    Actually the documentation is in `docs/running-on-mesos.md`.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20167: [SPARK-16501] [MESOS] Allow providing Mesos principal & ...

Posted by rvesse <gi...@git.apache.org>.
Github user rvesse commented on the issue:

    https://github.com/apache/spark/pull/20167
  
    @vanzin @ArtRand Thanks for the initial reviews, I have refactored the patch based on comments so should hopefully be in a better state now.  I was able to also add some unit test coverage for this functionality.
    
    I will look at also updating documentation appropriately which will need to be a separate PR to the apache/spark-website repository


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20167: [SPARK-16501] [MESOS] Allow providing Mesos principal & ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/20167
  
    **[Test build #87103 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/87103/testReport)** for PR 20167 at commit [`38d340c`](https://github.com/apache/spark/commit/38d340ce93158d475f1eb7443aa25f65890dc828).
     * This patch **fails Scala style tests**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20167: [SPARK-16501] [MESOS] Allow providing Mesos principal & ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/20167
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/87258/
    Test PASSed.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20167: [SPARK-16501] [MESOS] Allow providing Mesos principal & ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/20167
  
    **[Test build #87259 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/87259/testReport)** for PR 20167 at commit [`30e1cbd`](https://github.com/apache/spark/commit/30e1cbd89ca801e0c6d0ac28b5d10554921b72d9).


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20167: [SPARK-16501] [MESOS] Allow providing Mesos principal & ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/20167
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/87259/
    Test PASSed.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20167: Allow providing Mesos principal & secret via files (SPAR...

Posted by ArtRand <gi...@git.apache.org>.
Github user ArtRand commented on the issue:

    https://github.com/apache/spark/pull/20167
  
    Hello @rvesse thanks for this patch! With Mesos having [reference secrets](https://github.com/apache/mesos/blob/master/include/mesos/mesos.proto#L2596) now I think providing this as a file is a natural evolution.
    
    @vanzin, @rvesse , Why not also allow for `spark.authenticate.secret` to be a file? We have an [patch](https://github.com/mesosphere/spark/commit/a897c1ca07e67935b8ae80cfc1d41370fe8ef60b) at Mesosphere that allows this. It can be in a different JIRA/PR just want an opinion.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20167: [SPARK-16501] [MESOS] Allow providing Mesos principal & ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/20167
  
    Merged build finished. Test PASSed.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #20167: [SPARK-16501] [MESOS] Allow providing Mesos princ...

Posted by ArtRand <gi...@git.apache.org>.
Github user ArtRand commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20167#discussion_r163001901
  
    --- Diff: resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala ---
    @@ -71,40 +74,64 @@ trait MesosSchedulerUtils extends Logging {
           failoverTimeout: Option[Double] = None,
           frameworkId: Option[String] = None): SchedulerDriver = {
         val fwInfoBuilder = FrameworkInfo.newBuilder().setUser(sparkUser).setName(appName)
    -    val credBuilder = Credential.newBuilder()
    +    fwInfoBuilder.setHostname(Option(conf.getenv("SPARK_PUBLIC_DNS")).getOrElse(
    +      conf.get(DRIVER_HOST_ADDRESS)))
         webuiUrl.foreach { url => fwInfoBuilder.setWebuiUrl(url) }
         checkpoint.foreach { checkpoint => fwInfoBuilder.setCheckpoint(checkpoint) }
         failoverTimeout.foreach { timeout => fwInfoBuilder.setFailoverTimeout(timeout) }
         frameworkId.foreach { id =>
           fwInfoBuilder.setId(FrameworkID.newBuilder().setValue(id).build())
         }
    -    fwInfoBuilder.setHostname(Option(conf.getenv("SPARK_PUBLIC_DNS")).getOrElse(
    -      conf.get(DRIVER_HOST_ADDRESS)))
    -    conf.getOption("spark.mesos.principal").foreach { principal =>
    -      fwInfoBuilder.setPrincipal(principal)
    -      credBuilder.setPrincipal(principal)
    -    }
    -    conf.getOption("spark.mesos.secret").foreach { secret =>
    -      credBuilder.setSecret(secret)
    -    }
    -    if (credBuilder.hasSecret && !fwInfoBuilder.hasPrincipal) {
    -      throw new SparkException(
    -        "spark.mesos.principal must be configured when spark.mesos.secret is set")
    -    }
    +
         conf.getOption("spark.mesos.role").foreach { role =>
           fwInfoBuilder.setRole(role)
         }
         val maxGpus = conf.getInt("spark.mesos.gpus.max", 0)
         if (maxGpus > 0) {
           fwInfoBuilder.addCapabilities(Capability.newBuilder().setType(Capability.Type.GPU_RESOURCES))
         }
    +    val credBuilder = buildCredentials(conf, fwInfoBuilder)
         if (credBuilder.hasPrincipal) {
           new MesosSchedulerDriver(
             scheduler, fwInfoBuilder.build(), masterUrl, credBuilder.build())
         } else {
           new MesosSchedulerDriver(scheduler, fwInfoBuilder.build(), masterUrl)
         }
       }
    +  
    +  def buildCredentials(
    +      conf: SparkConf, 
    +      fwInfoBuilder: Protos.FrameworkInfo.Builder): Protos.Credential.Builder = {
    +    val credBuilder = Credential.newBuilder()
    +    conf.getOption("spark.mesos.principal")
    +      .orElse(Option(conf.getenv("SPARK_MESOS_PRINCIPAL")))
    --- End diff --
    
    If you use this environment variable then won't these always be set when using the [Rest client](https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/deploy/rest/RestSubmissionClient.scala#L407). Is this the desired behavior? 


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20167: [SPARK-16501] [MESOS] Allow providing Mesos principal & ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/20167
  
    **[Test build #87259 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/87259/testReport)** for PR 20167 at commit [`30e1cbd`](https://github.com/apache/spark/commit/30e1cbd89ca801e0c6d0ac28b5d10554921b72d9).
     * This patch passes all tests.
     * This patch merges cleanly.
     * This patch adds no public classes.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #20167: [SPARK-16501] [MESOS] Allow providing Mesos princ...

Posted by vanzin <gi...@git.apache.org>.
Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20167#discussion_r167062721
  
    --- Diff: resource-managers/mesos/src/test/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtilsSuite.scala ---
    @@ -237,4 +242,157 @@ class MesosSchedulerUtilsSuite extends SparkFunSuite with Matchers with MockitoS
         val portsToUse = getRangesFromResources(resourcesToBeUsed).map{r => r._1}
         portsToUse.isEmpty shouldBe true
       }
    +
    +  test("Principal specified via spark.mesos.principal") {
    +    val conf = new SparkConf()
    +    conf.set("spark.mesos.principal", "test-principal")
    +
    +    val credBuilder = utils.buildCredentials(conf, FrameworkInfo.newBuilder())
    +    credBuilder.hasPrincipal shouldBe true
    +    credBuilder.getPrincipal shouldBe "test-principal"
    +  }
    +
    +  test("Principal specified via spark.mesos.principal.file") {
    +    val pFile = File.createTempFile("MesosSchedulerUtilsSuite", ".txt");
    +    pFile.deleteOnExit()
    +    Files.write("test-principal".getBytes("UTF-8"), pFile);
    +    val conf = new SparkConf()
    +    conf.set("spark.mesos.principal.file", pFile.getAbsolutePath())
    +
    +    val credBuilder = utils.buildCredentials(conf, FrameworkInfo.newBuilder())
    +    credBuilder.hasPrincipal shouldBe true
    +    credBuilder.getPrincipal shouldBe "test-principal"
    +  }
    +
    +  test("Principal specified via spark.mesos.principal.file that does not exist") {
    +    val conf = new SparkConf()
    +    conf.set("spark.mesos.principal.file", "/tmp/does-not-exist")
    +
    +    intercept[FileNotFoundException] {
    +      val credBuilder = utils.buildCredentials(conf, FrameworkInfo.newBuilder())
    +    }
    +  }
    +
    +  test("Principal specified via SPARK_MESOS_PRINCIPAL") {
    +    val conf = new SparkConfWithEnv(Map("SPARK_MESOS_PRINCIPAL" -> "test-principal"))
    +
    +    val credBuilder = utils.buildCredentials(conf, FrameworkInfo.newBuilder())
    +    credBuilder.hasPrincipal shouldBe true
    +    credBuilder.getPrincipal shouldBe "test-principal"
    +  }
    +
    +  test("Principal specified via SPARK_MESOS_PRINCIPAL_FILE") {
    +    val pFile = File.createTempFile("MesosSchedulerUtilsSuite", ".txt");
    +    pFile.deleteOnExit()
    +    Files.write("test-principal".getBytes("UTF-8"), pFile);
    +    val conf = new SparkConfWithEnv(Map("SPARK_MESOS_PRINCIPAL_FILE" -> pFile.getAbsolutePath()))
    +
    +    val credBuilder = utils.buildCredentials(conf, FrameworkInfo.newBuilder())
    +    credBuilder.hasPrincipal shouldBe true
    +    credBuilder.getPrincipal shouldBe "test-principal"
    +  }
    +
    +  test("Principal specified via SPARK_MESOS_PRINCIPAL_FILE that does not exist") {
    +    val conf = new SparkConfWithEnv(Map("SPARK_MESOS_PRINCIPAL_FILE" -> "/tmp/does-not-exist"))
    +
    +    intercept[FileNotFoundException] {
    +      val credBuilder = utils.buildCredentials(conf, FrameworkInfo.newBuilder())
    --- End diff --
    
    `val credBuilder = ` is unnecessary.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20167: [SPARK-16501] [MESOS] Allow providing Mesos principal & ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/20167
  
    **[Test build #87258 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/87258/testReport)** for PR 20167 at commit [`232043f`](https://github.com/apache/spark/commit/232043f72df53c000efbc789499de05a2208cebe).
     * This patch passes all tests.
     * This patch merges cleanly.
     * This patch adds no public classes.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20167: [SPARK-16501] [MESOS] Allow providing Mesos principal & ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/20167
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/87104/
    Test PASSed.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20167: [SPARK-16501] [MESOS] Allow providing Mesos principal & ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/20167
  
    **[Test build #87258 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/87258/testReport)** for PR 20167 at commit [`232043f`](https://github.com/apache/spark/commit/232043f72df53c000efbc789499de05a2208cebe).


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #20167: [SPARK-16501] [MESOS] Allow providing Mesos princ...

Posted by vanzin <gi...@git.apache.org>.
Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20167#discussion_r167062808
  
    --- Diff: resource-managers/mesos/src/test/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtilsSuite.scala ---
    @@ -237,4 +242,157 @@ class MesosSchedulerUtilsSuite extends SparkFunSuite with Matchers with MockitoS
         val portsToUse = getRangesFromResources(resourcesToBeUsed).map{r => r._1}
         portsToUse.isEmpty shouldBe true
       }
    +
    +  test("Principal specified via spark.mesos.principal") {
    +    val conf = new SparkConf()
    +    conf.set("spark.mesos.principal", "test-principal")
    +
    +    val credBuilder = utils.buildCredentials(conf, FrameworkInfo.newBuilder())
    +    credBuilder.hasPrincipal shouldBe true
    +    credBuilder.getPrincipal shouldBe "test-principal"
    +  }
    +
    +  test("Principal specified via spark.mesos.principal.file") {
    +    val pFile = File.createTempFile("MesosSchedulerUtilsSuite", ".txt");
    +    pFile.deleteOnExit()
    +    Files.write("test-principal".getBytes("UTF-8"), pFile);
    +    val conf = new SparkConf()
    +    conf.set("spark.mesos.principal.file", pFile.getAbsolutePath())
    +
    +    val credBuilder = utils.buildCredentials(conf, FrameworkInfo.newBuilder())
    +    credBuilder.hasPrincipal shouldBe true
    +    credBuilder.getPrincipal shouldBe "test-principal"
    +  }
    +
    +  test("Principal specified via spark.mesos.principal.file that does not exist") {
    +    val conf = new SparkConf()
    +    conf.set("spark.mesos.principal.file", "/tmp/does-not-exist")
    +
    +    intercept[FileNotFoundException] {
    +      val credBuilder = utils.buildCredentials(conf, FrameworkInfo.newBuilder())
    +    }
    +  }
    +
    +  test("Principal specified via SPARK_MESOS_PRINCIPAL") {
    +    val conf = new SparkConfWithEnv(Map("SPARK_MESOS_PRINCIPAL" -> "test-principal"))
    +
    +    val credBuilder = utils.buildCredentials(conf, FrameworkInfo.newBuilder())
    +    credBuilder.hasPrincipal shouldBe true
    +    credBuilder.getPrincipal shouldBe "test-principal"
    +  }
    +
    +  test("Principal specified via SPARK_MESOS_PRINCIPAL_FILE") {
    +    val pFile = File.createTempFile("MesosSchedulerUtilsSuite", ".txt");
    +    pFile.deleteOnExit()
    +    Files.write("test-principal".getBytes("UTF-8"), pFile);
    +    val conf = new SparkConfWithEnv(Map("SPARK_MESOS_PRINCIPAL_FILE" -> pFile.getAbsolutePath()))
    +
    +    val credBuilder = utils.buildCredentials(conf, FrameworkInfo.newBuilder())
    +    credBuilder.hasPrincipal shouldBe true
    +    credBuilder.getPrincipal shouldBe "test-principal"
    +  }
    +
    +  test("Principal specified via SPARK_MESOS_PRINCIPAL_FILE that does not exist") {
    +    val conf = new SparkConfWithEnv(Map("SPARK_MESOS_PRINCIPAL_FILE" -> "/tmp/does-not-exist"))
    +
    +    intercept[FileNotFoundException] {
    +      val credBuilder = utils.buildCredentials(conf, FrameworkInfo.newBuilder())
    +    }
    +  }
    +
    +  test("Secret specified via spark.mesos.secret") {
    +    val conf = new SparkConf()
    +    conf.set("spark.mesos.principal", "test-principal")
    +    conf.set("spark.mesos.secret", "my-secret")
    +
    +    val credBuilder = utils.buildCredentials(conf, FrameworkInfo.newBuilder())
    +    credBuilder.hasPrincipal shouldBe true
    +    credBuilder.getPrincipal shouldBe "test-principal"
    +    credBuilder.hasSecret shouldBe true
    +    credBuilder.getSecret shouldBe "my-secret"
    +  }
    +
    +  test("Principal specified via spark.mesos.secret.file") {
    +    val sFile = File.createTempFile("MesosSchedulerUtilsSuite", ".txt");
    +    sFile.deleteOnExit()
    +    Files.write("my-secret".getBytes("UTF-8"), sFile);
    +    val conf = new SparkConf()
    +    conf.set("spark.mesos.principal", "test-principal")
    +    conf.set("spark.mesos.secret.file", sFile.getAbsolutePath())
    +
    +    val credBuilder = utils.buildCredentials(conf, FrameworkInfo.newBuilder())
    +    credBuilder.hasPrincipal shouldBe true
    +    credBuilder.getPrincipal shouldBe "test-principal"
    +    credBuilder.hasSecret shouldBe true
    +    credBuilder.getSecret shouldBe "my-secret"
    +  }
    +
    +  test("Principal specified via spark.mesos.secret.file that does not exist") {
    +    val conf = new SparkConf()
    +    conf.set("spark.mesos.principal", "test-principal")
    +    conf.set("spark.mesos.secret.file", "/tmp/does-not-exist")
    +
    +    intercept[FileNotFoundException] {
    +      val credBuilder = utils.buildCredentials(conf, FrameworkInfo.newBuilder())
    --- End diff --
    
    `val credBuilder = ` is unnecessary.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #20167: Allow providing Mesos principal & secret via file...

Posted by vanzin <gi...@git.apache.org>.
Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20167#discussion_r161074854
  
    --- Diff: resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala ---
    @@ -80,10 +80,27 @@ trait MesosSchedulerUtils extends Logging {
         }
         fwInfoBuilder.setHostname(Option(conf.getenv("SPARK_PUBLIC_DNS")).getOrElse(
           conf.get(DRIVER_HOST_ADDRESS)))
    +    conf.getOption("spark.mesos.principal.file")
    --- End diff --
    
    Why is it necessary to read the principal from a file? It's not a really sensitive parameter, and can be provided with `--conf` with no issues.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #20167: [SPARK-16501] [MESOS] Allow providing Mesos princ...

Posted by vanzin <gi...@git.apache.org>.
Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20167#discussion_r162464326
  
    --- Diff: resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala ---
    @@ -71,40 +74,62 @@ trait MesosSchedulerUtils extends Logging {
           failoverTimeout: Option[Double] = None,
           frameworkId: Option[String] = None): SchedulerDriver = {
         val fwInfoBuilder = FrameworkInfo.newBuilder().setUser(sparkUser).setName(appName)
    -    val credBuilder = Credential.newBuilder()
    +    fwInfoBuilder.setHostname(Option(conf.getenv("SPARK_PUBLIC_DNS")).getOrElse(
    +      conf.get(DRIVER_HOST_ADDRESS)))
         webuiUrl.foreach { url => fwInfoBuilder.setWebuiUrl(url) }
         checkpoint.foreach { checkpoint => fwInfoBuilder.setCheckpoint(checkpoint) }
         failoverTimeout.foreach { timeout => fwInfoBuilder.setFailoverTimeout(timeout) }
         frameworkId.foreach { id =>
           fwInfoBuilder.setId(FrameworkID.newBuilder().setValue(id).build())
         }
    -    fwInfoBuilder.setHostname(Option(conf.getenv("SPARK_PUBLIC_DNS")).getOrElse(
    -      conf.get(DRIVER_HOST_ADDRESS)))
    -    conf.getOption("spark.mesos.principal").foreach { principal =>
    -      fwInfoBuilder.setPrincipal(principal)
    -      credBuilder.setPrincipal(principal)
    -    }
    -    conf.getOption("spark.mesos.secret").foreach { secret =>
    -      credBuilder.setSecret(secret)
    -    }
    -    if (credBuilder.hasSecret && !fwInfoBuilder.hasPrincipal) {
    -      throw new SparkException(
    -        "spark.mesos.principal must be configured when spark.mesos.secret is set")
    -    }
    +
         conf.getOption("spark.mesos.role").foreach { role =>
           fwInfoBuilder.setRole(role)
         }
         val maxGpus = conf.getInt("spark.mesos.gpus.max", 0)
         if (maxGpus > 0) {
           fwInfoBuilder.addCapabilities(Capability.newBuilder().setType(Capability.Type.GPU_RESOURCES))
         }
    +    val credBuilder = buildCredentials(conf, fwInfoBuilder)
         if (credBuilder.hasPrincipal) {
           new MesosSchedulerDriver(
             scheduler, fwInfoBuilder.build(), masterUrl, credBuilder.build())
         } else {
           new MesosSchedulerDriver(scheduler, fwInfoBuilder.build(), masterUrl)
         }
       }
    +  
    +  def buildCredentials(
    +      conf: SparkConf, 
    +      fwInfoBuilder: Protos.FrameworkInfo.Builder): Protos.Credential.Builder = {
    +    val credBuilder = Credential.newBuilder()
    +    conf.getOption("spark.mesos.principal")
    +      .orElse(
    +        conf.getOption("spark.mesos.principal.file")
    +          .orElse(Option(conf.getenv("SPARK_MESOS_PRINCIPAL_FILE")))
    +          .map { principalFile =>
    +            Files.toString(new File(principalFile), Charset.forName("UTF-8"))
    +          }
    +      ).foreach { principal =>
    +        fwInfoBuilder.setPrincipal(principal)
    +        credBuilder.setPrincipal(principal)
    +    }
    +    conf.getOption("spark.mesos.secret")
    +      .orElse(
    +        conf.getOption("spark.mesos.secret.file")
    +         .orElse(Option(conf.getenv("SPARK_MESOS_SECRET_FILE")))
    +         .map { secretFile =>
    +           Files.toString(new File(secretFile), Charset.forName("UTF-8"))
    +         }
    +      ).foreach { secret =>
    +      credBuilder.setSecret(secret)
    --- End diff --
    
    nit: indendation is wrong


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #20167: [SPARK-16501] [MESOS] Allow providing Mesos princ...

Posted by vanzin <gi...@git.apache.org>.
Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20167#discussion_r167061786
  
    --- Diff: resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala ---
    @@ -106,6 +99,40 @@ trait MesosSchedulerUtils extends Logging {
         }
       }
     
    +  def buildCredentials(
    +      conf: SparkConf,
    +      fwInfoBuilder: Protos.FrameworkInfo.Builder): Protos.Credential.Builder = {
    +    val credBuilder = Credential.newBuilder()
    +    conf.getOption("spark.mesos.principal")
    +      .orElse(Option(conf.getenv("SPARK_MESOS_PRINCIPAL")))
    +      .orElse(
    +        conf.getOption("spark.mesos.principal.file")
    +          .orElse(Option(conf.getenv("SPARK_MESOS_PRINCIPAL_FILE")))
    +          .map { principalFile =>
    +              Files.toString(new File(principalFile), Charset.forName("UTF-8"))
    +          }
    +      ).foreach { principal =>
    +         fwInfoBuilder.setPrincipal(principal)
    +         credBuilder.setPrincipal(principal)
    +    }
    --- End diff --
    
    nit: indent a little more


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20167: [SPARK-16501] [MESOS] Allow providing Mesos principal & ...

Posted by rvesse <gi...@git.apache.org>.
Github user rvesse commented on the issue:

    https://github.com/apache/spark/pull/20167
  
    @vanzin Thanks for the detailed review. I have now added the ability to also specify the credentials directly via environment variables. I have added additional unit test coverage and a new section to the documentation covering these improvements.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #20167: [SPARK-16501] [MESOS] Allow providing Mesos princ...

Posted by vanzin <gi...@git.apache.org>.
Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20167#discussion_r167321536
  
    --- Diff: resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala ---
    @@ -106,6 +99,40 @@ trait MesosSchedulerUtils extends Logging {
         }
       }
     
    +  def buildCredentials(
    +      conf: SparkConf,
    +      fwInfoBuilder: Protos.FrameworkInfo.Builder): Protos.Credential.Builder = {
    +    val credBuilder = Credential.newBuilder()
    +    conf.getOption("spark.mesos.principal")
    +      .orElse(Option(conf.getenv("SPARK_MESOS_PRINCIPAL")))
    +      .orElse(
    +        conf.getOption("spark.mesos.principal.file")
    +          .orElse(Option(conf.getenv("SPARK_MESOS_PRINCIPAL_FILE")))
    +          .map { principalFile =>
    +            Files.toString(new File(principalFile), StandardCharsets.UTF_8)
    +          }
    +      ).foreach { principal =>
    +          fwInfoBuilder.setPrincipal(principal)
    +          credBuilder.setPrincipal(principal)
    +    }
    +    conf.getOption("spark.mesos.secret")
    +      .orElse(Option(conf.getenv("SPARK_MESOS_SECRET")))
    +      .orElse(
    +        conf.getOption("spark.mesos.secret.file")
    +         .orElse(Option(conf.getenv("SPARK_MESOS_SECRET_FILE")))
    +         .map { secretFile =>
    +           Files.toString(new File(secretFile), StandardCharsets.UTF_8)
    +         }
    +      ).foreach { secret =>
    +          credBuilder.setSecret(secret)
    --- End diff --
    
    same indentation problem here.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #20167: [SPARK-16501] [MESOS] Allow providing Mesos princ...

Posted by ArtRand <gi...@git.apache.org>.
Github user ArtRand commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20167#discussion_r162997172
  
    --- Diff: docs/running-on-mesos.md ---
    @@ -427,15 +437,30 @@ See the [configuration page](configuration.html) for information on Spark config
       <td><code>spark.mesos.principal</code></td>
       <td>(none)</td>
       <td>
    -    Set the principal with which Spark framework will use to authenticate with Mesos.
    +    Set the principal with which Spark framework will use to authenticate with Mesos.  You can also specify this via the environment variable `SPARK_MESOS_PRINCIPAL`
    +  </td>
    +</tr>
    +<tr>
    +  <td><code>spark.mesos.principal.file</code></td>
    +  <td>(none)</td>
    +  <td>
    +    Set the file containing the principal with which Spark framework will use to authenticate with Mesos.  Allows specifying the principal indirectly in more security conscious deployments.  The file must be readable by the user launching the job.  You can also specify this via the environment variable `SPARK_MESOS_PRINCIPAL_FILE`
       </td>
     </tr>
     <tr>
       <td><code>spark.mesos.secret</code></td>
       <td>(none)</td>
       <td>
         Set the secret with which Spark framework will use to authenticate with Mesos. Used, for example, when
    -    authenticating with the registry.
    +    authenticating with the registry.  You can also specify this via the environment variable `SPARK_MESOS_SECRET`
    +  </td>
    +</tr>
    +<tr>
    +  <td><code>spark.mesos.secret.file</code></td>
    +  <td>(none)</td>
    +  <td>
    --- End diff --
    
    Maybe comment on the format of the secret? I.e. we assume this is a plaintext secret, not a binary one? Is that true? 


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20167: Allow providing Mesos principal & secret via files (SPAR...

Posted by rvesse <gi...@git.apache.org>.
Github user rvesse commented on the issue:

    https://github.com/apache/spark/pull/20167
  
    CC @ArtRand @vanzin I would appreciate your reviews as and when you have time


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #20167: [SPARK-16501] [MESOS] Allow providing Mesos princ...

Posted by vanzin <gi...@git.apache.org>.
Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20167#discussion_r167062039
  
    --- Diff: resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala ---
    @@ -106,6 +99,40 @@ trait MesosSchedulerUtils extends Logging {
         }
       }
     
    +  def buildCredentials(
    +      conf: SparkConf,
    +      fwInfoBuilder: Protos.FrameworkInfo.Builder): Protos.Credential.Builder = {
    +    val credBuilder = Credential.newBuilder()
    +    conf.getOption("spark.mesos.principal")
    +      .orElse(Option(conf.getenv("SPARK_MESOS_PRINCIPAL")))
    +      .orElse(
    +        conf.getOption("spark.mesos.principal.file")
    +          .orElse(Option(conf.getenv("SPARK_MESOS_PRINCIPAL_FILE")))
    +          .map { principalFile =>
    +              Files.toString(new File(principalFile), Charset.forName("UTF-8"))
    +          }
    +      ).foreach { principal =>
    +         fwInfoBuilder.setPrincipal(principal)
    +         credBuilder.setPrincipal(principal)
    +    }
    +    conf.getOption("spark.mesos.secret")
    +      .orElse(Option(conf.getenv("SPARK_MESOS_SECRET")))
    +      .orElse(
    +        conf.getOption("spark.mesos.secret.file")
    +         .orElse(Option(conf.getenv("SPARK_MESOS_SECRET_FILE")))
    +         .map { secretFile =>
    +           Files.toString(new File(secretFile), Charset.forName("UTF-8"))
    +         }
    +      ).foreach { secret =>
    +         credBuilder.setSecret(secret)
    --- End diff --
    
    nit: this is indented 3 spaces, not 2.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20167: [SPARK-16501] [MESOS] Allow providing Mesos principal & ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/20167
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/87103/
    Test FAILed.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #20167: [SPARK-16501] [MESOS] Allow providing Mesos princ...

Posted by vanzin <gi...@git.apache.org>.
Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20167#discussion_r167321443
  
    --- Diff: resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala ---
    @@ -106,6 +99,40 @@ trait MesosSchedulerUtils extends Logging {
         }
       }
     
    +  def buildCredentials(
    +      conf: SparkConf,
    +      fwInfoBuilder: Protos.FrameworkInfo.Builder): Protos.Credential.Builder = {
    +    val credBuilder = Credential.newBuilder()
    +    conf.getOption("spark.mesos.principal")
    +      .orElse(Option(conf.getenv("SPARK_MESOS_PRINCIPAL")))
    +      .orElse(
    +        conf.getOption("spark.mesos.principal.file")
    +          .orElse(Option(conf.getenv("SPARK_MESOS_PRINCIPAL_FILE")))
    +          .map { principalFile =>
    +            Files.toString(new File(principalFile), StandardCharsets.UTF_8)
    +          }
    +      ).foreach { principal =>
    +          fwInfoBuilder.setPrincipal(principal)
    --- End diff --
    
    nit: this block is indented 4 spaces not 2.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #20167: Allow providing Mesos principal & secret via file...

Posted by ArtRand <gi...@git.apache.org>.
Github user ArtRand commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20167#discussion_r161385124
  
    --- Diff: resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala ---
    @@ -80,10 +80,27 @@ trait MesosSchedulerUtils extends Logging {
         }
         fwInfoBuilder.setHostname(Option(conf.getenv("SPARK_PUBLIC_DNS")).getOrElse(
           conf.get(DRIVER_HOST_ADDRESS)))
    +    conf.getOption("spark.mesos.principal.file")
    +      .orElse(Option(conf.getenv("SPARK_MESOS_PRINCIPAL_FILE"))
    +      .foreach { principalFile =>
    +        val file = io.Source.fromFile(principalFile)
    +        val principal = file.getLines.next()
    +        file.close
    +        fwInfoBuilder.setPrincipal(principal)
    +        credBuilder.setPrincipal(principal)
    +      }
         conf.getOption("spark.mesos.principal").foreach { principal =>
           fwInfoBuilder.setPrincipal(principal)
           credBuilder.setPrincipal(principal)
         }
    +    conf.getOption("spark.mesos.secret.file")
    +      .orElse(Option(conf.getenv("SPARK_MESOS_SECRET_FILE"))
    --- End diff --
    
    Environment-variable secrets (even using the `secrets` primitive up to Mesos 1.4) will be available at `/proc/<pid>/environ` so file-based is better more secure. I'm tempted to say having an insecure _option_ is worse than having less flexibility. 


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #20167: [SPARK-16501] [MESOS] Allow providing Mesos princ...

Posted by vanzin <gi...@git.apache.org>.
Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20167#discussion_r167062632
  
    --- Diff: resource-managers/mesos/src/test/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtilsSuite.scala ---
    @@ -237,4 +242,157 @@ class MesosSchedulerUtilsSuite extends SparkFunSuite with Matchers with MockitoS
         val portsToUse = getRangesFromResources(resourcesToBeUsed).map{r => r._1}
         portsToUse.isEmpty shouldBe true
       }
    +
    +  test("Principal specified via spark.mesos.principal") {
    +    val conf = new SparkConf()
    +    conf.set("spark.mesos.principal", "test-principal")
    +
    +    val credBuilder = utils.buildCredentials(conf, FrameworkInfo.newBuilder())
    +    credBuilder.hasPrincipal shouldBe true
    +    credBuilder.getPrincipal shouldBe "test-principal"
    +  }
    +
    +  test("Principal specified via spark.mesos.principal.file") {
    +    val pFile = File.createTempFile("MesosSchedulerUtilsSuite", ".txt");
    +    pFile.deleteOnExit()
    +    Files.write("test-principal".getBytes("UTF-8"), pFile);
    +    val conf = new SparkConf()
    +    conf.set("spark.mesos.principal.file", pFile.getAbsolutePath())
    +
    +    val credBuilder = utils.buildCredentials(conf, FrameworkInfo.newBuilder())
    +    credBuilder.hasPrincipal shouldBe true
    +    credBuilder.getPrincipal shouldBe "test-principal"
    +  }
    +
    +  test("Principal specified via spark.mesos.principal.file that does not exist") {
    +    val conf = new SparkConf()
    +    conf.set("spark.mesos.principal.file", "/tmp/does-not-exist")
    +
    +    intercept[FileNotFoundException] {
    +      val credBuilder = utils.buildCredentials(conf, FrameworkInfo.newBuilder())
    --- End diff --
    
    `val credBuilder =` is unnecessary.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #20167: [SPARK-16501] [MESOS] Allow providing Mesos princ...

Posted by vanzin <gi...@git.apache.org>.
Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20167#discussion_r167061911
  
    --- Diff: resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala ---
    @@ -106,6 +99,40 @@ trait MesosSchedulerUtils extends Logging {
         }
       }
     
    +  def buildCredentials(
    +      conf: SparkConf,
    +      fwInfoBuilder: Protos.FrameworkInfo.Builder): Protos.Credential.Builder = {
    +    val credBuilder = Credential.newBuilder()
    +    conf.getOption("spark.mesos.principal")
    +      .orElse(Option(conf.getenv("SPARK_MESOS_PRINCIPAL")))
    +      .orElse(
    +        conf.getOption("spark.mesos.principal.file")
    +          .orElse(Option(conf.getenv("SPARK_MESOS_PRINCIPAL_FILE")))
    +          .map { principalFile =>
    +              Files.toString(new File(principalFile), Charset.forName("UTF-8"))
    +          }
    +      ).foreach { principal =>
    +         fwInfoBuilder.setPrincipal(principal)
    +         credBuilder.setPrincipal(principal)
    +    }
    +    conf.getOption("spark.mesos.secret")
    +      .orElse(Option(conf.getenv("SPARK_MESOS_SECRET")))
    +      .orElse(
    +        conf.getOption("spark.mesos.secret.file")
    +         .orElse(Option(conf.getenv("SPARK_MESOS_SECRET_FILE")))
    +         .map { secretFile =>
    +           Files.toString(new File(secretFile), Charset.forName("UTF-8"))
    +         }
    +      ).foreach { secret =>
    +         credBuilder.setSecret(secret)
    +    }
    --- End diff --
    
    nit: add indent


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20167: Allow providing Mesos principal & secret via files (SPAR...

Posted by felixcheung <gi...@git.apache.org>.
Github user felixcheung commented on the issue:

    https://github.com/apache/spark/pull/20167
  
    is putting secrets as plain text files a good practice..?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #20167: [SPARK-16501] [MESOS] Allow providing Mesos princ...

Posted by vanzin <gi...@git.apache.org>.
Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20167#discussion_r161860003
  
    --- Diff: resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala ---
    @@ -80,10 +80,27 @@ trait MesosSchedulerUtils extends Logging {
         }
         fwInfoBuilder.setHostname(Option(conf.getenv("SPARK_PUBLIC_DNS")).getOrElse(
           conf.get(DRIVER_HOST_ADDRESS)))
    +    conf.getOption("spark.mesos.principal.file")
    +      .orElse(Option(conf.getenv("SPARK_MESOS_PRINCIPAL_FILE"))
    +      .foreach { principalFile =>
    +        val file = io.Source.fromFile(principalFile)
    +        val principal = file.getLines.next()
    +        file.close
    +        fwInfoBuilder.setPrincipal(principal)
    +        credBuilder.setPrincipal(principal)
    +      }
         conf.getOption("spark.mesos.principal").foreach { principal =>
           fwInfoBuilder.setPrincipal(principal)
           credBuilder.setPrincipal(principal)
         }
    +    conf.getOption("spark.mesos.secret.file")
    +      .orElse(Option(conf.getenv("SPARK_MESOS_SECRET_FILE"))
    --- End diff --
    
    You can't read `/proc/<pid>/environ` from other users' processes.
    
    ```
    $ cat /proc/1/environ
    cat: /proc/1/environ: Permission denied
    ```


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20167: [SPARK-16501] [MESOS] Allow providing Mesos principal & ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/20167
  
    **[Test build #87104 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/87104/testReport)** for PR 20167 at commit [`8903ba6`](https://github.com/apache/spark/commit/8903ba641b4713799e6739ce71c9baf81183127f).
     * This patch passes all tests.
     * This patch merges cleanly.
     * This patch adds no public classes.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #20167: [SPARK-16501] [MESOS] Allow providing Mesos princ...

Posted by asfgit <gi...@git.apache.org>.
Github user asfgit closed the pull request at:

    https://github.com/apache/spark/pull/20167


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20167: [SPARK-16501] [MESOS] Allow providing Mesos principal & ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/20167
  
    **[Test build #87104 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/87104/testReport)** for PR 20167 at commit [`8903ba6`](https://github.com/apache/spark/commit/8903ba641b4713799e6739ce71c9baf81183127f).


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20167: [SPARK-16501] [MESOS] Allow providing Mesos principal & ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/20167
  
    Merged build finished. Test FAILed.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20167: [SPARK-16501] [MESOS] Allow providing Mesos principal & ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/20167
  
    **[Test build #87103 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/87103/testReport)** for PR 20167 at commit [`38d340c`](https://github.com/apache/spark/commit/38d340ce93158d475f1eb7443aa25f65890dc828).


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #20167: [SPARK-16501] [MESOS] Allow providing Mesos princ...

Posted by vanzin <gi...@git.apache.org>.
Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20167#discussion_r167062542
  
    --- Diff: resource-managers/mesos/src/test/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtilsSuite.scala ---
    @@ -17,16 +17,21 @@
     
     package org.apache.spark.scheduler.cluster.mesos
     
    +import java.io.{File, FileNotFoundException}
    +
     import scala.collection.JavaConverters._
    +import scala.collection.immutable.Map
    --- End diff --
    
    Why do you need this explicit import?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20167: [SPARK-16501] [MESOS] Allow providing Mesos principal & ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/20167
  
    Merged build finished. Test PASSed.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #20167: [SPARK-16501] [MESOS] Allow providing Mesos princ...

Posted by vanzin <gi...@git.apache.org>.
Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20167#discussion_r167061286
  
    --- Diff: docs/running-on-mesos.md ---
    @@ -82,6 +82,16 @@ a Spark driver program configured to connect to Mesos.
     Alternatively, you can also install Spark in the same location in all the Mesos slaves, and configure
     `spark.mesos.executor.home` (defaults to SPARK_HOME) to point to that location.
     
    +## Authenticating to Mesos
    +
    +When Mesos Framework authentication is enabled it is necessary to provide a principal and secret by which to authenticate Spark to Mesos.  Each Spark job will register with Mesos as a separate framework.
    +
    +Depending on your deployment environment you may wish to create a single set of framework credentials that are shared across all users or create framework credentials for each user.  Creating and managing framework credentials should be done following the Mesos [Authentication documentation](http://mesos.apache.org/documentation/latest/authentication/).
    +
    +Framework credentials may be specified in a variety of ways depending on your deployment environment and security requirements.  The most simple way is to specify the `spark.mesos.principal` and `spark.mesos.secret` values directly in your Spark configuration.  Alternatively you may specify these values indirectly by instead specifying `spark.mesos.principal.file` and `spark.mesos.secret.file`, these settings point to files containing the principal and secret.  These files must be plaintext files in UTF-8 encoding.  Combined with appropriate file ownership and mode/ACLs this provides a more secure way to specify these credentials.
    +
    +Additionally if you prefer to use environment variables you can specify all of the above via environment variables instead, the environment variable names are simply the configuration settings uppercased with `.` replaced with `_` e.g. `SPARK_MESOS_PRINCIPAL`.  Please note that if you specify multiple ways to obtain the credentials the direct specifications are used in preference to the indirect specifications.
    --- End diff --
    
    I think it'd be clearer to specify the order instead of saying "direct vs. indirect". Also because both spark.mesos.principal and SPARK_MESOS_PRINCIPAL to me are direct, and this paragraph doesn't explain which one is chosen.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20167: [SPARK-16501] [MESOS] Allow providing Mesos principal & ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/20167
  
    Can one of the admins verify this patch?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20167: [SPARK-16501] [MESOS] Allow providing Mesos principal & ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/20167
  
    Can one of the admins verify this patch?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20167: [SPARK-16501] [MESOS] Allow providing Mesos principal & ...

Posted by rvesse <gi...@git.apache.org>.
Github user rvesse commented on the issue:

    https://github.com/apache/spark/pull/20167
  
    @vanzin Ok I should hopefully have all those addressed and the documentation clarified appropriately


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #20167: [SPARK-16501] [MESOS] Allow providing Mesos princ...

Posted by vanzin <gi...@git.apache.org>.
Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20167#discussion_r167061871
  
    --- Diff: resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala ---
    @@ -106,6 +99,40 @@ trait MesosSchedulerUtils extends Logging {
         }
       }
     
    +  def buildCredentials(
    +      conf: SparkConf,
    +      fwInfoBuilder: Protos.FrameworkInfo.Builder): Protos.Credential.Builder = {
    +    val credBuilder = Credential.newBuilder()
    +    conf.getOption("spark.mesos.principal")
    +      .orElse(Option(conf.getenv("SPARK_MESOS_PRINCIPAL")))
    +      .orElse(
    +        conf.getOption("spark.mesos.principal.file")
    +          .orElse(Option(conf.getenv("SPARK_MESOS_PRINCIPAL_FILE")))
    +          .map { principalFile =>
    +              Files.toString(new File(principalFile), Charset.forName("UTF-8"))
    +          }
    +      ).foreach { principal =>
    +         fwInfoBuilder.setPrincipal(principal)
    +         credBuilder.setPrincipal(principal)
    +    }
    +    conf.getOption("spark.mesos.secret")
    +      .orElse(Option(conf.getenv("SPARK_MESOS_SECRET")))
    +      .orElse(
    +        conf.getOption("spark.mesos.secret.file")
    +         .orElse(Option(conf.getenv("SPARK_MESOS_SECRET_FILE")))
    +         .map { secretFile =>
    +           Files.toString(new File(secretFile), Charset.forName("UTF-8"))
    --- End diff --
    
    Same re: `StandardCharsets`.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20167: [SPARK-16501] [MESOS] Allow providing Mesos principal & ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/20167
  
    **[Test build #86946 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/86946/testReport)** for PR 20167 at commit [`adc59ea`](https://github.com/apache/spark/commit/adc59ea26c67feff90a259da084b740dd615223f).


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20167: [SPARK-16501] [MESOS] Allow providing Mesos principal & ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/20167
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/86946/
    Test FAILed.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #20167: Allow providing Mesos principal & secret via file...

Posted by vanzin <gi...@git.apache.org>.
Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20167#discussion_r161075101
  
    --- Diff: resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala ---
    @@ -80,10 +80,27 @@ trait MesosSchedulerUtils extends Logging {
         }
         fwInfoBuilder.setHostname(Option(conf.getenv("SPARK_PUBLIC_DNS")).getOrElse(
           conf.get(DRIVER_HOST_ADDRESS)))
    +    conf.getOption("spark.mesos.principal.file")
    +      .orElse(Option(conf.getenv("SPARK_MESOS_PRINCIPAL_FILE"))
    +      .foreach { principalFile =>
    +        val file = io.Source.fromFile(principalFile)
    +        val principal = file.getLines.next()
    +        file.close
    +        fwInfoBuilder.setPrincipal(principal)
    +        credBuilder.setPrincipal(principal)
    +      }
         conf.getOption("spark.mesos.principal").foreach { principal =>
           fwInfoBuilder.setPrincipal(principal)
           credBuilder.setPrincipal(principal)
         }
    +    conf.getOption("spark.mesos.secret.file")
    +      .orElse(Option(conf.getenv("SPARK_MESOS_SECRET_FILE"))
    --- End diff --
    
    Why not provide the secret directly in the environment, instead of indirectly through a file?
    
    That is more secure than files in certain situations, so it would be nice to at least have that option.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20167: Allow providing Mesos principal & secret via files (SPAR...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/20167
  
    Can one of the admins verify this patch?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #20167: [SPARK-16501] [MESOS] Allow providing Mesos princ...

Posted by vanzin <gi...@git.apache.org>.
Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20167#discussion_r162465062
  
    --- Diff: resource-managers/mesos/src/test/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtilsSuite.scala ---
    @@ -237,4 +242,88 @@ class MesosSchedulerUtilsSuite extends SparkFunSuite with Matchers with MockitoS
         val portsToUse = getRangesFromResources(resourcesToBeUsed).map{r => r._1}
         portsToUse.isEmpty shouldBe true
       }
    +  
    +  test("Principal specified via spark.mesos.principal") {
    +    val conf = new SparkConf()
    +    conf.set("spark.mesos.principal", "test-principal")
    +    
    +    val credBuilder = utils.buildCredentials(conf, FrameworkInfo.newBuilder())
    +    credBuilder.hasPrincipal shouldBe true
    +    credBuilder.getPrincipal shouldBe "test-principal"
    +  }
    +  
    +  test("Principal specified via spark.mesos.principal.file") {
    +    val pFile = File.createTempFile("MesosSchedulerUtilsSuite", ".txt");
    +    pFile.deleteOnExit()
    +    Files.write("test-principal".getBytes("UTF-8"), pFile);
    +    val conf = new SparkConf()
    +    conf.set("spark.mesos.principal.file", pFile.getAbsolutePath())
    +    
    +    val credBuilder = utils.buildCredentials(conf, FrameworkInfo.newBuilder())
    +    credBuilder.hasPrincipal shouldBe true
    +    credBuilder.getPrincipal shouldBe "test-principal"
    +  }
    +  
    +  test("Principal specified via SPARK_MESOS_PRINCIPAL_FILE") {
    +    val pFile = File.createTempFile("MesosSchedulerUtilsSuite", ".txt");
    +    pFile.deleteOnExit()
    +    Files.write("test-principal".getBytes("UTF-8"), pFile);
    +    val conf = new SparkConfWithEnv(Map("SPARK_MESOS_PRINCIPAL_FILE" -> pFile.getAbsolutePath()))
    +    
    +    val credBuilder = utils.buildCredentials(conf, FrameworkInfo.newBuilder())
    +    credBuilder.hasPrincipal shouldBe true
    +    credBuilder.getPrincipal shouldBe "test-principal"
    +  }
    +  
    +  test("Secret specified via spark.mesos.secret") {
    +    val conf = new SparkConf()
    +    conf.set("spark.mesos.principal", "test-principal")
    +    conf.set("spark.mesos.secret", "my-secret")
    +    
    +    val credBuilder = utils.buildCredentials(conf, FrameworkInfo.newBuilder())
    +    credBuilder.hasPrincipal shouldBe true
    +    credBuilder.getPrincipal shouldBe "test-principal"
    +    credBuilder.hasSecret shouldBe true
    +    credBuilder.getSecret shouldBe "my-secret"
    +  }
    +  
    +  test("Principal specified via spark.mesos.secret.file") {
    +    val sFile = File.createTempFile("MesosSchedulerUtilsSuite", ".txt");
    +    sFile.deleteOnExit()
    +    Files.write("my-secret".getBytes("UTF-8"), sFile);
    +    val conf = new SparkConf()
    +    conf.set("spark.mesos.principal", "test-principal")
    +    conf.set("spark.mesos.secret.file", sFile.getAbsolutePath())
    +    
    +    val credBuilder = utils.buildCredentials(conf, FrameworkInfo.newBuilder())
    +    credBuilder.hasPrincipal shouldBe true
    +    credBuilder.getPrincipal shouldBe "test-principal"
    +    credBuilder.hasSecret shouldBe true
    +    credBuilder.getSecret shouldBe "my-secret"
    +  }
    +  
    +  test("Principal specified via SPARK_MESOS_SECRET_FILE") {
    +    val sFile = File.createTempFile("MesosSchedulerUtilsSuite", ".txt");
    +    sFile.deleteOnExit()
    +    Files.write("my-secret".getBytes("UTF-8"), sFile);
    +    
    +    val sFilePath = sFile.getAbsolutePath()
    +    val env = Map("SPARK_MESOS_SECRET_FILE" -> sFilePath)
    +    val conf = new SparkConfWithEnv(env)
    +    conf.set("spark.mesos.principal", "test-principal")
    +    conf.set("spark.mesos.secret.file", sFile.getAbsolutePath())
    +    
    +    val credBuilder = utils.buildCredentials(conf, FrameworkInfo.newBuilder())
    +    credBuilder.hasPrincipal shouldBe true
    +    credBuilder.getPrincipal shouldBe "test-principal"
    +    credBuilder.hasSecret shouldBe true
    +    credBuilder.getSecret shouldBe "my-secret"
    +  }
    +  
    +  test("Secret specified with no principal") {
    +    val conf = new SparkConf()
    +    conf.set("spark.mesos.secret", "my-secret")
    +    
    +    an [SparkException] should be thrownBy utils.buildCredentials(conf, FrameworkInfo.newBuilder())
    --- End diff --
    
    We normally use a different approach:
    
    ```
    intercept[SparkException] {
      // code
    }
    ```


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #20167: [SPARK-16501] [MESOS] Allow providing Mesos princ...

Posted by vanzin <gi...@git.apache.org>.
Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20167#discussion_r162463766
  
    --- Diff: resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala ---
    @@ -71,40 +74,62 @@ trait MesosSchedulerUtils extends Logging {
           failoverTimeout: Option[Double] = None,
           frameworkId: Option[String] = None): SchedulerDriver = {
         val fwInfoBuilder = FrameworkInfo.newBuilder().setUser(sparkUser).setName(appName)
    -    val credBuilder = Credential.newBuilder()
    +    fwInfoBuilder.setHostname(Option(conf.getenv("SPARK_PUBLIC_DNS")).getOrElse(
    +      conf.get(DRIVER_HOST_ADDRESS)))
         webuiUrl.foreach { url => fwInfoBuilder.setWebuiUrl(url) }
         checkpoint.foreach { checkpoint => fwInfoBuilder.setCheckpoint(checkpoint) }
         failoverTimeout.foreach { timeout => fwInfoBuilder.setFailoverTimeout(timeout) }
         frameworkId.foreach { id =>
           fwInfoBuilder.setId(FrameworkID.newBuilder().setValue(id).build())
         }
    -    fwInfoBuilder.setHostname(Option(conf.getenv("SPARK_PUBLIC_DNS")).getOrElse(
    -      conf.get(DRIVER_HOST_ADDRESS)))
    -    conf.getOption("spark.mesos.principal").foreach { principal =>
    -      fwInfoBuilder.setPrincipal(principal)
    -      credBuilder.setPrincipal(principal)
    -    }
    -    conf.getOption("spark.mesos.secret").foreach { secret =>
    -      credBuilder.setSecret(secret)
    -    }
    -    if (credBuilder.hasSecret && !fwInfoBuilder.hasPrincipal) {
    -      throw new SparkException(
    -        "spark.mesos.principal must be configured when spark.mesos.secret is set")
    -    }
    +
         conf.getOption("spark.mesos.role").foreach { role =>
           fwInfoBuilder.setRole(role)
         }
         val maxGpus = conf.getInt("spark.mesos.gpus.max", 0)
         if (maxGpus > 0) {
           fwInfoBuilder.addCapabilities(Capability.newBuilder().setType(Capability.Type.GPU_RESOURCES))
         }
    +    val credBuilder = buildCredentials(conf, fwInfoBuilder)
         if (credBuilder.hasPrincipal) {
           new MesosSchedulerDriver(
             scheduler, fwInfoBuilder.build(), masterUrl, credBuilder.build())
         } else {
           new MesosSchedulerDriver(scheduler, fwInfoBuilder.build(), masterUrl)
         }
       }
    +  
    +  def buildCredentials(
    +      conf: SparkConf, 
    +      fwInfoBuilder: Protos.FrameworkInfo.Builder): Protos.Credential.Builder = {
    +    val credBuilder = Credential.newBuilder()
    +    conf.getOption("spark.mesos.principal")
    +      .orElse(
    +        conf.getOption("spark.mesos.principal.file")
    +          .orElse(Option(conf.getenv("SPARK_MESOS_PRINCIPAL_FILE")))
    +          .map { principalFile =>
    +            Files.toString(new File(principalFile), Charset.forName("UTF-8"))
    +          }
    +      ).foreach { principal =>
    +        fwInfoBuilder.setPrincipal(principal)
    +        credBuilder.setPrincipal(principal)
    +    }
    --- End diff --
    
    nit: indent extra level


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #20167: Allow providing Mesos principal & secret via file...

Posted by ArtRand <gi...@git.apache.org>.
Github user ArtRand commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20167#discussion_r161385080
  
    --- Diff: resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala ---
    @@ -80,10 +80,27 @@ trait MesosSchedulerUtils extends Logging {
         }
         fwInfoBuilder.setHostname(Option(conf.getenv("SPARK_PUBLIC_DNS")).getOrElse(
           conf.get(DRIVER_HOST_ADDRESS)))
    +    conf.getOption("spark.mesos.principal.file")
    --- End diff --
    
    I agree. I don't think it's necessary to put the principal in a file, just the secret. 


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20167: [SPARK-16501] [MESOS] Allow providing Mesos principal & ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/20167
  
    Merged build finished. Test FAILed.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20167: [SPARK-16501] [MESOS] Allow providing Mesos principal & ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/20167
  
    **[Test build #86946 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/86946/testReport)** for PR 20167 at commit [`adc59ea`](https://github.com/apache/spark/commit/adc59ea26c67feff90a259da084b740dd615223f).
     * This patch **fails Scala style tests**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20167: [SPARK-16501] [MESOS] Allow providing Mesos principal & ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/20167
  
    Merged build finished. Test PASSed.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org