You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by pwendell <gi...@git.apache.org> on 2014/04/26 04:22:39 UTC

[GitHub] spark pull request: SPARK-1606: Infer user application arguments i...

GitHub user pwendell opened a pull request:

    https://github.com/apache/spark/pull/563

    SPARK-1606: Infer user application arguments instead of requiring --arg.

    This modifies spark-submit to do something more like the Hadoop `jar`
    command. Now we have the following syntax:
    
    ./bin/spark-submit [options] user.jar [user options]

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/pwendell/spark spark-submit

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/563.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #563
    
----
commit bc48139a0ffbe458649b825af036f86b8bf181de
Author: Patrick Wendell <pw...@gmail.com>
Date:   2014-04-26T02:04:53Z

    SPARK-1606: Infer user application arguments instead of requiring --arg.
    
    This modifies spark-submit to do something more like the Hadoop `jar`
    command. Now we have the following syntax:
    
    ./bin/spark-submit [options] user.jar [user options]

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] spark pull request: SPARK-1606: Infer user application arguments i...

Posted by mateiz <gi...@git.apache.org>.
Github user mateiz commented on the pull request:

    https://github.com/apache/spark/pull/563#issuecomment-41454989
  
    Seems like a good change to me in terms of usability.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] spark pull request: SPARK-1606: Infer user application arguments i...

Posted by sryza <gi...@git.apache.org>.
Github user sryza commented on a diff in the pull request:

    https://github.com/apache/spark/pull/563#discussion_r12023968
  
    --- Diff: docs/cluster-overview.md ---
    @@ -73,30 +73,34 @@ the bin directory. This script takes care of setting up the classpath with Spark
     dependencies, and can support different cluster managers and deploy modes that Spark supports.
     It's usage is
     
    -    ./bin/spark-submit <app jar> --class path.to.your.Class [other options..]
    +    ./bin/spark-submit --class path.to.your.Class [options] <app jar> [app options]
     
    -To enumerate all options available to `spark-submit` run it with the `--help` flag.
    -Here are a few examples of common options:
    +When calling Spark submit `[app options]` will be passed along to your application's 
    --- End diff --
    
    Nits:
    Missing comma.  Also, is "Spark submit" vs. "spark-submit" intentional?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] spark pull request: SPARK-1606: Infer user application arguments i...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/563#issuecomment-41456485
  
    
    Refer to this link for build results: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/14507/


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] spark pull request: SPARK-1606: Infer user application arguments i...

Posted by pwendell <gi...@git.apache.org>.
Github user pwendell commented on the pull request:

    https://github.com/apache/spark/pull/563#issuecomment-41454854
  
    //cc @sryza @mengxr @mateiz


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] spark pull request: SPARK-1606: Infer user application arguments i...

Posted by pwendell <gi...@git.apache.org>.
Github user pwendell commented on the pull request:

    https://github.com/apache/spark/pull/563#issuecomment-41486584
  
    Okay then, I'll go ahead and merge this, and once SPARK-1636 and SPARK-1665 are finished we'll have a full solution for the MLLib examples.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] spark pull request: SPARK-1606: Infer user application arguments i...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/563#issuecomment-41476323
  
    All automated tests passed.
    Refer to this link for build results: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/14512/


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] spark pull request: SPARK-1606: Infer user application arguments i...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/563#issuecomment-41454843
  
     Merged build triggered. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] spark pull request: SPARK-1606: Infer user application arguments i...

Posted by pwendell <gi...@git.apache.org>.
Github user pwendell commented on a diff in the pull request:

    https://github.com/apache/spark/pull/563#discussion_r12022413
  
    --- Diff: core/src/main/scala/org/apache/spark/deploy/SparkSubmitArguments.scala ---
    @@ -156,119 +156,123 @@ private[spark] class SparkSubmitArguments(args: Array[String]) {
         """.stripMargin
       }
     
    -  private def parseOpts(opts: List[String]): Unit = opts match {
    -    case ("--name") :: value :: tail =>
    -      name = value
    -      parseOpts(tail)
    -
    -    case ("--master") :: value :: tail =>
    -      master = value
    -      parseOpts(tail)
    -
    -    case ("--class") :: value :: tail =>
    -      mainClass = value
    -      parseOpts(tail)
    -
    -    case ("--deploy-mode") :: value :: tail =>
    -      if (value != "client" && value != "cluster") {
    -        SparkSubmit.printErrorAndExit("--deploy-mode must be either \"client\" or \"cluster\"")
    -      }
    -      deployMode = value
    -      parseOpts(tail)
    -
    -    case ("--num-executors") :: value :: tail =>
    -      numExecutors = value
    -      parseOpts(tail)
    -
    -    case ("--total-executor-cores") :: value :: tail =>
    -      totalExecutorCores = value
    -      parseOpts(tail)
    -
    -    case ("--executor-cores") :: value :: tail =>
    -      executorCores = value
    -      parseOpts(tail)
    -
    -    case ("--executor-memory") :: value :: tail =>
    -      executorMemory = value
    -      parseOpts(tail)
    -
    -    case ("--driver-memory") :: value :: tail =>
    -      driverMemory = value
    -      parseOpts(tail)
    -
    -    case ("--driver-cores") :: value :: tail =>
    -      driverCores = value
    -      parseOpts(tail)
    -
    -    case ("--driver-class-path") :: value :: tail =>
    -      driverExtraClassPath = value
    -      parseOpts(tail)
    -
    -    case ("--driver-java-options") :: value :: tail =>
    -      driverExtraJavaOptions = value
    -      parseOpts(tail)
    -
    -    case ("--driver-library-path") :: value :: tail =>
    -      driverExtraLibraryPath = value
    -      parseOpts(tail)
    -
    -    case ("--properties-file") :: value :: tail =>
    -      propertiesFile = value
    -      parseOpts(tail)
    -
    -    case ("--supervise") :: tail =>
    -      supervise = true
    -      parseOpts(tail)
    -
    -    case ("--queue") :: value :: tail =>
    -      queue = value
    -      parseOpts(tail)
    -
    -    case ("--files") :: value :: tail =>
    -      files = value
    -      parseOpts(tail)
    -
    -    case ("--archives") :: value :: tail =>
    -      archives = value
    -      parseOpts(tail)
    -
    -    case ("--arg") :: value :: tail =>
    -      childArgs += value
    -      parseOpts(tail)
    -
    -    case ("--jars") :: value :: tail =>
    -      jars = value
    -      parseOpts(tail)
    -
    -    case ("--help" | "-h") :: tail =>
    -      printUsageAndExit(0)
    -
    -    case ("--verbose" | "-v") :: tail =>
    -      verbose = true
    -      parseOpts(tail)
    -
    -    case value :: tail =>
    -      if (value.startsWith("-")) {
    -        val errMessage = s"Unrecognized option '$value'."
    -        val suggestion: Option[String] = value match {
    -          case v if v.startsWith("--") && v.contains("=") =>
    -            val parts = v.split("=")
    -            Some(s"Perhaps you want '${parts(0)} ${parts(1)}'?")
    -          case _ =>
    -            None
    +  /** Fill in values by parsing user options. */
    +  private def parseOpts(opts: Seq[String]): Unit = {
    +    // Delineates parsing of Spark options from parsing of user options.
    +    var inSparkOpts = true
    +    parse(opts)
    +
    +    def parse(opts: Seq[String]): Unit = {
    +      opts match {
    +      case ("--name") :: value :: tail =>
    +        name = value
    +        parse(tail)
    +
    +      case ("--master") :: value :: tail =>
    +        master = value
    +        parse(tail)
    +
    +      case ("--class") :: value :: tail =>
    +        mainClass = value
    +        parse(tail)
    +
    +      case ("--deploy-mode") :: value :: tail =>
    +        if (value != "client" && value != "cluster") {
    +          SparkSubmit.printErrorAndExit("--deploy-mode must be either \"client\" or \"cluster\"")
    +        }
    +        deployMode = value
    +        parse(tail)
    +
    +      case ("--num-executors") :: value :: tail =>
    +        numExecutors = value
    +        parse(tail)
    +
    +      case ("--total-executor-cores") :: value :: tail =>
    +        totalExecutorCores = value
    +        parse(tail)
    +
    +      case ("--executor-cores") :: value :: tail =>
    +        executorCores = value
    +        parse(tail)
    +
    +      case ("--executor-memory") :: value :: tail =>
    +        executorMemory = value
    +        parse(tail)
    +
    +      case ("--driver-memory") :: value :: tail =>
    +        driverMemory = value
    +        parse(tail)
    +
    +      case ("--driver-cores") :: value :: tail =>
    +        driverCores = value
    +        parse(tail)
    +
    +      case ("--driver-class-path") :: value :: tail =>
    +        driverExtraClassPath = value
    +        parse(tail)
    +
    +      case ("--driver-java-options") :: value :: tail =>
    +        driverExtraJavaOptions = value
    +        parse(tail)
    +
    +      case ("--driver-library-path") :: value :: tail =>
    +        driverExtraLibraryPath = value
    +        parse(tail)
    +
    +      case ("--properties-file") :: value :: tail =>
    +        propertiesFile = value
    +        parse(tail)
    +
    +      case ("--supervise") :: tail =>
    +        supervise = true
    +        parse(tail)
    +
    +      case ("--queue") :: value :: tail =>
    +        queue = value
    +        parse(tail)
    +
    +      case ("--files") :: value :: tail =>
    +        files = value
    +        parse(tail)
    +
    +      case ("--archives") :: value :: tail =>
    +        archives = value
    +        parse(tail)
    +
    +      case ("--jars") :: value :: tail =>
    +        jars = value
    +        parse(tail)
    +
    +      case ("--help" | "-h") :: tail =>
    +        printUsageAndExit(0)
    +
    +      case ("--verbose" | "-v") :: tail =>
    +        verbose = true
    +        parse(tail)
    +
    +      case value :: tail =>
    +        if (inSparkOpts) {
    --- End diff --
    
    The only change is here.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] spark pull request: SPARK-1606: Infer user application arguments i...

Posted by pwendell <gi...@git.apache.org>.
Github user pwendell commented on a diff in the pull request:

    https://github.com/apache/spark/pull/563#discussion_r12022406
  
    --- Diff: core/src/main/scala/org/apache/spark/deploy/SparkSubmitArguments.scala ---
    @@ -156,119 +156,123 @@ private[spark] class SparkSubmitArguments(args: Array[String]) {
         """.stripMargin
       }
     
    -  private def parseOpts(opts: List[String]): Unit = opts match {
    --- End diff --
    
    This is a minor change but causes the diff to look large because of indentation.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] spark pull request: SPARK-1606: Infer user application arguments i...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/563#issuecomment-41458964
  
    Merged build finished. All automated tests passed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] spark pull request: SPARK-1606: Infer user application arguments i...

Posted by asfgit <gi...@git.apache.org>.
Github user asfgit closed the pull request at:

    https://github.com/apache/spark/pull/563


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] spark pull request: SPARK-1606: Infer user application arguments i...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/563#issuecomment-41457420
  
     Merged build triggered. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] spark pull request: SPARK-1606: Infer user application arguments i...

Posted by mengxr <gi...@git.apache.org>.
Github user mengxr commented on the pull request:

    https://github.com/apache/spark/pull/563#issuecomment-41482557
  
    @pwendell A user may submit a job that is defined in Spark, e.g., `org.apache.spark.mllib.classification.SVMWithSGD`. In such case, jar is not a required argument but the job class is required.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] spark pull request: SPARK-1606: Infer user application arguments i...

Posted by pwendell <gi...@git.apache.org>.
Github user pwendell commented on the pull request:

    https://github.com/apache/spark/pull/563#issuecomment-41475106
  
    Thanks, I've addressed feedback.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] spark pull request: SPARK-1606: Infer user application arguments i...

Posted by mateiz <gi...@git.apache.org>.
Github user mateiz commented on the pull request:

    https://github.com/apache/spark/pull/563#issuecomment-41485626
  
    Yeah, I think we discussed moving the MLlib driver programs to examples.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] spark pull request: SPARK-1606: Infer user application arguments i...

Posted by pwendell <gi...@git.apache.org>.
Github user pwendell commented on the pull request:

    https://github.com/apache/spark/pull/563#issuecomment-41484532
  
    @mengxr in the past we typically haven't exposed runnable programs from within spark to users except for two cases:
    
    1. The spark shell. (`./bin/spark-shell`)
    2. The examples. (`./bin/run-example`)
    
    I think these classes should probably be moved to examples, right? @mateiz what do you think?
    
    I recently refactored (1) to internally use the spark submit script by passing a special name for the jar (`spark-internal`). I'd also like to have user submit the examples jar to spark-submit when running examples as discussed in https://issues.apache.org/jira/browse/SPARK-1565.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] spark pull request: SPARK-1606: Infer user application arguments i...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/563#issuecomment-41456484
  
    Merged build finished. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] spark pull request: SPARK-1606: Infer user application arguments i...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/563#issuecomment-41475185
  
     Merged build triggered. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] spark pull request: SPARK-1606: Infer user application arguments i...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/563#issuecomment-41476321
  
    Merged build finished. All automated tests passed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] spark pull request: SPARK-1606: Infer user application arguments i...

Posted by rxin <gi...@git.apache.org>.
Github user rxin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/563#discussion_r12023282
  
    --- Diff: core/src/main/scala/org/apache/spark/deploy/SparkSubmitArguments.scala ---
    @@ -156,119 +156,123 @@ private[spark] class SparkSubmitArguments(args: Array[String]) {
         """.stripMargin
       }
     
    -  private def parseOpts(opts: List[String]): Unit = opts match {
    -    case ("--name") :: value :: tail =>
    -      name = value
    -      parseOpts(tail)
    -
    -    case ("--master") :: value :: tail =>
    -      master = value
    -      parseOpts(tail)
    -
    -    case ("--class") :: value :: tail =>
    -      mainClass = value
    -      parseOpts(tail)
    -
    -    case ("--deploy-mode") :: value :: tail =>
    -      if (value != "client" && value != "cluster") {
    -        SparkSubmit.printErrorAndExit("--deploy-mode must be either \"client\" or \"cluster\"")
    -      }
    -      deployMode = value
    -      parseOpts(tail)
    -
    -    case ("--num-executors") :: value :: tail =>
    -      numExecutors = value
    -      parseOpts(tail)
    -
    -    case ("--total-executor-cores") :: value :: tail =>
    -      totalExecutorCores = value
    -      parseOpts(tail)
    -
    -    case ("--executor-cores") :: value :: tail =>
    -      executorCores = value
    -      parseOpts(tail)
    -
    -    case ("--executor-memory") :: value :: tail =>
    -      executorMemory = value
    -      parseOpts(tail)
    -
    -    case ("--driver-memory") :: value :: tail =>
    -      driverMemory = value
    -      parseOpts(tail)
    -
    -    case ("--driver-cores") :: value :: tail =>
    -      driverCores = value
    -      parseOpts(tail)
    -
    -    case ("--driver-class-path") :: value :: tail =>
    -      driverExtraClassPath = value
    -      parseOpts(tail)
    -
    -    case ("--driver-java-options") :: value :: tail =>
    -      driverExtraJavaOptions = value
    -      parseOpts(tail)
    -
    -    case ("--driver-library-path") :: value :: tail =>
    -      driverExtraLibraryPath = value
    -      parseOpts(tail)
    -
    -    case ("--properties-file") :: value :: tail =>
    -      propertiesFile = value
    -      parseOpts(tail)
    -
    -    case ("--supervise") :: tail =>
    -      supervise = true
    -      parseOpts(tail)
    -
    -    case ("--queue") :: value :: tail =>
    -      queue = value
    -      parseOpts(tail)
    -
    -    case ("--files") :: value :: tail =>
    -      files = value
    -      parseOpts(tail)
    -
    -    case ("--archives") :: value :: tail =>
    -      archives = value
    -      parseOpts(tail)
    -
    -    case ("--arg") :: value :: tail =>
    -      childArgs += value
    -      parseOpts(tail)
    -
    -    case ("--jars") :: value :: tail =>
    -      jars = value
    -      parseOpts(tail)
    -
    -    case ("--help" | "-h") :: tail =>
    -      printUsageAndExit(0)
    -
    -    case ("--verbose" | "-v") :: tail =>
    -      verbose = true
    -      parseOpts(tail)
    -
    -    case value :: tail =>
    -      if (value.startsWith("-")) {
    -        val errMessage = s"Unrecognized option '$value'."
    -        val suggestion: Option[String] = value match {
    -          case v if v.startsWith("--") && v.contains("=") =>
    -            val parts = v.split("=")
    -            Some(s"Perhaps you want '${parts(0)} ${parts(1)}'?")
    -          case _ =>
    -            None
    +  /** Fill in values by parsing user options. */
    +  private def parseOpts(opts: Seq[String]): Unit = {
    +    // Delineates parsing of Spark options from parsing of user options.
    +    var inSparkOpts = true
    +    parse(opts)
    +
    +    def parse(opts: Seq[String]): Unit = {
    +      opts match {
    --- End diff --
    
    the indent is off here - maybe you want to put opts match on the previous line?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] spark pull request: SPARK-1606: Infer user application arguments i...

Posted by mengxr <gi...@git.apache.org>.
Github user mengxr commented on the pull request:

    https://github.com/apache/spark/pull/563#issuecomment-41479197
  
    It looks good! Would submitting python jobs look the same? 
    
    ~~~
    spark-submit [options] main.py [app options]
    ~~~


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] spark pull request: SPARK-1606: Infer user application arguments i...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/563#issuecomment-41475187
  
    Merged build started. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] spark pull request: SPARK-1606: Infer user application arguments i...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/563#issuecomment-41457429
  
    Merged build started. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] spark pull request: SPARK-1606: Infer user application arguments i...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/563#issuecomment-41458965
  
    All automated tests passed.
    Refer to this link for build results: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/14508/


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] spark pull request: SPARK-1606: Infer user application arguments i...

Posted by sryza <gi...@git.apache.org>.
Github user sryza commented on the pull request:

    https://github.com/apache/spark/pull/563#issuecomment-41464953
  
    I am very in favor of this


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] spark pull request: SPARK-1606: Infer user application arguments i...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/563#issuecomment-41454845
  
    Merged build started. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---