You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Josh Rosen (JIRA)" <ji...@apache.org> on 2015/07/10 20:45:04 UTC

[jira] [Updated] (SPARK-8985) Create a test harness to improve Spark's combinatorial test coverage of non-default configurations

     [ https://issues.apache.org/jira/browse/SPARK-8985?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Josh Rosen updated SPARK-8985:
------------------------------
    Summary: Create a test harness to improve Spark's combinatorial test coverage of non-default configurations  (was: Create a test harness to improve Spark's combinatorial test coverage of non-default configuration)

> Create a test harness to improve Spark's combinatorial test coverage of non-default configurations
> --------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-8985
>                 URL: https://issues.apache.org/jira/browse/SPARK-8985
>             Project: Spark
>          Issue Type: Bug
>          Components: Tests
>            Reporter: Josh Rosen
>
> Large numbers of Spark bugs could be caught by running a trivial set of end-to-end tests with a non-standard SparkConf configuration.
> This ticket exists to assemble a list of such bugs and the configurations which would have caught them.  I think that we should build a separate Jenkins harness which runs end-to-end tests across a huge configuration matrix in order to detect these issues.  If the test configuration matrix grows to be too large to be tested daily, then we can explore combinatorial testing approaches to test fewer configurations while still achieving a high level of combinatorial coverage.
> **Bugs listed in order of the test configurations which would have caught them:**
> * spark.python.worker.reuse=false:
>    ** SPARK-8976



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org