You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Josh Rosen (JIRA)" <ji...@apache.org> on 2016/01/24 01:05:39 UTC
[jira] [Resolved] (SPARK-12432) Make parts of the Spark testing API
public to assist developers making their own tests.
[ https://issues.apache.org/jira/browse/SPARK-12432?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Josh Rosen resolved SPARK-12432.
--------------------------------
Resolution: Won't Fix
Closing as "Won't Fix", per the resolution of SPARK-12433
> Make parts of the Spark testing API public to assist developers making their own tests.
> ---------------------------------------------------------------------------------------
>
> Key: SPARK-12432
> URL: https://issues.apache.org/jira/browse/SPARK-12432
> Project: Spark
> Issue Type: Improvement
> Components: Tests
> Reporter: holdenk
> Priority: Trivial
>
> Apache Spark has a wide variety of internal tests, but many of the APIs used for testing are private. To allow external developers to easily test their Spark programs we should expose a minimal set of testing APIs. The spark-testing-base project provides a set of base classes based on the existing testing code in Spark we could use to fashion a public test API for Spark.
> Since the tests for Streaming (and other components) depend on accessing Spark internals, maintaining these tests inside of Spark is a better fit than as an external component. Testing is also a core component of any project using Spark that moves beyond the proof-of-concept phase.
> The proposed API (which is very much based on Spark's current internal API) is at https://docs.google.com/document/d/1FBg1qWbzjW0fKt6YxzNEn7vJ7l2y1ZfeM0CSP3JJIBI/edit?usp=sharing .
> This issue can be broken up into three sub issues to expose testing utils for core, streaming, and sql.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org