You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Andrew Ash (JIRA)" <ji...@apache.org> on 2014/11/14 12:07:34 UTC

[jira] [Commented] (SPARK-1358) Continuous integrated test should be involved in Spark ecosystem

    [ https://issues.apache.org/jira/browse/SPARK-1358?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14212125#comment-14212125 ] 

Andrew Ash commented on SPARK-1358:
-----------------------------------

I've heard these sorts of extended tests called end to end tests.  A sample one for Spark could be to stand up an HDFS cluster, load some data into it, stand up a parallel Spark cluster, read data out of HDFS, and compute some kind of aggregate.

They tend to require significant hardware though in order to run because they are much more intense and can be long-running.  [~pwendell] is that kind of extended hardware available?  I know we currently run some CI on Amplab Jenkins but I'm uncertain how much additional capacity it could support.

cc [~shaneknapp]

> Continuous integrated test should be involved in Spark ecosystem 
> -----------------------------------------------------------------
>
>                 Key: SPARK-1358
>                 URL: https://issues.apache.org/jira/browse/SPARK-1358
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>            Reporter: xiajunluan
>
> Currently, Spark only contains unit test and performance test, but I think it is not enough for customer to evaluate status about their cluster and spark version they  will used, and it is necessary to build continuous integrated test for spark development , it could included 
> 1. complex applications test cases for spark/spark streaming/graphx....
> 2. stresss test cases
> 3. fault tolerance test cases
> 4......



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org