You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@systemml.apache.org by "Nakul Jindal (JIRA)" <ji...@apache.org> on 2017/03/31 23:20:41 UTC

[jira] [Comment Edited] (SYSTEMML-1451) Automate performance testing and reporting

    [ https://issues.apache.org/jira/browse/SYSTEMML-1451?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15951795#comment-15951795 ] 

Nakul Jindal edited comment on SYSTEMML-1451 at 3/31/17 11:20 PM:
------------------------------------------------------------------

+1 [~mwdusenb@us.ibm.com]
We should at least do this for the algorithms that we use for performance testing before the potential GSoC student begins the project. The rest of the algorithms can be taken over by the student or by us at a later time.


was (Author: nakul02):
+1 [~mwdusenb@us.ibm.com]
We should at least do this for the algorithms that we use for performance testing before the potential GSoC student begins the project.

> Automate performance testing and reporting
> ------------------------------------------
>
>                 Key: SYSTEMML-1451
>                 URL: https://issues.apache.org/jira/browse/SYSTEMML-1451
>             Project: SystemML
>          Issue Type: Improvement
>          Components: Infrastructure, Test
>            Reporter: Nakul Jindal
>              Labels: gsoc2017, mentor, performance, reporting, testing
>
> As part of a release (and in general), performance tests are run for SystemML.
> Currently, running and reporting on these performance tests are a manual process. There are helper scripts, but largely the process is manual.
> The aim of this GSoC 2017 project is to automate performance testing and its reporting.
> These are the tasks that this entails
> 1. Automate running of the performance tests, including generation of test data
> 2. Detect errors and report if any
> 3. Record performance benchmarking information
> 4. Automatically compare this performance to previous version to check for performance regressions
> 5. Automatically compare to Spark MLLib, R?, Julia?
> 6. Prepare report with all the information about failed jobs, performance information, perf info against other comparable projects/algorithms (plotted/in plain text in CSV, PDF or other common format)
> 7. Create scripts to automatically run this process on a cloud provider that spins up machines, runs the test, saves the reports and spins down the machines.
> 8. Create a web application to do this interactively without dropping down into a shell.
> As part of this project, the student will need to know scripting (in Bash, Python, etc). It may also involve changing error reporting and performance reporting code in SystemML. 
> Rating - Medium (for the amount of work)
> Mentor - [~nakul02] (Other co-mentors will join in)



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)