You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@lucene.apache.org by "Mark Miller (JIRA)" <ji...@apache.org> on 2017/03/08 04:54:46 UTC

[jira] [Commented] (SOLR-10032) Create report to assess Solr test quality at a commit point.

    [ https://issues.apache.org/jira/browse/SOLR-10032?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15900701#comment-15900701 ] 

Mark Miller commented on SOLR-10032:
------------------------------------

I've got a few issue to look into, but when I'm done I'll run a new report and work on removing any tests failing over 5% of the time.

> Create report to assess Solr test quality at a commit point.
> ------------------------------------------------------------
>
>                 Key: SOLR-10032
>                 URL: https://issues.apache.org/jira/browse/SOLR-10032
>             Project: Solr
>          Issue Type: Task
>      Security Level: Public(Default Security Level. Issues are Public) 
>          Components: Tests
>            Reporter: Mark Miller
>            Assignee: Mark Miller
>         Attachments: Lucene-Solr Master Test Beast Results 01-24-2017-9899cbd031dc3fc37a384b1f9e2b379e90a9a3a6 Level Medium- Running 30 iterations, 12 at a time .pdf, Lucene-Solr Master Test Beasults 02-01-2017-bbc455de195c83d9f807980b510fa46018f33b1b Level Medium- Running 30 iterations, 10 at a time.pdf, Lucene-Solr Master Test Beasults 02-08-2017-6696eafaae18948c2891ce758c7a2ec09873dab8 Level Medium+- Running 30 iterations, 10 at a time, 8 cores.pdf, Lucene-Solr Master Test Beasults 02-14-2017- Level Medium+-a1f114f70f3800292c25be08213edf39b3e37f6a Running 30 iterations, 10 at a time, 8 cores.pdf, Lucene-Solr Master Test Beasults 02%2F17%2F2017-19c8ec2bf1882bed1bb34d0b55198d03f2018838 Level Hard Running 100 iterations, 12 at a time, 8 cores.pdf
>
>
> We have many Jenkins instances blasting tests, some official, some policeman, I and others have or had their own, and the email trail proves the power of the Jenkins cluster to find test fails.
> However, I still have a very hard time with some basic questions:
> what tests are flakey right now? which test fails actually affect devs most? did I break it? was that test already flakey? is that test still flakey? what are our worst tests right now? is that test getting better or worse?
> We really need a way to see exactly what tests are the problem, not because of OS or environmental issues, but more basic test quality issues. Which tests are flakey and how flakey are they at any point in time.
> Reports:
> https://drive.google.com/drive/folders/0ByYyjsrbz7-qa2dOaU1UZDdRVzg?usp=sharing
> 01/24/2017 - https://docs.google.com/spreadsheets/d/1JySta2j2s7A_p16wA1UO-l6c4GsUHBIb4FONS2EzW9k/edit?usp=sharing
> 02/01/2017 - https://docs.google.com/spreadsheets/d/1FndoyHmihaOVL2o_Zns5alpNdAJlNsEwQVoJ4XDWj3c/edit?usp=sharing
> 02/08/2017 - https://docs.google.com/spreadsheets/d/1N6RxH4Edd7ldRIaVfin0si-uSLGyowQi8-7mcux27S0/edit?usp=sharing
> 02/14/2017 - https://docs.google.com/spreadsheets/d/1eZ9_ds_0XyqsKKp8xkmESrcMZRP85jTxSKkNwgtcUn0/edit?usp=sharing
> 02/17/2017 - https://docs.google.com/spreadsheets/d/1LEPvXbsoHtKfIcZCJZ3_P6OHp7S5g2HP2OJgU6B2sAg/edit?usp=sharing



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@lucene.apache.org
For additional commands, e-mail: dev-help@lucene.apache.org