You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@lucene.apache.org by "Mark Miller (JIRA)" <ji...@apache.org> on 2017/01/25 15:19:26 UTC
[jira] [Created] (SOLR-10032) Create report to assess Solr test
quality at a commit point.
Mark Miller created SOLR-10032:
----------------------------------
Summary: Create report to assess Solr test quality at a commit point.
Key: SOLR-10032
URL: https://issues.apache.org/jira/browse/SOLR-10032
Project: Solr
Issue Type: Task
Security Level: Public (Default Security Level. Issues are Public)
Components: Tests
Reporter: Mark Miller
Assignee: Mark Miller
We have many Jenkins instances blasting tests, some official, some policeman, I and others have or had their own, and the email trail proves the power of the Jenkins cluster to find test fails.
However, I still have a very hard time with some basic questions:
what tests are flakey right now? which test fails actually affect devs most? did I break it? was that test already flakey? is that test still flakey? what are our worst tests right now? is that test getting better or worse?
We really need a way to see exactly what tests are the problem, not because of OS or environmental issues, but more basic test quality issues. Which tests are flakey and how flakey are they at any point in time.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@lucene.apache.org
For additional commands, e-mail: dev-help@lucene.apache.org