You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@geode.apache.org by Aaron Lindsey <aa...@apache.org> on 2020/01/01 00:52:20 UTC

Re: [DISCUSS] What should we do with @Ignore tests?

I’m in favor of deleting all except the ones that have JIRA tickets open for them, like Bruce said.

Also going forward I’d like to see us not be checking in @Ignored tests — just delete them instead. If we need to get it back we have revision history. Just my two cents.

Aaron

> On Dec 31, 2019, at 2:53 PM, Bruce Schuchardt <bs...@pivotal.io> wrote:
> 
> I agree with deleting @Ignored tests except for the few that have JIRA tickets open for them.  There are less than 1/2 dozen of these and we should consider keeping them since we have a way of tracking them.
> 
> On 12/31/19 2:07 PM, Alexander Murmann wrote:
>> Here are a few things that are true for me or I believe are true in general:
>> 
>>    - Our test suite is more flaky than we'd like it to be
>>    - I don't believe that adding more Unit tests that follow existing
>>    patterns buys us that much. I'd rather see something similar to what some
>>    folks are doing with Membership right now where we isolate the code and
>>    test it more systematically
>>    - We have other testing gaps: We have benchmarks 👏🎉, but we are still
>>    lacking coverage in that ares; our community is still lacking HA tests. I'd
>>    rather fill those than bring back old DUnit tests that are chosen somewhat
>>    at random.
>>    - I'd rather be deliberate about what tests we introduce than wholesale
>>    bring back a set of tests, since any of these re-introduced tests has a
>>    potential to be flaky. Let's make sure our tests carry their weight.
>>    - If we delete these tests, we can always go back to a SHA from today
>>    and bring them back at a later date
>>    - These tests have been ignored since a very long time and we've shipped
>>    without them and nobody has missed them enough to bring them back.
>> 
>> Given all the above, my vote is for less noise in our code, which means
>> deleting all ignored tests. If we want to keep them, I'd love to hear a
>> plan of action on how we bring them back. Having a bunch of dead code helps
>> nobody.
>> 
>> On Tue, Dec 31, 2019 at 1:50 PM Mark Hanson <mh...@pivotal.io> wrote:
>> 
>>> Hi All,
>>> 
>>> As part of what I am doing to fix flaky tests, I periodically come across
>>> tests that are @Ignore’d. I am curious what we would like to do with them
>>> generally speaking. We could fix them, which would seem obvious, but we are
>>> struggling to fix flaky tests as it is.  We could delete them, but those
>>> tests were written for a reason (I hope).  Or we could leave them. This
>>> pollutes searches etc as inactive code requiring upkeep at least.
>>> 
>>> I don’t have an easy answer. Some have suggested deleting them. I tend to
>>> lean that direction, but I thought I would consult the community for a
>>> broader perspective.
>>> 
>>> Thanks,
>>> Mark


Re: [DISCUSS] What should we do with @Ignore tests?

Posted by Mark Hanson <mh...@pivotal.io>.
Hi Naba,

While I think what you are suggesting sounds reasonable,  I think what you are proposing is a more painful process then leaving them in.  I am encountering maybe two of them at once when addressing a flaky test. If we want to do big bulk removes then  the burden of research becomes less likely to happen.  Just a thought.

Thanks,
Mark

Sent from my iPhone

> On Dec 31, 2019, at 6:31 PM, Nabarun Nag <nn...@apache.org> wrote:
> 
> +1 to Dan's suggestions.
> 
> - Remove in batches.
> - Send review requests for those PRs to relevant committers (authors of
> those tests etc.)
> - A brief explanation on why these tests are being deleted, and there is no
> loss of test coverage as it is covered by these other tests (or some other
> reason).
> 
> Regards
> Nabarun Nag
> 
>> On Tue, Dec 31, 2019 at 5:32 PM Dan Smith <ds...@pivotal.io> wrote:
>> 
>> Some of these test have been ignored for a long time. However, looking at
>> the history, I see we have ignored some tests as recently as in the last
>> month, for what seem like some questionable reasons.
>> 
>> I'm concerned that this could be a two step process to losing test coverage
>> - someone who things the test is still valuable but intends to fix it later
>> ignores it, and then someone else deletes it.
>> 
>> So what I would suggest is that if we are going to delete them, let's do it
>> in batches in get folks that have context on the code being tested to
>> review the changes. Make sense?
>> 
>> Also +1 to not ignoring any more tests - it would be nice to get down to 0
>> Ignored tests and enforce that!
>> 
>> -Dan
>> 
>> On Tue, Dec 31, 2019 at 4:52 PM Aaron Lindsey <aa...@apache.org>
>> wrote:
>> 
>>> I’m in favor of deleting all except the ones that have JIRA tickets open
>>> for them, like Bruce said.
>>> 
>>> Also going forward I’d like to see us not be checking in @Ignored tests —
>>> just delete them instead. If we need to get it back we have revision
>>> history. Just my two cents.
>>> 
>>> Aaron
>>> 
>>>> On Dec 31, 2019, at 2:53 PM, Bruce Schuchardt <bs...@pivotal.io>
>>> wrote:
>>>> 
>>>> I agree with deleting @Ignored tests except for the few that have JIRA
>>> tickets open for them.  There are less than 1/2 dozen of these and we
>>> should consider keeping them since we have a way of tracking them.
>>>> 
>>>> On 12/31/19 2:07 PM, Alexander Murmann wrote:
>>>>> Here are a few things that are true for me or I believe are true in
>>> general:
>>>>> 
>>>>>   - Our test suite is more flaky than we'd like it to be
>>>>>   - I don't believe that adding more Unit tests that follow existing
>>>>>   patterns buys us that much. I'd rather see something similar to
>> what
>>> some
>>>>>   folks are doing with Membership right now where we isolate the code
>>> and
>>>>>   test it more systematically
>>>>>   - We have other testing gaps: We have benchmarks 👏🎉, but we are
>>> still
>>>>>   lacking coverage in that ares; our community is still lacking HA
>>> tests. I'd
>>>>>   rather fill those than bring back old DUnit tests that are chosen
>>> somewhat
>>>>>   at random.
>>>>>   - I'd rather be deliberate about what tests we introduce than
>>> wholesale
>>>>>   bring back a set of tests, since any of these re-introduced tests
>>> has a
>>>>>   potential to be flaky. Let's make sure our tests carry their
>> weight.
>>>>>   - If we delete these tests, we can always go back to a SHA from
>> today
>>>>>   and bring them back at a later date
>>>>>   - These tests have been ignored since a very long time and we've
>>> shipped
>>>>>   without them and nobody has missed them enough to bring them back.
>>>>> 
>>>>> Given all the above, my vote is for less noise in our code, which
>> means
>>>>> deleting all ignored tests. If we want to keep them, I'd love to hear
>> a
>>>>> plan of action on how we bring them back. Having a bunch of dead code
>>> helps
>>>>> nobody.
>>>>> 
>>>>> On Tue, Dec 31, 2019 at 1:50 PM Mark Hanson <mh...@pivotal.io>
>> wrote:
>>>>> 
>>>>>> Hi All,
>>>>>> 
>>>>>> As part of what I am doing to fix flaky tests, I periodically come
>>> across
>>>>>> tests that are @Ignore’d. I am curious what we would like to do with
>>> them
>>>>>> generally speaking. We could fix them, which would seem obvious, but
>>> we are
>>>>>> struggling to fix flaky tests as it is.  We could delete them, but
>>> those
>>>>>> tests were written for a reason (I hope).  Or we could leave them.
>> This
>>>>>> pollutes searches etc as inactive code requiring upkeep at least.
>>>>>> 
>>>>>> I don’t have an easy answer. Some have suggested deleting them. I
>> tend
>>> to
>>>>>> lean that direction, but I thought I would consult the community for
>> a
>>>>>> broader perspective.
>>>>>> 
>>>>>> Thanks,
>>>>>> Mark
>>> 
>>> 
>> 

Re: [DISCUSS] What should we do with @Ignore tests?

Posted by Nabarun Nag <nn...@apache.org>.
+1 to Dan's suggestions.

- Remove in batches.
- Send review requests for those PRs to relevant committers (authors of
those tests etc.)
- A brief explanation on why these tests are being deleted, and there is no
loss of test coverage as it is covered by these other tests (or some other
reason).

Regards
Nabarun Nag

On Tue, Dec 31, 2019 at 5:32 PM Dan Smith <ds...@pivotal.io> wrote:

> Some of these test have been ignored for a long time. However, looking at
> the history, I see we have ignored some tests as recently as in the last
> month, for what seem like some questionable reasons.
>
> I'm concerned that this could be a two step process to losing test coverage
> - someone who things the test is still valuable but intends to fix it later
> ignores it, and then someone else deletes it.
>
> So what I would suggest is that if we are going to delete them, let's do it
> in batches in get folks that have context on the code being tested to
> review the changes. Make sense?
>
> Also +1 to not ignoring any more tests - it would be nice to get down to 0
> Ignored tests and enforce that!
>
> -Dan
>
> On Tue, Dec 31, 2019 at 4:52 PM Aaron Lindsey <aa...@apache.org>
> wrote:
>
> > I’m in favor of deleting all except the ones that have JIRA tickets open
> > for them, like Bruce said.
> >
> > Also going forward I’d like to see us not be checking in @Ignored tests —
> > just delete them instead. If we need to get it back we have revision
> > history. Just my two cents.
> >
> > Aaron
> >
> > > On Dec 31, 2019, at 2:53 PM, Bruce Schuchardt <bs...@pivotal.io>
> > wrote:
> > >
> > > I agree with deleting @Ignored tests except for the few that have JIRA
> > tickets open for them.  There are less than 1/2 dozen of these and we
> > should consider keeping them since we have a way of tracking them.
> > >
> > > On 12/31/19 2:07 PM, Alexander Murmann wrote:
> > >> Here are a few things that are true for me or I believe are true in
> > general:
> > >>
> > >>    - Our test suite is more flaky than we'd like it to be
> > >>    - I don't believe that adding more Unit tests that follow existing
> > >>    patterns buys us that much. I'd rather see something similar to
> what
> > some
> > >>    folks are doing with Membership right now where we isolate the code
> > and
> > >>    test it more systematically
> > >>    - We have other testing gaps: We have benchmarks 👏🎉, but we are
> > still
> > >>    lacking coverage in that ares; our community is still lacking HA
> > tests. I'd
> > >>    rather fill those than bring back old DUnit tests that are chosen
> > somewhat
> > >>    at random.
> > >>    - I'd rather be deliberate about what tests we introduce than
> > wholesale
> > >>    bring back a set of tests, since any of these re-introduced tests
> > has a
> > >>    potential to be flaky. Let's make sure our tests carry their
> weight.
> > >>    - If we delete these tests, we can always go back to a SHA from
> today
> > >>    and bring them back at a later date
> > >>    - These tests have been ignored since a very long time and we've
> > shipped
> > >>    without them and nobody has missed them enough to bring them back.
> > >>
> > >> Given all the above, my vote is for less noise in our code, which
> means
> > >> deleting all ignored tests. If we want to keep them, I'd love to hear
> a
> > >> plan of action on how we bring them back. Having a bunch of dead code
> > helps
> > >> nobody.
> > >>
> > >> On Tue, Dec 31, 2019 at 1:50 PM Mark Hanson <mh...@pivotal.io>
> wrote:
> > >>
> > >>> Hi All,
> > >>>
> > >>> As part of what I am doing to fix flaky tests, I periodically come
> > across
> > >>> tests that are @Ignore’d. I am curious what we would like to do with
> > them
> > >>> generally speaking. We could fix them, which would seem obvious, but
> > we are
> > >>> struggling to fix flaky tests as it is.  We could delete them, but
> > those
> > >>> tests were written for a reason (I hope).  Or we could leave them.
> This
> > >>> pollutes searches etc as inactive code requiring upkeep at least.
> > >>>
> > >>> I don’t have an easy answer. Some have suggested deleting them. I
> tend
> > to
> > >>> lean that direction, but I thought I would consult the community for
> a
> > >>> broader perspective.
> > >>>
> > >>> Thanks,
> > >>> Mark
> >
> >
>

Re: [DISCUSS] What should we do with @Ignore tests?

Posted by Dan Smith <ds...@pivotal.io>.
Some of these test have been ignored for a long time. However, looking at
the history, I see we have ignored some tests as recently as in the last
month, for what seem like some questionable reasons.

I'm concerned that this could be a two step process to losing test coverage
- someone who things the test is still valuable but intends to fix it later
ignores it, and then someone else deletes it.

So what I would suggest is that if we are going to delete them, let's do it
in batches in get folks that have context on the code being tested to
review the changes. Make sense?

Also +1 to not ignoring any more tests - it would be nice to get down to 0
Ignored tests and enforce that!

-Dan

On Tue, Dec 31, 2019 at 4:52 PM Aaron Lindsey <aa...@apache.org>
wrote:

> I’m in favor of deleting all except the ones that have JIRA tickets open
> for them, like Bruce said.
>
> Also going forward I’d like to see us not be checking in @Ignored tests —
> just delete them instead. If we need to get it back we have revision
> history. Just my two cents.
>
> Aaron
>
> > On Dec 31, 2019, at 2:53 PM, Bruce Schuchardt <bs...@pivotal.io>
> wrote:
> >
> > I agree with deleting @Ignored tests except for the few that have JIRA
> tickets open for them.  There are less than 1/2 dozen of these and we
> should consider keeping them since we have a way of tracking them.
> >
> > On 12/31/19 2:07 PM, Alexander Murmann wrote:
> >> Here are a few things that are true for me or I believe are true in
> general:
> >>
> >>    - Our test suite is more flaky than we'd like it to be
> >>    - I don't believe that adding more Unit tests that follow existing
> >>    patterns buys us that much. I'd rather see something similar to what
> some
> >>    folks are doing with Membership right now where we isolate the code
> and
> >>    test it more systematically
> >>    - We have other testing gaps: We have benchmarks 👏🎉, but we are
> still
> >>    lacking coverage in that ares; our community is still lacking HA
> tests. I'd
> >>    rather fill those than bring back old DUnit tests that are chosen
> somewhat
> >>    at random.
> >>    - I'd rather be deliberate about what tests we introduce than
> wholesale
> >>    bring back a set of tests, since any of these re-introduced tests
> has a
> >>    potential to be flaky. Let's make sure our tests carry their weight.
> >>    - If we delete these tests, we can always go back to a SHA from today
> >>    and bring them back at a later date
> >>    - These tests have been ignored since a very long time and we've
> shipped
> >>    without them and nobody has missed them enough to bring them back.
> >>
> >> Given all the above, my vote is for less noise in our code, which means
> >> deleting all ignored tests. If we want to keep them, I'd love to hear a
> >> plan of action on how we bring them back. Having a bunch of dead code
> helps
> >> nobody.
> >>
> >> On Tue, Dec 31, 2019 at 1:50 PM Mark Hanson <mh...@pivotal.io> wrote:
> >>
> >>> Hi All,
> >>>
> >>> As part of what I am doing to fix flaky tests, I periodically come
> across
> >>> tests that are @Ignore’d. I am curious what we would like to do with
> them
> >>> generally speaking. We could fix them, which would seem obvious, but
> we are
> >>> struggling to fix flaky tests as it is.  We could delete them, but
> those
> >>> tests were written for a reason (I hope).  Or we could leave them. This
> >>> pollutes searches etc as inactive code requiring upkeep at least.
> >>>
> >>> I don’t have an easy answer. Some have suggested deleting them. I tend
> to
> >>> lean that direction, but I thought I would consult the community for a
> >>> broader perspective.
> >>>
> >>> Thanks,
> >>> Mark
>
>