You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@ant.apache.org by Charles Wells <ch...@us.ibm.com> on 2007/02/02 22:16:16 UTC

Enhancing JUnitReport's output

I use Ant to invoke several different test suites, where each test suite
contains many test cases, and each test case contains many test methods.
Its a common complaint here that the HTML report generated by the
<junitreport> task lists all the test method names but not what test case
each method belongs to.  This is a problem because it is common to have the
same test method name exist in more than one test case; we can't easily
tell which test method may be the one reporting a failure.  We don't want
to have to incorporate the test case's name in the naming of every test
method; this seems excessive.  Besides, even if we keep all names unique,
it can still be annoying figuring out which test case a particular method
belongs to.  What's the preferred ways that others deal with this problem?
Is there something simple that I'm overlooking?  The only way I found so
far was to write a custom formatter that writes both the test case name and
the test method name when generating the <testcase> tag's "name" attribute
into the XML, rather than just write the test method name.  Is this a
reasonable approach?    (I bet this has been asked before, but my searching
turned up nothing, so my apologies if this is a dup.)

The other issue I have with the format of the HTML is that it groups the
tests by Java package rather than by a name of my choosing.  This is a
problem when I am running the same suite of tests (which would always exist
in the same package) several times, but varying the purpose of the test run
each time.  For example, I may have suite AllTests which I first run
against DB2, next against Oracle, then against SQL Server, and a fourth
time on Unix.  This produces four XML files, one for each run of AllTests.
The <junit> task lets me name these XML files so that they have useful
names that indicate the purpose of the test run, e.g.
"TEST-Oracle-AllTests.xml".  Then, I use <junitreport> task to create a
single HTML noframes report from those 4 XML files.  The report has a
"Packages" section that combines all the test runs into one line.  Next, it
has a Package section that then has "AllTests" repeated several times, with
a summary of each.  What's the best way to get a name of my choosing in
place of "AllTests", such as "Oracle Test Run" or "DB2 Test Run"?  Its not
obvious to me how to best proceed, although I'm guessing it may require
modifying the stylesheet (I don't know XSL... yet).  Anyone have a good
solution for this?

Thanks!


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@ant.apache.org
For additional commands, e-mail: user-help@ant.apache.org


Re: Enhancing JUnitReport's output

Posted by Steve Loughran <st...@apache.org>.
Charles Wells wrote:
> I use Ant to invoke several different test suites, where each test suite
> contains many test cases, and each test case contains many test methods.
> Its a common complaint here that the HTML report generated by the
> <junitreport> task lists all the test method names but not what test case
> each method belongs to.  This is a problem because it is common to have the
> same test method name exist in more than one test case; we can't easily
> tell which test method may be the one reporting a failure.  We don't want
> to have to incorporate the test case's name in the naming of every test
> method; this seems excessive.  Besides, even if we keep all names unique,
> it can still be annoying figuring out which test case a particular method
> belongs to.  What's the preferred ways that others deal with this problem?
> Is there something simple that I'm overlooking?  The only way I found so
> far was to write a custom formatter that writes both the test case name and
> the test method name when generating the <testcase> tag's "name" attribute
> into the XML, rather than just write the test method name.  Is this a
> reasonable approach?    (I bet this has been asked before, but my searching
> turned up nothing, so my apologies if this is a dup.)
> 
> The other issue I have with the format of the HTML is that it groups the
> tests by Java package rather than by a name of my choosing.  This is a
> problem when I am running the same suite of tests (which would always exist
> in the same package) several times, but varying the purpose of the test run
> each time.  For example, I may have suite AllTests which I first run
> against DB2, next against Oracle, then against SQL Server, and a fourth
> time on Unix.  This produces four XML files, one for each run of AllTests.
> The <junit> task lets me name these XML files so that they have useful
> names that indicate the purpose of the test run, e.g.
> "TEST-Oracle-AllTests.xml".  Then, I use <junitreport> task to create a
> single HTML noframes report from those 4 XML files.  The report has a
> "Packages" section that combines all the test runs into one line.  Next, it
> has a Package section that then has "AllTests" repeated several times, with
> a summary of each.  What's the best way to get a name of my choosing in
> place of "AllTests", such as "Oracle Test Run" or "DB2 Test Run"?  Its not
> obvious to me how to best proceed, although I'm guessing it may require
> modifying the stylesheet (I don't know XSL... yet).  Anyone have a good
> solution for this?
> 

JUnit's reports are pretty limited, in that they aggregate runs on a 
single host, but have no idea of same-test-different config. Such as 
running the same test suite against four different remote soap 
endpoiints, or with two different database options.

well, short term, you get to edit the XSL files. There is something on 
sourceforge called gridunit you may want to look at; its the best 
aggregator of results out there, though it uses serialized java classes 
to represent results (a fork of some classes I wrote)

if you look at my recent talk on system testing:
http://smartfrog.org/presentations/distributed_testing_with_smartfrog_slides.pdf
http://smartfrog.org/autolinks/googleLTAC06.htm

you can see some of my future throughts; I'm doing some work stuff that 
I'd like to put in to Ant, amongst other places.

Here are some of my requirements
  -ability to stream out results (instead of buffer until completion)
  -retain log info at different levels/sources
  -give tests tag names (interop, unlikely, 'steve's', priority-2, etc)
  -link tests to defects (i.e urls)
  -log skipped tests explicitly (to remind you)
  -allow multiple runs on a single host with different options
  -allow runs on different hosts
  -have some aggregate view of the results that let you look at how well 
a test went across ten different machines.
  -cross platform chained stack traces (where possible)
  -maybe big binary attachments (videos, vmware images)

In the talk I showed my first pass, a marked up XHTML file that could be 
read without waiting for an XSL transform, streamed out, etc, etc. Since 
then I've been making my reporting stuff (in smartfrog) more generic 
than just junit, and doing sfunit, which is a bit like antunit but for 
system configs. so havent done much on reporting.

one good suggestion from the google conference was to use Atom as a 
result feed, which is interesting. you could subscribe to test runs on 
different systems, use the feed categories to filter results, etc, etc. 
However, Firefox 2's willingness to impose its own style sheet on test 
results is actually pretty inconvenient here. The other option would be 
to (somehow) use RDF for the results, and then a facet browser as a way 
of analyzing results.

anyway, all this is research, but that is my day job after all. If you 
want to get involved, then we could perhaps collaborate. My vision is an 
extensible format that can be used by lots of test tools, though the 
broad spread of outcomes (there's a lot more than just pass and fail out 
there) makes it a bit ambitious.


-steve


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@ant.apache.org
For additional commands, e-mail: user-help@ant.apache.org