You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@harmony.apache.org by "Mikhail Loenko (JIRA)" <ji...@apache.org> on 2006/01/17 05:50:44 UTC

[jira] Commented: (HARMONY-31) Move peformance timing of unit tests into a decorator class.

    [ http://issues.apache.org/jira/browse/HARMONY-31?page=comments#action_12362910 ] 

Mikhail Loenko commented on HARMONY-31:
---------------------------------------

This is not what I meant.

I was going to create a Logger class at this point like this:

public class Logger {
        public static boolean printAllowed = false;
	public static void log(String message) {
		if (printAllowed) System.out.print(message);
	}
	public static void logln(String message) {
		if (printAllowed) System.out.println(message);
	}
	public static void logError(String message) {
		if (printAllowed) System.err.print(message);
	}
	public static void loglnError(String message) {
		if (printAllowed) System.err.println(message);
	}
}

And replace log() with Logger.log() everywhere in the tests.

All the remaining functionality in the PerformanceTest is obsolete.


> Move peformance timing of unit tests into a decorator class.
> ------------------------------------------------------------
>
>          Key: HARMONY-31
>          URL: http://issues.apache.org/jira/browse/HARMONY-31
>      Project: Harmony
>         Type: Improvement
>     Reporter: George Harley
>     Assignee: Geir Magnusson Jr
>     Priority: Minor
>  Attachments: PerfDecorator.java
>
> There has been some low-level discussion on the dev mailing list recently about the inclusion of performance-related logging code near the top of a unit test class inheritance hierarchy (see com.openintel.drl.security.test.PerformanceTest in the HARMONY-16 contribution). This particular issue suggests an alternative way of adding in timing code but without making it the responsibility of the unit tests themselves and without the need to introduce a class in the inheritance hierarchy. 
> The basic approach is to exploit the junit.extensions.TestDecorator type in the JUnit API to add in timing behaviour before and after each test method runs. This will be demonstrated with some simple sample code. 

-- 
This message is automatically generated by JIRA.
-
If you think it was sent incorrectly contact one of the administrators:
   http://issues.apache.org/jira/secure/Administrators.jspa
-
For more information on JIRA, see:
   http://www.atlassian.com/software/jira


Re: [jira] Commented: (HARMONY-31) Move peformance timing of unit tests into a decorator class.

Posted by Geir Magnusson Jr <ge...@pobox.com>.
or something useful like log4j?

:)

geir


Tim Ellison wrote:
> Why not use java.util.logging?
> 
> Regards,
> Tim
> 
> Mikhail Loenko (JIRA) wrote:
>>     [ http://issues.apache.org/jira/browse/HARMONY-31?page=comments#action_12362910 ] 
>>
>> Mikhail Loenko commented on HARMONY-31:
>> ---------------------------------------
>>
>> This is not what I meant.
>>
>> I was going to create a Logger class at this point like this:
>>
>> public class Logger {
>>         public static boolean printAllowed = false;
>> 	public static void log(String message) {
>> 		if (printAllowed) System.out.print(message);
>> 	}
>> 	public static void logln(String message) {
>> 		if (printAllowed) System.out.println(message);
>> 	}
>> 	public static void logError(String message) {
>> 		if (printAllowed) System.err.print(message);
>> 	}
>> 	public static void loglnError(String message) {
>> 		if (printAllowed) System.err.println(message);
>> 	}
>> }
>>
>> And replace log() with Logger.log() everywhere in the tests.
>>
>> All the remaining functionality in the PerformanceTest is obsolete.
>>
>>
>>> Move peformance timing of unit tests into a decorator class.
>>> ------------------------------------------------------------
>>>
>>>          Key: HARMONY-31
>>>          URL: http://issues.apache.org/jira/browse/HARMONY-31
>>>      Project: Harmony
>>>         Type: Improvement
>>>     Reporter: George Harley
>>>     Assignee: Geir Magnusson Jr
>>>     Priority: Minor
>>>  Attachments: PerfDecorator.java
>>>
>>> There has been some low-level discussion on the dev mailing list recently about the inclusion of performance-related logging code near the top of a unit test class inheritance hierarchy (see com.openintel.drl.security.test.PerformanceTest in the HARMONY-16 contribution). This particular issue suggests an alternative way of adding in timing code but without making it the responsibility of the unit tests themselves and without the need to introduce a class in the inheritance hierarchy. 
>>> The basic approach is to exploit the junit.extensions.TestDecorator type in the JUnit API to add in timing behaviour before and after each test method runs. This will be demonstrated with some simple sample code. 
> 

Re: [classlib] Unit and performance testing (was Re: [jira] Commented: (HARMONY-31) Move peformance timing of unit tests into a decorator class.)

Posted by Mikhail Loenko <ml...@gmail.com>.
Formally, option #2 from my mail that was

> > 2. Remove PerformnceTest. Introduce a simple Logger that does not print by
> > default.

does not mix any performance *infrastructure* with junit testing.

I think that we do not have to find the final solution right now,  we might see
various ideas as people will develop and contribute thier tests and as we
investigate more options.
As far as the #1 task for now is integration of security2, any rather good
short-term solution would be acceptable.

I have not seen any other solution that is well studied and does not cut
existing functionality.

Thanks,
Mikhail


On 1/20/06, Geir Magnusson Jr <ge...@pobox.com> wrote:
> [I got sick of the thread subject - it blended into every other JIRA
> thread... ]
>
> There is a 4th option - not mix performance infrastructure with unit
> testing.
>
> I'm all for getting "PerformanceTest" out of the class hierarchy, and
> not having unit tests yammer out to console if we can avoid it. (I do
> testing in console, and don't really care about the output, but it will
> slew the performance numbers as console i/o is relatively expensive...)
>
> That said, I do believe in the importance of having performance numbers
> to help detect regressions.
>
> George outlined how to use standard JUnit mechanisms to do this.  IMO,
> they are good because they are the canonical way using JUnit, but they
> also are a bit invasive too.
>
> Some other options :
>
> 1) This problem seems to be to be one of three usecases in the universe
> for using aspects (the other two being logging and caching, of
> course...)  So that's one area we might investigate - we would add an
> interceptor for each test/suite/whatever to do the perf that we need to
> be done.   We might be able to use it to turn debug logging on and off
> as well in a cheap and uninvasive way.
>
> 2) TestNG - I do want to give this a hard look, as it's annotations
> based, and see if there's something in there (or coming in there) for
> this.  TestNG will also run JUnit tests as is, so playing with it is
> going to be easy.
>
> geir
>
>
> Mikhail Loenko wrote:
> > To summarize, we have 3 options:
> >
> > 1. Keep PerformanceTest as a super class. Set printAllowed to false by default.
> > 2. Remove PerformnceTest. Introduce a simple Logger that does not print by
> > default.
> > 3. Move performance functionality to Decorator.
> >
> > #1 is the most unliked. #3 as I wrote before does not work.
> >
> > So I can submit a script that goes through the tests replacing
> > "extends PerformanceTest" with "extends TestCase"
> > "import PerformanceTest" with "import Logger"
> > and putting "Logger." before
> > logln() and other log functions
> >
> > Thanks,
> > Mikhail
> >
> >
> > On 1/19/06, Geir Magnusson Jr <ge...@pobox.com> wrote:
> >>
> >> Mikhail Loenko wrote:
> >>> On 1/19/06, Geir Magnusson Jr <ge...@pobox.com> wrote:
> >>>> Mikhail Loenko wrote:
> >>>>> The problem is unstable execution time of java programs:
> >>>>>
> >>>>> If you consequently run the same java program on the same computer
> >>>>> in the same conditions, execution time may vary by 20% or even more
> >>>> Why?  Given that computers are pretty determinstic, I'd argue that you
> >>>> don't have the same conditions from run to run.
> >>> Did you make experiments or it's your theoretical conclusion :) ?
> >> Have done experiments.  I never claim that it's the same conditions
> >> every run.  That's the issue, I think.
> >>
> >> geir
> >>
> >>> Try to create an application that runs 20 seconds and run it several times.
> >>>
> >>> Frankly, I do not exactly know why. But I know a lot of reasons that could
> >>> affect this dispersion. For example, there is a number of serving
> >>> threads and GC that impact on execution time.
> >>> Thanks,
> >>> Mikhail
> >>>
> >>>
> >>>> geir
> >>>>
> >>>
> >
> >
>

[classlib] Unit and performance testing (was Re: [jira] Commented: (HARMONY-31) Move peformance timing of unit tests into a decorator class.)

Posted by Geir Magnusson Jr <ge...@pobox.com>.
[I got sick of the thread subject - it blended into every other JIRA 
thread... ]

There is a 4th option - not mix performance infrastructure with unit 
testing.

I'm all for getting "PerformanceTest" out of the class hierarchy, and 
not having unit tests yammer out to console if we can avoid it. (I do 
testing in console, and don't really care about the output, but it will 
slew the performance numbers as console i/o is relatively expensive...)

That said, I do believe in the importance of having performance numbers 
to help detect regressions.

George outlined how to use standard JUnit mechanisms to do this.  IMO, 
they are good because they are the canonical way using JUnit, but they 
also are a bit invasive too.

Some other options :

1) This problem seems to be to be one of three usecases in the universe 
for using aspects (the other two being logging and caching, of 
course...)  So that's one area we might investigate - we would add an 
interceptor for each test/suite/whatever to do the perf that we need to 
be done.   We might be able to use it to turn debug logging on and off 
as well in a cheap and uninvasive way.

2) TestNG - I do want to give this a hard look, as it's annotations 
based, and see if there's something in there (or coming in there) for 
this.  TestNG will also run JUnit tests as is, so playing with it is 
going to be easy.

geir


Mikhail Loenko wrote:
> To summarize, we have 3 options:
> 
> 1. Keep PerformanceTest as a super class. Set printAllowed to false by default.
> 2. Remove PerformnceTest. Introduce a simple Logger that does not print by
> default.
> 3. Move performance functionality to Decorator.
> 
> #1 is the most unliked. #3 as I wrote before does not work.
> 
> So I can submit a script that goes through the tests replacing
> "extends PerformanceTest" with "extends TestCase"
> "import PerformanceTest" with "import Logger"
> and putting "Logger." before
> logln() and other log functions
> 
> Thanks,
> Mikhail
> 
> 
> On 1/19/06, Geir Magnusson Jr <ge...@pobox.com> wrote:
>>
>> Mikhail Loenko wrote:
>>> On 1/19/06, Geir Magnusson Jr <ge...@pobox.com> wrote:
>>>> Mikhail Loenko wrote:
>>>>> The problem is unstable execution time of java programs:
>>>>>
>>>>> If you consequently run the same java program on the same computer
>>>>> in the same conditions, execution time may vary by 20% or even more
>>>> Why?  Given that computers are pretty determinstic, I'd argue that you
>>>> don't have the same conditions from run to run.
>>> Did you make experiments or it's your theoretical conclusion :) ?
>> Have done experiments.  I never claim that it's the same conditions
>> every run.  That's the issue, I think.
>>
>> geir
>>
>>> Try to create an application that runs 20 seconds and run it several times.
>>>
>>> Frankly, I do not exactly know why. But I know a lot of reasons that could
>>> affect this dispersion. For example, there is a number of serving
>>> threads and GC that impact on execution time.
>>> Thanks,
>>> Mikhail
>>>
>>>
>>>> geir
>>>>
>>>
> 
> 

Re: [jira] Commented: (HARMONY-31) Move peformance timing of unit tests into a decorator class.

Posted by Mikhail Loenko <ml...@gmail.com>.
To summarize, we have 3 options:

1. Keep PerformanceTest as a super class. Set printAllowed to false by default.
2. Remove PerformnceTest. Introduce a simple Logger that does not print by
default.
3. Move performance functionality to Decorator.

#1 is the most unliked. #3 as I wrote before does not work.

So I can submit a script that goes through the tests replacing
"extends PerformanceTest" with "extends TestCase"
"import PerformanceTest" with "import Logger"
and putting "Logger." before
logln() and other log functions

Thanks,
Mikhail


On 1/19/06, Geir Magnusson Jr <ge...@pobox.com> wrote:
>
>
> Mikhail Loenko wrote:
> > On 1/19/06, Geir Magnusson Jr <ge...@pobox.com> wrote:
> >>
> >> Mikhail Loenko wrote:
> >>> The problem is unstable execution time of java programs:
> >>>
> >>> If you consequently run the same java program on the same computer
> >>> in the same conditions, execution time may vary by 20% or even more
> >> Why?  Given that computers are pretty determinstic, I'd argue that you
> >> don't have the same conditions from run to run.
> >
> > Did you make experiments or it's your theoretical conclusion :) ?
>
> Have done experiments.  I never claim that it's the same conditions
> every run.  That's the issue, I think.
>
> geir
>
> > Try to create an application that runs 20 seconds and run it several times.
> >
> > Frankly, I do not exactly know why. But I know a lot of reasons that could
> > affect this dispersion. For example, there is a number of serving
> > threads and GC that impact on execution time.
>
> >
> > Thanks,
> > Mikhail
> >
> >
> >> geir
> >>
> >
> >
>

Re: [jira] Commented: (HARMONY-31) Move peformance timing of unit tests into a decorator class.

Posted by Geir Magnusson Jr <ge...@pobox.com>.

Mikhail Loenko wrote:
> On 1/19/06, Geir Magnusson Jr <ge...@pobox.com> wrote:
>>
>> Mikhail Loenko wrote:
>>> The problem is unstable execution time of java programs:
>>>
>>> If you consequently run the same java program on the same computer
>>> in the same conditions, execution time may vary by 20% or even more
>> Why?  Given that computers are pretty determinstic, I'd argue that you
>> don't have the same conditions from run to run.
> 
> Did you make experiments or it's your theoretical conclusion :) ?

Have done experiments.  I never claim that it's the same conditions 
every run.  That's the issue, I think.

geir

> Try to create an application that runs 20 seconds and run it several times.
> 
> Frankly, I do not exactly know why. But I know a lot of reasons that could
> affect this dispersion. For example, there is a number of serving
> threads and GC that impact on execution time.

> 
> Thanks,
> Mikhail
> 
> 
>> geir
>>
> 
> 

Re: [jira] Commented: (HARMONY-31) Move peformance timing of unit tests into a decorator class.

Posted by Mikhail Loenko <ml...@gmail.com>.
On 1/19/06, Geir Magnusson Jr <ge...@pobox.com> wrote:
>
>
> Mikhail Loenko wrote:
> > The problem is unstable execution time of java programs:
> >
> > If you consequently run the same java program on the same computer
> > in the same conditions, execution time may vary by 20% or even more
>
> Why?  Given that computers are pretty determinstic, I'd argue that you
> don't have the same conditions from run to run.

Did you make experiments or it's your theoretical conclusion :) ?
Try to create an application that runs 20 seconds and run it several times.

Frankly, I do not exactly know why. But I know a lot of reasons that could
affect this dispersion. For example, there is a number of serving
threads and GC that impact on execution time.

Thanks,
Mikhail


>
> geir
>

Re: [jira] Commented: (HARMONY-31) Move peformance timing of unit tests into a decorator class.

Posted by Geir Magnusson Jr <ge...@pobox.com>.

Mikhail Loenko wrote:
> The problem is unstable execution time of java programs:
> 
> If you consequently run the same java program on the same computer
> in the same conditions, execution time may vary by 20% or even more

Why?  Given that computers are pretty determinstic, I'd argue that you 
don't have the same conditions from run to run.

geir

Re: [jira] Commented: (HARMONY-31) Move peformance timing of unit tests into a decorator class.

Posted by Mikhail Loenko <ml...@gmail.com>.
The problem is unstable execution time of java programs:

If you consequently run the same java program on the same computer
in the same conditions, execution time may vary by 20% or even more

If you have Config1 such that a loop invoking some security function
runs consiquently;
   10sec, 11sec, 9sec
and Config2 that executes the same loop
   20sec, 19secc, 23sec
Then you may conclude that Config1 is faster.

If you have to call via reflection that might take 100x more than execution
of the test method itself then you'll see times for Config1:
   1010sec, 912sec, 1183sec
and for Config2:
   1020sec, 835sec, 1190sec

Which one is faser? And if you want to compare performance of just a single
method and your configs imply usage of different reflection
implementations, than
those "performance results" would be useless at all - you will always compare
perfromance of reflection rather then performance of functionality of interest.


On 1/19/06, George Harley1 <GH...@uk.ibm.com> wrote:
> Hi Mikhail,
>
>
> >  When we run the tests (those ones that we call unit tests :) in
> performance mode
> > we start a timer, then run the same testcase in a loop millions of
> > times (and we do NOT call them via reflection) then we stop a timer
>
> Here is a flavour of what works for a lot of people...
>
> Use the JUnit type RepeatedTest (see [1]) to wrap any concrete JUnit Test
> (e.g. a TestSuite, a TestCase etc) so that the test (or tests if it is a
> TestSuite) get run as many million times as you want. Then, take the very
> same RepeatedTest object and wrap that inside a JUnitPerf TimedTest (see
> [2]), also passing in a maximum elapsed time that the RepeatedTest has to
> run in. If the RepeatedTest (which is running the original test case or
> suite millions of times for you) doesn't complete in the specified maximum
> time then the test is a failure.
>
>
> > As I understand your point, you are going to measure a single call of
> > a test method and that call is over reflection, or multiple calls but
> > each of them over reflection.
>
> No, that was not my point. My point is that, at the unit test level, it is
> possible to carry out performance timings using *established* JUnit
> techniques that are *dynamic* in nature and do *not* require adding in a
> superclass into the inheritance hierarchy meaning that every child class
> "is a" performance test spouting out lots of "== test Foo passed OK =="
> messages. What happens if you want to carry out some stress testing with
> the unit tests ? Does it mean the addition of a new superclass at the top
> of the hierarchy so that every unit test class "is a" stress test as well

As you might notice I suggested to remove super class and add a simple
predictable Logger class whose execution time would me minor comparing
to test case's time.

If other modes require usage of other functionality - it might be
considered also.

Thanks,
Mikhail




> ? And so on for interop tests ... etc .. etc ... ?
>
>
> > Just wanted to say that reflection part of the call might take 99% of
> method
> > execution and results would not be reliable. Please correct me if I
> wrong.
>
> You could adjust your target timings to take into account the effects of
> the test framework. What would be the problem in doing that ? If you are
> looking for the introduction of performance problems then isn't it the
> relative timings (e.g. "before versus after the introduction of new code"
> or "test on JRE1 versus identical test on JRE2") that matter ?
>
>
> Best regards,
> George
>
> [1]
> http://www.junit.org/junit/javadoc/3.8.1/junit/extensions/RepeatedTest.html
> [2] http://clarkware.com/software/JUnitPerf.html#howtouse
> ________________________________________
> George C. Harley
>
>
>
>
>
> Mikhail Loenko <ml...@gmail.com>
> 18/01/2006 17:06
> Please respond to
> harmony-dev@incubator.apache.org
>
>
> To
> harmony-dev@incubator.apache.org
> cc
>
> Subject
> Re: [jira] Commented: (HARMONY-31) Move peformance timing of unit tests
> into a decorator class.
>
>
>
>
>
>
> Hi George
>
> When we run the tests (those ones that we call unit tests :) in
> performance mode
> we start a timer, then run the same testcase in a loop millions of
> times (and we do
> NOT call them via reflection) then we stop a timer
>
> As I understand your point, you are going to measure a single call of
> a test method and that call is over reflection, or multiple calls but
> each of them over reflection.
> Just wanted to say that reflection part of the call might take 99% of
> method
> execution and results would not be reliable. Please correct me if I wrong.
>
> Thanks,
> Mikhail
>
> On 1/18/06, George Harley1 <GH...@uk.ibm.com> wrote:
> > Hi Mikhail,
> >
> > > The messages are important to analyze failures also.
> >
> > What is JUnit's own failure reporting mechanism not providing to you ?
> >
> >
> > > And the possibility to test perfromance is also important
> >
> > Yes it is. But - to return to the original point of the HARMONY-31 JIRA
> > issue - this can be done without the need to either bring in a
> superclass
> > in the test hierarchy and/or scatter logging calls around the test code.
> > Performance testing using JUnit should be done in a transparent manner
> > using a decorator. As an example of what I mean please take a look at
> the
> > JUnitPerf site at http://clarkware.com/software/JUnitPerf.html .
> JUnitPerf
> > is an extension to JUnit that helps with the creation of performance
> tests
> > that match existing unit tests. It does this using a decorator design.
> > This provides for a separation of concerns that benefits developers and
> > performance engineers.
> >
> > The decorator approach means that we effectively *extend* JUnit with
> > simple wrapper classes in which we can make full use of the JUnit API to
> > give us the additional behaviour needed when running the tests (maybe
> this
> > could be used to give whatever extra failure analysis data you say is
> > lacking ?). And if somebody doesn't want the extension behaviour ? They
> > just run the suite without the custom wrappers.
> >
> >
> > > When we have a powerful performance suite we may revisit this.
> >
> > Don't you mean when we have a powerful unit test suite ?
> >
> > Take care,
> > George
> > ________________________________________
> > George C. Harley
> >
> >
> >
> >
> >
> > Mikhail Loenko <ml...@gmail.com>
> > 18/01/2006 12:57
> > Please respond to
> > harmony-dev@incubator.apache.org
> >
> >
> > To
> > harmony-dev@incubator.apache.org
> > cc
> >
> > Subject
> > Re: [jira] Commented: (HARMONY-31) Move peformance timing of unit tests
> > into a decorator class.
> >
> >
> >
> >
> >
> >
> > Well, those messages were important for developers when
> > they were writing code and tests. Then tests came to repository 'as is'.
> >
> > The messages are important to analyze failures also.
> > And the possibility to test perfromance is also important
> >
> > For me any option that does not break functionality in favor of beauty
> > looks
> > good. I suggest swithcing off logging by default and ether keep
> > PerformanceTest
> > super class as is or replace it with a simple Logger class.
> >
> > When we have a powerful performance suite we may revisit this.
> >
> > Thanks,
> > Mikhail.
> >
> > On 1/18/06, Tim Ellison <t....@gmail.com> wrote:
> > > Absolutely right -- writing meaningful performance tests is hard.
> > > Implementing your own Logger would not solve the problem though<g>.
> > >
> > > Best to avoid the 'This test worked OK' log messages altogether, and
> > > stick to assertions.
> > >
> > > Regards,
> > > Tim
> > >
> > > Mikhail Loenko wrote:
> > > > It might be a problem...
> > > >
> > > > When we use java.util.logging we do not just compare performance of
> > security
> > > > API functions, the result is also depends on difference in
> performance
> > of
> > > > java.util.logging in standard classes vs. Harmony classes. So if we
> > use
> > > > non-trivial functionality from there then our results will be
> spoiled
> > a little.
> > > >
> > > > Will investigate more...
> > > >
> > > > Thanks,
> > > > Mikhail.
> > > >
> > > > On 1/17/06, Tim Ellison <t....@gmail.com> wrote:
> > > >> neither is the Logger class -- so my point is if you are going to
> > write
> > > >> some logging code why not do it in java.util.logging?  You may
> choose
> > to
> > > >> only do simple stubs for now until somebody steps up to do a real
> > impl.
> > > >>
> > > >> Regards,
> > > >> Tim
> > > >>
> > > >> Mikhail Loenko wrote:
> > > >>> It's not yet implemented.
> > > >>>
> > > >>> thanks,
> > > >>> Mikhail
> > > >>>
> > > >>> On 1/17/06, Tim Ellison <t....@gmail.com> wrote:
> > > >>>> Why not use java.util.logging?
> > > >>>>
> > > >>>> Regards,
> > > >>>> Tim
> > > >>>>
> > > >>>> Mikhail Loenko (JIRA) wrote:
> > > >>>>>     [
> >
> http://issues.apache.org/jira/browse/HARMONY-31?page=comments#action_12362910
>
> > ]
> > > >>>>>
> > > >>>>> Mikhail Loenko commented on HARMONY-31:
> > > >>>>> ---------------------------------------
> > > >>>>>
> > > >>>>> This is not what I meant.
> > > >>>>>
> > > >>>>> I was going to create a Logger class at this point like this:
> > > >>>>>
> > > >>>>> public class Logger {
> > > >>>>>         public static boolean printAllowed = false;
> > > >>>>>       public static void log(String message) {
> > > >>>>>               if (printAllowed) System.out.print(message);
> > > >>>>>       }
> > > >>>>>       public static void logln(String message) {
> > > >>>>>               if (printAllowed) System.out.println(message);
> > > >>>>>       }
> > > >>>>>       public static void logError(String message) {
> > > >>>>>               if (printAllowed) System.err.print(message);
> > > >>>>>       }
> > > >>>>>       public static void loglnError(String message) {
> > > >>>>>               if (printAllowed) System.err.println(message);
> > > >>>>>       }
> > > >>>>> }
> > > >>>>>
> > > >>>>> And replace log() with Logger.log() everywhere in the tests.
> > > >>>>>
> > > >>>>> All the remaining functionality in the PerformanceTest is
> > obsolete.
> > > >>>>>
> > > >>>>>
> > > >>>>>> Move peformance timing of unit tests into a decorator class.
> > > >>>>>> ------------------------------------------------------------
> > > >>>>>>
> > > >>>>>>          Key: HARMONY-31
> > > >>>>>>          URL: http://issues.apache.org/jira/browse/HARMONY-31
> > > >>>>>>      Project: Harmony
> > > >>>>>>         Type: Improvement
> > > >>>>>>     Reporter: George Harley
> > > >>>>>>     Assignee: Geir Magnusson Jr
> > > >>>>>>     Priority: Minor
> > > >>>>>>  Attachments: PerfDecorator.java
> > > >>>>>>
> > > >>>>>> There has been some low-level discussion on the dev mailing
> list
> > recently about the inclusion of performance-related logging code near
> the
> > top of a unit test class inheritance hierarchy (see
> > com.openintel.drl.security.test.PerformanceTest in the HARMONY-16
> > contribution). This particular issue suggests an alternative way of
> adding
> > in timing code but without making it the responsibility of the unit
> tests
> > themselves and without the need to introduce a class in the inheritance
> > hierarchy.
> > > >>>>>> The basic approach is to exploit the
> > junit.extensions.TestDecorator type in the JUnit API to add in timing
> > behaviour before and after each test method runs. This will be
> > demonstrated with some simple sample code.
> > > >>>> --
> > > >>>>
> > > >>>> Tim Ellison (t.p.ellison@gmail.com)
> > > >>>> IBM Java technology centre, UK.
> > > >>>>
> > > >> --
> > > >>
> > > >> Tim Ellison (t.p.ellison@gmail.com)
> > > >> IBM Java technology centre, UK.
> > > >>
> > > >
> > >
> > > --
> > >
> > > Tim Ellison (t.p.ellison@gmail.com)
> > > IBM Java technology centre, UK.
> > >
> >
> >
> >
>
>
>

Re: [jira] Commented: (HARMONY-31) Move peformance timing of unit tests into a decorator class.

Posted by George Harley1 <GH...@uk.ibm.com>.
Hi Mikhail, 


>  When we run the tests (those ones that we call unit tests :) in 
performance mode
> we start a timer, then run the same testcase in a loop millions of
> times (and we do NOT call them via reflection) then we stop a timer

Here is a flavour of what works for a lot of people...
 
Use the JUnit type RepeatedTest (see [1]) to wrap any concrete JUnit Test 
(e.g. a TestSuite, a TestCase etc) so that the test (or tests if it is a 
TestSuite) get run as many million times as you want. Then, take the very 
same RepeatedTest object and wrap that inside a JUnitPerf TimedTest (see 
[2]), also passing in a maximum elapsed time that the RepeatedTest has to 
run in. If the RepeatedTest (which is running the original test case or 
suite millions of times for you) doesn't complete in the specified maximum 
time then the test is a failure. 


> As I understand your point, you are going to measure a single call of
> a test method and that call is over reflection, or multiple calls but
> each of them over reflection.

No, that was not my point. My point is that, at the unit test level, it is 
possible to carry out performance timings using *established* JUnit 
techniques that are *dynamic* in nature and do *not* require adding in a 
superclass into the inheritance hierarchy meaning that every child class 
"is a" performance test spouting out lots of "== test Foo passed OK ==" 
messages. What happens if you want to carry out some stress testing with 
the unit tests ? Does it mean the addition of a new superclass at the top 
of the hierarchy so that every unit test class "is a" stress test as well 
? And so on for interop tests ... etc .. etc ... ?


> Just wanted to say that reflection part of the call might take 99% of 
method
> execution and results would not be reliable. Please correct me if I 
wrong.

You could adjust your target timings to take into account the effects of 
the test framework. What would be the problem in doing that ? If you are 
looking for the introduction of performance problems then isn't it the 
relative timings (e.g. "before versus after the introduction of new code" 
or "test on JRE1 versus identical test on JRE2") that matter ? 


Best regards, 
George

[1] 
http://www.junit.org/junit/javadoc/3.8.1/junit/extensions/RepeatedTest.html
[2] http://clarkware.com/software/JUnitPerf.html#howtouse
________________________________________
George C. Harley





Mikhail Loenko <ml...@gmail.com> 
18/01/2006 17:06
Please respond to
harmony-dev@incubator.apache.org


To
harmony-dev@incubator.apache.org
cc

Subject
Re: [jira] Commented: (HARMONY-31) Move peformance timing of unit tests 
into a decorator class.






Hi George

When we run the tests (those ones that we call unit tests :) in 
performance mode
we start a timer, then run the same testcase in a loop millions of
times (and we do
NOT call them via reflection) then we stop a timer

As I understand your point, you are going to measure a single call of
a test method and that call is over reflection, or multiple calls but
each of them over reflection.
Just wanted to say that reflection part of the call might take 99% of 
method
execution and results would not be reliable. Please correct me if I wrong.

Thanks,
Mikhail

On 1/18/06, George Harley1 <GH...@uk.ibm.com> wrote:
> Hi Mikhail,
>
> > The messages are important to analyze failures also.
>
> What is JUnit's own failure reporting mechanism not providing to you ?
>
>
> > And the possibility to test perfromance is also important
>
> Yes it is. But - to return to the original point of the HARMONY-31 JIRA
> issue - this can be done without the need to either bring in a 
superclass
> in the test hierarchy and/or scatter logging calls around the test code.
> Performance testing using JUnit should be done in a transparent manner
> using a decorator. As an example of what I mean please take a look at 
the
> JUnitPerf site at http://clarkware.com/software/JUnitPerf.html . 
JUnitPerf
> is an extension to JUnit that helps with the creation of performance 
tests
> that match existing unit tests. It does this using a decorator design.
> This provides for a separation of concerns that benefits developers and
> performance engineers.
>
> The decorator approach means that we effectively *extend* JUnit with
> simple wrapper classes in which we can make full use of the JUnit API to
> give us the additional behaviour needed when running the tests (maybe 
this
> could be used to give whatever extra failure analysis data you say is
> lacking ?). And if somebody doesn't want the extension behaviour ? They
> just run the suite without the custom wrappers.
>
>
> > When we have a powerful performance suite we may revisit this.
>
> Don't you mean when we have a powerful unit test suite ?
>
> Take care,
> George
> ________________________________________
> George C. Harley
>
>
>
>
>
> Mikhail Loenko <ml...@gmail.com>
> 18/01/2006 12:57
> Please respond to
> harmony-dev@incubator.apache.org
>
>
> To
> harmony-dev@incubator.apache.org
> cc
>
> Subject
> Re: [jira] Commented: (HARMONY-31) Move peformance timing of unit tests
> into a decorator class.
>
>
>
>
>
>
> Well, those messages were important for developers when
> they were writing code and tests. Then tests came to repository 'as is'.
>
> The messages are important to analyze failures also.
> And the possibility to test perfromance is also important
>
> For me any option that does not break functionality in favor of beauty
> looks
> good. I suggest swithcing off logging by default and ether keep
> PerformanceTest
> super class as is or replace it with a simple Logger class.
>
> When we have a powerful performance suite we may revisit this.
>
> Thanks,
> Mikhail.
>
> On 1/18/06, Tim Ellison <t....@gmail.com> wrote:
> > Absolutely right -- writing meaningful performance tests is hard.
> > Implementing your own Logger would not solve the problem though<g>.
> >
> > Best to avoid the 'This test worked OK' log messages altogether, and
> > stick to assertions.
> >
> > Regards,
> > Tim
> >
> > Mikhail Loenko wrote:
> > > It might be a problem...
> > >
> > > When we use java.util.logging we do not just compare performance of
> security
> > > API functions, the result is also depends on difference in 
performance
> of
> > > java.util.logging in standard classes vs. Harmony classes. So if we
> use
> > > non-trivial functionality from there then our results will be 
spoiled
> a little.
> > >
> > > Will investigate more...
> > >
> > > Thanks,
> > > Mikhail.
> > >
> > > On 1/17/06, Tim Ellison <t....@gmail.com> wrote:
> > >> neither is the Logger class -- so my point is if you are going to
> write
> > >> some logging code why not do it in java.util.logging?  You may 
choose
> to
> > >> only do simple stubs for now until somebody steps up to do a real
> impl.
> > >>
> > >> Regards,
> > >> Tim
> > >>
> > >> Mikhail Loenko wrote:
> > >>> It's not yet implemented.
> > >>>
> > >>> thanks,
> > >>> Mikhail
> > >>>
> > >>> On 1/17/06, Tim Ellison <t....@gmail.com> wrote:
> > >>>> Why not use java.util.logging?
> > >>>>
> > >>>> Regards,
> > >>>> Tim
> > >>>>
> > >>>> Mikhail Loenko (JIRA) wrote:
> > >>>>>     [
> 
http://issues.apache.org/jira/browse/HARMONY-31?page=comments#action_12362910

> ]
> > >>>>>
> > >>>>> Mikhail Loenko commented on HARMONY-31:
> > >>>>> ---------------------------------------
> > >>>>>
> > >>>>> This is not what I meant.
> > >>>>>
> > >>>>> I was going to create a Logger class at this point like this:
> > >>>>>
> > >>>>> public class Logger {
> > >>>>>         public static boolean printAllowed = false;
> > >>>>>       public static void log(String message) {
> > >>>>>               if (printAllowed) System.out.print(message);
> > >>>>>       }
> > >>>>>       public static void logln(String message) {
> > >>>>>               if (printAllowed) System.out.println(message);
> > >>>>>       }
> > >>>>>       public static void logError(String message) {
> > >>>>>               if (printAllowed) System.err.print(message);
> > >>>>>       }
> > >>>>>       public static void loglnError(String message) {
> > >>>>>               if (printAllowed) System.err.println(message);
> > >>>>>       }
> > >>>>> }
> > >>>>>
> > >>>>> And replace log() with Logger.log() everywhere in the tests.
> > >>>>>
> > >>>>> All the remaining functionality in the PerformanceTest is
> obsolete.
> > >>>>>
> > >>>>>
> > >>>>>> Move peformance timing of unit tests into a decorator class.
> > >>>>>> ------------------------------------------------------------
> > >>>>>>
> > >>>>>>          Key: HARMONY-31
> > >>>>>>          URL: http://issues.apache.org/jira/browse/HARMONY-31
> > >>>>>>      Project: Harmony
> > >>>>>>         Type: Improvement
> > >>>>>>     Reporter: George Harley
> > >>>>>>     Assignee: Geir Magnusson Jr
> > >>>>>>     Priority: Minor
> > >>>>>>  Attachments: PerfDecorator.java
> > >>>>>>
> > >>>>>> There has been some low-level discussion on the dev mailing 
list
> recently about the inclusion of performance-related logging code near 
the
> top of a unit test class inheritance hierarchy (see
> com.openintel.drl.security.test.PerformanceTest in the HARMONY-16
> contribution). This particular issue suggests an alternative way of 
adding
> in timing code but without making it the responsibility of the unit 
tests
> themselves and without the need to introduce a class in the inheritance
> hierarchy.
> > >>>>>> The basic approach is to exploit the
> junit.extensions.TestDecorator type in the JUnit API to add in timing
> behaviour before and after each test method runs. This will be
> demonstrated with some simple sample code.
> > >>>> --
> > >>>>
> > >>>> Tim Ellison (t.p.ellison@gmail.com)
> > >>>> IBM Java technology centre, UK.
> > >>>>
> > >> --
> > >>
> > >> Tim Ellison (t.p.ellison@gmail.com)
> > >> IBM Java technology centre, UK.
> > >>
> > >
> >
> > --
> >
> > Tim Ellison (t.p.ellison@gmail.com)
> > IBM Java technology centre, UK.
> >
>
>
>



Re: [jira] Commented: (HARMONY-31) Move peformance timing of unit tests into a decorator class.

Posted by Mikhail Loenko <ml...@gmail.com>.
Hi George

When we run the tests (those ones that we call unit tests :) in performance mode
we start a timer, then run the same testcase in a loop millions of
times (and we do
NOT call them via reflection) then we stop a timer

As I understand your point, you are going to measure a single call of
a test method and that call is over reflection, or multiple calls but
each of them over reflection.
Just wanted to say that reflection part of the call might take 99% of method
execution and results would not be reliable. Please correct me if I wrong.

Thanks,
Mikhail

On 1/18/06, George Harley1 <GH...@uk.ibm.com> wrote:
> Hi Mikhail,
>
> > The messages are important to analyze failures also.
>
> What is JUnit's own failure reporting mechanism not providing to you ?
>
>
> > And the possibility to test perfromance is also important
>
> Yes it is. But - to return to the original point of the HARMONY-31 JIRA
> issue - this can be done without the need to either bring in a superclass
> in the test hierarchy and/or scatter logging calls around the test code.
> Performance testing using JUnit should be done in a transparent manner
> using a decorator. As an example of what I mean please take a look at the
> JUnitPerf site at http://clarkware.com/software/JUnitPerf.html . JUnitPerf
> is an extension to JUnit that helps with the creation of performance tests
> that match existing unit tests. It does this using a decorator design.
> This provides for a separation of concerns that benefits developers and
> performance engineers.
>
> The decorator approach means that we effectively *extend* JUnit with
> simple wrapper classes in which we can make full use of the JUnit API to
> give us the additional behaviour needed when running the tests (maybe this
> could be used to give whatever extra failure analysis data you say is
> lacking ?). And if somebody doesn't want the extension behaviour ? They
> just run the suite without the custom wrappers.
>
>
> > When we have a powerful performance suite we may revisit this.
>
> Don't you mean when we have a powerful unit test suite ?
>
> Take care,
> George
> ________________________________________
> George C. Harley
>
>
>
>
>
> Mikhail Loenko <ml...@gmail.com>
> 18/01/2006 12:57
> Please respond to
> harmony-dev@incubator.apache.org
>
>
> To
> harmony-dev@incubator.apache.org
> cc
>
> Subject
> Re: [jira] Commented: (HARMONY-31) Move peformance timing of unit tests
> into a decorator class.
>
>
>
>
>
>
> Well, those messages were important for developers when
> they were writing code and tests. Then tests came to repository 'as is'.
>
> The messages are important to analyze failures also.
> And the possibility to test perfromance is also important
>
> For me any option that does not break functionality in favor of beauty
> looks
> good. I suggest swithcing off logging by default and ether keep
> PerformanceTest
> super class as is or replace it with a simple Logger class.
>
> When we have a powerful performance suite we may revisit this.
>
> Thanks,
> Mikhail.
>
> On 1/18/06, Tim Ellison <t....@gmail.com> wrote:
> > Absolutely right -- writing meaningful performance tests is hard.
> > Implementing your own Logger would not solve the problem though<g>.
> >
> > Best to avoid the 'This test worked OK' log messages altogether, and
> > stick to assertions.
> >
> > Regards,
> > Tim
> >
> > Mikhail Loenko wrote:
> > > It might be a problem...
> > >
> > > When we use java.util.logging we do not just compare performance of
> security
> > > API functions, the result is also depends on difference in performance
> of
> > > java.util.logging in standard classes vs. Harmony classes. So if we
> use
> > > non-trivial functionality from there then our results will be spoiled
> a little.
> > >
> > > Will investigate more...
> > >
> > > Thanks,
> > > Mikhail.
> > >
> > > On 1/17/06, Tim Ellison <t....@gmail.com> wrote:
> > >> neither is the Logger class -- so my point is if you are going to
> write
> > >> some logging code why not do it in java.util.logging?  You may choose
> to
> > >> only do simple stubs for now until somebody steps up to do a real
> impl.
> > >>
> > >> Regards,
> > >> Tim
> > >>
> > >> Mikhail Loenko wrote:
> > >>> It's not yet implemented.
> > >>>
> > >>> thanks,
> > >>> Mikhail
> > >>>
> > >>> On 1/17/06, Tim Ellison <t....@gmail.com> wrote:
> > >>>> Why not use java.util.logging?
> > >>>>
> > >>>> Regards,
> > >>>> Tim
> > >>>>
> > >>>> Mikhail Loenko (JIRA) wrote:
> > >>>>>     [
> http://issues.apache.org/jira/browse/HARMONY-31?page=comments#action_12362910
> ]
> > >>>>>
> > >>>>> Mikhail Loenko commented on HARMONY-31:
> > >>>>> ---------------------------------------
> > >>>>>
> > >>>>> This is not what I meant.
> > >>>>>
> > >>>>> I was going to create a Logger class at this point like this:
> > >>>>>
> > >>>>> public class Logger {
> > >>>>>         public static boolean printAllowed = false;
> > >>>>>       public static void log(String message) {
> > >>>>>               if (printAllowed) System.out.print(message);
> > >>>>>       }
> > >>>>>       public static void logln(String message) {
> > >>>>>               if (printAllowed) System.out.println(message);
> > >>>>>       }
> > >>>>>       public static void logError(String message) {
> > >>>>>               if (printAllowed) System.err.print(message);
> > >>>>>       }
> > >>>>>       public static void loglnError(String message) {
> > >>>>>               if (printAllowed) System.err.println(message);
> > >>>>>       }
> > >>>>> }
> > >>>>>
> > >>>>> And replace log() with Logger.log() everywhere in the tests.
> > >>>>>
> > >>>>> All the remaining functionality in the PerformanceTest is
> obsolete.
> > >>>>>
> > >>>>>
> > >>>>>> Move peformance timing of unit tests into a decorator class.
> > >>>>>> ------------------------------------------------------------
> > >>>>>>
> > >>>>>>          Key: HARMONY-31
> > >>>>>>          URL: http://issues.apache.org/jira/browse/HARMONY-31
> > >>>>>>      Project: Harmony
> > >>>>>>         Type: Improvement
> > >>>>>>     Reporter: George Harley
> > >>>>>>     Assignee: Geir Magnusson Jr
> > >>>>>>     Priority: Minor
> > >>>>>>  Attachments: PerfDecorator.java
> > >>>>>>
> > >>>>>> There has been some low-level discussion on the dev mailing list
> recently about the inclusion of performance-related logging code near the
> top of a unit test class inheritance hierarchy (see
> com.openintel.drl.security.test.PerformanceTest in the HARMONY-16
> contribution). This particular issue suggests an alternative way of adding
> in timing code but without making it the responsibility of the unit tests
> themselves and without the need to introduce a class in the inheritance
> hierarchy.
> > >>>>>> The basic approach is to exploit the
> junit.extensions.TestDecorator type in the JUnit API to add in timing
> behaviour before and after each test method runs. This will be
> demonstrated with some simple sample code.
> > >>>> --
> > >>>>
> > >>>> Tim Ellison (t.p.ellison@gmail.com)
> > >>>> IBM Java technology centre, UK.
> > >>>>
> > >> --
> > >>
> > >> Tim Ellison (t.p.ellison@gmail.com)
> > >> IBM Java technology centre, UK.
> > >>
> > >
> >
> > --
> >
> > Tim Ellison (t.p.ellison@gmail.com)
> > IBM Java technology centre, UK.
> >
>
>
>

Re: [jira] Commented: (HARMONY-31) Move peformance timing of unit tests into a decorator class.

Posted by George Harley1 <GH...@uk.ibm.com>.
Hi Mikhail, 

> The messages are important to analyze failures also.

What is JUnit's own failure reporting mechanism not providing to you ? 


> And the possibility to test perfromance is also important

Yes it is. But - to return to the original point of the HARMONY-31 JIRA 
issue - this can be done without the need to either bring in a superclass 
in the test hierarchy and/or scatter logging calls around the test code. 
Performance testing using JUnit should be done in a transparent manner 
using a decorator. As an example of what I mean please take a look at the 
JUnitPerf site at http://clarkware.com/software/JUnitPerf.html . JUnitPerf 
is an extension to JUnit that helps with the creation of performance tests 
that match existing unit tests. It does this using a decorator design. 
This provides for a separation of concerns that benefits developers and 
performance engineers. 

The decorator approach means that we effectively *extend* JUnit with 
simple wrapper classes in which we can make full use of the JUnit API to 
give us the additional behaviour needed when running the tests (maybe this 
could be used to give whatever extra failure analysis data you say is 
lacking ?). And if somebody doesn't want the extension behaviour ? They 
just run the suite without the custom wrappers. 


> When we have a powerful performance suite we may revisit this.

Don't you mean when we have a powerful unit test suite ? 

Take care, 
George
________________________________________
George C. Harley





Mikhail Loenko <ml...@gmail.com> 
18/01/2006 12:57
Please respond to
harmony-dev@incubator.apache.org


To
harmony-dev@incubator.apache.org
cc

Subject
Re: [jira] Commented: (HARMONY-31) Move peformance timing of unit tests 
into a decorator class.






Well, those messages were important for developers when
they were writing code and tests. Then tests came to repository 'as is'.

The messages are important to analyze failures also.
And the possibility to test perfromance is also important

For me any option that does not break functionality in favor of beauty 
looks
good. I suggest swithcing off logging by default and ether keep
PerformanceTest
super class as is or replace it with a simple Logger class.

When we have a powerful performance suite we may revisit this.

Thanks,
Mikhail.

On 1/18/06, Tim Ellison <t....@gmail.com> wrote:
> Absolutely right -- writing meaningful performance tests is hard.
> Implementing your own Logger would not solve the problem though<g>.
>
> Best to avoid the 'This test worked OK' log messages altogether, and
> stick to assertions.
>
> Regards,
> Tim
>
> Mikhail Loenko wrote:
> > It might be a problem...
> >
> > When we use java.util.logging we do not just compare performance of 
security
> > API functions, the result is also depends on difference in performance 
of
> > java.util.logging in standard classes vs. Harmony classes. So if we 
use
> > non-trivial functionality from there then our results will be spoiled 
a little.
> >
> > Will investigate more...
> >
> > Thanks,
> > Mikhail.
> >
> > On 1/17/06, Tim Ellison <t....@gmail.com> wrote:
> >> neither is the Logger class -- so my point is if you are going to 
write
> >> some logging code why not do it in java.util.logging?  You may choose 
to
> >> only do simple stubs for now until somebody steps up to do a real 
impl.
> >>
> >> Regards,
> >> Tim
> >>
> >> Mikhail Loenko wrote:
> >>> It's not yet implemented.
> >>>
> >>> thanks,
> >>> Mikhail
> >>>
> >>> On 1/17/06, Tim Ellison <t....@gmail.com> wrote:
> >>>> Why not use java.util.logging?
> >>>>
> >>>> Regards,
> >>>> Tim
> >>>>
> >>>> Mikhail Loenko (JIRA) wrote:
> >>>>>     [ 
http://issues.apache.org/jira/browse/HARMONY-31?page=comments#action_12362910 
]
> >>>>>
> >>>>> Mikhail Loenko commented on HARMONY-31:
> >>>>> ---------------------------------------
> >>>>>
> >>>>> This is not what I meant.
> >>>>>
> >>>>> I was going to create a Logger class at this point like this:
> >>>>>
> >>>>> public class Logger {
> >>>>>         public static boolean printAllowed = false;
> >>>>>       public static void log(String message) {
> >>>>>               if (printAllowed) System.out.print(message);
> >>>>>       }
> >>>>>       public static void logln(String message) {
> >>>>>               if (printAllowed) System.out.println(message);
> >>>>>       }
> >>>>>       public static void logError(String message) {
> >>>>>               if (printAllowed) System.err.print(message);
> >>>>>       }
> >>>>>       public static void loglnError(String message) {
> >>>>>               if (printAllowed) System.err.println(message);
> >>>>>       }
> >>>>> }
> >>>>>
> >>>>> And replace log() with Logger.log() everywhere in the tests.
> >>>>>
> >>>>> All the remaining functionality in the PerformanceTest is 
obsolete.
> >>>>>
> >>>>>
> >>>>>> Move peformance timing of unit tests into a decorator class.
> >>>>>> ------------------------------------------------------------
> >>>>>>
> >>>>>>          Key: HARMONY-31
> >>>>>>          URL: http://issues.apache.org/jira/browse/HARMONY-31
> >>>>>>      Project: Harmony
> >>>>>>         Type: Improvement
> >>>>>>     Reporter: George Harley
> >>>>>>     Assignee: Geir Magnusson Jr
> >>>>>>     Priority: Minor
> >>>>>>  Attachments: PerfDecorator.java
> >>>>>>
> >>>>>> There has been some low-level discussion on the dev mailing list 
recently about the inclusion of performance-related logging code near the 
top of a unit test class inheritance hierarchy (see 
com.openintel.drl.security.test.PerformanceTest in the HARMONY-16 
contribution). This particular issue suggests an alternative way of adding 
in timing code but without making it the responsibility of the unit tests 
themselves and without the need to introduce a class in the inheritance 
hierarchy.
> >>>>>> The basic approach is to exploit the 
junit.extensions.TestDecorator type in the JUnit API to add in timing 
behaviour before and after each test method runs. This will be 
demonstrated with some simple sample code.
> >>>> --
> >>>>
> >>>> Tim Ellison (t.p.ellison@gmail.com)
> >>>> IBM Java technology centre, UK.
> >>>>
> >> --
> >>
> >> Tim Ellison (t.p.ellison@gmail.com)
> >> IBM Java technology centre, UK.
> >>
> >
>
> --
>
> Tim Ellison (t.p.ellison@gmail.com)
> IBM Java technology centre, UK.
>



Re: [jira] Commented: (HARMONY-31) Move peformance timing of unit tests into a decorator class.

Posted by Mikhail Loenko <ml...@gmail.com>.
Well, those messages were important for developers when
they were writing code and tests. Then tests came to repository 'as is'.

The messages are important to analyze failures also.
And the possibility to test perfromance is also important

For me any option that does not break functionality in favor of beauty looks
good. I suggest swithcing off logging by default and ether keep
PerformanceTest
super class as is or replace it with a simple Logger class.

When we have a powerful performance suite we may revisit this.

Thanks,
Mikhail.

On 1/18/06, Tim Ellison <t....@gmail.com> wrote:
> Absolutely right -- writing meaningful performance tests is hard.
> Implementing your own Logger would not solve the problem though<g>.
>
> Best to avoid the 'This test worked OK' log messages altogether, and
> stick to assertions.
>
> Regards,
> Tim
>
> Mikhail Loenko wrote:
> > It might be a problem...
> >
> > When we use java.util.logging we do not just compare performance of security
> > API functions, the result is also depends on difference in performance of
> > java.util.logging in standard classes vs. Harmony classes. So if we use
> > non-trivial functionality from there then our results will be spoiled a little.
> >
> > Will investigate more...
> >
> > Thanks,
> > Mikhail.
> >
> > On 1/17/06, Tim Ellison <t....@gmail.com> wrote:
> >> neither is the Logger class -- so my point is if you are going to write
> >> some logging code why not do it in java.util.logging?  You may choose to
> >> only do simple stubs for now until somebody steps up to do a real impl.
> >>
> >> Regards,
> >> Tim
> >>
> >> Mikhail Loenko wrote:
> >>> It's not yet implemented.
> >>>
> >>> thanks,
> >>> Mikhail
> >>>
> >>> On 1/17/06, Tim Ellison <t....@gmail.com> wrote:
> >>>> Why not use java.util.logging?
> >>>>
> >>>> Regards,
> >>>> Tim
> >>>>
> >>>> Mikhail Loenko (JIRA) wrote:
> >>>>>     [ http://issues.apache.org/jira/browse/HARMONY-31?page=comments#action_12362910 ]
> >>>>>
> >>>>> Mikhail Loenko commented on HARMONY-31:
> >>>>> ---------------------------------------
> >>>>>
> >>>>> This is not what I meant.
> >>>>>
> >>>>> I was going to create a Logger class at this point like this:
> >>>>>
> >>>>> public class Logger {
> >>>>>         public static boolean printAllowed = false;
> >>>>>       public static void log(String message) {
> >>>>>               if (printAllowed) System.out.print(message);
> >>>>>       }
> >>>>>       public static void logln(String message) {
> >>>>>               if (printAllowed) System.out.println(message);
> >>>>>       }
> >>>>>       public static void logError(String message) {
> >>>>>               if (printAllowed) System.err.print(message);
> >>>>>       }
> >>>>>       public static void loglnError(String message) {
> >>>>>               if (printAllowed) System.err.println(message);
> >>>>>       }
> >>>>> }
> >>>>>
> >>>>> And replace log() with Logger.log() everywhere in the tests.
> >>>>>
> >>>>> All the remaining functionality in the PerformanceTest is obsolete.
> >>>>>
> >>>>>
> >>>>>> Move peformance timing of unit tests into a decorator class.
> >>>>>> ------------------------------------------------------------
> >>>>>>
> >>>>>>          Key: HARMONY-31
> >>>>>>          URL: http://issues.apache.org/jira/browse/HARMONY-31
> >>>>>>      Project: Harmony
> >>>>>>         Type: Improvement
> >>>>>>     Reporter: George Harley
> >>>>>>     Assignee: Geir Magnusson Jr
> >>>>>>     Priority: Minor
> >>>>>>  Attachments: PerfDecorator.java
> >>>>>>
> >>>>>> There has been some low-level discussion on the dev mailing list recently about the inclusion of performance-related logging code near the top of a unit test class inheritance hierarchy (see com.openintel.drl.security.test.PerformanceTest in the HARMONY-16 contribution). This particular issue suggests an alternative way of adding in timing code but without making it the responsibility of the unit tests themselves and without the need to introduce a class in the inheritance hierarchy.
> >>>>>> The basic approach is to exploit the junit.extensions.TestDecorator type in the JUnit API to add in timing behaviour before and after each test method runs. This will be demonstrated with some simple sample code.
> >>>> --
> >>>>
> >>>> Tim Ellison (t.p.ellison@gmail.com)
> >>>> IBM Java technology centre, UK.
> >>>>
> >> --
> >>
> >> Tim Ellison (t.p.ellison@gmail.com)
> >> IBM Java technology centre, UK.
> >>
> >
>
> --
>
> Tim Ellison (t.p.ellison@gmail.com)
> IBM Java technology centre, UK.
>

Re: [jira] Commented: (HARMONY-31) Move peformance timing of unit tests into a decorator class.

Posted by Tim Ellison <t....@gmail.com>.
Absolutely right -- writing meaningful performance tests is hard.
Implementing your own Logger would not solve the problem though<g>.

Best to avoid the 'This test worked OK' log messages altogether, and
stick to assertions.

Regards,
Tim

Mikhail Loenko wrote:
> It might be a problem...
> 
> When we use java.util.logging we do not just compare performance of security
> API functions, the result is also depends on difference in performance of
> java.util.logging in standard classes vs. Harmony classes. So if we use
> non-trivial functionality from there then our results will be spoiled a little.
> 
> Will investigate more...
> 
> Thanks,
> Mikhail.
> 
> On 1/17/06, Tim Ellison <t....@gmail.com> wrote:
>> neither is the Logger class -- so my point is if you are going to write
>> some logging code why not do it in java.util.logging?  You may choose to
>> only do simple stubs for now until somebody steps up to do a real impl.
>>
>> Regards,
>> Tim
>>
>> Mikhail Loenko wrote:
>>> It's not yet implemented.
>>>
>>> thanks,
>>> Mikhail
>>>
>>> On 1/17/06, Tim Ellison <t....@gmail.com> wrote:
>>>> Why not use java.util.logging?
>>>>
>>>> Regards,
>>>> Tim
>>>>
>>>> Mikhail Loenko (JIRA) wrote:
>>>>>     [ http://issues.apache.org/jira/browse/HARMONY-31?page=comments#action_12362910 ]
>>>>>
>>>>> Mikhail Loenko commented on HARMONY-31:
>>>>> ---------------------------------------
>>>>>
>>>>> This is not what I meant.
>>>>>
>>>>> I was going to create a Logger class at this point like this:
>>>>>
>>>>> public class Logger {
>>>>>         public static boolean printAllowed = false;
>>>>>       public static void log(String message) {
>>>>>               if (printAllowed) System.out.print(message);
>>>>>       }
>>>>>       public static void logln(String message) {
>>>>>               if (printAllowed) System.out.println(message);
>>>>>       }
>>>>>       public static void logError(String message) {
>>>>>               if (printAllowed) System.err.print(message);
>>>>>       }
>>>>>       public static void loglnError(String message) {
>>>>>               if (printAllowed) System.err.println(message);
>>>>>       }
>>>>> }
>>>>>
>>>>> And replace log() with Logger.log() everywhere in the tests.
>>>>>
>>>>> All the remaining functionality in the PerformanceTest is obsolete.
>>>>>
>>>>>
>>>>>> Move peformance timing of unit tests into a decorator class.
>>>>>> ------------------------------------------------------------
>>>>>>
>>>>>>          Key: HARMONY-31
>>>>>>          URL: http://issues.apache.org/jira/browse/HARMONY-31
>>>>>>      Project: Harmony
>>>>>>         Type: Improvement
>>>>>>     Reporter: George Harley
>>>>>>     Assignee: Geir Magnusson Jr
>>>>>>     Priority: Minor
>>>>>>  Attachments: PerfDecorator.java
>>>>>>
>>>>>> There has been some low-level discussion on the dev mailing list recently about the inclusion of performance-related logging code near the top of a unit test class inheritance hierarchy (see com.openintel.drl.security.test.PerformanceTest in the HARMONY-16 contribution). This particular issue suggests an alternative way of adding in timing code but without making it the responsibility of the unit tests themselves and without the need to introduce a class in the inheritance hierarchy.
>>>>>> The basic approach is to exploit the junit.extensions.TestDecorator type in the JUnit API to add in timing behaviour before and after each test method runs. This will be demonstrated with some simple sample code.
>>>> --
>>>>
>>>> Tim Ellison (t.p.ellison@gmail.com)
>>>> IBM Java technology centre, UK.
>>>>
>> --
>>
>> Tim Ellison (t.p.ellison@gmail.com)
>> IBM Java technology centre, UK.
>>
> 

-- 

Tim Ellison (t.p.ellison@gmail.com)
IBM Java technology centre, UK.

Re: [jira] Commented: (HARMONY-31) Move peformance timing of unit tests into a decorator class.

Posted by Mikhail Loenko <ml...@gmail.com>.
It might be a problem...

When we use java.util.logging we do not just compare performance of security
API functions, the result is also depends on difference in performance of
java.util.logging in standard classes vs. Harmony classes. So if we use
non-trivial functionality from there then our results will be spoiled a little.

Will investigate more...

Thanks,
Mikhail.

On 1/17/06, Tim Ellison <t....@gmail.com> wrote:
> neither is the Logger class -- so my point is if you are going to write
> some logging code why not do it in java.util.logging?  You may choose to
> only do simple stubs for now until somebody steps up to do a real impl.
>
> Regards,
> Tim
>
> Mikhail Loenko wrote:
> > It's not yet implemented.
> >
> > thanks,
> > Mikhail
> >
> > On 1/17/06, Tim Ellison <t....@gmail.com> wrote:
> >> Why not use java.util.logging?
> >>
> >> Regards,
> >> Tim
> >>
> >> Mikhail Loenko (JIRA) wrote:
> >>>     [ http://issues.apache.org/jira/browse/HARMONY-31?page=comments#action_12362910 ]
> >>>
> >>> Mikhail Loenko commented on HARMONY-31:
> >>> ---------------------------------------
> >>>
> >>> This is not what I meant.
> >>>
> >>> I was going to create a Logger class at this point like this:
> >>>
> >>> public class Logger {
> >>>         public static boolean printAllowed = false;
> >>>       public static void log(String message) {
> >>>               if (printAllowed) System.out.print(message);
> >>>       }
> >>>       public static void logln(String message) {
> >>>               if (printAllowed) System.out.println(message);
> >>>       }
> >>>       public static void logError(String message) {
> >>>               if (printAllowed) System.err.print(message);
> >>>       }
> >>>       public static void loglnError(String message) {
> >>>               if (printAllowed) System.err.println(message);
> >>>       }
> >>> }
> >>>
> >>> And replace log() with Logger.log() everywhere in the tests.
> >>>
> >>> All the remaining functionality in the PerformanceTest is obsolete.
> >>>
> >>>
> >>>> Move peformance timing of unit tests into a decorator class.
> >>>> ------------------------------------------------------------
> >>>>
> >>>>          Key: HARMONY-31
> >>>>          URL: http://issues.apache.org/jira/browse/HARMONY-31
> >>>>      Project: Harmony
> >>>>         Type: Improvement
> >>>>     Reporter: George Harley
> >>>>     Assignee: Geir Magnusson Jr
> >>>>     Priority: Minor
> >>>>  Attachments: PerfDecorator.java
> >>>>
> >>>> There has been some low-level discussion on the dev mailing list recently about the inclusion of performance-related logging code near the top of a unit test class inheritance hierarchy (see com.openintel.drl.security.test.PerformanceTest in the HARMONY-16 contribution). This particular issue suggests an alternative way of adding in timing code but without making it the responsibility of the unit tests themselves and without the need to introduce a class in the inheritance hierarchy.
> >>>> The basic approach is to exploit the junit.extensions.TestDecorator type in the JUnit API to add in timing behaviour before and after each test method runs. This will be demonstrated with some simple sample code.
> >> --
> >>
> >> Tim Ellison (t.p.ellison@gmail.com)
> >> IBM Java technology centre, UK.
> >>
> >
>
> --
>
> Tim Ellison (t.p.ellison@gmail.com)
> IBM Java technology centre, UK.
>

Re: [jira] Commented: (HARMONY-31) Move peformance timing of unit tests into a decorator class.

Posted by Mikhail Loenko <ml...@gmail.com>.
sounds reasonable...

Mikhail

On 1/17/06, Tim Ellison <t....@gmail.com> wrote:
> neither is the Logger class -- so my point is if you are going to write
> some logging code why not do it in java.util.logging?  You may choose to
> only do simple stubs for now until somebody steps up to do a real impl.
>
> Regards,
> Tim
>
> Mikhail Loenko wrote:
> > It's not yet implemented.
> >
> > thanks,
> > Mikhail
> >
> > On 1/17/06, Tim Ellison <t....@gmail.com> wrote:
> >> Why not use java.util.logging?
> >>
> >> Regards,
> >> Tim
> >>
> >> Mikhail Loenko (JIRA) wrote:
> >>>     [ http://issues.apache.org/jira/browse/HARMONY-31?page=comments#action_12362910 ]
> >>>
> >>> Mikhail Loenko commented on HARMONY-31:
> >>> ---------------------------------------
> >>>
> >>> This is not what I meant.
> >>>
> >>> I was going to create a Logger class at this point like this:
> >>>
> >>> public class Logger {
> >>>         public static boolean printAllowed = false;
> >>>       public static void log(String message) {
> >>>               if (printAllowed) System.out.print(message);
> >>>       }
> >>>       public static void logln(String message) {
> >>>               if (printAllowed) System.out.println(message);
> >>>       }
> >>>       public static void logError(String message) {
> >>>               if (printAllowed) System.err.print(message);
> >>>       }
> >>>       public static void loglnError(String message) {
> >>>               if (printAllowed) System.err.println(message);
> >>>       }
> >>> }
> >>>
> >>> And replace log() with Logger.log() everywhere in the tests.
> >>>
> >>> All the remaining functionality in the PerformanceTest is obsolete.
> >>>
> >>>
> >>>> Move peformance timing of unit tests into a decorator class.
> >>>> ------------------------------------------------------------
> >>>>
> >>>>          Key: HARMONY-31
> >>>>          URL: http://issues.apache.org/jira/browse/HARMONY-31
> >>>>      Project: Harmony
> >>>>         Type: Improvement
> >>>>     Reporter: George Harley
> >>>>     Assignee: Geir Magnusson Jr
> >>>>     Priority: Minor
> >>>>  Attachments: PerfDecorator.java
> >>>>
> >>>> There has been some low-level discussion on the dev mailing list recently about the inclusion of performance-related logging code near the top of a unit test class inheritance hierarchy (see com.openintel.drl.security.test.PerformanceTest in the HARMONY-16 contribution). This particular issue suggests an alternative way of adding in timing code but without making it the responsibility of the unit tests themselves and without the need to introduce a class in the inheritance hierarchy.
> >>>> The basic approach is to exploit the junit.extensions.TestDecorator type in the JUnit API to add in timing behaviour before and after each test method runs. This will be demonstrated with some simple sample code.
> >> --
> >>
> >> Tim Ellison (t.p.ellison@gmail.com)
> >> IBM Java technology centre, UK.
> >>
> >
>
> --
>
> Tim Ellison (t.p.ellison@gmail.com)
> IBM Java technology centre, UK.
>

Re: [jira] Commented: (HARMONY-31) Move peformance timing of unit tests into a decorator class.

Posted by Tim Ellison <t....@gmail.com>.
neither is the Logger class -- so my point is if you are going to write
some logging code why not do it in java.util.logging?  You may choose to
only do simple stubs for now until somebody steps up to do a real impl.

Regards,
Tim

Mikhail Loenko wrote:
> It's not yet implemented.
> 
> thanks,
> Mikhail
> 
> On 1/17/06, Tim Ellison <t....@gmail.com> wrote:
>> Why not use java.util.logging?
>>
>> Regards,
>> Tim
>>
>> Mikhail Loenko (JIRA) wrote:
>>>     [ http://issues.apache.org/jira/browse/HARMONY-31?page=comments#action_12362910 ]
>>>
>>> Mikhail Loenko commented on HARMONY-31:
>>> ---------------------------------------
>>>
>>> This is not what I meant.
>>>
>>> I was going to create a Logger class at this point like this:
>>>
>>> public class Logger {
>>>         public static boolean printAllowed = false;
>>>       public static void log(String message) {
>>>               if (printAllowed) System.out.print(message);
>>>       }
>>>       public static void logln(String message) {
>>>               if (printAllowed) System.out.println(message);
>>>       }
>>>       public static void logError(String message) {
>>>               if (printAllowed) System.err.print(message);
>>>       }
>>>       public static void loglnError(String message) {
>>>               if (printAllowed) System.err.println(message);
>>>       }
>>> }
>>>
>>> And replace log() with Logger.log() everywhere in the tests.
>>>
>>> All the remaining functionality in the PerformanceTest is obsolete.
>>>
>>>
>>>> Move peformance timing of unit tests into a decorator class.
>>>> ------------------------------------------------------------
>>>>
>>>>          Key: HARMONY-31
>>>>          URL: http://issues.apache.org/jira/browse/HARMONY-31
>>>>      Project: Harmony
>>>>         Type: Improvement
>>>>     Reporter: George Harley
>>>>     Assignee: Geir Magnusson Jr
>>>>     Priority: Minor
>>>>  Attachments: PerfDecorator.java
>>>>
>>>> There has been some low-level discussion on the dev mailing list recently about the inclusion of performance-related logging code near the top of a unit test class inheritance hierarchy (see com.openintel.drl.security.test.PerformanceTest in the HARMONY-16 contribution). This particular issue suggests an alternative way of adding in timing code but without making it the responsibility of the unit tests themselves and without the need to introduce a class in the inheritance hierarchy.
>>>> The basic approach is to exploit the junit.extensions.TestDecorator type in the JUnit API to add in timing behaviour before and after each test method runs. This will be demonstrated with some simple sample code.
>> --
>>
>> Tim Ellison (t.p.ellison@gmail.com)
>> IBM Java technology centre, UK.
>>
> 

-- 

Tim Ellison (t.p.ellison@gmail.com)
IBM Java technology centre, UK.

Re: [jira] Commented: (HARMONY-31) Move peformance timing of unit tests into a decorator class.

Posted by Mikhail Loenko <ml...@gmail.com>.
It's not yet implemented.

thanks,
Mikhail

On 1/17/06, Tim Ellison <t....@gmail.com> wrote:
> Why not use java.util.logging?
>
> Regards,
> Tim
>
> Mikhail Loenko (JIRA) wrote:
> >     [ http://issues.apache.org/jira/browse/HARMONY-31?page=comments#action_12362910 ]
> >
> > Mikhail Loenko commented on HARMONY-31:
> > ---------------------------------------
> >
> > This is not what I meant.
> >
> > I was going to create a Logger class at this point like this:
> >
> > public class Logger {
> >         public static boolean printAllowed = false;
> >       public static void log(String message) {
> >               if (printAllowed) System.out.print(message);
> >       }
> >       public static void logln(String message) {
> >               if (printAllowed) System.out.println(message);
> >       }
> >       public static void logError(String message) {
> >               if (printAllowed) System.err.print(message);
> >       }
> >       public static void loglnError(String message) {
> >               if (printAllowed) System.err.println(message);
> >       }
> > }
> >
> > And replace log() with Logger.log() everywhere in the tests.
> >
> > All the remaining functionality in the PerformanceTest is obsolete.
> >
> >
> >> Move peformance timing of unit tests into a decorator class.
> >> ------------------------------------------------------------
> >>
> >>          Key: HARMONY-31
> >>          URL: http://issues.apache.org/jira/browse/HARMONY-31
> >>      Project: Harmony
> >>         Type: Improvement
> >>     Reporter: George Harley
> >>     Assignee: Geir Magnusson Jr
> >>     Priority: Minor
> >>  Attachments: PerfDecorator.java
> >>
> >> There has been some low-level discussion on the dev mailing list recently about the inclusion of performance-related logging code near the top of a unit test class inheritance hierarchy (see com.openintel.drl.security.test.PerformanceTest in the HARMONY-16 contribution). This particular issue suggests an alternative way of adding in timing code but without making it the responsibility of the unit tests themselves and without the need to introduce a class in the inheritance hierarchy.
> >> The basic approach is to exploit the junit.extensions.TestDecorator type in the JUnit API to add in timing behaviour before and after each test method runs. This will be demonstrated with some simple sample code.
> >
>
> --
>
> Tim Ellison (t.p.ellison@gmail.com)
> IBM Java technology centre, UK.
>

Re: [jira] Commented: (HARMONY-31) Move peformance timing of unit tests into a decorator class.

Posted by Tim Ellison <t....@gmail.com>.
Why not use java.util.logging?

Regards,
Tim

Mikhail Loenko (JIRA) wrote:
>     [ http://issues.apache.org/jira/browse/HARMONY-31?page=comments#action_12362910 ] 
> 
> Mikhail Loenko commented on HARMONY-31:
> ---------------------------------------
> 
> This is not what I meant.
> 
> I was going to create a Logger class at this point like this:
> 
> public class Logger {
>         public static boolean printAllowed = false;
> 	public static void log(String message) {
> 		if (printAllowed) System.out.print(message);
> 	}
> 	public static void logln(String message) {
> 		if (printAllowed) System.out.println(message);
> 	}
> 	public static void logError(String message) {
> 		if (printAllowed) System.err.print(message);
> 	}
> 	public static void loglnError(String message) {
> 		if (printAllowed) System.err.println(message);
> 	}
> }
> 
> And replace log() with Logger.log() everywhere in the tests.
> 
> All the remaining functionality in the PerformanceTest is obsolete.
> 
> 
>> Move peformance timing of unit tests into a decorator class.
>> ------------------------------------------------------------
>>
>>          Key: HARMONY-31
>>          URL: http://issues.apache.org/jira/browse/HARMONY-31
>>      Project: Harmony
>>         Type: Improvement
>>     Reporter: George Harley
>>     Assignee: Geir Magnusson Jr
>>     Priority: Minor
>>  Attachments: PerfDecorator.java
>>
>> There has been some low-level discussion on the dev mailing list recently about the inclusion of performance-related logging code near the top of a unit test class inheritance hierarchy (see com.openintel.drl.security.test.PerformanceTest in the HARMONY-16 contribution). This particular issue suggests an alternative way of adding in timing code but without making it the responsibility of the unit tests themselves and without the need to introduce a class in the inheritance hierarchy. 
>> The basic approach is to exploit the junit.extensions.TestDecorator type in the JUnit API to add in timing behaviour before and after each test method runs. This will be demonstrated with some simple sample code. 
> 

-- 

Tim Ellison (t.p.ellison@gmail.com)
IBM Java technology centre, UK.