You are viewing a plain text version of this content. The canonical link for it is here.
Posted to wsif-dev@ws.apache.org by Aleksander Slominski <as...@cs.indiana.edu> on 2003/02/01 04:09:59 UTC

Re: cvs commit: xml-axis-wsif/java/test/performance JavaPerfTest.java Monitor.java

Nirmal Mukhi wrote:

> I've been thinking therefore that what we need to concentrate on is 
> improving the relative perf measured above, but we can make general 
> claims only on the percentage improvement, not on absolute numbers. 
> For example if WSIF adds 8% overhead to a SOAP call (this really 
> depends on the SOAP engine, network stuff, etc. etc.) and after some 
> changes to WSIF we cut this to 4% in the same setting, we can make two 
> claims:
> 1. The specific claim that WSIF overhead for a SOAP call w.r.t. a 
> particular SOAP environment and application is now 4%
> 2. The general claim that the efficiency of the WSIF SOAP provider has 
> been iproved by 50% since the last measurement (applies roughly across 
> SOAP servers and applications).
>
> Just my random thoughts. Post your comments. 

this looks lie reasonable approach however we will need to record not 
only improvement but also what was environment where tests were executed 
(what machine CPU/memory, network conection if relevant, OS version, JDK 
version)

> The first thing I will work on actually is being able to run tests and 
> documenting that (otherwise I can't commit code!!). This comes next. 

we should gradually get to document how to run on a new machine all 
tests including setup of all required services (AXIS, tomcat, J2EE, ...).

thanks,

alek

-- 
"Mr. Pauli, we in the audience are all agreed that your theory is crazy. 
What divides us is whether it is crazy enough to be true." Niels H. D. Bohr