You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@jmeter.apache.org by m mat <pe...@yahoo.com> on 2005/11/28 05:03:01 UTC

Interpreting results

Performance experts,
   
  I am testing a web services server that relies on a SQL server DB as system of records. The way this data base is ,  most of the search web services, make the DB CPU utilization go petty high momentarily (close to 50% or so) on a single (standalone) query operation
   
  Under load - as I can not predict what operation will run concurrently with what other operation - I have a huge variance in the execution of these search operations (any where from the its base time to about 10 times the base time). Average of such a number also does not make much sense. As a result, as I try to optimize the DB or other layers, I can not confidently say that performance is improving or not, as every time I run it, I get different numbers for each search operation depending on what else is running concurrently with it.
   
  Of course I can run these search operations one at a time (i.e. not run any thing else when a search is running - in a single thread), but then I can not see some other issues that I need to observe.
   
  How do you tackle such a situation?
   
  Matt

		
---------------------------------
 Yahoo! Music Unlimited - Access over 1 million songs. Try it free.

Re: Interpreting results

Posted by Peter Lin <wo...@gmail.com>.
sounds like a tricky situation.  a couple of things come to mind.  the most
useful one is to take production logs and use those to run the tests.

what I would do to isolate the database issues is to use the same test plan,
and vary the sql and concurrent load.

In other words.

Test plan + old sql
test plan + optimized sql

run both combinations above with different concurrent load. You'll have to
run it for a large sample to get a good estimate. A small sample will likely
be mis-leading. I consider a large sample something over 500K. hope that
helps

peter


On 11/27/05, m mat <pe...@yahoo.com> wrote:
>
> Performance experts,
>
>   I am testing a web services server that relies on a SQL server DB as
> system of records. The way this data base is ,  most of the search web
> services, make the DB CPU utilization go petty high momentarily (close to
> 50% or so) on a single (standalone) query operation
>
>   Under load - as I can not predict what operation will run concurrently
> with what other operation - I have a huge variance in the execution of these
> search operations (any where from the its base time to about 10 times the
> base time). Average of such a number also does not make much sense. As a
> result, as I try to optimize the DB or other layers, I can not confidently
> say that performance is improving or not, as every time I run it, I get
> different numbers for each search operation depending on what else is
> running concurrently with it.
>
>   Of course I can run these search operations one at a time (i.e. not run
> any thing else when a search is running - in a single thread), but then I
> can not see some other issues that I need to observe.
>
>   How do you tackle such a situation?
>
>   Matt
>
>
> ---------------------------------
> Yahoo! Music Unlimited - Access over 1 million songs. Try it free.
>