You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@jmeter.apache.org by Michael McDonnell <bz...@gmail.com> on 2008/06/06 02:19:03 UTC

Distributed Testing: Not actually creating load?

We're running a distributed test (roughly 7 remote workstations) on a pretty
hefty box (8 cores, 32 gigs ram.... etc...)

However, something seems to be going wrong... perhaps its because I'm
crossing linux and windows platforms to try to do the testing?

We're load testing a web application, so primarily, the only work we're
doing is http requests (there are a few "java requests" that actually is an
app I created to make webservice calls, but we'll get to that later)

However, when we view the transactions in the database, they are extremely
low. (frighteningly low).

Then we run the test from a single user work station (same test, 300 users
doing work) and our results come back fantastically!

Now granted: I guess the big deal is this: when the app uses a csv in
distributed mode, does each slave utilize the the same csv in the same order
? or is there a sort of "break up" so that no two slaves are using the same
line in the csv?

I'm sorry for what may be dumb questions... but we're coming down to a tight
deadline, and the distributed testing is not giving us good results where as
the local testing is.

Thanks for all your help in advance.

Michael

Re: Distributed Testing: Not actually creating load?

Posted by sebb <se...@gmail.com>.
On 07/06/2008, Michael McDonnell <bz...@gmail.com> wrote:
> I haven't had the opportunity to test each of the nodes in single mode, just
>  due to the nature of my work place environment (Most of the computers are
>  locked, I was just given an IP and told that JMeter-server is running.) I
>  know all the data is in sync because they're all synchronized using
>  subversion.
>
>  I ran some tests yesterday, comparing server response time (using JAMon) vs
>  Jmeter's response time, and I found that my results start to diverge at 20
>  users (spread out amongst all 7 of the nodes).
>
>  Regardless, I'm almost positive (as I said before) that its because of all
>  the optimistic locking (because of the situation with subversion, its
>  impossible right now to adjust the datafiles for everyone) exceptions being
>  thrown all the time.

If you include the hostname in the filename, then you can have
different files for each host.

>  Since we've aquired a faster machine (dual core, 2 gigs ram...) and set the
>  heap up on jmeter to like 1024m, that can run it beautifully in single node
>  mode, I unfortunately have to abandon the distributed test environment.
>  (Corporate...)
>
>  Thanks much for your time!
>
>
>  Michael
>
>
>  On Sat, Jun 7, 2008 at 5:35 AM, sebb <se...@gmail.com> wrote:
>
>  > You originally wrote that "single user work station ... results come
>  > back fantastically".
>  >
>  > Is that true for all the nodes used individually?
>  > Or do some nodes struggle?
>  >
>  > If all nodes work OK individually, try adding more nodes to the test,
>  > i.e instead of one node with 300 users, use 2 nodes with 150 users
>  > each. Try these both completely separately - in non-GUI mode - and in
>  > client-server mode.
>  >
>  > If you can run all the nodes together in non-GUI mode and get the same
>  > performance as
>  > a single node with the same total of users, but not when you run in
>  > client-server mode then there must be a bottleneck in the client.
>  >
>  > You also wrote:
>  >
>  > "We're load testing a web application, so primarily, the only work we're
>  > doing is http requests ...
>  >
>  > However, when we view the transactions in the database, they are extremely
>  > low. (frighteningly low)."
>  >
>  > This does not make sense. If the http requests directly relate to
>  > database transactions, then either some requests are not succeeding,
>  > or there is an error in the test data or the application. You'll need
>  > to trace the progress of the http request in the server to see why
>  > some requests are not generating database transactions.
>  >
>  > On 06/06/2008, Michael McDonnell <bz...@gmail.com> wrote:
>  > > Actually, that's about the exact configuration across all 7 of them.
>  > >
>  > >
>  > >  On Fri, Jun 6, 2008 at 9:58 AM, sebb <se...@gmail.com> wrote:
>  > >
>  > >  > On 06/06/2008, Michael McDonnell <bz...@gmail.com> wrote:
>  > >  > > One would think so, but my response times are terrible, I'm
>  > wondering if
>  > >  > it
>  > >  > >  has anything to do with the fact that its a 1st gen p4.
>  > >  > >  (Ubuntu Hardy, 1.5 Gig PC3200 RAM)
>  > >  > >
>  > >  >
>  > >  > What node is that?
>  > >  >
>  > >  > You wrote:
>  > >  >
>  > >  > "Then we run the test from a single user work station (same test, 300
>  > users
>  > >  > doing work) and our results come back fantastically!"
>  > >  >
>  > >  > >
>  > >  > >  On Fri, Jun 6, 2008 at 9:44 AM, sebb <se...@gmail.com> wrote:
>  > >  > >
>  > >  > >  > Yes, 300 users should not be a problem as a single JMeter client
>  > >  > >  > should be able to handle that.
>  > >  > >  >
>  > >  > >  > On 06/06/2008, Michael McDonnell <bz...@gmail.com> wrote:
>  > >  > >  > > That makes sense. I'll give it a go. (We're pretty sure there's
>  > no
>  > >  > bottle
>  > >  > >  > >  neck passing things, they do it every 100 samples, and this is
>  > over
>  > >  > a
>  > >  > >  > 100
>  > >  > >  > >  MB/s net. I'm only trying to run 300 users, so they should be
>  > able
>  > >  > to
>  > >  > >  > >  perform well over a 10 MB/s
>  > >  > >  > >
>  > >  > >  > >
>  > >  > >  > >  On Fri, Jun 6, 2008 at 9:27 AM, sebb <se...@gmail.com>
>  > wrote:
>  > >  > >  > >
>  > >  > >  > >  > In client-server mode, only the test plan is sent from the
>  > client
>  > >  > to
>  > >  > >  > >  > the server(s).
>  > >  > >  > >  >
>  > >  > >  > >  > Any additional files - e.g. CSV input files - need to be
>  > present
>  > >  > on
>  > >  > >  > >  > the server host in the location specified by  the test plan.
>  > >  > >  > >  >
>  > >  > >  > >  > Sample data is returned to the client, and processed/stored
>  > by
>  > >  > the
>  > >  > >  > client.
>  > >  > >  > >  > This can become a bottleneck at the client - both for JMeter
>  > >  > itself,
>  > >  > >  > >  > and for the network connection - under high loads.
>  > >  > >  > >  >
>  > >  > >  > >  > Data files are best randomised before use.
>  > >  > >  > >  > Likewise, if you want to run with different data on
>  > different
>  > >  > hosts,
>  > >  > >  > >  > then create different data files for each host (but you can
>  > use
>  > >  > the
>  > >  > >  > >  > same name).
>  > >  > >  > >  >
>  > >  > >  > >  > On 06/06/2008, Michael McDonnell <bz...@gmail.com>
>  > wrote:
>  > >  > >  > >  > > How did you randomize the data from the CSVs? (if I may
>  > ask)
>  > >  > >  > >  > >
>  > >  > >  > >  > >  Also, I'm dealing with a lot of optimistic locking issues
>  > >  > which
>  > >  > >  > would
>  > >  > >  > >  > only
>  > >  > >  > >  > >  occur if each csv is doing the EXACT same thing at the
>  > exact
>  > >  > same
>  > >  > >  > time
>  > >  > >  > >  > >  (which is completely likely)
>  > >  > >  > >  > >
>  > >  > >  > >  > >
>  > >  > >  > >  > >  On Thu, Jun 5, 2008 at 9:54 PM, Ryan Dooley <
>  > >  > ryan.dooley@gmail.com
>  > >  > >  > >
>  > >  > >  > >  > wrote:
>  > >  > >  > >  > >
>  > >  > >  > >  > >  > I had a similar experience the first time.  Turns out
>  > that
>  > >  > the
>  > >  > >  > data I
>  > >  > >  > >  > >  > wanted
>  > >  > >  > >  > >  > to test with (HTTP POSTs) has to be put on each remote.
>  >  I
>  > >  > also
>  > >  > >  > had a
>  > >  > >  > >  > >  > process to randomize the data when transferred to the
>  > >  > remotes.  I
>  > >  > >  > >  > finally
>  > >  > >  > >  > >  > got the load up high enough across 10 machines like
>  > yours.
>  > >  > >  > >  > >  >
>  > >  > >  > >  > >  > The test harness I had was pretty simple:  post these
>  > things
>  > >  > to
>  > >  > >  > this
>  > >  > >  > >  > url.
>  > >  > >  > >  > >  >
>  > >  > >  > >  > >  > On Thu, Jun 5, 2008 at 5:19 PM, Michael McDonnell <
>  > >  > >  > >  > bzaks1424@gmail.com>
>  > >  > >  > >  > >  > wrote:
>  > >  > >  > >  > >  >
>  > >  > >  > >  > >  > > We're running a distributed test (roughly 7 remote
>  > >  > >  > workstations) on
>  > >  > >  > >  > a
>  > >  > >  > >  > >  > > pretty
>  > >  > >  > >  > >  > > hefty box (8 cores, 32 gigs ram.... etc...)
>  > >  > >  > >  > >  > >
>  > >  > >  > >  > >  > > However, something seems to be going wrong... perhaps
>  > its
>  > >  > >  > because
>  > >  > >  > >  > I'm
>  > >  > >  > >  > >  > > crossing linux and windows platforms to try to do the
>  > >  > testing?
>  > >  > >  > >  > >  > >
>  > >  > >  > >  > >  > > We're load testing a web application, so primarily,
>  > the
>  > >  > only
>  > >  > >  > work
>  > >  > >  > >  > we're
>  > >  > >  > >  > >  > > doing is http requests (there are a few "java
>  > requests"
>  > >  > that
>  > >  > >  > >  > actually is
>  > >  > >  > >  > >  > an
>  > >  > >  > >  > >  > > app I created to make webservice calls, but we'll get
>  > to
>  > >  > that
>  > >  > >  > later)
>  > >  > >  > >  > >  > >
>  > >  > >  > >  > >  > > However, when we view the transactions in the
>  > database,
>  > >  > they
>  > >  > >  > are
>  > >  > >  > >  > >  > extremely
>  > >  > >  > >  > >  > > low. (frighteningly low).
>  > >  > >  > >  > >  > >
>  > >  > >  > >  > >  > > Then we run the test from a single user work station
>  > (same
>  > >  > >  > test, 300
>  > >  > >  > >  > >  > users
>  > >  > >  > >  > >  > > doing work) and our results come back fantastically!
>  > >  > >  > >  > >  > >
>  > >  > >  > >  > >  > > Now granted: I guess the big deal is this: when the
>  > app
>  > >  > uses a
>  > >  > >  > csv
>  > >  > >  > >  > in
>  > >  > >  > >  > >  > > distributed mode, does each slave utilize the the
>  > same csv
>  > >  > in
>  > >  > >  > the
>  > >  > >  > >  > same
>  > >  > >  > >  > >  > > order
>  > >  > >  > >  > >  > > ? or is there a sort of "break up" so that no two
>  > slaves
>  > >  > are
>  > >  > >  > using
>  > >  > >  > >  > the
>  > >  > >  > >  > >  > same
>  > >  > >  > >  > >  > > line in the csv?
>  > >  > >  > >  > >  > >
>  > >  > >  > >  > >  > > I'm sorry for what may be dumb questions... but we're
>  > >  > coming
>  > >  > >  > down to
>  > >  > >  > >  > a
>  > >  > >  > >  > >  > > tight
>  > >  > >  > >  > >  > > deadline, and the distributed testing is not giving
>  > us
>  > >  > good
>  > >  > >  > results
>  > >  > >  > >  > where
>  > >  > >  > >  > >  > > as
>  > >  > >  > >  > >  > > the local testing is.
>  > >  > >  > >  > >  > >
>  > >  > >  > >  > >  > > Thanks for all your help in advance.
>  > >  > >  > >  > >  > >
>  > >  > >  > >  > >  > > Michael
>  > >  > >  > >  > >  > >
>  > >  > >  > >  > >  >
>  > >  > >  > >  > >
>  > >  > >  > >  >
>  > >  > >  > >
>  > >  > >  > > >
>  > >  > ---------------------------------------------------------------------
>  > >  > >  > >  > To unsubscribe, e-mail:
>  > >  > jmeter-user-unsubscribe@jakarta.apache.org
>  > >  > >  > >  > For additional commands, e-mail:
>  > >  > jmeter-user-help@jakarta.apache.org
>  > >  > >  > >  >
>  > >  > >  > >  >
>  > >  > >  > >
>  > >  > >  >
>  > >  > >  >
>  > ---------------------------------------------------------------------
>  > >  > >  > To unsubscribe, e-mail:
>  > jmeter-user-unsubscribe@jakarta.apache.org
>  > >  > >  > For additional commands, e-mail:
>  > jmeter-user-help@jakarta.apache.org
>  > >  > >  >
>  > >  > >  >
>  > >  > >
>  > >  >
>  > >  > ---------------------------------------------------------------------
>  > >  > To unsubscribe, e-mail: jmeter-user-unsubscribe@jakarta.apache.org
>  > >  > For additional commands, e-mail: jmeter-user-help@jakarta.apache.org
>  > >  >
>  > >  >
>  > >
>  >
>  > ---------------------------------------------------------------------
>  > To unsubscribe, e-mail: jmeter-user-unsubscribe@jakarta.apache.org
>  > For additional commands, e-mail: jmeter-user-help@jakarta.apache.org
>  >
>  >
>

---------------------------------------------------------------------
To unsubscribe, e-mail: jmeter-user-unsubscribe@jakarta.apache.org
For additional commands, e-mail: jmeter-user-help@jakarta.apache.org


Re: Distributed Testing: Not actually creating load?

Posted by Michael McDonnell <bz...@gmail.com>.
I haven't had the opportunity to test each of the nodes in single mode, just
due to the nature of my work place environment (Most of the computers are
locked, I was just given an IP and told that JMeter-server is running.) I
know all the data is in sync because they're all synchronized using
subversion.

I ran some tests yesterday, comparing server response time (using JAMon) vs
Jmeter's response time, and I found that my results start to diverge at 20
users (spread out amongst all 7 of the nodes).

Regardless, I'm almost positive (as I said before) that its because of all
the optimistic locking (because of the situation with subversion, its
impossible right now to adjust the datafiles for everyone) exceptions being
thrown all the time.

Since we've aquired a faster machine (dual core, 2 gigs ram...) and set the
heap up on jmeter to like 1024m, that can run it beautifully in single node
mode, I unfortunately have to abandon the distributed test environment.
(Corporate...)

Thanks much for your time!

Michael

On Sat, Jun 7, 2008 at 5:35 AM, sebb <se...@gmail.com> wrote:

> You originally wrote that "single user work station ... results come
> back fantastically".
>
> Is that true for all the nodes used individually?
> Or do some nodes struggle?
>
> If all nodes work OK individually, try adding more nodes to the test,
> i.e instead of one node with 300 users, use 2 nodes with 150 users
> each. Try these both completely separately - in non-GUI mode - and in
> client-server mode.
>
> If you can run all the nodes together in non-GUI mode and get the same
> performance as
> a single node with the same total of users, but not when you run in
> client-server mode then there must be a bottleneck in the client.
>
> You also wrote:
>
> "We're load testing a web application, so primarily, the only work we're
> doing is http requests ...
>
> However, when we view the transactions in the database, they are extremely
> low. (frighteningly low)."
>
> This does not make sense. If the http requests directly relate to
> database transactions, then either some requests are not succeeding,
> or there is an error in the test data or the application. You'll need
> to trace the progress of the http request in the server to see why
> some requests are not generating database transactions.
>
> On 06/06/2008, Michael McDonnell <bz...@gmail.com> wrote:
> > Actually, that's about the exact configuration across all 7 of them.
> >
> >
> >  On Fri, Jun 6, 2008 at 9:58 AM, sebb <se...@gmail.com> wrote:
> >
> >  > On 06/06/2008, Michael McDonnell <bz...@gmail.com> wrote:
> >  > > One would think so, but my response times are terrible, I'm
> wondering if
> >  > it
> >  > >  has anything to do with the fact that its a 1st gen p4.
> >  > >  (Ubuntu Hardy, 1.5 Gig PC3200 RAM)
> >  > >
> >  >
> >  > What node is that?
> >  >
> >  > You wrote:
> >  >
> >  > "Then we run the test from a single user work station (same test, 300
> users
> >  > doing work) and our results come back fantastically!"
> >  >
> >  > >
> >  > >  On Fri, Jun 6, 2008 at 9:44 AM, sebb <se...@gmail.com> wrote:
> >  > >
> >  > >  > Yes, 300 users should not be a problem as a single JMeter client
> >  > >  > should be able to handle that.
> >  > >  >
> >  > >  > On 06/06/2008, Michael McDonnell <bz...@gmail.com> wrote:
> >  > >  > > That makes sense. I'll give it a go. (We're pretty sure there's
> no
> >  > bottle
> >  > >  > >  neck passing things, they do it every 100 samples, and this is
> over
> >  > a
> >  > >  > 100
> >  > >  > >  MB/s net. I'm only trying to run 300 users, so they should be
> able
> >  > to
> >  > >  > >  perform well over a 10 MB/s
> >  > >  > >
> >  > >  > >
> >  > >  > >  On Fri, Jun 6, 2008 at 9:27 AM, sebb <se...@gmail.com>
> wrote:
> >  > >  > >
> >  > >  > >  > In client-server mode, only the test plan is sent from the
> client
> >  > to
> >  > >  > >  > the server(s).
> >  > >  > >  >
> >  > >  > >  > Any additional files - e.g. CSV input files - need to be
> present
> >  > on
> >  > >  > >  > the server host in the location specified by  the test plan.
> >  > >  > >  >
> >  > >  > >  > Sample data is returned to the client, and processed/stored
> by
> >  > the
> >  > >  > client.
> >  > >  > >  > This can become a bottleneck at the client - both for JMeter
> >  > itself,
> >  > >  > >  > and for the network connection - under high loads.
> >  > >  > >  >
> >  > >  > >  > Data files are best randomised before use.
> >  > >  > >  > Likewise, if you want to run with different data on
> different
> >  > hosts,
> >  > >  > >  > then create different data files for each host (but you can
> use
> >  > the
> >  > >  > >  > same name).
> >  > >  > >  >
> >  > >  > >  > On 06/06/2008, Michael McDonnell <bz...@gmail.com>
> wrote:
> >  > >  > >  > > How did you randomize the data from the CSVs? (if I may
> ask)
> >  > >  > >  > >
> >  > >  > >  > >  Also, I'm dealing with a lot of optimistic locking issues
> >  > which
> >  > >  > would
> >  > >  > >  > only
> >  > >  > >  > >  occur if each csv is doing the EXACT same thing at the
> exact
> >  > same
> >  > >  > time
> >  > >  > >  > >  (which is completely likely)
> >  > >  > >  > >
> >  > >  > >  > >
> >  > >  > >  > >  On Thu, Jun 5, 2008 at 9:54 PM, Ryan Dooley <
> >  > ryan.dooley@gmail.com
> >  > >  > >
> >  > >  > >  > wrote:
> >  > >  > >  > >
> >  > >  > >  > >  > I had a similar experience the first time.  Turns out
> that
> >  > the
> >  > >  > data I
> >  > >  > >  > >  > wanted
> >  > >  > >  > >  > to test with (HTTP POSTs) has to be put on each remote.
>  I
> >  > also
> >  > >  > had a
> >  > >  > >  > >  > process to randomize the data when transferred to the
> >  > remotes.  I
> >  > >  > >  > finally
> >  > >  > >  > >  > got the load up high enough across 10 machines like
> yours.
> >  > >  > >  > >  >
> >  > >  > >  > >  > The test harness I had was pretty simple:  post these
> things
> >  > to
> >  > >  > this
> >  > >  > >  > url.
> >  > >  > >  > >  >
> >  > >  > >  > >  > On Thu, Jun 5, 2008 at 5:19 PM, Michael McDonnell <
> >  > >  > >  > bzaks1424@gmail.com>
> >  > >  > >  > >  > wrote:
> >  > >  > >  > >  >
> >  > >  > >  > >  > > We're running a distributed test (roughly 7 remote
> >  > >  > workstations) on
> >  > >  > >  > a
> >  > >  > >  > >  > > pretty
> >  > >  > >  > >  > > hefty box (8 cores, 32 gigs ram.... etc...)
> >  > >  > >  > >  > >
> >  > >  > >  > >  > > However, something seems to be going wrong... perhaps
> its
> >  > >  > because
> >  > >  > >  > I'm
> >  > >  > >  > >  > > crossing linux and windows platforms to try to do the
> >  > testing?
> >  > >  > >  > >  > >
> >  > >  > >  > >  > > We're load testing a web application, so primarily,
> the
> >  > only
> >  > >  > work
> >  > >  > >  > we're
> >  > >  > >  > >  > > doing is http requests (there are a few "java
> requests"
> >  > that
> >  > >  > >  > actually is
> >  > >  > >  > >  > an
> >  > >  > >  > >  > > app I created to make webservice calls, but we'll get
> to
> >  > that
> >  > >  > later)
> >  > >  > >  > >  > >
> >  > >  > >  > >  > > However, when we view the transactions in the
> database,
> >  > they
> >  > >  > are
> >  > >  > >  > >  > extremely
> >  > >  > >  > >  > > low. (frighteningly low).
> >  > >  > >  > >  > >
> >  > >  > >  > >  > > Then we run the test from a single user work station
> (same
> >  > >  > test, 300
> >  > >  > >  > >  > users
> >  > >  > >  > >  > > doing work) and our results come back fantastically!
> >  > >  > >  > >  > >
> >  > >  > >  > >  > > Now granted: I guess the big deal is this: when the
> app
> >  > uses a
> >  > >  > csv
> >  > >  > >  > in
> >  > >  > >  > >  > > distributed mode, does each slave utilize the the
> same csv
> >  > in
> >  > >  > the
> >  > >  > >  > same
> >  > >  > >  > >  > > order
> >  > >  > >  > >  > > ? or is there a sort of "break up" so that no two
> slaves
> >  > are
> >  > >  > using
> >  > >  > >  > the
> >  > >  > >  > >  > same
> >  > >  > >  > >  > > line in the csv?
> >  > >  > >  > >  > >
> >  > >  > >  > >  > > I'm sorry for what may be dumb questions... but we're
> >  > coming
> >  > >  > down to
> >  > >  > >  > a
> >  > >  > >  > >  > > tight
> >  > >  > >  > >  > > deadline, and the distributed testing is not giving
> us
> >  > good
> >  > >  > results
> >  > >  > >  > where
> >  > >  > >  > >  > > as
> >  > >  > >  > >  > > the local testing is.
> >  > >  > >  > >  > >
> >  > >  > >  > >  > > Thanks for all your help in advance.
> >  > >  > >  > >  > >
> >  > >  > >  > >  > > Michael
> >  > >  > >  > >  > >
> >  > >  > >  > >  >
> >  > >  > >  > >
> >  > >  > >  >
> >  > >  > >
> >  > >  > > >
> >  > ---------------------------------------------------------------------
> >  > >  > >  > To unsubscribe, e-mail:
> >  > jmeter-user-unsubscribe@jakarta.apache.org
> >  > >  > >  > For additional commands, e-mail:
> >  > jmeter-user-help@jakarta.apache.org
> >  > >  > >  >
> >  > >  > >  >
> >  > >  > >
> >  > >  >
> >  > >  >
> ---------------------------------------------------------------------
> >  > >  > To unsubscribe, e-mail:
> jmeter-user-unsubscribe@jakarta.apache.org
> >  > >  > For additional commands, e-mail:
> jmeter-user-help@jakarta.apache.org
> >  > >  >
> >  > >  >
> >  > >
> >  >
> >  > ---------------------------------------------------------------------
> >  > To unsubscribe, e-mail: jmeter-user-unsubscribe@jakarta.apache.org
> >  > For additional commands, e-mail: jmeter-user-help@jakarta.apache.org
> >  >
> >  >
> >
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: jmeter-user-unsubscribe@jakarta.apache.org
> For additional commands, e-mail: jmeter-user-help@jakarta.apache.org
>
>

Re: Distributed Testing: Not actually creating load?

Posted by sebb <se...@gmail.com>.
You originally wrote that "single user work station ... results come
back fantastically".

Is that true for all the nodes used individually?
Or do some nodes struggle?

If all nodes work OK individually, try adding more nodes to the test,
i.e instead of one node with 300 users, use 2 nodes with 150 users
each. Try these both completely separately - in non-GUI mode - and in
client-server mode.

If you can run all the nodes together in non-GUI mode and get the same
performance as
a single node with the same total of users, but not when you run in
client-server mode then there must be a bottleneck in the client.

You also wrote:

"We're load testing a web application, so primarily, the only work we're
doing is http requests ...

However, when we view the transactions in the database, they are extremely
low. (frighteningly low)."

This does not make sense. If the http requests directly relate to
database transactions, then either some requests are not succeeding,
or there is an error in the test data or the application. You'll need
to trace the progress of the http request in the server to see why
some requests are not generating database transactions.

On 06/06/2008, Michael McDonnell <bz...@gmail.com> wrote:
> Actually, that's about the exact configuration across all 7 of them.
>
>
>  On Fri, Jun 6, 2008 at 9:58 AM, sebb <se...@gmail.com> wrote:
>
>  > On 06/06/2008, Michael McDonnell <bz...@gmail.com> wrote:
>  > > One would think so, but my response times are terrible, I'm wondering if
>  > it
>  > >  has anything to do with the fact that its a 1st gen p4.
>  > >  (Ubuntu Hardy, 1.5 Gig PC3200 RAM)
>  > >
>  >
>  > What node is that?
>  >
>  > You wrote:
>  >
>  > "Then we run the test from a single user work station (same test, 300 users
>  > doing work) and our results come back fantastically!"
>  >
>  > >
>  > >  On Fri, Jun 6, 2008 at 9:44 AM, sebb <se...@gmail.com> wrote:
>  > >
>  > >  > Yes, 300 users should not be a problem as a single JMeter client
>  > >  > should be able to handle that.
>  > >  >
>  > >  > On 06/06/2008, Michael McDonnell <bz...@gmail.com> wrote:
>  > >  > > That makes sense. I'll give it a go. (We're pretty sure there's no
>  > bottle
>  > >  > >  neck passing things, they do it every 100 samples, and this is over
>  > a
>  > >  > 100
>  > >  > >  MB/s net. I'm only trying to run 300 users, so they should be able
>  > to
>  > >  > >  perform well over a 10 MB/s
>  > >  > >
>  > >  > >
>  > >  > >  On Fri, Jun 6, 2008 at 9:27 AM, sebb <se...@gmail.com> wrote:
>  > >  > >
>  > >  > >  > In client-server mode, only the test plan is sent from the client
>  > to
>  > >  > >  > the server(s).
>  > >  > >  >
>  > >  > >  > Any additional files - e.g. CSV input files - need to be present
>  > on
>  > >  > >  > the server host in the location specified by  the test plan.
>  > >  > >  >
>  > >  > >  > Sample data is returned to the client, and processed/stored by
>  > the
>  > >  > client.
>  > >  > >  > This can become a bottleneck at the client - both for JMeter
>  > itself,
>  > >  > >  > and for the network connection - under high loads.
>  > >  > >  >
>  > >  > >  > Data files are best randomised before use.
>  > >  > >  > Likewise, if you want to run with different data on different
>  > hosts,
>  > >  > >  > then create different data files for each host (but you can use
>  > the
>  > >  > >  > same name).
>  > >  > >  >
>  > >  > >  > On 06/06/2008, Michael McDonnell <bz...@gmail.com> wrote:
>  > >  > >  > > How did you randomize the data from the CSVs? (if I may ask)
>  > >  > >  > >
>  > >  > >  > >  Also, I'm dealing with a lot of optimistic locking issues
>  > which
>  > >  > would
>  > >  > >  > only
>  > >  > >  > >  occur if each csv is doing the EXACT same thing at the exact
>  > same
>  > >  > time
>  > >  > >  > >  (which is completely likely)
>  > >  > >  > >
>  > >  > >  > >
>  > >  > >  > >  On Thu, Jun 5, 2008 at 9:54 PM, Ryan Dooley <
>  > ryan.dooley@gmail.com
>  > >  > >
>  > >  > >  > wrote:
>  > >  > >  > >
>  > >  > >  > >  > I had a similar experience the first time.  Turns out that
>  > the
>  > >  > data I
>  > >  > >  > >  > wanted
>  > >  > >  > >  > to test with (HTTP POSTs) has to be put on each remote.  I
>  > also
>  > >  > had a
>  > >  > >  > >  > process to randomize the data when transferred to the
>  > remotes.  I
>  > >  > >  > finally
>  > >  > >  > >  > got the load up high enough across 10 machines like yours.
>  > >  > >  > >  >
>  > >  > >  > >  > The test harness I had was pretty simple:  post these things
>  > to
>  > >  > this
>  > >  > >  > url.
>  > >  > >  > >  >
>  > >  > >  > >  > On Thu, Jun 5, 2008 at 5:19 PM, Michael McDonnell <
>  > >  > >  > bzaks1424@gmail.com>
>  > >  > >  > >  > wrote:
>  > >  > >  > >  >
>  > >  > >  > >  > > We're running a distributed test (roughly 7 remote
>  > >  > workstations) on
>  > >  > >  > a
>  > >  > >  > >  > > pretty
>  > >  > >  > >  > > hefty box (8 cores, 32 gigs ram.... etc...)
>  > >  > >  > >  > >
>  > >  > >  > >  > > However, something seems to be going wrong... perhaps its
>  > >  > because
>  > >  > >  > I'm
>  > >  > >  > >  > > crossing linux and windows platforms to try to do the
>  > testing?
>  > >  > >  > >  > >
>  > >  > >  > >  > > We're load testing a web application, so primarily, the
>  > only
>  > >  > work
>  > >  > >  > we're
>  > >  > >  > >  > > doing is http requests (there are a few "java requests"
>  > that
>  > >  > >  > actually is
>  > >  > >  > >  > an
>  > >  > >  > >  > > app I created to make webservice calls, but we'll get to
>  > that
>  > >  > later)
>  > >  > >  > >  > >
>  > >  > >  > >  > > However, when we view the transactions in the database,
>  > they
>  > >  > are
>  > >  > >  > >  > extremely
>  > >  > >  > >  > > low. (frighteningly low).
>  > >  > >  > >  > >
>  > >  > >  > >  > > Then we run the test from a single user work station (same
>  > >  > test, 300
>  > >  > >  > >  > users
>  > >  > >  > >  > > doing work) and our results come back fantastically!
>  > >  > >  > >  > >
>  > >  > >  > >  > > Now granted: I guess the big deal is this: when the app
>  > uses a
>  > >  > csv
>  > >  > >  > in
>  > >  > >  > >  > > distributed mode, does each slave utilize the the same csv
>  > in
>  > >  > the
>  > >  > >  > same
>  > >  > >  > >  > > order
>  > >  > >  > >  > > ? or is there a sort of "break up" so that no two slaves
>  > are
>  > >  > using
>  > >  > >  > the
>  > >  > >  > >  > same
>  > >  > >  > >  > > line in the csv?
>  > >  > >  > >  > >
>  > >  > >  > >  > > I'm sorry for what may be dumb questions... but we're
>  > coming
>  > >  > down to
>  > >  > >  > a
>  > >  > >  > >  > > tight
>  > >  > >  > >  > > deadline, and the distributed testing is not giving us
>  > good
>  > >  > results
>  > >  > >  > where
>  > >  > >  > >  > > as
>  > >  > >  > >  > > the local testing is.
>  > >  > >  > >  > >
>  > >  > >  > >  > > Thanks for all your help in advance.
>  > >  > >  > >  > >
>  > >  > >  > >  > > Michael
>  > >  > >  > >  > >
>  > >  > >  > >  >
>  > >  > >  > >
>  > >  > >  >
>  > >  > >
>  > >  > > >
>  > ---------------------------------------------------------------------
>  > >  > >  > To unsubscribe, e-mail:
>  > jmeter-user-unsubscribe@jakarta.apache.org
>  > >  > >  > For additional commands, e-mail:
>  > jmeter-user-help@jakarta.apache.org
>  > >  > >  >
>  > >  > >  >
>  > >  > >
>  > >  >
>  > >  > ---------------------------------------------------------------------
>  > >  > To unsubscribe, e-mail: jmeter-user-unsubscribe@jakarta.apache.org
>  > >  > For additional commands, e-mail: jmeter-user-help@jakarta.apache.org
>  > >  >
>  > >  >
>  > >
>  >
>  > ---------------------------------------------------------------------
>  > To unsubscribe, e-mail: jmeter-user-unsubscribe@jakarta.apache.org
>  > For additional commands, e-mail: jmeter-user-help@jakarta.apache.org
>  >
>  >
>

---------------------------------------------------------------------
To unsubscribe, e-mail: jmeter-user-unsubscribe@jakarta.apache.org
For additional commands, e-mail: jmeter-user-help@jakarta.apache.org


Re: Distributed Testing: Not actually creating load?

Posted by Michael McDonnell <bz...@gmail.com>.
Actually, that's about the exact configuration across all 7 of them.

On Fri, Jun 6, 2008 at 9:58 AM, sebb <se...@gmail.com> wrote:

> On 06/06/2008, Michael McDonnell <bz...@gmail.com> wrote:
> > One would think so, but my response times are terrible, I'm wondering if
> it
> >  has anything to do with the fact that its a 1st gen p4.
> >  (Ubuntu Hardy, 1.5 Gig PC3200 RAM)
> >
>
> What node is that?
>
> You wrote:
>
> "Then we run the test from a single user work station (same test, 300 users
> doing work) and our results come back fantastically!"
>
> >
> >  On Fri, Jun 6, 2008 at 9:44 AM, sebb <se...@gmail.com> wrote:
> >
> >  > Yes, 300 users should not be a problem as a single JMeter client
> >  > should be able to handle that.
> >  >
> >  > On 06/06/2008, Michael McDonnell <bz...@gmail.com> wrote:
> >  > > That makes sense. I'll give it a go. (We're pretty sure there's no
> bottle
> >  > >  neck passing things, they do it every 100 samples, and this is over
> a
> >  > 100
> >  > >  MB/s net. I'm only trying to run 300 users, so they should be able
> to
> >  > >  perform well over a 10 MB/s
> >  > >
> >  > >
> >  > >  On Fri, Jun 6, 2008 at 9:27 AM, sebb <se...@gmail.com> wrote:
> >  > >
> >  > >  > In client-server mode, only the test plan is sent from the client
> to
> >  > >  > the server(s).
> >  > >  >
> >  > >  > Any additional files - e.g. CSV input files - need to be present
> on
> >  > >  > the server host in the location specified by  the test plan.
> >  > >  >
> >  > >  > Sample data is returned to the client, and processed/stored by
> the
> >  > client.
> >  > >  > This can become a bottleneck at the client - both for JMeter
> itself,
> >  > >  > and for the network connection - under high loads.
> >  > >  >
> >  > >  > Data files are best randomised before use.
> >  > >  > Likewise, if you want to run with different data on different
> hosts,
> >  > >  > then create different data files for each host (but you can use
> the
> >  > >  > same name).
> >  > >  >
> >  > >  > On 06/06/2008, Michael McDonnell <bz...@gmail.com> wrote:
> >  > >  > > How did you randomize the data from the CSVs? (if I may ask)
> >  > >  > >
> >  > >  > >  Also, I'm dealing with a lot of optimistic locking issues
> which
> >  > would
> >  > >  > only
> >  > >  > >  occur if each csv is doing the EXACT same thing at the exact
> same
> >  > time
> >  > >  > >  (which is completely likely)
> >  > >  > >
> >  > >  > >
> >  > >  > >  On Thu, Jun 5, 2008 at 9:54 PM, Ryan Dooley <
> ryan.dooley@gmail.com
> >  > >
> >  > >  > wrote:
> >  > >  > >
> >  > >  > >  > I had a similar experience the first time.  Turns out that
> the
> >  > data I
> >  > >  > >  > wanted
> >  > >  > >  > to test with (HTTP POSTs) has to be put on each remote.  I
> also
> >  > had a
> >  > >  > >  > process to randomize the data when transferred to the
> remotes.  I
> >  > >  > finally
> >  > >  > >  > got the load up high enough across 10 machines like yours.
> >  > >  > >  >
> >  > >  > >  > The test harness I had was pretty simple:  post these things
> to
> >  > this
> >  > >  > url.
> >  > >  > >  >
> >  > >  > >  > On Thu, Jun 5, 2008 at 5:19 PM, Michael McDonnell <
> >  > >  > bzaks1424@gmail.com>
> >  > >  > >  > wrote:
> >  > >  > >  >
> >  > >  > >  > > We're running a distributed test (roughly 7 remote
> >  > workstations) on
> >  > >  > a
> >  > >  > >  > > pretty
> >  > >  > >  > > hefty box (8 cores, 32 gigs ram.... etc...)
> >  > >  > >  > >
> >  > >  > >  > > However, something seems to be going wrong... perhaps its
> >  > because
> >  > >  > I'm
> >  > >  > >  > > crossing linux and windows platforms to try to do the
> testing?
> >  > >  > >  > >
> >  > >  > >  > > We're load testing a web application, so primarily, the
> only
> >  > work
> >  > >  > we're
> >  > >  > >  > > doing is http requests (there are a few "java requests"
> that
> >  > >  > actually is
> >  > >  > >  > an
> >  > >  > >  > > app I created to make webservice calls, but we'll get to
> that
> >  > later)
> >  > >  > >  > >
> >  > >  > >  > > However, when we view the transactions in the database,
> they
> >  > are
> >  > >  > >  > extremely
> >  > >  > >  > > low. (frighteningly low).
> >  > >  > >  > >
> >  > >  > >  > > Then we run the test from a single user work station (same
> >  > test, 300
> >  > >  > >  > users
> >  > >  > >  > > doing work) and our results come back fantastically!
> >  > >  > >  > >
> >  > >  > >  > > Now granted: I guess the big deal is this: when the app
> uses a
> >  > csv
> >  > >  > in
> >  > >  > >  > > distributed mode, does each slave utilize the the same csv
> in
> >  > the
> >  > >  > same
> >  > >  > >  > > order
> >  > >  > >  > > ? or is there a sort of "break up" so that no two slaves
> are
> >  > using
> >  > >  > the
> >  > >  > >  > same
> >  > >  > >  > > line in the csv?
> >  > >  > >  > >
> >  > >  > >  > > I'm sorry for what may be dumb questions... but we're
> coming
> >  > down to
> >  > >  > a
> >  > >  > >  > > tight
> >  > >  > >  > > deadline, and the distributed testing is not giving us
> good
> >  > results
> >  > >  > where
> >  > >  > >  > > as
> >  > >  > >  > > the local testing is.
> >  > >  > >  > >
> >  > >  > >  > > Thanks for all your help in advance.
> >  > >  > >  > >
> >  > >  > >  > > Michael
> >  > >  > >  > >
> >  > >  > >  >
> >  > >  > >
> >  > >  >
> >  > >
> >  > > >
> ---------------------------------------------------------------------
> >  > >  > To unsubscribe, e-mail:
> jmeter-user-unsubscribe@jakarta.apache.org
> >  > >  > For additional commands, e-mail:
> jmeter-user-help@jakarta.apache.org
> >  > >  >
> >  > >  >
> >  > >
> >  >
> >  > ---------------------------------------------------------------------
> >  > To unsubscribe, e-mail: jmeter-user-unsubscribe@jakarta.apache.org
> >  > For additional commands, e-mail: jmeter-user-help@jakarta.apache.org
> >  >
> >  >
> >
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: jmeter-user-unsubscribe@jakarta.apache.org
> For additional commands, e-mail: jmeter-user-help@jakarta.apache.org
>
>

Re: Distributed Testing: Not actually creating load?

Posted by sebb <se...@gmail.com>.
On 06/06/2008, Michael McDonnell <bz...@gmail.com> wrote:
> One would think so, but my response times are terrible, I'm wondering if it
>  has anything to do with the fact that its a 1st gen p4.
>  (Ubuntu Hardy, 1.5 Gig PC3200 RAM)
>

What node is that?

You wrote:

"Then we run the test from a single user work station (same test, 300 users
doing work) and our results come back fantastically!"

>
>  On Fri, Jun 6, 2008 at 9:44 AM, sebb <se...@gmail.com> wrote:
>
>  > Yes, 300 users should not be a problem as a single JMeter client
>  > should be able to handle that.
>  >
>  > On 06/06/2008, Michael McDonnell <bz...@gmail.com> wrote:
>  > > That makes sense. I'll give it a go. (We're pretty sure there's no bottle
>  > >  neck passing things, they do it every 100 samples, and this is over a
>  > 100
>  > >  MB/s net. I'm only trying to run 300 users, so they should be able to
>  > >  perform well over a 10 MB/s
>  > >
>  > >
>  > >  On Fri, Jun 6, 2008 at 9:27 AM, sebb <se...@gmail.com> wrote:
>  > >
>  > >  > In client-server mode, only the test plan is sent from the client to
>  > >  > the server(s).
>  > >  >
>  > >  > Any additional files - e.g. CSV input files - need to be present on
>  > >  > the server host in the location specified by  the test plan.
>  > >  >
>  > >  > Sample data is returned to the client, and processed/stored by the
>  > client.
>  > >  > This can become a bottleneck at the client - both for JMeter itself,
>  > >  > and for the network connection - under high loads.
>  > >  >
>  > >  > Data files are best randomised before use.
>  > >  > Likewise, if you want to run with different data on different hosts,
>  > >  > then create different data files for each host (but you can use the
>  > >  > same name).
>  > >  >
>  > >  > On 06/06/2008, Michael McDonnell <bz...@gmail.com> wrote:
>  > >  > > How did you randomize the data from the CSVs? (if I may ask)
>  > >  > >
>  > >  > >  Also, I'm dealing with a lot of optimistic locking issues which
>  > would
>  > >  > only
>  > >  > >  occur if each csv is doing the EXACT same thing at the exact same
>  > time
>  > >  > >  (which is completely likely)
>  > >  > >
>  > >  > >
>  > >  > >  On Thu, Jun 5, 2008 at 9:54 PM, Ryan Dooley <ryan.dooley@gmail.com
>  > >
>  > >  > wrote:
>  > >  > >
>  > >  > >  > I had a similar experience the first time.  Turns out that the
>  > data I
>  > >  > >  > wanted
>  > >  > >  > to test with (HTTP POSTs) has to be put on each remote.  I also
>  > had a
>  > >  > >  > process to randomize the data when transferred to the remotes.  I
>  > >  > finally
>  > >  > >  > got the load up high enough across 10 machines like yours.
>  > >  > >  >
>  > >  > >  > The test harness I had was pretty simple:  post these things to
>  > this
>  > >  > url.
>  > >  > >  >
>  > >  > >  > On Thu, Jun 5, 2008 at 5:19 PM, Michael McDonnell <
>  > >  > bzaks1424@gmail.com>
>  > >  > >  > wrote:
>  > >  > >  >
>  > >  > >  > > We're running a distributed test (roughly 7 remote
>  > workstations) on
>  > >  > a
>  > >  > >  > > pretty
>  > >  > >  > > hefty box (8 cores, 32 gigs ram.... etc...)
>  > >  > >  > >
>  > >  > >  > > However, something seems to be going wrong... perhaps its
>  > because
>  > >  > I'm
>  > >  > >  > > crossing linux and windows platforms to try to do the testing?
>  > >  > >  > >
>  > >  > >  > > We're load testing a web application, so primarily, the only
>  > work
>  > >  > we're
>  > >  > >  > > doing is http requests (there are a few "java requests" that
>  > >  > actually is
>  > >  > >  > an
>  > >  > >  > > app I created to make webservice calls, but we'll get to that
>  > later)
>  > >  > >  > >
>  > >  > >  > > However, when we view the transactions in the database, they
>  > are
>  > >  > >  > extremely
>  > >  > >  > > low. (frighteningly low).
>  > >  > >  > >
>  > >  > >  > > Then we run the test from a single user work station (same
>  > test, 300
>  > >  > >  > users
>  > >  > >  > > doing work) and our results come back fantastically!
>  > >  > >  > >
>  > >  > >  > > Now granted: I guess the big deal is this: when the app uses a
>  > csv
>  > >  > in
>  > >  > >  > > distributed mode, does each slave utilize the the same csv in
>  > the
>  > >  > same
>  > >  > >  > > order
>  > >  > >  > > ? or is there a sort of "break up" so that no two slaves are
>  > using
>  > >  > the
>  > >  > >  > same
>  > >  > >  > > line in the csv?
>  > >  > >  > >
>  > >  > >  > > I'm sorry for what may be dumb questions... but we're coming
>  > down to
>  > >  > a
>  > >  > >  > > tight
>  > >  > >  > > deadline, and the distributed testing is not giving us good
>  > results
>  > >  > where
>  > >  > >  > > as
>  > >  > >  > > the local testing is.
>  > >  > >  > >
>  > >  > >  > > Thanks for all your help in advance.
>  > >  > >  > >
>  > >  > >  > > Michael
>  > >  > >  > >
>  > >  > >  >
>  > >  > >
>  > >  >
>  > >
>  > > > ---------------------------------------------------------------------
>  > >  > To unsubscribe, e-mail: jmeter-user-unsubscribe@jakarta.apache.org
>  > >  > For additional commands, e-mail: jmeter-user-help@jakarta.apache.org
>  > >  >
>  > >  >
>  > >
>  >
>  > ---------------------------------------------------------------------
>  > To unsubscribe, e-mail: jmeter-user-unsubscribe@jakarta.apache.org
>  > For additional commands, e-mail: jmeter-user-help@jakarta.apache.org
>  >
>  >
>

---------------------------------------------------------------------
To unsubscribe, e-mail: jmeter-user-unsubscribe@jakarta.apache.org
For additional commands, e-mail: jmeter-user-help@jakarta.apache.org


Re: Distributed Testing: Not actually creating load?

Posted by Michael McDonnell <bz...@gmail.com>.
One would think so, but my response times are terrible, I'm wondering if it
has anything to do with the fact that its a 1st gen p4.
(Ubuntu Hardy, 1.5 Gig PC3200 RAM)

On Fri, Jun 6, 2008 at 9:44 AM, sebb <se...@gmail.com> wrote:

> Yes, 300 users should not be a problem as a single JMeter client
> should be able to handle that.
>
> On 06/06/2008, Michael McDonnell <bz...@gmail.com> wrote:
> > That makes sense. I'll give it a go. (We're pretty sure there's no bottle
> >  neck passing things, they do it every 100 samples, and this is over a
> 100
> >  MB/s net. I'm only trying to run 300 users, so they should be able to
> >  perform well over a 10 MB/s
> >
> >
> >  On Fri, Jun 6, 2008 at 9:27 AM, sebb <se...@gmail.com> wrote:
> >
> >  > In client-server mode, only the test plan is sent from the client to
> >  > the server(s).
> >  >
> >  > Any additional files - e.g. CSV input files - need to be present on
> >  > the server host in the location specified by  the test plan.
> >  >
> >  > Sample data is returned to the client, and processed/stored by the
> client.
> >  > This can become a bottleneck at the client - both for JMeter itself,
> >  > and for the network connection - under high loads.
> >  >
> >  > Data files are best randomised before use.
> >  > Likewise, if you want to run with different data on different hosts,
> >  > then create different data files for each host (but you can use the
> >  > same name).
> >  >
> >  > On 06/06/2008, Michael McDonnell <bz...@gmail.com> wrote:
> >  > > How did you randomize the data from the CSVs? (if I may ask)
> >  > >
> >  > >  Also, I'm dealing with a lot of optimistic locking issues which
> would
> >  > only
> >  > >  occur if each csv is doing the EXACT same thing at the exact same
> time
> >  > >  (which is completely likely)
> >  > >
> >  > >
> >  > >  On Thu, Jun 5, 2008 at 9:54 PM, Ryan Dooley <ryan.dooley@gmail.com
> >
> >  > wrote:
> >  > >
> >  > >  > I had a similar experience the first time.  Turns out that the
> data I
> >  > >  > wanted
> >  > >  > to test with (HTTP POSTs) has to be put on each remote.  I also
> had a
> >  > >  > process to randomize the data when transferred to the remotes.  I
> >  > finally
> >  > >  > got the load up high enough across 10 machines like yours.
> >  > >  >
> >  > >  > The test harness I had was pretty simple:  post these things to
> this
> >  > url.
> >  > >  >
> >  > >  > On Thu, Jun 5, 2008 at 5:19 PM, Michael McDonnell <
> >  > bzaks1424@gmail.com>
> >  > >  > wrote:
> >  > >  >
> >  > >  > > We're running a distributed test (roughly 7 remote
> workstations) on
> >  > a
> >  > >  > > pretty
> >  > >  > > hefty box (8 cores, 32 gigs ram.... etc...)
> >  > >  > >
> >  > >  > > However, something seems to be going wrong... perhaps its
> because
> >  > I'm
> >  > >  > > crossing linux and windows platforms to try to do the testing?
> >  > >  > >
> >  > >  > > We're load testing a web application, so primarily, the only
> work
> >  > we're
> >  > >  > > doing is http requests (there are a few "java requests" that
> >  > actually is
> >  > >  > an
> >  > >  > > app I created to make webservice calls, but we'll get to that
> later)
> >  > >  > >
> >  > >  > > However, when we view the transactions in the database, they
> are
> >  > >  > extremely
> >  > >  > > low. (frighteningly low).
> >  > >  > >
> >  > >  > > Then we run the test from a single user work station (same
> test, 300
> >  > >  > users
> >  > >  > > doing work) and our results come back fantastically!
> >  > >  > >
> >  > >  > > Now granted: I guess the big deal is this: when the app uses a
> csv
> >  > in
> >  > >  > > distributed mode, does each slave utilize the the same csv in
> the
> >  > same
> >  > >  > > order
> >  > >  > > ? or is there a sort of "break up" so that no two slaves are
> using
> >  > the
> >  > >  > same
> >  > >  > > line in the csv?
> >  > >  > >
> >  > >  > > I'm sorry for what may be dumb questions... but we're coming
> down to
> >  > a
> >  > >  > > tight
> >  > >  > > deadline, and the distributed testing is not giving us good
> results
> >  > where
> >  > >  > > as
> >  > >  > > the local testing is.
> >  > >  > >
> >  > >  > > Thanks for all your help in advance.
> >  > >  > >
> >  > >  > > Michael
> >  > >  > >
> >  > >  >
> >  > >
> >  >
> >
> > > ---------------------------------------------------------------------
> >  > To unsubscribe, e-mail: jmeter-user-unsubscribe@jakarta.apache.org
> >  > For additional commands, e-mail: jmeter-user-help@jakarta.apache.org
> >  >
> >  >
> >
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: jmeter-user-unsubscribe@jakarta.apache.org
> For additional commands, e-mail: jmeter-user-help@jakarta.apache.org
>
>

Re: Distributed Testing: Not actually creating load?

Posted by sebb <se...@gmail.com>.
Yes, 300 users should not be a problem as a single JMeter client
should be able to handle that.

On 06/06/2008, Michael McDonnell <bz...@gmail.com> wrote:
> That makes sense. I'll give it a go. (We're pretty sure there's no bottle
>  neck passing things, they do it every 100 samples, and this is over a 100
>  MB/s net. I'm only trying to run 300 users, so they should be able to
>  perform well over a 10 MB/s
>
>
>  On Fri, Jun 6, 2008 at 9:27 AM, sebb <se...@gmail.com> wrote:
>
>  > In client-server mode, only the test plan is sent from the client to
>  > the server(s).
>  >
>  > Any additional files - e.g. CSV input files - need to be present on
>  > the server host in the location specified by  the test plan.
>  >
>  > Sample data is returned to the client, and processed/stored by the client.
>  > This can become a bottleneck at the client - both for JMeter itself,
>  > and for the network connection - under high loads.
>  >
>  > Data files are best randomised before use.
>  > Likewise, if you want to run with different data on different hosts,
>  > then create different data files for each host (but you can use the
>  > same name).
>  >
>  > On 06/06/2008, Michael McDonnell <bz...@gmail.com> wrote:
>  > > How did you randomize the data from the CSVs? (if I may ask)
>  > >
>  > >  Also, I'm dealing with a lot of optimistic locking issues which would
>  > only
>  > >  occur if each csv is doing the EXACT same thing at the exact same time
>  > >  (which is completely likely)
>  > >
>  > >
>  > >  On Thu, Jun 5, 2008 at 9:54 PM, Ryan Dooley <ry...@gmail.com>
>  > wrote:
>  > >
>  > >  > I had a similar experience the first time.  Turns out that the data I
>  > >  > wanted
>  > >  > to test with (HTTP POSTs) has to be put on each remote.  I also had a
>  > >  > process to randomize the data when transferred to the remotes.  I
>  > finally
>  > >  > got the load up high enough across 10 machines like yours.
>  > >  >
>  > >  > The test harness I had was pretty simple:  post these things to this
>  > url.
>  > >  >
>  > >  > On Thu, Jun 5, 2008 at 5:19 PM, Michael McDonnell <
>  > bzaks1424@gmail.com>
>  > >  > wrote:
>  > >  >
>  > >  > > We're running a distributed test (roughly 7 remote workstations) on
>  > a
>  > >  > > pretty
>  > >  > > hefty box (8 cores, 32 gigs ram.... etc...)
>  > >  > >
>  > >  > > However, something seems to be going wrong... perhaps its because
>  > I'm
>  > >  > > crossing linux and windows platforms to try to do the testing?
>  > >  > >
>  > >  > > We're load testing a web application, so primarily, the only work
>  > we're
>  > >  > > doing is http requests (there are a few "java requests" that
>  > actually is
>  > >  > an
>  > >  > > app I created to make webservice calls, but we'll get to that later)
>  > >  > >
>  > >  > > However, when we view the transactions in the database, they are
>  > >  > extremely
>  > >  > > low. (frighteningly low).
>  > >  > >
>  > >  > > Then we run the test from a single user work station (same test, 300
>  > >  > users
>  > >  > > doing work) and our results come back fantastically!
>  > >  > >
>  > >  > > Now granted: I guess the big deal is this: when the app uses a csv
>  > in
>  > >  > > distributed mode, does each slave utilize the the same csv in the
>  > same
>  > >  > > order
>  > >  > > ? or is there a sort of "break up" so that no two slaves are using
>  > the
>  > >  > same
>  > >  > > line in the csv?
>  > >  > >
>  > >  > > I'm sorry for what may be dumb questions... but we're coming down to
>  > a
>  > >  > > tight
>  > >  > > deadline, and the distributed testing is not giving us good results
>  > where
>  > >  > > as
>  > >  > > the local testing is.
>  > >  > >
>  > >  > > Thanks for all your help in advance.
>  > >  > >
>  > >  > > Michael
>  > >  > >
>  > >  >
>  > >
>  >
>
> > ---------------------------------------------------------------------
>  > To unsubscribe, e-mail: jmeter-user-unsubscribe@jakarta.apache.org
>  > For additional commands, e-mail: jmeter-user-help@jakarta.apache.org
>  >
>  >
>

---------------------------------------------------------------------
To unsubscribe, e-mail: jmeter-user-unsubscribe@jakarta.apache.org
For additional commands, e-mail: jmeter-user-help@jakarta.apache.org


Re: Distributed Testing: Not actually creating load?

Posted by Michael McDonnell <bz...@gmail.com>.
That makes sense. I'll give it a go. (We're pretty sure there's no bottle
neck passing things, they do it every 100 samples, and this is over a 100
MB/s net. I'm only trying to run 300 users, so they should be able to
perform well over a 10 MB/s

On Fri, Jun 6, 2008 at 9:27 AM, sebb <se...@gmail.com> wrote:

> In client-server mode, only the test plan is sent from the client to
> the server(s).
>
> Any additional files - e.g. CSV input files - need to be present on
> the server host in the location specified by  the test plan.
>
> Sample data is returned to the client, and processed/stored by the client.
> This can become a bottleneck at the client - both for JMeter itself,
> and for the network connection - under high loads.
>
> Data files are best randomised before use.
> Likewise, if you want to run with different data on different hosts,
> then create different data files for each host (but you can use the
> same name).
>
> On 06/06/2008, Michael McDonnell <bz...@gmail.com> wrote:
> > How did you randomize the data from the CSVs? (if I may ask)
> >
> >  Also, I'm dealing with a lot of optimistic locking issues which would
> only
> >  occur if each csv is doing the EXACT same thing at the exact same time
> >  (which is completely likely)
> >
> >
> >  On Thu, Jun 5, 2008 at 9:54 PM, Ryan Dooley <ry...@gmail.com>
> wrote:
> >
> >  > I had a similar experience the first time.  Turns out that the data I
> >  > wanted
> >  > to test with (HTTP POSTs) has to be put on each remote.  I also had a
> >  > process to randomize the data when transferred to the remotes.  I
> finally
> >  > got the load up high enough across 10 machines like yours.
> >  >
> >  > The test harness I had was pretty simple:  post these things to this
> url.
> >  >
> >  > On Thu, Jun 5, 2008 at 5:19 PM, Michael McDonnell <
> bzaks1424@gmail.com>
> >  > wrote:
> >  >
> >  > > We're running a distributed test (roughly 7 remote workstations) on
> a
> >  > > pretty
> >  > > hefty box (8 cores, 32 gigs ram.... etc...)
> >  > >
> >  > > However, something seems to be going wrong... perhaps its because
> I'm
> >  > > crossing linux and windows platforms to try to do the testing?
> >  > >
> >  > > We're load testing a web application, so primarily, the only work
> we're
> >  > > doing is http requests (there are a few "java requests" that
> actually is
> >  > an
> >  > > app I created to make webservice calls, but we'll get to that later)
> >  > >
> >  > > However, when we view the transactions in the database, they are
> >  > extremely
> >  > > low. (frighteningly low).
> >  > >
> >  > > Then we run the test from a single user work station (same test, 300
> >  > users
> >  > > doing work) and our results come back fantastically!
> >  > >
> >  > > Now granted: I guess the big deal is this: when the app uses a csv
> in
> >  > > distributed mode, does each slave utilize the the same csv in the
> same
> >  > > order
> >  > > ? or is there a sort of "break up" so that no two slaves are using
> the
> >  > same
> >  > > line in the csv?
> >  > >
> >  > > I'm sorry for what may be dumb questions... but we're coming down to
> a
> >  > > tight
> >  > > deadline, and the distributed testing is not giving us good results
> where
> >  > > as
> >  > > the local testing is.
> >  > >
> >  > > Thanks for all your help in advance.
> >  > >
> >  > > Michael
> >  > >
> >  >
> >
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: jmeter-user-unsubscribe@jakarta.apache.org
> For additional commands, e-mail: jmeter-user-help@jakarta.apache.org
>
>

Re: Distributed Testing: Not actually creating load?

Posted by sebb <se...@gmail.com>.
In client-server mode, only the test plan is sent from the client to
the server(s).

Any additional files - e.g. CSV input files - need to be present on
the server host in the location specified by  the test plan.

Sample data is returned to the client, and processed/stored by the client.
This can become a bottleneck at the client - both for JMeter itself,
and for the network connection - under high loads.

Data files are best randomised before use.
Likewise, if you want to run with different data on different hosts,
then create different data files for each host (but you can use the
same name).

On 06/06/2008, Michael McDonnell <bz...@gmail.com> wrote:
> How did you randomize the data from the CSVs? (if I may ask)
>
>  Also, I'm dealing with a lot of optimistic locking issues which would only
>  occur if each csv is doing the EXACT same thing at the exact same time
>  (which is completely likely)
>
>
>  On Thu, Jun 5, 2008 at 9:54 PM, Ryan Dooley <ry...@gmail.com> wrote:
>
>  > I had a similar experience the first time.  Turns out that the data I
>  > wanted
>  > to test with (HTTP POSTs) has to be put on each remote.  I also had a
>  > process to randomize the data when transferred to the remotes.  I finally
>  > got the load up high enough across 10 machines like yours.
>  >
>  > The test harness I had was pretty simple:  post these things to this url.
>  >
>  > On Thu, Jun 5, 2008 at 5:19 PM, Michael McDonnell <bz...@gmail.com>
>  > wrote:
>  >
>  > > We're running a distributed test (roughly 7 remote workstations) on a
>  > > pretty
>  > > hefty box (8 cores, 32 gigs ram.... etc...)
>  > >
>  > > However, something seems to be going wrong... perhaps its because I'm
>  > > crossing linux and windows platforms to try to do the testing?
>  > >
>  > > We're load testing a web application, so primarily, the only work we're
>  > > doing is http requests (there are a few "java requests" that actually is
>  > an
>  > > app I created to make webservice calls, but we'll get to that later)
>  > >
>  > > However, when we view the transactions in the database, they are
>  > extremely
>  > > low. (frighteningly low).
>  > >
>  > > Then we run the test from a single user work station (same test, 300
>  > users
>  > > doing work) and our results come back fantastically!
>  > >
>  > > Now granted: I guess the big deal is this: when the app uses a csv in
>  > > distributed mode, does each slave utilize the the same csv in the same
>  > > order
>  > > ? or is there a sort of "break up" so that no two slaves are using the
>  > same
>  > > line in the csv?
>  > >
>  > > I'm sorry for what may be dumb questions... but we're coming down to a
>  > > tight
>  > > deadline, and the distributed testing is not giving us good results where
>  > > as
>  > > the local testing is.
>  > >
>  > > Thanks for all your help in advance.
>  > >
>  > > Michael
>  > >
>  >
>

---------------------------------------------------------------------
To unsubscribe, e-mail: jmeter-user-unsubscribe@jakarta.apache.org
For additional commands, e-mail: jmeter-user-help@jakarta.apache.org


Re: Distributed Testing: Not actually creating load?

Posted by Michael McDonnell <bz...@gmail.com>.
How did you randomize the data from the CSVs? (if I may ask)

Also, I'm dealing with a lot of optimistic locking issues which would only
occur if each csv is doing the EXACT same thing at the exact same time
(which is completely likely)

On Thu, Jun 5, 2008 at 9:54 PM, Ryan Dooley <ry...@gmail.com> wrote:

> I had a similar experience the first time.  Turns out that the data I
> wanted
> to test with (HTTP POSTs) has to be put on each remote.  I also had a
> process to randomize the data when transferred to the remotes.  I finally
> got the load up high enough across 10 machines like yours.
>
> The test harness I had was pretty simple:  post these things to this url.
>
> On Thu, Jun 5, 2008 at 5:19 PM, Michael McDonnell <bz...@gmail.com>
> wrote:
>
> > We're running a distributed test (roughly 7 remote workstations) on a
> > pretty
> > hefty box (8 cores, 32 gigs ram.... etc...)
> >
> > However, something seems to be going wrong... perhaps its because I'm
> > crossing linux and windows platforms to try to do the testing?
> >
> > We're load testing a web application, so primarily, the only work we're
> > doing is http requests (there are a few "java requests" that actually is
> an
> > app I created to make webservice calls, but we'll get to that later)
> >
> > However, when we view the transactions in the database, they are
> extremely
> > low. (frighteningly low).
> >
> > Then we run the test from a single user work station (same test, 300
> users
> > doing work) and our results come back fantastically!
> >
> > Now granted: I guess the big deal is this: when the app uses a csv in
> > distributed mode, does each slave utilize the the same csv in the same
> > order
> > ? or is there a sort of "break up" so that no two slaves are using the
> same
> > line in the csv?
> >
> > I'm sorry for what may be dumb questions... but we're coming down to a
> > tight
> > deadline, and the distributed testing is not giving us good results where
> > as
> > the local testing is.
> >
> > Thanks for all your help in advance.
> >
> > Michael
> >
>

Re: Distributed Testing: Not actually creating load?

Posted by Ryan Dooley <ry...@gmail.com>.
I had a similar experience the first time.  Turns out that the data I wanted
to test with (HTTP POSTs) has to be put on each remote.  I also had a
process to randomize the data when transferred to the remotes.  I finally
got the load up high enough across 10 machines like yours.

The test harness I had was pretty simple:  post these things to this url.

On Thu, Jun 5, 2008 at 5:19 PM, Michael McDonnell <bz...@gmail.com>
wrote:

> We're running a distributed test (roughly 7 remote workstations) on a
> pretty
> hefty box (8 cores, 32 gigs ram.... etc...)
>
> However, something seems to be going wrong... perhaps its because I'm
> crossing linux and windows platforms to try to do the testing?
>
> We're load testing a web application, so primarily, the only work we're
> doing is http requests (there are a few "java requests" that actually is an
> app I created to make webservice calls, but we'll get to that later)
>
> However, when we view the transactions in the database, they are extremely
> low. (frighteningly low).
>
> Then we run the test from a single user work station (same test, 300 users
> doing work) and our results come back fantastically!
>
> Now granted: I guess the big deal is this: when the app uses a csv in
> distributed mode, does each slave utilize the the same csv in the same
> order
> ? or is there a sort of "break up" so that no two slaves are using the same
> line in the csv?
>
> I'm sorry for what may be dumb questions... but we're coming down to a
> tight
> deadline, and the distributed testing is not giving us good results where
> as
> the local testing is.
>
> Thanks for all your help in advance.
>
> Michael
>