You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@jmeter.apache.org by Shanmugam Ganeshkumar <sh...@gmail.com> on 2009/09/15 06:44:14 UTC

Fwd: [Benchmarking] Running the tests on the command line and summarizing results

List,
This was posted in one of the list that does special benchmarking.

Can some expert on jmeter help us to get it right.

What we need is a table output when we run it from command prompt. How does
this gets executed ?

basically, how does the listener summary_report is saved while running the
test on command?

http://jakarta.apache.org/jmeter/usermanual/component_reference.html#Summary_Report

Thanks for the help

Cheers
Ganesh





---------- Forwarded message ----------
From: Andrea Aime
Date: 2009/9/14
Subject: [Benchmarking] Running the tests on the command line and
summarizing results


Hi,
here are some quick instructions to get started doing benchmarking on the
test box. It's a tentative setup and I'm looking for someone
to finish the job ;-)

In order to run jmeter you first need a jmeter script and possibly a
.csv file that drives its requests. I've attached a sample with
bluearble_ecw.jmx and bluemarble.csv

The .jmx file has to be generated using the JMeter GUI (it's an
xml file, if you want to hurt yourself you can also write it by hand),
the .csv file can be generated using Frank's bbox generator (wms_request.py,
attached).
Of course the .jmx file contains the url that will need to be
hit, that is something relatively easy to change even in the
.jmx file.

Once you have those you can run a test using:
jmeter -p jmeter.properties -n -t script.jmx -l script_results.jtl

The jtl file is actually just a csv file with details of all requests,
you can run the summarizer on it to get a table summary with average time,
throughput and so on, for example:

./summarizer.py states.jtl
Label   Avg     Min     Max     Throughput
1       19      13      41      45.0
10      85      12      376     80.9
20      179     13      851     84.7
40      435     14      2175    77.1

Now, it would be cool if we had a little wrapper that does
run both, something where one can call:

runbench script.jmx

and get the table as an output in one shot.
Any taker? ;-)

Cheers
Andrea


#---------------------------------------------------------------------------
# Results file configuration
#---------------------------------------------------------------------------

# This section helps determine how result data will be saved.
# The commented out values are the defaults.

# legitimate values: xml, csv, db.  Only xml and csv are currently
supported.
jmeter.save.saveservice.output_format=csv


# true when field should be saved; false otherwise

# assertion_results_failure_message only affects CSV output
#jmeter.save.saveservice.assertion_results_failure_message=false
#
#jmeter.save.saveservice.data_type=true
#jmeter.save.saveservice.label=true
#jmeter.save.saveservice.response_code=true
# response_data is not currently supported for CSV output
#jmeter.save.saveservice.response_data=false
# Save ResponseData for failed samples
#jmeter.save.saveservice.response_data.on_error=false
#jmeter.save.saveservice.response_message=true
#jmeter.save.saveservice.successful=true
#jmeter.save.saveservice.thread_name=true
#jmeter.save.saveservice.time=true
#jmeter.save.saveservice.subresults=true
#jmeter.save.saveservice.assertions=true
#jmeter.save.saveservice.latency=true
#jmeter.save.saveservice.samplerData=false
#jmeter.save.saveservice.responseHeaders=false
#jmeter.save.saveservice.requestHeaders=false
#jmeter.save.saveservice.encoding=false
#jmeter.save.saveservice.bytes=true
jmeter.save.saveservice.url=true
#jmeter.save.saveservice.filename=false
#jmeter.save.saveservice.hostname=false
#jmeter.save.saveservice.thread_counts=false
#jmeter.save.saveservice.sample_count=false

# Timestamp format
# legitimate values: none, ms, or a format suitable for SimpleDateFormat
#jmeter.save.saveservice.timestamp_format=ms
#jmeter.save.saveservice.timestamp_format=MM/dd/yy HH:mm:ss

# Put the start time stamp in logs instead of the end
sampleresult.timestamp.start=true


# legitimate values: none, first, all
#jmeter.save.saveservice.assertion_results=none

# For use with Comma-separated value (CSV) files or other formats
# where the fields' values are separated by specified delimiters.
# Default:
#jmeter.save.saveservice.default_delimiter=,
# For TAB, since JMeter 2.3 one can use:
#jmeter.save.saveservice.default_delimiter=\t

#jmeter.save.saveservice.print_field_names=false

# Optional list of JMeter variable names whose values are to be saved in the
result data files.
# Use commas to separate the names. For example:
#sample_variables=SESSION_ID,REFERENCE
# N.B. The current implementation saves the values in XML as attributes,
# so the names must be valid XML names.
# Versions of JMeter after 2.3.2 send the variable to all servers
# to ensure that the correct data is available at the client.

# Optional xml processing instruction for line 2 of the file:
#jmeter.save.saveservice.xml_pi=<?xml-stylesheet type="text/xsl"
href="sample.xsl"?>

Re: [Benchmarking] Running the tests on the command line and summarizing results

Posted by sebb <se...@gmail.com>.
On 15/09/2009, Shanmugam Ganeshkumar <sh...@gmail.com> wrote:
> List,
>  This was posted in one of the list that does special benchmarking.
>
>  Can some expert on jmeter help us to get it right.
>
>  What we need is a table output when we run it from command prompt. How does
>  this gets executed ?
>
>  basically, how does the listener summary_report is saved while running the
>  test on command?

Sorry, not possible. The GUI part of Listeners is deliberately not
invoked in command-line (non-GUI) mode, and that is where the
summarising is done.

>  http://jakarta.apache.org/jmeter/usermanual/component_reference.html#Summary_Report
>
>  Thanks for the help
>
>  Cheers
>  Ganesh
>
>
>
>
>
>  ---------- Forwarded message ----------
>  From: Andrea Aime
>  Date: 2009/9/14
>  Subject: [Benchmarking] Running the tests on the command line and
>  summarizing results
>
>
>  Hi,
>  here are some quick instructions to get started doing benchmarking on the
>  test box. It's a tentative setup and I'm looking for someone
>  to finish the job ;-)
>
>  In order to run jmeter you first need a jmeter script and possibly a
>  .csv file that drives its requests. I've attached a sample with
>  bluearble_ecw.jmx and bluemarble.csv
>
>  The .jmx file has to be generated using the JMeter GUI (it's an
>  xml file, if you want to hurt yourself you can also write it by hand),
>  the .csv file can be generated using Frank's bbox generator (wms_request.py,
>  attached).
>  Of course the .jmx file contains the url that will need to be
>  hit, that is something relatively easy to change even in the
>  .jmx file.
>
>  Once you have those you can run a test using:
>  jmeter -p jmeter.properties -n -t script.jmx -l script_results.jtl
>
>  The jtl file is actually just a csv file with details of all requests,
>  you can run the summarizer on it to get a table summary with average time,
>  throughput and so on, for example:
>
>  ./summarizer.py states.jtl
>  Label   Avg     Min     Max     Throughput
>  1       19      13      41      45.0
>  10      85      12      376     80.9
>  20      179     13      851     84.7
>  40      435     14      2175    77.1
>
>  Now, it would be cool if we had a little wrapper that does
>  run both, something where one can call:
>
>  runbench script.jmx
>
>  and get the table as an output in one shot.
>  Any taker? ;-)
>
>  Cheers
>  Andrea
>
>
>  #---------------------------------------------------------------------------
>  # Results file configuration
>  #---------------------------------------------------------------------------
>
>  # This section helps determine how result data will be saved.
>  # The commented out values are the defaults.
>
>  # legitimate values: xml, csv, db.  Only xml and csv are currently
>  supported.
>  jmeter.save.saveservice.output_format=csv
>
>
>  # true when field should be saved; false otherwise
>
>  # assertion_results_failure_message only affects CSV output
>  #jmeter.save.saveservice.assertion_results_failure_message=false
>  #
>  #jmeter.save.saveservice.data_type=true
>  #jmeter.save.saveservice.label=true
>  #jmeter.save.saveservice.response_code=true
>  # response_data is not currently supported for CSV output
>  #jmeter.save.saveservice.response_data=false
>  # Save ResponseData for failed samples
>  #jmeter.save.saveservice.response_data.on_error=false
>  #jmeter.save.saveservice.response_message=true
>  #jmeter.save.saveservice.successful=true
>  #jmeter.save.saveservice.thread_name=true
>  #jmeter.save.saveservice.time=true
>  #jmeter.save.saveservice.subresults=true
>  #jmeter.save.saveservice.assertions=true
>  #jmeter.save.saveservice.latency=true
>  #jmeter.save.saveservice.samplerData=false
>  #jmeter.save.saveservice.responseHeaders=false
>  #jmeter.save.saveservice.requestHeaders=false
>  #jmeter.save.saveservice.encoding=false
>  #jmeter.save.saveservice.bytes=true
>  jmeter.save.saveservice.url=true
>  #jmeter.save.saveservice.filename=false
>  #jmeter.save.saveservice.hostname=false
>  #jmeter.save.saveservice.thread_counts=false
>  #jmeter.save.saveservice.sample_count=false
>
>  # Timestamp format
>  # legitimate values: none, ms, or a format suitable for SimpleDateFormat
>  #jmeter.save.saveservice.timestamp_format=ms
>  #jmeter.save.saveservice.timestamp_format=MM/dd/yy HH:mm:ss
>
>  # Put the start time stamp in logs instead of the end
>  sampleresult.timestamp.start=true
>
>
>  # legitimate values: none, first, all
>  #jmeter.save.saveservice.assertion_results=none
>
>  # For use with Comma-separated value (CSV) files or other formats
>  # where the fields' values are separated by specified delimiters.
>  # Default:
>  #jmeter.save.saveservice.default_delimiter=,
>  # For TAB, since JMeter 2.3 one can use:
>  #jmeter.save.saveservice.default_delimiter=\t
>
>  #jmeter.save.saveservice.print_field_names=false
>
>  # Optional list of JMeter variable names whose values are to be saved in the
>  result data files.
>  # Use commas to separate the names. For example:
>  #sample_variables=SESSION_ID,REFERENCE
>  # N.B. The current implementation saves the values in XML as attributes,
>  # so the names must be valid XML names.
>  # Versions of JMeter after 2.3.2 send the variable to all servers
>  # to ensure that the correct data is available at the client.
>
>  # Optional xml processing instruction for line 2 of the file:
>  #jmeter.save.saveservice.xml_pi=<?xml-stylesheet type="text/xsl"
>  href="sample.xsl"?>
>

---------------------------------------------------------------------
To unsubscribe, e-mail: jmeter-user-unsubscribe@jakarta.apache.org
For additional commands, e-mail: jmeter-user-help@jakarta.apache.org