You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@jmeter.apache.org by chaitanya bhatt <bh...@gmail.com> on 2015/04/11 09:43:16 UTC

Automatic Grafana/InfluxDB dashboard generation for Jmeter scripts

Jmeter Users,

I have created an open source Grafana dashboard generator tool for Jmeter
scripts. This program will automatically generate graph panel with InfluxDB
query for all the HTTP samplers in the JMeter .jmx script.

Download Binary :
https://github.com/bhattchaitanya/Grafana-Dashboard-Generator/releases/download/V.1.0/GenerateDashboard.jar

Screenshot:
https://github.com/bhattchaitanya/Grafana-Dashboard-Generator/releases

Source Code: https://github.com/bhattchaitanya/Grafana-Dashboard-Generator

Wiki: https://github.com/bhattchaitanya/Grafana-Dashboard-Generator/wiki

Let me know your thoughts on this tool or ideas to improve.

Thanks
Chaitanya M Bhatt
http://www.performancecompetence.com/


On Fri, Apr 10, 2015 at 1:45 PM, Glenn Caccia <ga...@yahoo.com.invalid>
wrote:

>
> Thinking about this more, you could use a dynamic rootMetricsPrefix,
> something like..
>
> jmeter.${__TestPlanName}.${__time}.
>
> That could then be used across all scripts and would satisfy the basic
> requirement from a storage perspective, but Grafana itself still can't
> easily handle the requirement from a display perspective.  Since queries
> are hard coded into a graph, you'd be stuck either needing to make a new
> dashboard for each test run or manually editing a dashboard for each test
> run.  It would be a mess to work with.
>        From: Glenn Caccia <ga...@yahoo.com.INVALID>
>  To: JMeter Users List <us...@jmeter.apache.org>
>  Sent: Friday, April 10, 2015 1:23 PM
>  Subject: Re: Thoughts on InfluxDB/Grafana integration
>
>
> You could do that, but it would then require remembering to change the
> root value each time you did a new run, which would then also require
> changing your dashboard queries each time to pick up on the new run.  I
> don't think that's a solution I would want to maintain.  I would definitely
> use variations on the rootMetricsPrefix to distinguish between test
> scripts, however.  The InfluxDB/Grafana solution is great for real-time
> analysis, which is certainly important, but seems to fall short on the need
> to easily compare runs.
>        From: Philippe Mouawad <ph...@gmail.com>
>
>
>  To: JMeter Users List <us...@jmeter.apache.org>
>  Sent: Friday, April 10, 2015 11:54 AM
>  Subject: Re: Thoughts on InfluxDB/Grafana integration
>
> Hi,
> What about playing on rootMetricsPrefix to do that ?
>
> Regarding SQL, do you know that you can now easily build a jdbc backend to
> store results in a database, you could contribute this to core.
>
>
> Regards
>
>
>
> On Friday, April 10, 2015, Glenn Caccia <ga...@yahoo.com.invalid>
> wrote:
>
> >  I've successfully installed InfluxDB and Grafana and did some basic
> > testing where I can now see results in Grafana.  I'm beginning to wonder
> > about the benefits of this system.  A while ago I had toyed around with
> the
> > idea of using Elasticsearch as a backend for JMeter test results and
> using
> > Kibana to view results.  I ultimately dropped the idea because of the
> > limitations of how data is structured.  I see the exact same issue with
> > InfluxDB and Grafana (either that, or I don't fully understand what can
> be
> > done in these tools).
> > What I want when viewing results is the ability to work with results in
> > terms of projects, test plans, and results from a particular test run.
> For
> > example, I want to see results for project A, test plan B and compare
> > results from the prior run with the current run.  With InfluxDB/Grafana
> > solution, there is no concept of a run.  If I run a test one day and then
> > run the same test the subsequent day, I can't compare the results using
> the
> > same view.  I can certainly change my time filter to see both inline
> (with
> > a big gap inbetween) or view one and then view the other, but I can't
> stack
> > them in separate graphs and see them at the same time or display them in
> > the same graph.  Likewise, if I want to see what performance was like the
> > last time a test was run and I don't know when the last test was run, I
> > have to do a bit of searching by playing with the time filter.
> > A while ago I worked for a company that used SQL Server for a lot of
> their
> > data storage needs.  This gave me access to the SQL Server Report Builder
> > tool.  I was able to create a solution where JMeter results were loaded
> > into SQL Server and we had a report interface where you could choose your
> > project, choose your test plan and then see the dates/times for all prior
> > runs.  From this, you could choose which run(s) to view.  I don't have
> > access to tools like that with my current company, but I miss that kind
> of
> > ability to structure and access test results.  A similar approach
> > to storing and presenting results can be seen with loadosphia.
> > In short, it seems like this new solution is primarily useful for
> > analyzing results from a current test run (which can already be done with
> > existing listeners) but is not as useful a tool for comparing results or
> > checking on results from prior runs.  Am I missing something or is that a
> > fair conclusion?
> >
>
>
> --
> Cordialement.
> Philippe Mouawad.
>
>
>
>
>
>