You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@lucene.apache.org by "Ishan Chattopadhyaya (JIRA)" <ji...@apache.org> on 2017/03/19 22:16:41 UTC

[jira] [Updated] (SOLR-10317) Solr Nightly Benchmarks

     [ https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Ishan Chattopadhyaya updated SOLR-10317:
----------------------------------------
    Description: 
Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be found here, https://home.apache.org/~mikemccand/lucenebench/.

Preferably, we need:
# A suite of benchmarks that build Solr from a commit point, start Solr nodes, both in SolrCloud and standalone mode, and record timing information of various operations like indexing, querying, faceting, grouping, replication etc.
# It should be possible to run them either as an independent suite or as a Jenkins job, and we should be able to report timings as graphs (Jenkins has some charting plugins).
# The code should eventually be integrated in the Solr codebase, so that it never goes out of date.

There is some prior work / discussion:
# https://github.com/shalinmangar/solr-perf-tools (Shalin)
# https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md (Ishan/Vivek)
# SOLR-2646 (Mark Miller)
# https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)

There is support for building, starting, indexing/querying and stopping Solr in some of these frameworks above. However, the benchmarks run are very limited. Any of these can be a starting point, or a new framework can as well be used. The motivation is to be able to cover every functionality of Solr with a corresponding benchmark that is run every night.

  was:
Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be found here, https://home.apache.org/~mikemccand/lucenebench/.

Preferably, we need:
# A suite of benchmarks that build Solr from a commit point, start Solr nodes, both in SolrCloud and standalone mode, and record timing information of various operations like indexing, querying, faceting, grouping, replication etc.
# It should be possible to run them via Jenkins, and we should be able to leverage some reporting/charting plugins.
# The code should eventually be integrated in the Solr codebase, so that it never goes out of date.

There is some prior work / discussion:
# https://github.com/shalinmangar/solr-perf-tools (Shalin)
# https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md (Ishan/Vivek)
# SOLR-2646 (Mark Miller)
# https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)

There is support for building, starting, indexing/querying and stopping Solr in some of these frameworks above. However, the benchmarks run are very limited. Any of these can be a starting point, or a new framework can as well be used. The motivation is to be able to cover every functionality of Solr with a corresponding benchmark that is run every night.


> Solr Nightly Benchmarks
> -----------------------
>
>                 Key: SOLR-10317
>                 URL: https://issues.apache.org/jira/browse/SOLR-10317
>             Project: Solr
>          Issue Type: Task
>      Security Level: Public(Default Security Level. Issues are Public) 
>            Reporter: Ishan Chattopadhyaya
>              Labels: gsoc2017, mentor
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr nodes, both in SolrCloud and standalone mode, and record timing information of various operations like indexing, querying, faceting, grouping, replication etc.
> # It should be possible to run them either as an independent suite or as a Jenkins job, and we should be able to report timings as graphs (Jenkins has some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md (Ishan/Vivek)
> # SOLR-2646 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> There is support for building, starting, indexing/querying and stopping Solr in some of these frameworks above. However, the benchmarks run are very limited. Any of these can be a starting point, or a new framework can as well be used. The motivation is to be able to cover every functionality of Solr with a corresponding benchmark that is run every night.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@lucene.apache.org
For additional commands, e-mail: dev-help@lucene.apache.org