You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@beam.apache.org by "Yichi Zhang (Jira)" <ji...@apache.org> on 2020/05/07 16:20:00 UTC
[jira] [Updated] (BEAM-9905) python nexmark benchmark suite metrics
[ https://issues.apache.org/jira/browse/BEAM-9905?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Yichi Zhang updated BEAM-9905:
------------------------------
Description:
Ensure we can collect output metrics from the query pipelines with the jenkins test infra such as:
execution time, processing event rate, number of results, also invalid auctions/bids, …
was:
Ensure we can collect output metrics from the query pipelines such as:
execution time, processing event rate, number of results, also invalid auctions/bids, …
> python nexmark benchmark suite metrics
> --------------------------------------
>
> Key: BEAM-9905
> URL: https://issues.apache.org/jira/browse/BEAM-9905
> Project: Beam
> Issue Type: Sub-task
> Components: benchmarking-py, testing-nexmark
> Reporter: Yichi Zhang
> Priority: Major
>
> Ensure we can collect output metrics from the query pipelines with the jenkins test infra such as:
> execution time, processing event rate, number of results, also invalid auctions/bids, …
--
This message was sent by Atlassian Jira
(v8.3.4#803005)