You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@beam.apache.org by "Ismaël Mejía (Jira)" <ji...@apache.org> on 2020/05/07 15:17:00 UTC
[jira] [Commented] (BEAM-9905) python nexmark benchmark suite
metrics
[ https://issues.apache.org/jira/browse/BEAM-9905?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17101765#comment-17101765 ]
Ismaël Mejía commented on BEAM-9905:
------------------------------------
Mmm this is part of just implementing the queries no?
> python nexmark benchmark suite metrics
> --------------------------------------
>
> Key: BEAM-9905
> URL: https://issues.apache.org/jira/browse/BEAM-9905
> Project: Beam
> Issue Type: Sub-task
> Components: benchmarking-py, testing-nexmark
> Reporter: Yichi Zhang
> Priority: Major
>
> Ensure we can collect output metrics from the query pipelines such as:
> execution time, processing event rate, number of results, also invalid auctions/bids, …
--
This message was sent by Atlassian Jira
(v8.3.4#803005)