You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@beam.apache.org by "Kenneth Knowles (Jira)" <ji...@apache.org> on 2021/05/15 17:59:02 UTC

[jira] [Updated] (BEAM-10306) Add latency measurement to Python benchmarks

     [ https://issues.apache.org/jira/browse/BEAM-10306?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Kenneth Knowles updated BEAM-10306:
-----------------------------------
    Resolution: Fixed
        Status: Resolved  (was: Resolved)

Hello! Due to a bug in our Jira configuration, this issue had status:Resolved but resolution:Unresolved.

I am bulk editing these issues to have resolution:Fixed

If a different resolution is appropriate, please change it. To do this, click the "Resolve" button (you can do this even for closed issues) and set the Resolution field to the right value.

> Add latency measurement to Python benchmarks
> --------------------------------------------
>
>                 Key: BEAM-10306
>                 URL: https://issues.apache.org/jira/browse/BEAM-10306
>             Project: Beam
>          Issue Type: Task
>          Components: benchmarking-py, build-system, runner-flink
>            Reporter: Maximilian Michels
>            Assignee: Maximilian Michels
>            Priority: P2
>             Fix For: 2.24.0
>
>          Time Spent: 4h 50m
>  Remaining Estimate: 0h
>
> There are currently no latency metrics in the load tests which makes it impossible to monitor latency regressions.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)