You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@beam.apache.org by "ASF GitHub Bot (Jira)" <ji...@apache.org> on 2021/10/05 00:29:00 UTC

[jira] [Work logged] (BEAM-12950) Missing events when using Python WriteToFiles in streaming pipeline

     [ https://issues.apache.org/jira/browse/BEAM-12950?focusedWorklogId=660003&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-660003 ]

ASF GitHub Bot logged work on BEAM-12950:
-----------------------------------------

                Author: ASF GitHub Bot
            Created on: 05/Oct/21 00:28
            Start Date: 05/Oct/21 00:28
    Worklog Time Spent: 10m 
      Work Description: pabloem commented on pull request #15576:
URL: https://github.com/apache/beam/pull/15576#issuecomment-933959656


   sorry about the delay on this. Currently we're removing files at an erroneous time. I think we need to delete orphaned files in a subsequent step (after we're sure that there won't be retries). But for now, and since Beam 2.34.0 is around the corner, I think we can move forward with this, because we want to make sure no data is dropped - I will review this by tomorrow and merge it if it looks good


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: github-unsubscribe@beam.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


Issue Time Tracking
-------------------

    Worklog Id:     (was: 660003)
    Time Spent: 1h 50m  (was: 1h 40m)

> Missing events when using Python WriteToFiles in streaming pipeline
> -------------------------------------------------------------------
>
>                 Key: BEAM-12950
>                 URL: https://issues.apache.org/jira/browse/BEAM-12950
>             Project: Beam
>          Issue Type: Bug
>          Components: io-py-files
>    Affects Versions: 2.32.0
>            Reporter: David
>            Priority: P1
>          Time Spent: 1h 50m
>  Remaining Estimate: 0h
>
> We have a Python streaming pipeline consuming events from PubSub and writing them into GCS, in Dataflow.
> After performing some tests, we realized that we were missing events. The reason was that some files were being deleted from the temporary folder before moving them to the destination.
> In the logs we can see that there might be a race condition: the code checks if the file exists in temp folder before it has been actually created, Thus it’s not moved to the destination. Afterwards, the file is considered orphaned and is deleted from the temp folder in this line: [https://github.com/apache/beam/blob/master/sdks/python/apache_beam/io/fileio.py#L677].
> Since files are being moved from the temp folder to the final destination, they shouldn’t be deleted in any case, otherwise we would lose events. For what purpose we have “orphaned files”? Should they exist?
> We know that Python SDK WriteToFiles is still experimental, but missing events looks like a big deal and that's why I've created it as P1. If you think it should be lowered let me know.
> The easiest and safest approach right now would be to not delete any files and log a message to warn that some files might be left orphaned in the temporary folder. Eventually, in the next execution, that orphaned file will be moved to the destination, so we don’t lose any event. In addition, the log level should be always INFO, because Dataflow doesn’t accept DEBUG.level. 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)