You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@beam.apache.org by "Pablo Estrada (Jira)" <ji...@apache.org> on 2020/05/12 01:12:00 UTC

[jira] [Commented] (BEAM-6514) Dataflow Batch Job Failure is leaving Datasets/Tables behind in BigQuery

    [ https://issues.apache.org/jira/browse/BEAM-6514?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17104994#comment-17104994 ] 

Pablo Estrada commented on BEAM-6514:
-------------------------------------

This seems to havbe been noticed by others here: [https://stackoverflow.com/questions/61658242/dataprep-is-leaving-datasets-tables-behind-in-bigquery]

> Dataflow Batch Job Failure is leaving Datasets/Tables behind in BigQuery
> ------------------------------------------------------------------------
>
>                 Key: BEAM-6514
>                 URL: https://issues.apache.org/jira/browse/BEAM-6514
>             Project: Beam
>          Issue Type: Bug
>          Components: io-java-gcp
>            Reporter: Rumeshkrishnan Mohan
>            Assignee: Chamikara Madhusanka Jayalath
>            Priority: Major
>
> Dataflow is leaving Datasets/Tables behind in BigQuery when the pipeline is cancelled or when it fails. I cancelled a job or it failed at run time, and it left behind a dataset and table in BigQuery.
>  # `cleanupTempResource` method involves cleaning tables and dataset after batch job succeed.
>  # If job failed in the middle or cancelled explicitly, the temporary dataset and tables remain exist. I do see the table expire period 1 day as per code in `getTableToExtract` function written in BigQueryQuerySource.java.
>  # I can understand that, keep temp tables and dataset when failure for debugging.
>  # Can we have pipeline or job optional parameters which get clean temporary dataset and tables when cancel or fail ?



--
This message was sent by Atlassian Jira
(v8.3.4#803005)