You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@beam.apache.org by "Beam JIRA Bot (Jira)" <ji...@apache.org> on 2021/05/16 17:20:02 UTC

[jira] [Commented] (BEAM-9917) BigQueryBatchFileLoads dynamic destination

    [ https://issues.apache.org/jira/browse/BEAM-9917?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17345756#comment-17345756 ] 

Beam JIRA Bot commented on BEAM-9917:
-------------------------------------

This issue is assigned but has not received an update in 30 days so it has been labeled "stale-assigned". If you are still working on the issue, please give an update and remove the label. If you are no longer working on the issue, please unassign so someone else may work on it. In 7 days the issue will be automatically unassigned.

> BigQueryBatchFileLoads dynamic destination
> ------------------------------------------
>
>                 Key: BEAM-9917
>                 URL: https://issues.apache.org/jira/browse/BEAM-9917
>             Project: Beam
>          Issue Type: Bug
>          Components: io-py-gcp
>    Affects Versions: 2.17.0
>            Reporter: Tord Sætren
>            Assignee: Pablo Estrada
>            Priority: P2
>              Labels: beginner, documentation, stale-assigned
>
> I am trying to use BigQueryBatchFileLoads to upload data from pubsub. It works fine for a single table, but when I try to use a dynamic destination such as
> destination=lambda elem: "my_project:my_dataset." + elem["sensor_key"],
> it just makes a new table for each time the triggering_frequency procs. I know it makes temporary tables before loading it all into one, but it never loads them. It just creates more and more tables.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)