You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@beam.apache.org by Mark Striebeck <ma...@gmail.com> on 2022/06/16 20:18:06 UTC

Credentials for BigQuery IO Connector

Hi,

We are using the BigQuery IO connector to read and write data in our
datapipeline. The pipeline is running on dataflow.

We want to use two different service accounts for reading (from a data
provider) and writing (into our own tables). But it seems as if the only
way to pass credentials to a dataflow worker is by including the service
key in the docker image and then set GOOGLE_APPLICATION_CREDENTIALS in the
docker image and point to that file.

Which doesn't give us the ability to use different credentials for
different operations.

Any ideas or pointers?

     Mark