You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@beam.apache.org by "ahalya (Jira)" <ji...@apache.org> on 2021/11/06 10:56:00 UTC

[jira] [Created] (BEAM-13194) Call the stored procedure in dataflow pipeline

ahalya created BEAM-13194:
-----------------------------

             Summary: Call the stored procedure in dataflow pipeline
                 Key: BEAM-13194
                 URL: https://issues.apache.org/jira/browse/BEAM-13194
             Project: Beam
          Issue Type: Bug
          Components: examples-python
            Reporter: ahalya


[[stackoverflow]  

|https://stackoverflow.com/questions/69863008/call-the-bigquery-stored-procedure-in-dataflow-pipeline]

I have written a stored procedure in Bigquery and trying to call it within a dataflow pipeline. This works for the {{SELECT}} queries but not for the stored procedure:
pipeLine = beam.Pipeline(options=options)
rawdata = ( pipeLine
            | beam.io.ReadFromBigQuery(
               query="CALL my_dataset.create_customer()", use_standard_sql=True)
          )
          pipeLine.run().wait_until_finish()
[{{}}|https://stackoverflow.com/questions/69863008/call-the-bigquery-stored-procedure-in-dataflow-pipeline]

Stored procedure:
CREATE OR REPLACE PROCEDURE my_dataset.create_customer()
BEGIN
    SELECT * 
    FROM `project_name.my_dataset.my_table` 
    WHERE customer_name LIKE "%John%"
    ORDER BY created_time
    LIMIT 5;
END;
 

I am able to create the stored procedure and call it within the Bigquery console. But, in the dataflow pipeline, it throws an error:
{quote}"code": 400,
"message": "configuration.query.destinationEncryptionConfiguration cannot be set for scripts",

"message": "configuration.query.destinationEncryptionConfiguration cannot be set for scripts", "domain": "global",
"reason": "invalid"

"status": "INVALID_ARGUMENT"
{quote}
 

 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)