You are viewing a plain text version of this content. The canonical link for it is here.
Posted to github@beam.apache.org by GitBox <gi...@apache.org> on 2022/06/03 20:26:01 UTC

[GitHub] [beam] kennknowles opened a new issue, #18783: Maximum recursion depth with Apache Beam and Google Cloud SDK

kennknowles opened a new issue, #18783:
URL: https://github.com/apache/beam/issues/18783

   I am working with a project which is using Google Cloud SDK and Cloud Engine. This project is generating a DataStore which I need to access in order to copy parts of it towards a Big Query instance, similar to what is describe in [https://blog.papercut.com/google-cloud-dataflow-data-migration/](https://blog.papercut.com/google-cloud-dataflow-data-migration/).
   
   I install Apache Beam using pip install \--upgrade apache_beam. However, when I set up everything in order to access the DataStore using the ndb models and want to import Beam using from airflow_beam import pipeline, I got an error related to some part of the SDK described in the docs part.
   
   Imported from Jira [BEAM-4437](https://issues.apache.org/jira/browse/BEAM-4437). Original Jira may contain additional context.
   Reported by: jonperron.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: github-unsubscribe@beam.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org