You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by GitBox <gi...@apache.org> on 2021/12/19 20:26:38 UTC

[GitHub] [airflow] potiuk commented on issue #20408: Logs Are Not Written When Using Remote Logging with K8s Executor

potiuk commented on issue #20408:
URL: https://github.com/apache/airflow/issues/20408#issuecomment-997455534


   I believe the problem is with understanding the logging authorization:
   
   See: https://airflow.apache.org/docs/apache-airflow-providers-google/stable/logging/gcs.html
   
   ``By default Application Default Credentials are used to obtain credentials. You can also set google_key_path option in [logging] section, if you want to use your own service account``
   
   As I read that - the ``remote_log_conn_id`` is used only to read the logs. but in order to write the logs, the PODs executed in K8S executor need to have "Application Default Credentials" with "write" access to the storage  (you can also provide `google_key_path` parameter but you'd have to make sure you have the key available in the POD and is generallly less secure). Since your PODs are likely running in GKE you should have ideallly workload identity configured, so that your PODs can have write access to your bucket: https://cloud.google.com/kubernetes-engine/docs/how-to/workload-identity.
   
   Note that the "gcs-conn-dev" in this case only needs "read" access to the bucket.
   
   I am closint it for now, assuming that your problem will be solved with this advice, please test it and comment here with your results and we can reopen it if needed.
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org