You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by GitBox <gi...@apache.org> on 2020/05/17 22:45:35 UTC

[GitHub] [airflow] TRReeve edited a comment on issue #8212: Can't read S3 remote logs using Airflow 1.10.9

TRReeve edited a comment on issue #8212:
URL: https://github.com/apache/airflow/issues/8212#issuecomment-629872414


   @marclamberti Your answer is how I understood it would work as well and it half worked for me I would get the logs uploading fine into the S3 bucket but then when i went to "view logs" in the UI it would give the "logs not found" error with no output in the logs to indicate it was using the s3 connection or the read_key function to retrieve anything. 
   
   It would be really nice if I could just define AIRFLOW_CONN_S3_URI = s3://user:pass@S3 then have REMOTE_BASE_LOGS_FOLDER=s3://airflow-stuff/logs and the UI would build the path but I could only get logs uploading. My working helm template for airflow on k8s builds the connection s3://access_key:secret_key@{{ mys3path }} and then remote_log_path is s3://{{ mys3path }}. Aside that it's exactly the same as you defined above with the same variables defined under AIRFLOW__KUBERNETES__ENVIRONMENT_VARIABLES. 


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org