You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by GitBox <gi...@apache.org> on 2021/01/06 06:18:10 UTC

[GitHub] [airflow] SakuraSound commented on issue #13343: Remote logging broken (Airflow 2.0.0 on GCS)

SakuraSound commented on issue #13343:
URL: https://github.com/apache/airflow/issues/13343#issuecomment-755106802


   I am also seeing issues here
   
   **Apache Airflow version:** 2.0.0
   
   **Kubernetes version (if you are using kubernetes) (use kubectl version):**
   
   ```
   $ kubectl version
   Client Version: version.Info{Major:"1", Minor:"18", GitVersion:"v1.18.6", GitCommit:"dff82dc0de47299ab66c83c626e08b245ab19037", GitTreeState:"clean", BuildDate:"2020-07-16T00:04:31Z", GoVersion:"go1.14.4", Compiler:"gc", Platform:"darwin/amd64"}
   Server Version: version.Info{Major:"1", Minor:"16+", GitVersion:"v1.16.15-eks-ad4801", GitCommit:"ad4801fd44fe0f125c8d13f1b1d4827e8884476d", GitTreeState:"clean", BuildDate:"2020-10-20T23:27:12Z", GoVersion:"go1.13.15", Compiler:"gc", Platform:"linux/amd64"}
   ```
   
   **Environment:**
   
   **Cloud provider or hardware configuration:** AWS
   
   **What happened:**
   
   Remote logging is configured as follows:
   
   export AIRFLOW__LOGGING__REMOTE_LOGGING=True
   export AIRFLOW__LOGGING__REMOTE_BASE_LOG_FOLDER=s3://arn:aws:s3:us-west-2:<AWS_ACCOUNT_ID>:accesspoint:<OUR_ACCESS_POINT>/logs
   export AIRFLOW__LOGGING__REMOTE_LOG_CONN_ID=S3LogConn
   
   This was also working for us at 1.10.14, but stopped working in Airflow 2.0.0.
   
   What we noticed specifically is that if we run 
   `airflow tasks run test_dag test now` from the web server or scheduler, we saw logs in S3. But when we let airflow naturally run DAGS, we see:
   
   ```
   [2021-01-05 19:12:41,205] {s3_task_handler.py:193} ERROR - Could not write logs to s3://arn:aws:s3:us-west-2:<AWS_ACCOUNT_ID>:accesspoint:<OUR_ACCESS_POINT>/logs/test_dag/test/2021-01-05T19:12:23.797669+00:00/1.log
   Traceback (most recent call last):
     File "/home/airflow/.local/lib/python3.8/site-packages/airflow/providers/amazon/aws/log/s3_task_handler.py", line 186, in s3_write
       self.hook.load_string(
     File "/home/airflow/.local/lib/python3.8/site-packages/airflow/providers/amazon/aws/hooks/s3.py", line 57, in wrapper
       connection = self.get_connection(self.aws_conn_id)
     File "/home/airflow/.local/lib/python3.8/site-packages/airflow/hooks/base.py", line 63, in get_connection
       conn = Connection.get_connection_from_secrets(conn_id)
     File "/home/airflow/.local/lib/python3.8/site-packages/airflow/models/connection.py", line 351, in get_connection_from_secrets
       conn = secrets_backend.get_connection(conn_id=conn_id)
     File "/home/airflow/.local/lib/python3.8/site-packages/airflow/utils/session.py", line 64, in wrapper
       with create_session() as session:
     File "/usr/local/lib/python3.8/contextlib.py", line 113, in __enter__
       return next(self.gen)
     File "/home/airflow/.local/lib/python3.8/site-packages/airflow/utils/session.py", line 29, in create_session
       session = settings.Session()
   TypeError: 'NoneType' object is not callable
   ```
   
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org