You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by GitBox <gi...@apache.org> on 2021/01/11 11:14:41 UTC

[GitHub] [airflow] bllach edited a comment on issue #12236: Logging to S3 stops working when MLFlow import added to file

bllach edited a comment on issue #12236:
URL: https://github.com/apache/airflow/issues/12236#issuecomment-757876226


   I have same issue with remote logging:
   **Apache Airflow version:** 2.0.0
   
   **Environment:**
   - **Cloud provider or hardware configuration:** AWS S3
   - **OS (e.g. from /etc/os-release):** Ubuntu 20.10 (Groovy Gorilla) [1PC] and Ubuntu 18.04.4 LTS (Bionic Beaver) [2PC]
   - **Kernel (e.g. uname -a):** 5.8.0-36-generic (Ubuntu 20.10) [1PC] and 5.3.0-62-generic (Ubuntu 18.04) [2PC]
   - **Install tools:** [airflow-quick-start](https://airflow.apache.org/docs/apache-airflow/stable/start.html#quick-start) and [airflow-config](https://airflow.apache.org/docs/apache-airflow/stable/configurations-ref.html#logging) and [airflow-Example Pipeline definition](https://airflow.apache.org/docs/apache-airflow/stable/tutorial.html#example-pipeline-definition)
   - **Others:** Setup logging to s3, by env vars and airflow.cfg
     env:
     ```bash
     AIRFLOW__CORE__REMOTE_LOGGING: "True"
     AIRFLOW__CORE__REMOTE_BASE_LOG_FOLDER: "s3://bucket/airflow/logs"
     AIRFLOW__CORE__REMOTE_LOG_CONN_ID: "custom_s3_id"
     ```
     airflow.cfg
     ```bash
     [logging]
     remote_logging = True
     remote_log_conn_id = custom_s3_id
     remote_base_log_folder = s3://bucket/airflow/logs
     ```
   - **Python:** 3.7.9 [1PC] and 3.6.9 [2PC]
   
   **What happened:**
   With [Example Pipeline definition](https://airflow.apache.org/docs/apache-airflow/stable/tutorial.html#example-pipeline-definition) remote logging work fine, but after add `import mlflow` to dag code, airflow doesn't send logs to s3 storage.
   
   **What you expected to happen:**
   Remote logging works fine with `import mlflow`
   
   **How to reproduce it:**
   - Install airflow [airflow-quick-start](https://airflow.apache.org/docs/apache-airflow/stable/start.html#quick-start);
   - Install mlflow [mlflow-quick-start](https://www.mlflow.org/docs/latest/quickstart.html#installing-mlflow);
   - Setup config with abowe env vars or airflow.cfg;
   - Add s3 connection to airflow;
   - Paste dag code [airflow-Example Pipeline definition](https://airflow.apache.org/docs/apache-airflow/stable/tutorial.html#example-pipeline-definition) to dag dir;
   - Add `import mlflow` to dag code;
   - Run dag;
   
   **Anything else we need to know:** Tested on 2 PC with different os and hardware


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org