You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by "Aidar Mamytov (Jira)" <ji...@apache.org> on 2019/11/21 19:36:00 UTC
[jira] [Commented] (AIRFLOW-5935) Logs are not sent S3 by
S3TaskHandler
[ https://issues.apache.org/jira/browse/AIRFLOW-5935?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16979555#comment-16979555 ]
Aidar Mamytov commented on AIRFLOW-5935:
----------------------------------------
Figured out the problem. We're using Django in the project and it's logging is somehow interfering with Airflow being able to write logs remotely. Closing this issue
> Logs are not sent S3 by S3TaskHandler
> -------------------------------------
>
> Key: AIRFLOW-5935
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5935
> Project: Apache Airflow
> Issue Type: Task
> Components: logging
> Affects Versions: 1.10.6
> Reporter: Aidar Mamytov
> Assignee: Aidar Mamytov
> Priority: Major
>
> When exactly is S3TaskHandler supposed to have its *s3_write* or *close* method called? The logs are written locally but are not appearing in S3.I've pdb-debugged my custom log_config.py file and Airflow reads configs successfully and loads *S3TaskHandler* configs successfully. I also pdb-debugged and checked another thing with print statements - whenever I try to open "_View Log_" for any task in the admin dashboard, it definitely calls *S3TaskHandler.s3_read* and *S3TaskHandler.s3_log_exists* and successfully connects to S3. I also checked if Airflow is able to connect to S3 in Python console: imported *S3Hook* and *S3TaskHandler* and tried to connect to S3, read objects and write new ones to my bucket - all good.
> The problem is that although Airflow is able to connect to S3 bucket and interact with it with read/write operations, it just does not upload logs to it. What might I do wrong or what do I not understand about airflow remote logging?
--
This message was sent by Atlassian Jira
(v8.3.4#803005)