You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by GitBox <gi...@apache.org> on 2021/08/18 07:22:35 UTC

[GitHub] [airflow] aioannoa commented on issue #17605: Airflow stdout not working/console problem

aioannoa commented on issue #17605:
URL: https://github.com/apache/airflow/issues/17605#issuecomment-900882673


   Hi there, 
   
   I have rechecked the code and I can not honestly see any typo issues. I have also tried this config a couple of times, so I doubt this is an issue. Re the pod, I am always creating a new Docker image which I then load into my cluster, so this is a new pod running the new img version. I have also checked the configuration of Airflow which is displayed as follows:
   [logging]
   base_log_folder = /autoupgr_orch/airflow/logs
   remote_logging = False
   remote_log_conn_id = 
   google_key_path = 
   remote_base_log_folder = 
   encrypt_s3_logs = False
   logging_level = INFO
   fab_logging_level = WARN
   **logging_config_class = log_config.LOGGING_CONFIG**
   colored_console_log = True
   colored_log_format = [%(blue)s%(asctime)s%(reset)s] {%(blue)s%(filename)s:%(reset)s%(lineno)d} %(log_color)s%(levelname)s%(reset)s - %(log_color)s%(message)s%(reset)s
   colored_formatter_class = airflow.utils.log.colored_log.CustomTTYColoredFormatter
   log_format = [%(asctime)s] {%(filename)s:%(lineno)d} %(levelname)s - %(message)s
   simple_log_format = %(asctime)s %(levelname)s - %(message)s
   task_log_prefix_template = 
   log_filename_template = {{ ti.dag_id }}/{{ ti.task_id }}/{{ ts }}/{{ try_number }}.log
   log_processor_filename_template = {{ filename }}.log
   dag_processor_manager_log_location = /autoupgr_orch/airflow/logs/dag_processor_manager/dag_processor_manager.log
   **task_log_reader = stdouttask**
   extra_loggers = 
   
   Shouldn't this be enough to verify my configuration has been applied? I can also see some related log when using "kubectl -f logs pod_name". 
   At this stage perhaps I should clarify what I mean by saying I can still see the logs ini the file. I do not have any logs under airflow/logs related to the dag's name, yet I can see all the logs from the airflow scheduler and individual tasks in a fiile named airflow-scheduler.out. I have tried to search info on this file and I could not find anything. I do also have a .err and .log file. My intention is to have all the output of .out into stdout, and able to access them by usinig "kubectl logs -f pod_name". Am I still missing sth?
   
   Thanks.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org