You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by GitBox <gi...@apache.org> on 2020/05/27 04:09:08 UTC

[GitHub] [airflow] halilduygulu opened a new issue #9006: Provide a way to config log level for airflow processes

halilduygulu opened a new issue #9006:
URL: https://github.com/apache/airflow/issues/9006


   
   
   **Apache Airflow version**: 1.10.9
   
   
   **Kubernetes version (if you are using kubernetes)** (use `kubectl version`): 1.15
   
   **Environment**: 
   
   - **Cloud provider or hardware configuration**: aws eks
   - **OS** (e.g. from /etc/os-release): aws linux
   - **Kernel** (e.g. `uname -a`):
   - **Install tools**:
   - **Others**:
   
   **What happened**:
   Trying to set logging level for airflow processes as they are writing in DEBUG level even i configured `logging_level=INFO` in config file.
   <!-- (please include exact error messages if you can) -->
   
   One example is settings.py for scheduler, always printing this log but it is debug level. Looking at code, there is a line 
   `LOGGING_LEVEL = logging.INFO` but never used
   https://github.com/apache/airflow/blob/master/airflow/settings.py#L39
   **What you expected to happen**:
   Please provide a simple, centralized way to set log level for dag_bag, settings, so we can stop these lines in logs
   `{dagbag.py:370} DEBUG - Loaded DAG `
   `[2020-05-25 15:10:27,799] {settings.py:278} DEBUG - Disposing DB connection pool (PID 3166)`
   <!-- What do you think went wrong? -->
   
   **How to reproduce it**:
   <!---
   
   As minimally and precisely as possible. Keep in mind we do not have access to your cluster or dags.
   
   If you are using kubernetes, please attempt to recreate the issue using minikube or kind.
   
   ## Install minikube/kind
   
   - Minikube https://minikube.sigs.k8s.io/docs/start/
   - Kind https://kind.sigs.k8s.io/docs/user/quick-start/
   
   If this is a UI bug, please provide a screenshot of the bug or a link to a youtube video of the bug in action
   
   You can include images using the .md sytle of
   ![alt text](http://url/to/img.png)
   
   To record a screencast, mac users can use QuickTime and then create an unlisted youtube video with the resulting .mov file.
   
   --->
   
   
   **Anything else we need to know**:
   
   <!--
   
   How often does this problem occur? Once? Every time etc?
   
   Any relevant logs to include? Put them here in side a detail tag:
   <details><summary>x.log</summary> lots of stuff </details>
   
   -->
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] halilduygulu commented on issue #9006: Provide a way to config log level for airflow processes

Posted by GitBox <gi...@apache.org>.
halilduygulu commented on issue #9006:
URL: https://github.com/apache/airflow/issues/9006#issuecomment-987667949


   Hi @blag, I was not asking for a workaround in this issue. I mentioned that I have `logging_level=INFO` in airflow.cfg file already, which is not stopping debug logs from scheduler as one expects from this configuration. So as you mentioned there are multiple places to configure log level, while `logging_level` applies to dag's logs, process logs can not be configured in airflow.cfg as far as I know, hence I created the task.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] ghaggart edited a comment on issue #9006: Provide a way to config log level for airflow processes

Posted by GitBox <gi...@apache.org>.
ghaggart edited a comment on issue #9006:
URL: https://github.com/apache/airflow/issues/9006#issuecomment-1050208433


   I have this same issue with a recent Docker Airflow Image Version: [v2.2.3](https://pypi.python.org/pypi/apache-airflow/2.2.3)
   Git Version: .release:2.2.3+06c82e17e9d7ff1bf261357e84c6013ccdb3c241
   
   I am running Airflow on a single host (scheduler/worker/webserver) using docker-compose with LocalExecutor.
   
   I specify the `AIRFLOW__LOGGING__LOGGING_LEVEL=INFO` in my Docker .env file, and inside the scheduler container the airflow.cfg is reporting as INFO, but the tasks still write DEBUG messages to the Airflow logs.
   
   Airflow Docker .env config:
   ```
   AIRFLOW__LOGGING__BASE_LOG_FOLDER=/opt/airflow/logs
   AIRFLOW__LOGGING__LOGGING_LEVEL=INFO
   AIRFLOW__LOGGING__FAB_LOGGING_LEVEL=WARNING
   AIRFLOW__CORE__SQL_ALCHEMY_CONN=postgresql+psycopg2://databases:password@postgres:port/airflow
   AIRFLOW__CORE__EXECUTOR=LocalExecutor
   AIRFLOW__CORE__PLUGINS_FOLDER=/opt/airflow/plugins
   AIRFLOW__WEBSERVER__WORKERS=2
   AIRFLOW__WEBSERVER__BASE_URL=base_url
   AIRFLOW__SCHEDULER__MIN_FILE_PROCESS_INTERVAL=60
   AIRFLOW__SCHEDULER__DAG_DIR_LIST_INTERVAL=120
   AIRFLOW__SCHEDULER__CHILD_PROCESS_LOG_DIRECTORY=/opt/airflow/logs/scheduler
   AIRFLOW__API__AUTH_BACKEND=airflow.api.auth.backend.basic_auth
   ```
   
   airflow.cfg inside the running container:
   
   ```
   # Logging level.
   #
   # Supported values: ``CRITICAL``, ``ERROR``, ``WARNING``, ``INFO``, ``DEBUG``.
   logging_level = INFO
   
   # Logging level for Flask-appbuilder UI.
   #
   # Supported values: ``CRITICAL``, ``ERROR``, ``WARNING``, ``INFO``, ``DEBUG``.
   fab_logging_level = WARNING
   
   # Logging class
   # Specify the class that will specify the logging configuration
   # This class has to be on the python classpath
   # Example: logging_config_class = my.path.default_local_settings.LOGGING_CONFIG
   logging_config_class =
   
   ```
   
   DEBUG logs from a task:
   `[2022-02-24 16:11:59,335] {script.py:1385} DEBUG - 'debug message' `
   
   My understanding is that this should configure the root logger to be INFO?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] ghaggart commented on issue #9006: Provide a way to config log level for airflow processes

Posted by GitBox <gi...@apache.org>.
ghaggart commented on issue #9006:
URL: https://github.com/apache/airflow/issues/9006#issuecomment-1050279304


   I think you're right, I looked further into some classes used by the task and one of them is setting it's own logger and hardcoding the DEBUG level by default...! My mistake... thank you!


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] potiuk commented on issue #9006: Provide a way to config log level for airflow processes

Posted by GitBox <gi...@apache.org>.
potiuk commented on issue #9006:
URL: https://github.com/apache/airflow/issues/9006#issuecomment-1050266169


   I think you should open a separate issue with it and specify (or before that take a look what logging setup your "scripts.py" and fix it. 
   
   I believe your script.py does some manipulation with logger. Your task shoudl do nothing with the loggers but logging logs.  If you are manipulating loggers and handlers, you might expect unexpected behaviours.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] blag commented on issue #9006: Provide a way to config log level for airflow processes

Posted by GitBox <gi...@apache.org>.
blag commented on issue #9006:
URL: https://github.com/apache/airflow/issues/9006#issuecomment-987504611


   I think this is already possible by setting the log level of the root logger. To do so, follow the [Advanced Configuration guide](https://airflow.apache.org/docs/apache-airflow/stable/logging-monitoring/logging-tasks.html#advanced-configuration).
   
   * [`logging.debug`](https://github.com/apache/airflow/blob/fc0fb22c120a7c12426ecaf6254159e7daf2de9e/airflow/settings.py#L442) logs to the root logger
   * The log level for the [root logger](https://github.com/apache/airflow/blob/fc0fb22c120a7c12426ecaf6254159e7daf2de9e/airflow/config_templates/airflow_local_settings.py#L114) is configured in the [`DEFAULT_LOGGING_CONFIG` variable](https://github.com/apache/airflow/blob/fc0fb22c120a7c12426ecaf6254159e7daf2de9e/airflow/config_templates/airflow_local_settings.py#L57) of `airflow_local_settings.py`
   * The log level for the root logger is configured in `airflow.cfg`, so ensure you have a `logging` section and a `LOGGING_LEVEL` variable:
     ```ini
     LOGGING_LEVEL = info
     ```
   
   Possibly related: #12414
   
   @halilduygulu Can you try out the above and see if that works for you? If so, this issue can be closed.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] potiuk commented on issue #9006: Provide a way to config log level for airflow processes

Posted by GitBox <gi...@apache.org>.
potiuk commented on issue #9006:
URL: https://github.com/apache/airflow/issues/9006#issuecomment-987874128


   First of all Airflow 1.10 is already end of life as of June 2021. Not only there will be no new features, but there even will not be critical bug fixes released for it. You should migrate to Airflow 2 asap, otherwise you put the business you have at risk because there are known security vulnerabilities that have been addressd in Airflow 2 and will never be addressed in 1.10 because it reached end-of-life. See https://www.youtube.com/watch?v=86uqFr3LIc8 for discusssion we had about it at Airflow Summit, in summer 2021.
   
   Secondly @blag is right. Airlfow is a distributed system and it's logging is also done differently for different configuration. There is no single place that you can configure logging for everything. Logging is configured usint standard Python logging configuration that is described in here: https://airflow.apache.org/docs/apache-airflow/stable/logging-monitoring/logging-tasks.html#advanced-configuration
   
   If you really want, you can replace it with a single logging configuration that sets the same logging for everything - feel free to do so - the "advanced logging" describes how to approach it.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] ghaggart edited a comment on issue #9006: Provide a way to config log level for airflow processes

Posted by GitBox <gi...@apache.org>.
ghaggart edited a comment on issue #9006:
URL: https://github.com/apache/airflow/issues/9006#issuecomment-1050208433


   I have this same issue with a recent Docker Airflow Image Version: [v2.2.3](https://pypi.python.org/pypi/apache-airflow/2.2.3)
   Git Version: .release:2.2.3+06c82e17e9d7ff1bf261357e84c6013ccdb3c241
   
   I am running Airflow on a single host (scheduler/worker/webserver) using docker-compose with LocalExecutor.
   
   I specify the `AIRFLOW__LOGGING__LOGGING_LEVEL=INFO` in my Docker .env file, and inside the scheduler container the airflow.cfg is reporting as INFO, but the tasks still write DEBUG messages to the Airflow logs.
   
   Airflow Docker .env config:
   ```
   AIRFLOW__LOGGING__BASE_LOG_FOLDER=/opt/airflow/logs
   AIRFLOW__LOGGING__LOGGING_LEVEL=INFO
   AIRFLOW__LOGGING__FAB_LOGGING_LEVEL=WARNING
   AIRFLOW__CORE__SQL_ALCHEMY_CONN=postgresql+psycopg2://databases:password@postgres:port/airflow
   AIRFLOW__CORE__EXECUTOR=LocalExecutor
   AIRFLOW__CORE__PLUGINS_FOLDER=/opt/airflow/plugins
   AIRFLOW__WEBSERVER__WORKERS=2
   AIRFLOW__WEBSERVER__BASE_URL=base_url
   AIRFLOW__SCHEDULER__MIN_FILE_PROCESS_INTERVAL=60
   AIRFLOW__SCHEDULER__DAG_DIR_LIST_INTERVAL=120
   AIRFLOW__SCHEDULER__CHILD_PROCESS_LOG_DIRECTORY=/opt/airflow/logs/scheduler
   AIRFLOW__API__AUTH_BACKEND=airflow.api.auth.backend.basic_auth
   ```
   
   airflow.cfg inside the running container:
   
   ```
   # Logging level.
   #
   # Supported values: ``CRITICAL``, ``ERROR``, ``WARNING``, ``INFO``, ``DEBUG``.
   logging_level = INFO
   
   # Logging level for Flask-appbuilder UI.
   #
   # Supported values: ``CRITICAL``, ``ERROR``, ``WARNING``, ``INFO``, ``DEBUG``.
   fab_logging_level = WARNING
   
   # Logging class
   # Specify the class that will specify the logging configuration
   # This class has to be on the python classpath
   # Example: logging_config_class = my.path.default_local_settings.LOGGING_CONFIG
   logging_config_class =
   
   ```
   
   DEBUG logs from a task:
   `[2022-02-24 16:11:59,335] {script.py:1385} DEBUG - 'debug message' `
   
   My understanding is that this should configure the root logger to be INFO?
   
   From airflow_local_settings.py:
   ```
   # TODO: Logging format and level should be configured
   # in this file instead of from airflow.cfg. Currently
   # there are other log format and level configurations in
   # settings.py and cli.py. Please see AIRFLOW-1455.
   LOG_LEVEL: str = conf.get('logging', 'LOGGING_LEVEL').upper()
   
   # Flask appbuilder's info level log is very verbose,
   # so it's set to 'WARN' by default.
   FAB_LOG_LEVEL: str = conf.get('logging', 'FAB_LOGGING_LEVEL').upper()
   
   
   
   DEFAULT_LOGGING_CONFIG: Dict[str, Any] = {
       'version': 1,
       'disable_existing_loggers': False,
       'formatters': {
           'airflow': {'format': LOG_FORMAT},
           'airflow_coloured': {
               'format': COLORED_LOG_FORMAT if COLORED_LOG else LOG_FORMAT,
               'class': COLORED_FORMATTER_CLASS if COLORED_LOG else 'logging.Formatter',
           },
       },
       'filters': {
           'mask_secrets': {
               '()': 'airflow.utils.log.secrets_masker.SecretsMasker',
           },
       },
       'handlers': {
           'console': {
               'class': 'airflow.utils.log.logging_mixin.RedirectStdHandler',
               'formatter': 'airflow_coloured',
               'stream': 'sys.stdout',
               'filters': ['mask_secrets'],
           },
           'task': {
               'class': 'airflow.utils.log.file_task_handler.FileTaskHandler',
               'formatter': 'airflow',
               'base_log_folder': os.path.expanduser(BASE_LOG_FOLDER),
               'filename_template': FILENAME_TEMPLATE,
               'filters': ['mask_secrets'],
           },
           'processor': {
               'class': 'airflow.utils.log.file_processor_handler.FileProcessorHandler',
               'formatter': 'airflow',
               'base_log_folder': os.path.expanduser(PROCESSOR_LOG_FOLDER),
               'filename_template': PROCESSOR_FILENAME_TEMPLATE,
               'filters': ['mask_secrets'],
           },
       },
       'loggers': {
           'airflow.processor': {
               'handlers': ['processor'],
               'level': LOG_LEVEL,
               'propagate': False,
           },
           'airflow.task': {
               'handlers': ['task'],
               'level': LOG_LEVEL,
               'propagate': False,
               'filters': ['mask_secrets'],
           },
           'flask_appbuilder': {
               'handlers': ['console'],
               'level': FAB_LOG_LEVEL,
               'propagate': True,
           },
       },
       'root': {
           'handlers': ['console'],
           'level': LOG_LEVEL,
           'filters': ['mask_secrets'],
       },
   }
   ```
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] ghaggart edited a comment on issue #9006: Provide a way to config log level for airflow processes

Posted by GitBox <gi...@apache.org>.
ghaggart edited a comment on issue #9006:
URL: https://github.com/apache/airflow/issues/9006#issuecomment-1050208433


   I have this same issue with a recent Docker Airflow Image Version: [v2.2.3](https://pypi.python.org/pypi/apache-airflow/2.2.3)
   Git Version: .release:2.2.3+06c82e17e9d7ff1bf261357e84c6013ccdb3c241
   
   I am running Airflow on a single host (scheduler/worker/webserver) using docker-compose with LocalExecutor.
   
   I specify the `AIRFLOW__LOGGING__LOGGING_LEVEL=INFO` in my Docker .env file, and inside the scheduler container the airflow.cfg is reporting as INFO, but the tasks still write DEBUG messages to the Airflow logs.
   
   Airflow Docker config:
   ```
   AIRFLOW__LOGGING__BASE_LOG_FOLDER=/opt/airflow/logs
   AIRFLOW__LOGGING__LOGGING_LEVEL=INFO
   AIRFLOW__LOGGING__FAB_LOGGING_LEVEL=WARNING
   AIRFLOW__CORE__EXECUTOR=/opt/airflow/logs
   AIRFLOW__CORE__SQL_ALCHEMY_CONN=postgresql+psycopg2://databases:password@postgres:port/airflow
   AIRFLOW__CORE__EXECUTOR=LocalExecutor
   AIRFLOW__CORE__PLUGINS_FOLDER=/opt/airflow/plugins
   AIRFLOW__WEBSERVER__WORKERS=2
   AIRFLOW__WEBSERVER__BASE_URL=base_url
   AIRFLOW__SCHEDULER__MIN_FILE_PROCESS_INTERVAL=60
   AIRFLOW__SCHEDULER__DAG_DIR_LIST_INTERVAL=120
   AIRFLOW__SCHEDULER__CHILD_PROCESS_LOG_DIRECTORY=/opt/airflow/logs/scheduler
   AIRFLOW__API__AUTH_BACKEND=airflow.api.auth.backend.basic_auth
   ```
   
   airflow.cfg inside the running container:
   
   ```
   # Logging level.
   #
   # Supported values: ``CRITICAL``, ``ERROR``, ``WARNING``, ``INFO``, ``DEBUG``.
   logging_level = INFO
   
   # Logging level for Flask-appbuilder UI.
   #
   # Supported values: ``CRITICAL``, ``ERROR``, ``WARNING``, ``INFO``, ``DEBUG``.
   fab_logging_level = WARNING
   
   # Logging class
   # Specify the class that will specify the logging configuration
   # This class has to be on the python classpath
   # Example: logging_config_class = my.path.default_local_settings.LOGGING_CONFIG
   logging_config_class =
   
   ```
   
   DEBUG logs from a task:
   `[2022-02-24 16:11:59,335] {script.py:1385} DEBUG - 'debug message' `


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] potiuk edited a comment on issue #9006: Provide a way to config log level for airflow processes

Posted by GitBox <gi...@apache.org>.
potiuk edited a comment on issue #9006:
URL: https://github.com/apache/airflow/issues/9006#issuecomment-987874128


   First of all Airflow 1.10 is already end of life as of June 2021. Not only there will be no new features, but there even will not be critical bug fixes released for it. You should migrate to Airflow 2 asap, otherwise you put the business you have at risk because there are known security vulnerabilities that have been addressd in Airflow 2 and will never be addressed in 1.10 because it reached end-of-life. See https://www.youtube.com/watch?v=86uqFr3LIc8 for discusssion we had about it at Airflow Summit, in summer 2021.
   
   Secondly @blag is right. Airlfow is a distributed system and it's logging is also done differently for different components. There is no single place that you can configure logging for everything. Logging is configured usint standard Python logging configuration that is described in here: https://airflow.apache.org/docs/apache-airflow/stable/logging-monitoring/logging-tasks.html#advanced-configuration
   
   If you really want, you can replace it with a single logging configuration that sets the same logging for everything - feel free to do so - the "advanced logging" describes how to approach it.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] boring-cyborg[bot] commented on issue #9006: Provide a way to config log level for airflow processes

Posted by GitBox <gi...@apache.org>.
boring-cyborg[bot] commented on issue #9006:
URL: https://github.com/apache/airflow/issues/9006#issuecomment-633616341


   Thanks for opening your first issue here! Be sure to follow the issue template!
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] ghaggart edited a comment on issue #9006: Provide a way to config log level for airflow processes

Posted by GitBox <gi...@apache.org>.
ghaggart edited a comment on issue #9006:
URL: https://github.com/apache/airflow/issues/9006#issuecomment-1050208433


   I have this same issue with a recent Docker Airflow Image Version: [v2.2.3](https://pypi.python.org/pypi/apache-airflow/2.2.3)
   Git Version: .release:2.2.3+06c82e17e9d7ff1bf261357e84c6013ccdb3c241
   
   I am running Airflow on a single host (scheduler/worker/webserver) using docker-compose with LocalExecutor.
   
   I specify the `AIRFLOW__LOGGING__LOGGING_LEVEL=INFO` in my Docker .env file, and inside the scheduler container the airflow.cfg is reporting as INFO, but the tasks still write DEBUG messages to the Airflow logs.
   
   Airflow Docker .env config:
   ```
   AIRFLOW__LOGGING__BASE_LOG_FOLDER=/opt/airflow/logs
   AIRFLOW__LOGGING__LOGGING_LEVEL=INFO
   AIRFLOW__LOGGING__FAB_LOGGING_LEVEL=WARNING
   AIRFLOW__CORE__SQL_ALCHEMY_CONN=postgresql+psycopg2://databases:password@postgres:port/airflow
   AIRFLOW__CORE__EXECUTOR=LocalExecutor
   AIRFLOW__CORE__PLUGINS_FOLDER=/opt/airflow/plugins
   AIRFLOW__WEBSERVER__WORKERS=2
   AIRFLOW__WEBSERVER__BASE_URL=base_url
   AIRFLOW__SCHEDULER__MIN_FILE_PROCESS_INTERVAL=60
   AIRFLOW__SCHEDULER__DAG_DIR_LIST_INTERVAL=120
   AIRFLOW__SCHEDULER__CHILD_PROCESS_LOG_DIRECTORY=/opt/airflow/logs/scheduler
   AIRFLOW__API__AUTH_BACKEND=airflow.api.auth.backend.basic_auth
   ```
   
   airflow.cfg inside the running container:
   
   ```
   # Logging level.
   #
   # Supported values: ``CRITICAL``, ``ERROR``, ``WARNING``, ``INFO``, ``DEBUG``.
   logging_level = INFO
   
   # Logging level for Flask-appbuilder UI.
   #
   # Supported values: ``CRITICAL``, ``ERROR``, ``WARNING``, ``INFO``, ``DEBUG``.
   fab_logging_level = WARNING
   
   # Logging class
   # Specify the class that will specify the logging configuration
   # This class has to be on the python classpath
   # Example: logging_config_class = my.path.default_local_settings.LOGGING_CONFIG
   logging_config_class =
   
   ```
   
   DEBUG logs from a task:
   `[2022-02-24 16:11:59,335] {script.py:1385} DEBUG - 'debug message' `
   
   My understanding is that this should configure the root logger to be INFO?
   
   From airflow_local_settings.py:
   ```
   DEFAULT_LOGGING_CONFIG: Dict[str, Any] = {
       'version': 1,
       'disable_existing_loggers': False,
       'formatters': {
           'airflow': {'format': LOG_FORMAT},
           'airflow_coloured': {
               'format': COLORED_LOG_FORMAT if COLORED_LOG else LOG_FORMAT,
               'class': COLORED_FORMATTER_CLASS if COLORED_LOG else 'logging.Formatter',
           },
       },
       'filters': {
           'mask_secrets': {
               '()': 'airflow.utils.log.secrets_masker.SecretsMasker',
           },
       },
       'handlers': {
           'console': {
               'class': 'airflow.utils.log.logging_mixin.RedirectStdHandler',
               'formatter': 'airflow_coloured',
               'stream': 'sys.stdout',
               'filters': ['mask_secrets'],
           },
           'task': {
               'class': 'airflow.utils.log.file_task_handler.FileTaskHandler',
               'formatter': 'airflow',
               'base_log_folder': os.path.expanduser(BASE_LOG_FOLDER),
               'filename_template': FILENAME_TEMPLATE,
               'filters': ['mask_secrets'],
           },
           'processor': {
               'class': 'airflow.utils.log.file_processor_handler.FileProcessorHandler',
               'formatter': 'airflow',
               'base_log_folder': os.path.expanduser(PROCESSOR_LOG_FOLDER),
               'filename_template': PROCESSOR_FILENAME_TEMPLATE,
               'filters': ['mask_secrets'],
           },
       },
       'loggers': {
           'airflow.processor': {
               'handlers': ['processor'],
               'level': LOG_LEVEL,
               'propagate': False,
           },
           'airflow.task': {
               'handlers': ['task'],
               'level': LOG_LEVEL,
               'propagate': False,
               'filters': ['mask_secrets'],
           },
           'flask_appbuilder': {
               'handlers': ['console'],
               'level': FAB_LOG_LEVEL,
               'propagate': True,
           },
       },
       'root': {
           'handlers': ['console'],
           'level': LOG_LEVEL,
           'filters': ['mask_secrets'],
       },
   }
   ```
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] ghaggart edited a comment on issue #9006: Provide a way to config log level for airflow processes

Posted by GitBox <gi...@apache.org>.
ghaggart edited a comment on issue #9006:
URL: https://github.com/apache/airflow/issues/9006#issuecomment-1050208433


   I have this same issue with a recent Docker Airflow Image Version: [v2.2.3](https://pypi.python.org/pypi/apache-airflow/2.2.3)
   Git Version: .release:2.2.3+06c82e17e9d7ff1bf261357e84c6013ccdb3c241
   
   I am running Airflow on a single host (scheduler/worker/webserver) using docker-compose with LocalExecutor.
   
   I specify the `AIRFLOW__LOGGING__LOGGING_LEVEL=INFO` in my Docker .env file, and inside the scheduler container the airflow.cfg is reporting as INFO, but the tasks still write DEBUG messages to the Airflow logs.
   
   Airflow Docker config:
   ```
   AIRFLOW__LOGGING__BASE_LOG_FOLDER=/opt/airflow/logs
   AIRFLOW__LOGGING__LOGGING_LEVEL=INFO
   AIRFLOW__LOGGING__FAB_LOGGING_LEVEL=WARNING
   AIRFLOW__CORE__SQL_ALCHEMY_CONN=postgresql+psycopg2://databases:password@postgres:port/airflow
   AIRFLOW__CORE__EXECUTOR=LocalExecutor
   AIRFLOW__CORE__PLUGINS_FOLDER=/opt/airflow/plugins
   AIRFLOW__WEBSERVER__WORKERS=2
   AIRFLOW__WEBSERVER__BASE_URL=base_url
   AIRFLOW__SCHEDULER__MIN_FILE_PROCESS_INTERVAL=60
   AIRFLOW__SCHEDULER__DAG_DIR_LIST_INTERVAL=120
   AIRFLOW__SCHEDULER__CHILD_PROCESS_LOG_DIRECTORY=/opt/airflow/logs/scheduler
   AIRFLOW__API__AUTH_BACKEND=airflow.api.auth.backend.basic_auth
   ```
   
   airflow.cfg inside the running container:
   
   ```
   # Logging level.
   #
   # Supported values: ``CRITICAL``, ``ERROR``, ``WARNING``, ``INFO``, ``DEBUG``.
   logging_level = INFO
   
   # Logging level for Flask-appbuilder UI.
   #
   # Supported values: ``CRITICAL``, ``ERROR``, ``WARNING``, ``INFO``, ``DEBUG``.
   fab_logging_level = WARNING
   
   # Logging class
   # Specify the class that will specify the logging configuration
   # This class has to be on the python classpath
   # Example: logging_config_class = my.path.default_local_settings.LOGGING_CONFIG
   logging_config_class =
   
   ```
   
   DEBUG logs from a task:
   `[2022-02-24 16:11:59,335] {script.py:1385} DEBUG - 'debug message' `


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] ghaggart edited a comment on issue #9006: Provide a way to config log level for airflow processes

Posted by GitBox <gi...@apache.org>.
ghaggart edited a comment on issue #9006:
URL: https://github.com/apache/airflow/issues/9006#issuecomment-1050208433


   I have this same issue with a recent Docker Airflow Image Version: [v2.2.3](https://pypi.python.org/pypi/apache-airflow/2.2.3)
   Git Version: .release:2.2.3+06c82e17e9d7ff1bf261357e84c6013ccdb3c241
   
   I am running Airflow on a single host (scheduler/worker/webserver) using docker-compose with LocalExecutor.
   
   I specify the `AIRFLOW__LOGGING__LOGGING_LEVEL=INFO` in my Docker .env file, and inside the scheduler container the airflow.cfg is reporting as INFO, but the tasks still write DEBUG messages to the Airflow logs.
   
   Airflow Docker .env config:
   ```
   AIRFLOW__LOGGING__BASE_LOG_FOLDER=/opt/airflow/logs
   AIRFLOW__LOGGING__LOGGING_LEVEL=INFO
   AIRFLOW__LOGGING__FAB_LOGGING_LEVEL=WARNING
   AIRFLOW__CORE__SQL_ALCHEMY_CONN=postgresql+psycopg2://databases:password@postgres:port/airflow
   AIRFLOW__CORE__EXECUTOR=LocalExecutor
   AIRFLOW__CORE__PLUGINS_FOLDER=/opt/airflow/plugins
   AIRFLOW__WEBSERVER__WORKERS=2
   AIRFLOW__WEBSERVER__BASE_URL=base_url
   AIRFLOW__SCHEDULER__MIN_FILE_PROCESS_INTERVAL=60
   AIRFLOW__SCHEDULER__DAG_DIR_LIST_INTERVAL=120
   AIRFLOW__SCHEDULER__CHILD_PROCESS_LOG_DIRECTORY=/opt/airflow/logs/scheduler
   AIRFLOW__API__AUTH_BACKEND=airflow.api.auth.backend.basic_auth
   ```
   
   airflow.cfg inside the running container:
   
   ```
   # Logging level.
   #
   # Supported values: ``CRITICAL``, ``ERROR``, ``WARNING``, ``INFO``, ``DEBUG``.
   logging_level = INFO
   
   # Logging level for Flask-appbuilder UI.
   #
   # Supported values: ``CRITICAL``, ``ERROR``, ``WARNING``, ``INFO``, ``DEBUG``.
   fab_logging_level = WARNING
   
   # Logging class
   # Specify the class that will specify the logging configuration
   # This class has to be on the python classpath
   # Example: logging_config_class = my.path.default_local_settings.LOGGING_CONFIG
   logging_config_class =
   
   ```
   
   DEBUG logs from a task:
   `[2022-02-24 16:11:59,335] {script.py:1385} DEBUG - 'debug message' `


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] potiuk closed issue #9006: Provide a way to config log level for airflow processes

Posted by GitBox <gi...@apache.org>.
potiuk closed issue #9006:
URL: https://github.com/apache/airflow/issues/9006


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] potiuk edited a comment on issue #9006: Provide a way to config log level for airflow processes

Posted by GitBox <gi...@apache.org>.
potiuk edited a comment on issue #9006:
URL: https://github.com/apache/airflow/issues/9006#issuecomment-987874128


   First of all Airflow 1.10 is already end of life as of June 2021. Not only there will be no new features, but there even will not be critical bug fixes released for it. You should migrate to Airflow 2 ASAP, otherwise you put the business you have at risk because there are known security vulnerabilities that have been addressed in Airflow 2 and will never be addressed in 1.10 because it reached end-of-life. See https://www.youtube.com/watch?v=86uqFr3LIc8 for discusssion we had about it at Airflow Summit, in summer 2021.
   
   Secondly @blag is right. Airlfow is a distributed system and it's logging is also done differently for different components. There is no single place that you can configure logging for everything. Logging is configured usint standard Python logging configuration that is described in here: https://airflow.apache.org/docs/apache-airflow/stable/logging-monitoring/logging-tasks.html#advanced-configuration
   
   If you really want, you can replace it with a single logging configuration that sets the same logging for everything - feel free to do so - the "advanced logging" describes how to approach it.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] potiuk edited a comment on issue #9006: Provide a way to config log level for airflow processes

Posted by GitBox <gi...@apache.org>.
potiuk edited a comment on issue #9006:
URL: https://github.com/apache/airflow/issues/9006#issuecomment-1050266169


   I think you should open a separate issue with it and specify (or before that take a look what logging setup your "scripts.py" and fix it. 
   
   I believe your script.py does some manipulation with logger. Your task should do nothing with the loggers but logging logs.  If you are manipulating loggers and handlers, you might expect unexpected behaviours.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] ghaggart commented on issue #9006: Provide a way to config log level for airflow processes

Posted by GitBox <gi...@apache.org>.
ghaggart commented on issue #9006:
URL: https://github.com/apache/airflow/issues/9006#issuecomment-1050208433


   I have this same issue with a recent Docker Airflow Image Version: [v2.2.3](https://pypi.python.org/pypi/apache-airflow/2.2.3)
   Git Version: .release:2.2.3+06c82e17e9d7ff1bf261357e84c6013ccdb3c241
   
   I am running Airflow on a single host (scheduler/worker/webserver) using docker-compose with LocalExecutor.
   
   I specify the AIRFLOW__LOGGING__LOGGING_LEVEL=INFO in my Docker .env file, and inside the scheduler container the airflow.cfg is reporting as INFO, but the tasks still write DEBUG messages to the Airflow logs.
   
   Airflow Docker config:
   ```
   AIRFLOW__LOGGING__BASE_LOG_FOLDER=/opt/airflow/logs
   AIRFLOW__LOGGING__LOGGING_LEVEL=INFO
   AIRFLOW__LOGGING__FAB_LOGGING_LEVEL=WARNING
   AIRFLOW__CORE__EXECUTOR=/opt/airflow/logs
   AIRFLOW__CORE__SQL_ALCHEMY_CONN=postgresql+psycopg2://databases:password@postgres:port/airflow
   AIRFLOW__CORE__EXECUTOR=LocalExecutor
   AIRFLOW__CORE__PLUGINS_FOLDER=/opt/airflow/plugins
   AIRFLOW__WEBSERVER__WORKERS=2
   AIRFLOW__WEBSERVER__BASE_URL=base_url
   AIRFLOW__SCHEDULER__MIN_FILE_PROCESS_INTERVAL=60
   AIRFLOW__SCHEDULER__DAG_DIR_LIST_INTERVAL=120
   AIRFLOW__SCHEDULER__CHILD_PROCESS_LOG_DIRECTORY=/opt/airflow/logs/scheduler
   AIRFLOW__API__AUTH_BACKEND=airflow.api.auth.backend.basic_auth
   ```
   
   airflow.cfg inside the running container:
   
   ```
   # Logging level.
   #
   # Supported values: ``CRITICAL``, ``ERROR``, ``WARNING``, ``INFO``, ``DEBUG``.
   logging_level = INFO
   
   # Logging level for Flask-appbuilder UI.
   #
   # Supported values: ``CRITICAL``, ``ERROR``, ``WARNING``, ``INFO``, ``DEBUG``.
   fab_logging_level = WARNING
   
   # Logging class
   # Specify the class that will specify the logging configuration
   # This class has to be on the python classpath
   # Example: logging_config_class = my.path.default_local_settings.LOGGING_CONFIG
   logging_config_class =
   
   ```
   
   DEBUG logs from a task:
   `[2022-02-24 16:11:59,335] {script.py:1385} DEBUG - 'debug message' `


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org