You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by GitBox <gi...@apache.org> on 2020/10/05 16:56:12 UTC

[GitHub] [airflow] prakshalj0512 opened a new issue #11286: Configs under `config` in values.yaml aren't applying to worker pods

prakshalj0512 opened a new issue #11286:
URL: https://github.com/apache/airflow/issues/11286


   **Apache Airflow version**:
   
   
   **Kubernetes version (if you are using kubernetes)** (use `kubectl version`):
   ```Client Version: version.Info{Major:"1", Minor:"19", GitVersion:"v1.19.2", GitCommit:"f5743093fd1c663cb0cbc89748f730662345d44d", GitTreeState:"clean", BuildDate:"2020-09-16T21:51:49Z", GoVersion:"go1.15.2", Compiler:"gc", Platform:"darwin/amd64"}
   Server Version: version.Info{Major:"1", Minor:"18", GitVersion:"v1.18.0", GitCommit:"9e991415386e4cf155a24b1da15becaa390438d8", GitTreeState:"clean", BuildDate:"2020-03-25T14:50:46Z", GoVersion:"go1.13.8", Compiler:"gc", Platform:"linux/amd64"}```
   
   **Environment**:
   
   - **Cloud provider or hardware configuration**: N/A
   - **OS** (e.g. from /etc/os-release): minikube
   - **Kernel** (e.g. `uname -a`):
   - **Install tools**:
   - **Others**:
   
   **What happened**:
   
   The configs present under the `config:` section aren't applying to the worker pods. For example, I have the following values set-up. 
   ```config:
     core:
       dags_folder: '{{ include "airflow_dags" . }}'
       load_examples: "True"
       colored_console_log: "False"
       executor: "{{ .Values.executor }}"
       remote_log_conn_id: "s3_conn"
       remote_logging: "True"
       remote_base_log_folder: "s3://prakshal-test-bucket/"```
   
   The worker pods don't recognize the remote logging values. I have to either pass them again under ENV or add them to the docker image being used for the workers.
   
   CC: @dimberman 
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] mik-laj closed issue #11286: Configs under `config` in values.yaml aren't applying to worker pods

Posted by GitBox <gi...@apache.org>.
mik-laj closed issue #11286:
URL: https://github.com/apache/airflow/issues/11286


   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] mik-laj commented on issue #11286: Configs under `config` in values.yaml aren't applying to worker pods

Posted by GitBox <gi...@apache.org>.
mik-laj commented on issue #11286:
URL: https://github.com/apache/airflow/issues/11286#issuecomment-705213994


   Overall, logger setup is one of the nightmares faced by people who want to start using Airflow. We have deffects in documentation, and debugging is also painful.
   I wanted to correct these situations  I added a new field to the airflow info command.
   https://github.com/apache/airflow/pull/10771
   This command will check which task handler you currently have. This is only available for Airflow 2.0, so if you want a similar effect for Airflow 1.10 you have to run python and then execute the following script.
   ```
   >>> import airflow # Initialize airflow and logger configuration
   >>> import logging
   >>> logging.getLogger('airflow.task').handlers
   [<GCSTaskHandler (NOTSET)>]
   ```
   If you see a message similar to the one below, you have a default configuration.
   `[<FileTaskHandler (NOTSET)>]` 
   also see that the `airflow config `command is not working for us, so you can also use the workaround to see the current configuration.
   ```
   from airflow import conf
   import pprint
   pprint.pprint(conf.getsection('core'))
   OrderedDict([('dags_folder', '/opt/airflow/dags/repo/dags'),
                ('base_log_folder', '/opt/airflow/logs'),
                ('remote_logging', False),
                ('remote_log_conn_id', ''),
                ('remote_base_log_folder', ''),
                ('encrypt_s3_logs', False),
                ('logging_level', 'INFO'),
                ('fab_logging_level', 'WARN'),
                ('logging_config_class', ''),
                ('colored_console_log', False),
   ```
   I use environment variables to configure and everything works fine.
   ```
       AIRFLOW_CONN_GOOGLE_CLOUD_DEFAULT = "google-cloud-platform://"
       // Configure remote logging
       // https://airflow.readthedocs.io/en/latest/logging-monitoring/logging-tasks.html#writing-logs-to-google-cloud-storage
       AIRFLOW__CORE__REMOTE_LOGGING         = "True"
       AIRFLOW__CORE__REMOTE_BASE_LOG_FOLDER = "gs://${var.gcs_logging_bucket}/"
       AIRFLOW__CORE__REMOTE_LOG_CONN_ID     = "google_cloud_default"
   ```


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org