You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by GitBox <gi...@apache.org> on 2021/04/12 07:53:35 UTC

[GitHub] [airflow] ephraimbuddy commented on issue #14974: KubernetesJobWatcher does not delete worker pods

ephraimbuddy commented on issue #14974:
URL: https://github.com/apache/airflow/issues/14974#issuecomment-817577477


   @mrpowerus please can you give more detailed steps to reproduce this?
   
   That said, I'm not Ok with your configuration.
   These lines in your config
   ```
   worker_container_repository = apache/airflow
   worker_container_tag = 2.0.1-python3.8
   ```
   makes every task start pulling 2.0.1-python3.8 image afresh before they can create a container if `images.airflow.repository` and `images.airflow.tag` are configured differently. 2.0.1-python3.8 image takes a long time to be pulled and you can have network error in the process. 
   
   Interestingly, When I configured worker_container_repository & worker_container_tag as you did and ran with airflow master(using breeze) repository. The images were pulled correctly but I got errors that the dag I ran could not be found. 
   
   When you use breeze to start a kubernetes cluster, It loads the example dags. Now using your configuration and also making sure that what's in `images.airflow.repository` & `images.airflow.tag` corresponds to what workers are using, I got the same error:
   
   ```
   BACKEND=postgresql                                                                                                                                                                                          │
   │ DB_HOST=airflow-postgresql.airflow.svc.cluster.local                                                                                                                                                        │
   │ DB_PORT=5432                                                                                                                                                                                                │
   │ [2021-04-12 06:36:06,121] {settings.py:210} DEBUG - Setting up DB connection pool (PID 7)                                                                                                                   │
   │ [2021-04-12 06:36:06,121] {settings.py:275} DEBUG - settings.prepare_engine_args(): Using pool settings. pool_size=5, max_overflow=10, pool_recycle=1800, pid=7                                             │
   │ [2021-04-12 06:36:06,192] {cli_action_loggers.py:40} DEBUG - Adding <function default_action_log at 0x7f50eaa82670> to pre execution callback                                                               │
   │ [2021-04-12 06:36:08,064] {cli_action_loggers.py:66} DEBUG - Calling callbacks: [<function default_action_log at 0x7f50eaa82670>]                                                                           │
   │ [2021-04-12 06:36:08,075] {settings.py:210} DEBUG - Setting up DB connection pool (PID 7)                                                                                                                   │
   │ [2021-04-12 06:36:08,075] {settings.py:243} DEBUG - settings.prepare_engine_args(): Using NullPool                                                                                                          │
   │ [2021-04-12 06:36:08,075] {dagbag.py:448} INFO - Filling up the DagBag from /opt/airflow/dags/example_bash_operator.py                                                                                      │
   │ [2021-04-12 06:36:08,076] {cli_action_loggers.py:84} DEBUG - Calling callbacks: []                                                                                                                          │
   │ Traceback (most recent call last):        
   File "/home/airflow/.local/bin/airflow", line 8, in <module>                                                                                                                                              │
   │     sys.exit(main())                                                                                                                                                                                        │
   │   File "/home/airflow/.local/lib/python3.8/site-packages/airflow/__main__.py", line 40, in main                                                                                                             │
   │     args.func(args)                                                                                                                                                                                         │
   │   File "/home/airflow/.local/lib/python3.8/site-packages/airflow/cli/cli_parser.py", line 48, in command                                                                                                    │
   │     return func(*args, **kwargs)                                                                                                                                                                            │
   │   File "/home/airflow/.local/lib/python3.8/site-packages/airflow/utils/cli.py", line 89, in wrapper                                                                                                         │
   │     return f(*args, **kwargs)                                                                                                                                                                               │
   │   File "/home/airflow/.local/lib/python3.8/site-packages/airflow/cli/commands/task_command.py", line 217, in task_run                                                                                       │
   │     dag = get_dag(args.subdir, args.dag_id)                                                                                                                                                                 │
   │   File "/home/airflow/.local/lib/python3.8/site-packages/airflow/utils/cli.py", line 187, in get_dag                                                                                                        │
   │     raise AirflowException(                                                                                                                                                                                 │
   │ airflow.exceptions.AirflowException: dag_id could not be found: example_bash_operator. Either the dag did not exist or it failed to parse.     
    [2021-04-12 06:36:08,076] {settings.py:292} DEBUG - Disposing DB connection pool (PID 7)                                                   
   ```
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org