You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by GitBox <gi...@apache.org> on 2021/11/18 21:29:25 UTC
[GitHub] [airflow] andormarkus commented on issue #13824: Cloudwatch Integration: SIGTERM/SIGKILL Sent Following DAG Completion, Causing Errors in Worker Logs
andormarkus commented on issue #13824:
URL: https://github.com/apache/airflow/issues/13824#issuecomment-973286690
Hi @ephraimbuddy
I have tested it with Airflow 2.2.2 and the problem is still existing.
Example logs 1:
```shell
2021-11-18 21:19:01,060: INFO/MainProcess] Task airflow.executors.celery_executor.execute_command[53ca9d3f-b59e-4a3f-9f21-009c32db5473] received
[2021-11-18 21:19:01,082: INFO/ForkPoolWorker-16] Executing command in Celery: ['airflow', 'tasks', 'run', 'simple_dag', 'sleep', 'scheduled__2021-11-18T21:18:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/simple_dag.py']
[2021-11-18 21:19:01,082: INFO/ForkPoolWorker-16] Celery task ID: 53ca9d3f-b59e-4a3f-9f21-009c32db5473
[2021-11-18 21:19:01,123: INFO/ForkPoolWorker-16] Filling up the DagBag from /opt/airflow/dags/repo/dags/simple_dag.py
[2021-11-18 21:19:01,234: WARNING/ForkPoolWorker-16] Running <TaskInstance: simple_dag.sleep scheduled__2021-11-18T21:18:00+00:00 [queued]> on host airflow-worker-5997488b78-t2ftx
[2021-11-18 21:19:17,631: INFO/ForkPoolWorker-15] Task airflow.executors.celery_executor.execute_command[4ccd30db-274f-4f2d-a750-3df8ec6e856c] succeeded in 76.68239335156977s: None
[2021-11-18 21:19:17,852: ERROR/ForkPoolWorker-16] Failed to execute task Task received SIGTERM signal.
Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.9/site-packages/airflow/executors/celery_executor.py", line 121, in _execute_in_fork
args.func(args)
File "/home/airflow/.local/lib/python3.9/site-packages/airflow/cli/cli_parser.py", line 48, in command
return func(*args, **kwargs)
File "/home/airflow/.local/lib/python3.9/site-packages/airflow/utils/cli.py", line 92, in wrapper
return f(*args, **kwargs)
File "/home/airflow/.local/lib/python3.9/site-packages/airflow/cli/commands/task_command.py", line 292, in task_run
_run_task_by_selected_method(args, dag, ti)
File "/home/airflow/.local/lib/python3.9/site-packages/airflow/cli/commands/task_command.py", line 105, in _run_task_by_selected_method
_run_task_by_local_task_job(args, ti)
File "/home/airflow/.local/lib/python3.9/site-packages/airflow/cli/commands/task_command.py", line 163, in _run_task_by_local_task_job
run_job.run()
File "/home/airflow/.local/lib/python3.9/site-packages/airflow/jobs/base_job.py", line 245, in run
self._execute()
File "/home/airflow/.local/lib/python3.9/site-packages/airflow/jobs/local_task_job.py", line 103, in _execute
self.task_runner.start()
File "/home/airflow/.local/lib/python3.9/site-packages/airflow/task/task_runner/standard_task_runner.py", line 41, in start
self.process = self._start_by_fork()
File "/home/airflow/.local/lib/python3.9/site-packages/airflow/task/task_runner/standard_task_runner.py", line 97, in _start_by_fork
logging.shutdown()
File "/usr/local/lib/python3.9/logging/__init__.py", line 2141, in shutdown
h.flush()
File "/home/airflow/.local/lib/python3.9/site-packages/watchtower/__init__.py", line 297, in flush
q.join()
File "/usr/local/lib/python3.9/queue.py", line 90, in join
self.all_tasks_done.wait()
File "/usr/local/lib/python3.9/threading.py", line 312, in wait
waiter.acquire()
File "/home/airflow/.local/lib/python3.9/site-packages/airflow/models/taskinstance.py", line 1413, in signal_handler
raise AirflowException("Task received SIGTERM signal")
airflow.exceptions.AirflowException: Task received SIGTERM signal
```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org
For queries about this service, please contact Infrastructure at:
users@infra.apache.org