You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by GitBox <gi...@apache.org> on 2022/01/14 07:18:44 UTC
[GitHub] [airflow] andormarkus commented on issue #13824: Cloudwatch Integration: SIGTERM/SIGKILL Sent Following DAG Completion, Causing Errors in Worker Logs
andormarkus commented on issue #13824:
URL: https://github.com/apache/airflow/issues/13824#issuecomment-1012838195
Hi @o-nikolas
We are on helm chart `1.3.0` and airlfow `2.2.2-python3.9`. Airflow is running on AWS EKS with celery executor (keda enabled). The logs attached in my previous comment was from the celery worker logs.
Please let me know if you need more information.
Relevant section from helm chart configuration
```yaml
config:
logging:
colored_console_log: "False"
remote_logging: "True"
remote_log_conn_id: aws_default
remote_base_log_folder: "cloudwatch://${log_group_arn}"
```
`simple_dag `
```python
"""Sample DAG."""
import time
from datetime import datetime, timedelta
from airflow import DAG
from airflow.operators.python import PythonOperator
default_args = {
"owner": "airflow",
"depends_on_past": False,
"start_date": datetime(2020, 1, 1),
"email": ["support@airflow.com"],
"email_on_failure": False,
"email_on_retry": False,
"retries": 1,
"retry_delay": timedelta(minutes=5),
}
def sleep() -> bool:
"""Sleep.
Returns:
bool: True
"""
time.sleep(10)
return True
with DAG("simple_dag", default_args=default_args, schedule_interval="* * * * *", catchup=False) as dag:
t1 = PythonOperator(task_id="sleep", python_callable=sleep)
```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org
For queries about this service, please contact Infrastructure at:
users@infra.apache.org