You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by "sanjayp (Jira)" <ji...@apache.org> on 2020/12/03 06:31:00 UTC

[jira] [Comment Edited] (AIRFLOW-2516) Deadlock found when trying to update task_instance table

    [ https://issues.apache.org/jira/browse/AIRFLOW-2516?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17242949#comment-17242949 ] 

sanjayp edited comment on AIRFLOW-2516 at 12/3/20, 6:30 AM:
------------------------------------------------------------

i am getting this issue on version 1.10.12 while running the example_subdag_operator from the examples.  it is a clean deployment, and this is the only DAG configured. 

 it is happening pretty consistently.

 

 

Environment:

Airflow Version: 1.10.12

Database: MySQL 8.0.22

Deployed on Kubernetes with CeleryExecutor using airflow-helm/charts
{code:java}
// code placeholder
*** Log file does not exist: /opt/airflow/logs/example_subdag_operator/section-2/2020-12-03T05:55:04.861187+00:00/1.log
*** Fetching from: http://airflow-demo-worker-0.airflow-demo-worker.airflow.svc.cluster.local:8793/log/example_subdag_operator/section-2/2020-12-03T05:55:04.861187+00:00/1.log

[2020-12-03 05:56:13,403] {taskinstance.py:670} INFO - Dependencies all met for <TaskInstance: example_subdag_operator.section-2 2020-12-03T05:55:04.861187+00:00 [queued]>
[2020-12-03 05:56:13,430] {taskinstance.py:670} INFO - Dependencies all met for <TaskInstance: example_subdag_operator.section-2 2020-12-03T05:55:04.861187+00:00 [queued]>
[2020-12-03 05:56:13,430] {taskinstance.py:880} INFO - 
--------------------------------------------------------------------------------
[2020-12-03 05:56:13,430] {taskinstance.py:881} INFO - Starting attempt 1 of 1
[2020-12-03 05:56:13,430] {taskinstance.py:882} INFO - 
--------------------------------------------------------------------------------
[2020-12-03 05:56:13,459] {taskinstance.py:901} INFO - Executing <Task(SubDagOperator): section-2> on 2020-12-03T05:55:04.861187+00:00
[2020-12-03 05:56:13,461] {standard_task_runner.py:54} INFO - Started process 1540 to run task
[2020-12-03 05:56:13,481] {standard_task_runner.py:77} INFO - Running: ['airflow', 'run', 'example_subdag_operator', 'section-2', '2020-12-03T05:55:04.861187+00:00', '--job_id', '39', '--pool', 'default_pool', '--raw', '-sd', 'DAGS_FOLDER/example_subdag_operator.py', '--cfg_path', '/tmp/tmpkdem7boi']
[2020-12-03 05:56:13,483] {standard_task_runner.py:78} INFO - Job 39: Subtask section-2
[2020-12-03 05:56:13,658] {logging_mixin.py:112} INFO - Running %s on host %s <TaskInstance: example_subdag_operator.section-2 2020-12-03T05:55:04.861187+00:00 [running]> airflow-demo-worker-0.airflow-demo-worker.airflow.svc.cluster.local
[2020-12-03 05:56:14,265] {base_executor.py:58} INFO - Adding to queue: ['airflow', 'run', 'example_subdag_operator.section-2', 'section-2-task-1', '2020-12-03T05:55:04.861187+00:00', '--local', '--pool', 'default_pool', '-sd', 'DAGS_FOLDER/example_subdag_operator.py', '--cfg_path', '/tmp/tmp8kky3o06']
[2020-12-03 05:56:14,539] {base_executor.py:58} INFO - Adding to queue: ['airflow', 'run', 'example_subdag_operator.section-2', 'section-2-task-2', '2020-12-03T05:55:04.861187+00:00', '--local', '--pool', 'default_pool', '-sd', 'DAGS_FOLDER/example_subdag_operator.py', '--cfg_path', '/tmp/tmp3uhiaf8x']
[2020-12-03 05:56:14,820] {base_executor.py:58} INFO - Adding to queue: ['airflow', 'run', 'example_subdag_operator.section-2', 'section-2-task-3', '2020-12-03T05:55:04.861187+00:00', '--local', '--pool', 'default_pool', '-sd', 'DAGS_FOLDER/example_subdag_operator.py', '--cfg_path', '/tmp/tmpjwipf0vo']
[2020-12-03 05:56:14,836] {sequential_executor.py:54} INFO - Executing command: ['airflow', 'run', 'example_subdag_operator.section-2', 'section-2-task-1', '2020-12-03T05:55:04.861187+00:00', '--local', '--pool', 'default_pool', '-sd', 'DAGS_FOLDER/example_subdag_operator.py', '--cfg_path', '/tmp/tmp8kky3o06']
[2020-12-03 05:56:22,546] {sequential_executor.py:54} INFO - Executing command: ['airflow', 'run', 'example_subdag_operator.section-2', 'section-2-task-2', '2020-12-03T05:55:04.861187+00:00', '--local', '--pool', 'default_pool', '-sd', 'DAGS_FOLDER/example_subdag_operator.py', '--cfg_path', '/tmp/tmp3uhiaf8x']
[2020-12-03 05:56:35,125] {sequential_executor.py:54} INFO - Executing command: ['airflow', 'run', 'example_subdag_operator.section-2', 'section-2-task-3', '2020-12-03T05:55:04.861187+00:00', '--local', '--pool', 'default_pool', '-sd', 'DAGS_FOLDER/example_subdag_operator.py', '--cfg_path', '/tmp/tmpjwipf0vo']
[2020-12-03 05:56:42,893] {taskinstance.py:1150} ERROR - (_mysql_exceptions.OperationalError) (1213, 'Deadlock found when trying to get lock; try restarting transaction')
[SQL: UPDATE task_instance SET state=%s, queued_dttm=%s WHERE task_instance.task_id = %s AND task_instance.dag_id = %s AND task_instance.execution_date = %s]
[parameters: ('queued', datetime.datetime(2020, 12, 3, 5, 56, 14, 601700), 'section-2-task-3', 'example_subdag_operator.section-2', datetime.datetime(2020, 12, 3, 5, 55, 4, 861187))]
(Background on this error at: http://sqlalche.me/e/13/e3q8)
Traceback (most recent call last):
  File "/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 1277, in _execute_context
    cursor, statement, parameters, context
  File "/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/engine/default.py", line 593, in do_execute
    cursor.execute(statement, parameters)
  File "/home/airflow/.local/lib/python3.6/site-packages/MySQLdb/cursors.py", line 255, in execute
    self.errorhandler(self, exc, value)
  File "/home/airflow/.local/lib/python3.6/site-packages/MySQLdb/connections.py", line 50, in defaulterrorhandler
    raise errorvalue
  File "/home/airflow/.local/lib/python3.6/site-packages/MySQLdb/cursors.py", line 252, in execute
    res = self._query(query)
  File "/home/airflow/.local/lib/python3.6/site-packages/MySQLdb/cursors.py", line 378, in _query
    db.query(q)
  File "/home/airflow/.local/lib/python3.6/site-packages/MySQLdb/connections.py", line 280, in query
    _mysql.connection.query(self, query)
_mysql_exceptions.OperationalError: (1213, 'Deadlock found when trying to get lock; try restarting transaction')

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/home/airflow/.local/lib/python3.6/site-packages/airflow/models/taskinstance.py", line 984, in _run_raw_task
    result = task_copy.execute(context=context)
  File "/home/airflow/.local/lib/python3.6/site-packages/airflow/operators/subdag_operator.py", line 102, in execute
    executor=self.executor)
  File "/home/airflow/.local/lib/python3.6/site-packages/airflow/models/dag.py", line 1432, in run
    job.run()
  File "/home/airflow/.local/lib/python3.6/site-packages/airflow/jobs/base_job.py", line 218, in run
    self._execute()
  File "/home/airflow/.local/lib/python3.6/site-packages/airflow/utils/db.py", line 74, in wrapper
    return func(*args, **kwargs)
  File "/home/airflow/.local/lib/python3.6/site-packages/airflow/jobs/backfill_job.py", line 787, in _execute
    session=session)
  File "/home/airflow/.local/lib/python3.6/site-packages/airflow/utils/db.py", line 70, in wrapper
    return func(*args, **kwargs)
  File "/home/airflow/.local/lib/python3.6/site-packages/airflow/jobs/backfill_job.py", line 717, in _execute_for_run_dates
    session=session)
  File "/home/airflow/.local/lib/python3.6/site-packages/airflow/utils/db.py", line 70, in wrapper
    return func(*args, **kwargs)
  File "/home/airflow/.local/lib/python3.6/site-packages/airflow/jobs/backfill_job.py", line 586, in _process_backfill_task_instances
    _per_task_process(task, key, ti)
  File "/home/airflow/.local/lib/python3.6/site-packages/airflow/utils/db.py", line 74, in wrapper
    return func(*args, **kwargs)
  File "/home/airflow/.local/lib/python3.6/site-packages/airflow/jobs/backfill_job.py", line 509, in _per_task_process
    session.commit()
  File "/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/orm/session.py", line 1042, in commit
    self.transaction.commit()
  File "/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/orm/session.py", line 504, in commit
    self._prepare_impl()
  File "/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/orm/session.py", line 483, in _prepare_impl
    self.session.flush()
  File "/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/orm/session.py", line 2523, in flush
    self._flush(objects)
  File "/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/orm/session.py", line 2664, in _flush
    transaction.rollback(_capture_exception=True)
  File "/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/util/langhelpers.py", line 69, in __exit__
    exc_value, with_traceback=exc_tb,
  File "/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/util/compat.py", line 182, in raise_
    raise exception
  File "/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/orm/session.py", line 2624, in _flush
    flush_context.execute()
  File "/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/orm/unitofwork.py", line 422, in execute
    rec.execute(self)
  File "/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/orm/unitofwork.py", line 589, in execute
    uow,
  File "/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/orm/persistence.py", line 236, in save_obj
    update,
  File "/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/orm/persistence.py", line 995, in _emit_update_statements
    statement, multiparams
  File "/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 1011, in execute
    return meth(self, multiparams, params)
  File "/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/sql/elements.py", line 298, in _execute_on_connection
    return connection._execute_clauseelement(self, multiparams, params)
  File "/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 1130, in _execute_clauseelement
    distilled_params,
  File "/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 1317, in _execute_context
    e, statement, parameters, cursor, context
  File "/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 1511, in _handle_dbapi_exception
    sqlalchemy_exception, with_traceback=exc_info[2], from_=e
  File "/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/util/compat.py", line 182, in raise_
    raise exception
  File "/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 1277, in _execute_context
    cursor, statement, parameters, context
  File "/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/engine/default.py", line 593, in do_execute
    cursor.execute(statement, parameters)
  File "/home/airflow/.local/lib/python3.6/site-packages/MySQLdb/cursors.py", line 255, in execute
    self.errorhandler(self, exc, value)
  File "/home/airflow/.local/lib/python3.6/site-packages/MySQLdb/connections.py", line 50, in defaulterrorhandler
    raise errorvalue
  File "/home/airflow/.local/lib/python3.6/site-packages/MySQLdb/cursors.py", line 252, in execute
    res = self._query(query)
  File "/home/airflow/.local/lib/python3.6/site-packages/MySQLdb/cursors.py", line 378, in _query
    db.query(q)
  File "/home/airflow/.local/lib/python3.6/site-packages/MySQLdb/connections.py", line 280, in query
    _mysql.connection.query(self, query)
sqlalchemy.exc.OperationalError: (_mysql_exceptions.OperationalError) (1213, 'Deadlock found when trying to get lock; try restarting transaction')
[SQL: UPDATE task_instance SET state=%s, queued_dttm=%s WHERE task_instance.task_id = %s AND task_instance.dag_id = %s AND task_instance.execution_date = %s]
[parameters: ('queued', datetime.datetime(2020, 12, 3, 5, 56, 14, 601700), 'section-2-task-3', 'example_subdag_operator.section-2', datetime.datetime(2020, 12, 3, 5, 55, 4, 861187))]
(Background on this error at: http://sqlalche.me/e/13/e3q8)
[2020-12-03 05:56:42,899] {taskinstance.py:1194} INFO - Marking task as FAILED. dag_id=example_subdag_operator, task_id=section-2, execution_date=20201203T055504, start_date=20201203T055613, end_date=20201203T055642
[2020-12-03 05:56:43,465] {local_task_job.py:102} INFO - Task exited with return code 1


{code}
h3.  

 

 Output of `show innodb status'
{code:java}
mysql> SHOW ENGINE INNODB STATUS \G
*************************** 1. row ***************************
  Type: InnoDB
  Name:
Status:
=====================================
2020-12-03 06:28:01 0x7f23791d4700 INNODB MONITOR OUTPUT
=====================================
Per second averages calculated from the last 18 seconds
-----------------
BACKGROUND THREAD
-----------------
srv_master_thread loops: 64332 srv_active, 0 srv_shutdown, 507795 srv_idle
srv_master_thread log flush and writes: 0
----------
SEMAPHORES
----------
OS WAIT ARRAY INFO: reservation count 16669
OS WAIT ARRAY INFO: signal count 15166
RW-shared spins 56, rounds 61, OS waits 5
RW-excl spins 179, rounds 4346, OS waits 127
RW-sx spins 1, rounds 30, OS waits 0
Spin rounds per wait: 1.09 RW-shared, 24.28 RW-excl, 30.00 RW-sx
------------------------
LATEST DETECTED DEADLOCK
------------------------
2020-12-03 05:56:14 0x7f237f4f8700
*** (1) TRANSACTION:
TRANSACTION 706710, ACTIVE 0 sec fetching rows
mysql tables in use 2, locked 2
LOCK WAIT 10 lock struct(s), heap size 1136, 13 row lock(s)
MySQL thread id 287250, OS thread handle 139790332544768, query id 7702769 10.240.0.64 airflow_demo executing
SELECT task_instance.try_number AS task_instance_try_number, task_instance.task_id AS task_instance_task_id, task_instance.dag_id AS task_instance_dag_id, task_instance.execution_date AS task_instance_execution_date, task_instance.start_date AS task_instance_start_date, task_instance.end_date AS task_instance_end_date, task_instance.duration AS task_instance_duration, task_instance.state AS task_instance_state, task_instance.max_tries AS task_instance_max_tries, task_instance.hostname AS task_instance_hostname, task_instance.unixname AS task_instance_unixname, task_instance.job_id AS task_instance_job_id, task_instance.pool AS task_instance_pool, task_instance.pool_slots AS task_instance_pool_slots, task_instance.queue AS task_instance_queue, task_instance.priority_weight AS task_instance_priority_weight, task_instance.operator AS task_instance_operator, task_instance.queued_dttm AS task_instance_que*** (1) HOLDS THE LOCK(S):
RECORD LOCKS space id 108 page no 8 n bits 184 index ti_state of table `airflow_demo`.`task_instance` trx id 706710 lock_mode X
Record lock, heap no 19 PHYSICAL RECORD: n_fields 4; compact format; info bits 0
 0: len 9; hex 7363686564756c6564; asc scheduled;;
 1: len 16; hex 73656374696f6e2d312d7461736b2d35; asc section-1-task-5;;
 2: len 30; hex 6578616d706c655f7375626461675f6f70657261746f722e73656374696f; asc example_subdag_operator.sectio; (total 33 bytes);
 3: len 7; hex 5fc58780000000; asc _      ;;Record lock, heap no 107 PHYSICAL RECORD: n_fields 4; compact format; info bits 0
 0: len 6; hex 717565756564; asc queued;;
 1: len 16; hex 73656374696f6e2d322d7461736b2d32; asc section-2-task-2;;
 2: len 30; hex 6578616d706c655f7375626461675f6f70657261746f722e73656374696f; asc example_subdag_operator.sectio; (total 33 bytes);
 3: len 7; hex 5fc87db80d2403; asc _ }  $ ;;Record lock, heap no 109 PHYSICAL RECORD: n_fields 4; compact format; info bits 0
 0: len 9; hex 7363686564756c6564; asc scheduled;;
 1: len 16; hex 73656374696f6e2d322d7461736b2d33; asc section-2-task-3;;
 2: len 30; hex 6578616d706c655f7375626461675f6f70657261746f722e73656374696f; asc example_subdag_operator.sectio; (total 33 bytes);
 3: len 7; hex 5fc87db80d2403; asc _ }  $ ;;Record lock, heap no 112 PHYSICAL RECORD: n_fields 4; compact format; info bits 0
 0: len 6; hex 717565756564; asc queued;;
 1: len 16; hex 73656374696f6e2d322d7461736b2d31; asc section-2-task-1;;
 2: len 30; hex 6578616d706c655f7375626461675f6f70657261746f722e73656374696f; asc example_subdag_operator.sectio; (total 33 bytes);
 3: len 7; hex 5fc87db80d2403; asc _ }  $ ;;
*** (1) WAITING FOR THIS LOCK TO BE GRANTED:
RECORD LOCKS space id 108 page no 4 n bits 128 index PRIMARY of table `airflow_demo`.`task_instance` trx id 706710 lock_mode X locks rec but not gap waiting
Record lock, heap no 55 PHYSICAL RECORD: n_fields 22; compact format; info bits 128
 0: len 16; hex 73656374696f6e2d322d7461736b2d33; asc section-2-task-3;;
 1: len 30; hex 6578616d706c655f7375626461675f6f70657261746f722e73656374696f; asc example_subdag_operator.sectio; (total 33 bytes);
 2: len 7; hex 5fc87db80d2403; asc _ }  $ ;;
 3: len 6; hex 0000000ac894; asc       ;;
 4: len 7; hex 02000001880f85; asc        ;;
 5: len 7; hex 5fc87dfd0d981f; asc _ }    ;;
 6: len 7; hex 5fc87dfd0d982a; asc _ }   *;;
 7: SQL NULL;
 8: len 6; hex 717565756564; asc queued;;
 9: len 4; hex 80000000; asc     ;;
 10: len 0; hex ; asc ;;
 11: len 7; hex 616972666c6f77; asc airflow;;
 12: SQL NULL;
 13: len 12; hex 64656661756c745f706f6f6c; asc default_pool;;
 14: len 7; hex 64656661756c74; asc default;;
 15: len 4; hex 80000001; asc     ;;
 16: len 13; hex 44756d6d794f70657261746f72; asc DummyOperator;;
 17: len 7; hex 5fc87dfe092e64; asc _ }  .d;;
 18: SQL NULL;
 19: len 4; hex 80000000; asc     ;;
 20: len 14; hex 80049503000000000000007d942e; asc            } .;;
 21: len 4; hex 80000001; asc     ;;
*** (2) TRANSACTION:
TRANSACTION 706708, ACTIVE 0 sec updating or deleting
mysql tables in use 1, locked 1
LOCK WAIT 3 lock struct(s), heap size 1136, 2 row lock(s), undo log entries 1
MySQL thread id 292393, OS thread handle 139790334904064, query id 7702770 10.240.0.108 airflow_demo updating
UPDATE task_instance SET state='queued', queued_dttm='2020-12-03 05:56:14.601700' WHERE task_instance.task_id = 'section-2-task-3' AND task_instance.dag_id = 'example_subdag_operator.section-2' AND task_instance.execution_date = '2020-12-03 05:55:04.861187'*** (2) HOLDS THE LOCK(S):
RECORD LOCKS space id 108 page no 4 n bits 120 index PRIMARY of table `airflow_demo`.`task_instance` trx id 706708 lock_mode X locks rec but not gap
Record lock, heap no 55 PHYSICAL RECORD: n_fields 22; compact format; info bits 128
 0: len 16; hex 73656374696f6e2d322d7461736b2d33; asc section-2-task-3;;
 1: len 30; hex 6578616d706c655f7375626461675f6f70657261746f722e73656374696f; asc example_subdag_operator.sectio; (total 33 bytes);
 2: len 7; hex 5fc87db80d2403; asc _ }  $ ;;
 3: len 6; hex 0000000ac894; asc       ;;
 4: len 7; hex 02000001880f85; asc        ;;
 5: len 7; hex 5fc87dfd0d981f; asc _ }    ;;
 6: len 7; hex 5fc87dfd0d982a; asc _ }   *;;
 7: SQL NULL;
 8: len 6; hex 717565756564; asc queued;;
 9: len 4; hex 80000000; asc     ;;
 10: len 0; hex ; asc ;;
 11: len 7; hex 616972666c6f77; asc airflow;;
 12: SQL NULL;
 13: len 12; hex 64656661756c745f706f6f6c; asc default_pool;;
 14: len 7; hex 64656661756c74; asc default;;
 15: len 4; hex 80000001; asc     ;;
 16: len 13; hex 44756d6d794f70657261746f72; asc DummyOperator;;
 17: len 7; hex 5fc87dfe092e64; asc _ }  .d;;
 18: SQL NULL;
 19: len 4; hex 80000000; asc     ;;
 20: len 14; hex 80049503000000000000007d942e; asc            } .;;
 21: len 4; hex 80000001; asc     ;;
*** (2) WAITING FOR THIS LOCK TO BE GRANTED:
RECORD LOCKS space id 108 page no 8 n bits 184 index ti_state of table `airflow_demo`.`task_instance` trx id 706708 lock_mode X locks rec but not gap waiting
Record lock, heap no 109 PHYSICAL RECORD: n_fields 4; compact format; info bits 0
 0: len 9; hex 7363686564756c6564; asc scheduled;;
 1: len 16; hex 73656374696f6e2d322d7461736b2d33; asc section-2-task-3;;
 2: len 30; hex 6578616d706c655f7375626461675f6f70657261746f722e73656374696f; asc example_subdag_operator.sectio; (total 33 bytes);
 3: len 7; hex 5fc87db80d2403; asc _ }  $ ;;*** WE ROLL BACK TRANSACTION (2)
------------
TRANSACTIONS
------------
Trx id counter 734549
Purge done for trx's n:o < 734548 undo n:o < 0 state: running but idle
History list length 15
LIST OF TRANSACTIONS FOR EACH SESSION:
---TRANSACTION 421265880635296, not started
0 lock struct(s), heap size 1136, 0 row lock(s)
---TRANSACTION 421265880639576, not started
0 lock struct(s), heap size 1136, 0 row lock(s)
---TRANSACTION 421265880638720, not started
0 lock struct(s), heap size 1136, 0 row lock(s)
---TRANSACTION 421265880637864, not started
0 lock struct(s), heap size 1136, 0 row lock(s)
---TRANSACTION 421265880637008, not started
0 lock struct(s), heap size 1136, 0 row lock(s)
---TRANSACTION 421265880633584, not started
0 lock struct(s), heap size 1136, 0 row lock(s)
---TRANSACTION 421265880632728, not started
0 lock struct(s), heap size 1136, 0 row lock(s)
---TRANSACTION 421265880634440, not started
0 lock struct(s), heap size 1136, 0 row lock(s)
---TRANSACTION 421265880631872, not started
0 lock struct(s), heap size 1136, 0 row lock(s)
---TRANSACTION 421265880631016, not started
0 lock struct(s), heap size 1136, 0 row lock(s)
--------
FILE I/O
--------
I/O thread 0 state: waiting for completed aio requests (insert buffer thread)
I/O thread 1 state: waiting for completed aio requests (log thread)
I/O thread 2 state: waiting for completed aio requests (read thread)
I/O thread 3 state: waiting for completed aio requests (read thread)
I/O thread 4 state: waiting for completed aio requests (read thread)
I/O thread 5 state: waiting for completed aio requests (read thread)
I/O thread 6 state: waiting for completed aio requests (write thread)
I/O thread 7 state: waiting for completed aio requests (write thread)
I/O thread 8 state: waiting for completed aio requests (write thread)
I/O thread 9 state: waiting for completed aio requests (write thread)
Pending normal aio reads: [0, 0, 0, 0] , aio writes: [0, 0, 0, 0] ,
 ibuf aio reads:, log i/o's:, sync i/o's:
Pending flushes (fsync) log: 0; buffer pool: 1
1452 OS file reads, 1279009 OS file writes, 851822 OS fsyncs
0.00 reads/s, 0 avg bytes/read, 22.94 writes/s, 14.78 fsyncs/s
-------------------------------------
INSERT BUFFER AND ADAPTIVE HASH INDEX
-------------------------------------
Ibuf: size 1, free list len 0, seg size 2, 0 merges
merged operations:
 insert 0, delete mark 0, delete 0
discarded operations:
 insert 0, delete mark 0, delete 0
Hash table size 34679, node heap has 1 buffer(s)
Hash table size 34679, node heap has 2 buffer(s)
Hash table size 34679, node heap has 1 buffer(s)
Hash table size 34679, node heap has 2 buffer(s)
Hash table size 34679, node heap has 1 buffer(s)
Hash table size 34679, node heap has 2 buffer(s)
Hash table size 34679, node heap has 2 buffer(s)
Hash table size 34679, node heap has 4 buffer(s)
37.55 hash searches/s, 84.55 non-hash searches/s
---
LOG
---
Log sequence number          258819344
Log buffer assigned up to    258819344
Log buffer completed up to   258819344
Log written up to            258819344
Log flushed up to            258819344
Added dirty pages up to      258819344
Pages flushed up to          252108186
Last checkpoint at           252108186
1181956 log i/o's done, 24.29 log i/o's/second
----------------------
BUFFER POOL AND MEMORY
----------------------
Total large memory allocated 137363456
Dictionary memory allocated 811480
Buffer pool size   8192
Free buffers       5787
Database pages     2390
Old database pages 862
Modified db pages  816
Pending reads      0
Pending writes: LRU 0, flush list 0, single page 0
Pages made young 1830, not young 3088
0.00 youngs/s, 0.00 non-youngs/s
Pages read 1419, created 989, written 67975
0.00 reads/s, 0.00 creates/s, 0.00 writes/s
Buffer pool hit rate 1000 / 1000, young-making rate 0 / 1000 not 0 / 1000
Pages read ahead 0.00/s, evicted without access 0.00/s, Random read ahead 0.00/s
LRU len: 2390, unzip_LRU len: 0
I/O sum[14]:cur[0], unzip sum[0]:cur[0]
--------------
ROW OPERATIONS
--------------
0 queries inside InnoDB, 0 queries in queue
0 read views open inside InnoDB
Process ID=1, Main thread ID=139790402930432 , state=sleeping
Number of rows inserted 53405, updated 249949, deleted 51905, read 1496610
0.94 inserts/s, 4.89 updates/s, 0.94 deletes/s, 32.66 reads/s
Number of system rows inserted 4717, updated 8012, deleted 3768, read 272867
0.00 inserts/s, 0.00 updates/s, 0.00 deletes/s, 3.89 reads/s
----------------------------
END OF INNODB MONITOR OUTPUT
============================1 row in set (0.00 sec)
  {code}


was (Author: snjypl):
i am getting this issue on version 1.10.12 while running the example_subdag_operator from the examples.  it is a clean deployment, and this is the only DAG configured. 

 it is happening pretty consistently. 

 

 

Environment:

Airflow Version: 1.10.12

Database: MySQL 8.0.22

Deployed on Kubernetes with CeleryExecutor using airflow-helm/charts
{code:java}
// code placeholder
*** Log file does not exist: /opt/airflow/logs/example_subdag_operator/section-2/2020-12-03T05:55:04.861187+00:00/1.log
*** Fetching from: http://airflow-demo-worker-0.airflow-demo-worker.airflow.svc.cluster.local:8793/log/example_subdag_operator/section-2/2020-12-03T05:55:04.861187+00:00/1.log

[2020-12-03 05:56:13,403] {taskinstance.py:670} INFO - Dependencies all met for <TaskInstance: example_subdag_operator.section-2 2020-12-03T05:55:04.861187+00:00 [queued]>
[2020-12-03 05:56:13,430] {taskinstance.py:670} INFO - Dependencies all met for <TaskInstance: example_subdag_operator.section-2 2020-12-03T05:55:04.861187+00:00 [queued]>
[2020-12-03 05:56:13,430] {taskinstance.py:880} INFO - 
--------------------------------------------------------------------------------
[2020-12-03 05:56:13,430] {taskinstance.py:881} INFO - Starting attempt 1 of 1
[2020-12-03 05:56:13,430] {taskinstance.py:882} INFO - 
--------------------------------------------------------------------------------
[2020-12-03 05:56:13,459] {taskinstance.py:901} INFO - Executing <Task(SubDagOperator): section-2> on 2020-12-03T05:55:04.861187+00:00
[2020-12-03 05:56:13,461] {standard_task_runner.py:54} INFO - Started process 1540 to run task
[2020-12-03 05:56:13,481] {standard_task_runner.py:77} INFO - Running: ['airflow', 'run', 'example_subdag_operator', 'section-2', '2020-12-03T05:55:04.861187+00:00', '--job_id', '39', '--pool', 'default_pool', '--raw', '-sd', 'DAGS_FOLDER/example_subdag_operator.py', '--cfg_path', '/tmp/tmpkdem7boi']
[2020-12-03 05:56:13,483] {standard_task_runner.py:78} INFO - Job 39: Subtask section-2
[2020-12-03 05:56:13,658] {logging_mixin.py:112} INFO - Running %s on host %s <TaskInstance: example_subdag_operator.section-2 2020-12-03T05:55:04.861187+00:00 [running]> airflow-demo-worker-0.airflow-demo-worker.airflow.svc.cluster.local
[2020-12-03 05:56:14,265] {base_executor.py:58} INFO - Adding to queue: ['airflow', 'run', 'example_subdag_operator.section-2', 'section-2-task-1', '2020-12-03T05:55:04.861187+00:00', '--local', '--pool', 'default_pool', '-sd', 'DAGS_FOLDER/example_subdag_operator.py', '--cfg_path', '/tmp/tmp8kky3o06']
[2020-12-03 05:56:14,539] {base_executor.py:58} INFO - Adding to queue: ['airflow', 'run', 'example_subdag_operator.section-2', 'section-2-task-2', '2020-12-03T05:55:04.861187+00:00', '--local', '--pool', 'default_pool', '-sd', 'DAGS_FOLDER/example_subdag_operator.py', '--cfg_path', '/tmp/tmp3uhiaf8x']
[2020-12-03 05:56:14,820] {base_executor.py:58} INFO - Adding to queue: ['airflow', 'run', 'example_subdag_operator.section-2', 'section-2-task-3', '2020-12-03T05:55:04.861187+00:00', '--local', '--pool', 'default_pool', '-sd', 'DAGS_FOLDER/example_subdag_operator.py', '--cfg_path', '/tmp/tmpjwipf0vo']
[2020-12-03 05:56:14,836] {sequential_executor.py:54} INFO - Executing command: ['airflow', 'run', 'example_subdag_operator.section-2', 'section-2-task-1', '2020-12-03T05:55:04.861187+00:00', '--local', '--pool', 'default_pool', '-sd', 'DAGS_FOLDER/example_subdag_operator.py', '--cfg_path', '/tmp/tmp8kky3o06']
[2020-12-03 05:56:22,546] {sequential_executor.py:54} INFO - Executing command: ['airflow', 'run', 'example_subdag_operator.section-2', 'section-2-task-2', '2020-12-03T05:55:04.861187+00:00', '--local', '--pool', 'default_pool', '-sd', 'DAGS_FOLDER/example_subdag_operator.py', '--cfg_path', '/tmp/tmp3uhiaf8x']
[2020-12-03 05:56:35,125] {sequential_executor.py:54} INFO - Executing command: ['airflow', 'run', 'example_subdag_operator.section-2', 'section-2-task-3', '2020-12-03T05:55:04.861187+00:00', '--local', '--pool', 'default_pool', '-sd', 'DAGS_FOLDER/example_subdag_operator.py', '--cfg_path', '/tmp/tmpjwipf0vo']
[2020-12-03 05:56:42,893] {taskinstance.py:1150} ERROR - (_mysql_exceptions.OperationalError) (1213, 'Deadlock found when trying to get lock; try restarting transaction')
[SQL: UPDATE task_instance SET state=%s, queued_dttm=%s WHERE task_instance.task_id = %s AND task_instance.dag_id = %s AND task_instance.execution_date = %s]
[parameters: ('queued', datetime.datetime(2020, 12, 3, 5, 56, 14, 601700), 'section-2-task-3', 'example_subdag_operator.section-2', datetime.datetime(2020, 12, 3, 5, 55, 4, 861187))]
(Background on this error at: http://sqlalche.me/e/13/e3q8)
Traceback (most recent call last):
  File "/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 1277, in _execute_context
    cursor, statement, parameters, context
  File "/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/engine/default.py", line 593, in do_execute
    cursor.execute(statement, parameters)
  File "/home/airflow/.local/lib/python3.6/site-packages/MySQLdb/cursors.py", line 255, in execute
    self.errorhandler(self, exc, value)
  File "/home/airflow/.local/lib/python3.6/site-packages/MySQLdb/connections.py", line 50, in defaulterrorhandler
    raise errorvalue
  File "/home/airflow/.local/lib/python3.6/site-packages/MySQLdb/cursors.py", line 252, in execute
    res = self._query(query)
  File "/home/airflow/.local/lib/python3.6/site-packages/MySQLdb/cursors.py", line 378, in _query
    db.query(q)
  File "/home/airflow/.local/lib/python3.6/site-packages/MySQLdb/connections.py", line 280, in query
    _mysql.connection.query(self, query)
_mysql_exceptions.OperationalError: (1213, 'Deadlock found when trying to get lock; try restarting transaction')

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/home/airflow/.local/lib/python3.6/site-packages/airflow/models/taskinstance.py", line 984, in _run_raw_task
    result = task_copy.execute(context=context)
  File "/home/airflow/.local/lib/python3.6/site-packages/airflow/operators/subdag_operator.py", line 102, in execute
    executor=self.executor)
  File "/home/airflow/.local/lib/python3.6/site-packages/airflow/models/dag.py", line 1432, in run
    job.run()
  File "/home/airflow/.local/lib/python3.6/site-packages/airflow/jobs/base_job.py", line 218, in run
    self._execute()
  File "/home/airflow/.local/lib/python3.6/site-packages/airflow/utils/db.py", line 74, in wrapper
    return func(*args, **kwargs)
  File "/home/airflow/.local/lib/python3.6/site-packages/airflow/jobs/backfill_job.py", line 787, in _execute
    session=session)
  File "/home/airflow/.local/lib/python3.6/site-packages/airflow/utils/db.py", line 70, in wrapper
    return func(*args, **kwargs)
  File "/home/airflow/.local/lib/python3.6/site-packages/airflow/jobs/backfill_job.py", line 717, in _execute_for_run_dates
    session=session)
  File "/home/airflow/.local/lib/python3.6/site-packages/airflow/utils/db.py", line 70, in wrapper
    return func(*args, **kwargs)
  File "/home/airflow/.local/lib/python3.6/site-packages/airflow/jobs/backfill_job.py", line 586, in _process_backfill_task_instances
    _per_task_process(task, key, ti)
  File "/home/airflow/.local/lib/python3.6/site-packages/airflow/utils/db.py", line 74, in wrapper
    return func(*args, **kwargs)
  File "/home/airflow/.local/lib/python3.6/site-packages/airflow/jobs/backfill_job.py", line 509, in _per_task_process
    session.commit()
  File "/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/orm/session.py", line 1042, in commit
    self.transaction.commit()
  File "/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/orm/session.py", line 504, in commit
    self._prepare_impl()
  File "/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/orm/session.py", line 483, in _prepare_impl
    self.session.flush()
  File "/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/orm/session.py", line 2523, in flush
    self._flush(objects)
  File "/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/orm/session.py", line 2664, in _flush
    transaction.rollback(_capture_exception=True)
  File "/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/util/langhelpers.py", line 69, in __exit__
    exc_value, with_traceback=exc_tb,
  File "/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/util/compat.py", line 182, in raise_
    raise exception
  File "/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/orm/session.py", line 2624, in _flush
    flush_context.execute()
  File "/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/orm/unitofwork.py", line 422, in execute
    rec.execute(self)
  File "/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/orm/unitofwork.py", line 589, in execute
    uow,
  File "/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/orm/persistence.py", line 236, in save_obj
    update,
  File "/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/orm/persistence.py", line 995, in _emit_update_statements
    statement, multiparams
  File "/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 1011, in execute
    return meth(self, multiparams, params)
  File "/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/sql/elements.py", line 298, in _execute_on_connection
    return connection._execute_clauseelement(self, multiparams, params)
  File "/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 1130, in _execute_clauseelement
    distilled_params,
  File "/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 1317, in _execute_context
    e, statement, parameters, cursor, context
  File "/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 1511, in _handle_dbapi_exception
    sqlalchemy_exception, with_traceback=exc_info[2], from_=e
  File "/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/util/compat.py", line 182, in raise_
    raise exception
  File "/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 1277, in _execute_context
    cursor, statement, parameters, context
  File "/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/engine/default.py", line 593, in do_execute
    cursor.execute(statement, parameters)
  File "/home/airflow/.local/lib/python3.6/site-packages/MySQLdb/cursors.py", line 255, in execute
    self.errorhandler(self, exc, value)
  File "/home/airflow/.local/lib/python3.6/site-packages/MySQLdb/connections.py", line 50, in defaulterrorhandler
    raise errorvalue
  File "/home/airflow/.local/lib/python3.6/site-packages/MySQLdb/cursors.py", line 252, in execute
    res = self._query(query)
  File "/home/airflow/.local/lib/python3.6/site-packages/MySQLdb/cursors.py", line 378, in _query
    db.query(q)
  File "/home/airflow/.local/lib/python3.6/site-packages/MySQLdb/connections.py", line 280, in query
    _mysql.connection.query(self, query)
sqlalchemy.exc.OperationalError: (_mysql_exceptions.OperationalError) (1213, 'Deadlock found when trying to get lock; try restarting transaction')
[SQL: UPDATE task_instance SET state=%s, queued_dttm=%s WHERE task_instance.task_id = %s AND task_instance.dag_id = %s AND task_instance.execution_date = %s]
[parameters: ('queued', datetime.datetime(2020, 12, 3, 5, 56, 14, 601700), 'section-2-task-3', 'example_subdag_operator.section-2', datetime.datetime(2020, 12, 3, 5, 55, 4, 861187))]
(Background on this error at: http://sqlalche.me/e/13/e3q8)
[2020-12-03 05:56:42,899] {taskinstance.py:1194} INFO - Marking task as FAILED. dag_id=example_subdag_operator, task_id=section-2, execution_date=20201203T055504, start_date=20201203T055613, end_date=20201203T055642
[2020-12-03 05:56:43,465] {local_task_job.py:102} INFO - Task exited with return code 1


{code}
h3.  

 

 
 

> Deadlock found when trying to update task_instance table
> --------------------------------------------------------
>
>                 Key: AIRFLOW-2516
>                 URL: https://issues.apache.org/jira/browse/AIRFLOW-2516
>             Project: Apache Airflow
>          Issue Type: Bug
>          Components: DagRun
>    Affects Versions: 1.8.0, 1.9.0, 1.10.0, 1.10.1, 1.10.2, 1.10.3, 1.10.4, 1.10.5, 1.10.6, 1.10.7
>            Reporter: Jeff Liu
>            Assignee: Jarek Potiuk
>            Priority: Major
>             Fix For: 1.10.11
>
>         Attachments: Screenshot 2019-12-30 at 10.42.52.png, image-2019-12-30-10-48-41-313.png, image-2019-12-30-10-58-02-610.png, jobs.py, jobs_fixed_deadlock_possibly_1.9.py, scheduler_job_fixed_deadlock_possibly_1.10.6.py
>
>
>  
>  
> {code:java}
> [2018-05-23 17:59:57,218] {base_task_runner.py:98} INFO - Subtask: [2018-05-23 17:59:57,217] {base_executor.py:49} INFO - Adding to queue: airflow run production_wipeout_wipe_manager.Carat Carat_20180227 2018-05-23T17:41:18.815809 --local -sd DAGS_FOLDER/wipeout/wipeout.py
> [2018-05-23 17:59:57,231] {base_task_runner.py:98} INFO - Subtask: Traceback (most recent call last):
> [2018-05-23 17:59:57,232] {base_task_runner.py:98} INFO - Subtask: File "/usr/local/bin/airflow", line 27, in <module>
> [2018-05-23 17:59:57,232] {base_task_runner.py:98} INFO - Subtask: args.func(args)
> [2018-05-23 17:59:57,232] {base_task_runner.py:98} INFO - Subtask: File "/usr/local/lib/python2.7/site-packages/airflow/bin/cli.py", line 392, in run
> [2018-05-23 17:59:57,232] {base_task_runner.py:98} INFO - Subtask: pool=args.pool,
> [2018-05-23 17:59:57,233] {base_task_runner.py:98} INFO - Subtask: File "/usr/local/lib/python2.7/site-packages/airflow/utils/db.py", line 50, in wrapper
> [2018-05-23 17:59:57,233] {base_task_runner.py:98} INFO - Subtask: result = func(*args, **kwargs)
> [2018-05-23 17:59:57,233] {base_task_runner.py:98} INFO - Subtask: File "/usr/local/lib/python2.7/site-packages/airflow/models.py", line 1532, in _run_raw_task
> [2018-05-23 17:59:57,234] {base_task_runner.py:98} INFO - Subtask: self.handle_failure(e, test_mode, context)
> [2018-05-23 17:59:57,234] {base_task_runner.py:98} INFO - Subtask: File "/usr/local/lib/python2.7/site-packages/airflow/models.py", line 1641, in handle_failure
> [2018-05-23 17:59:57,234] {base_task_runner.py:98} INFO - Subtask: session.merge(self)
> [2018-05-23 17:59:57,235] {base_task_runner.py:98} INFO - Subtask: File "/usr/local/lib/python2.7/site-packages/sqlalchemy/orm/session.py", line 1920, in merge
> [2018-05-23 17:59:57,235] {base_task_runner.py:98} INFO - Subtask: _resolve_conflict_map=_resolve_conflict_map)
> [2018-05-23 17:59:57,235] {base_task_runner.py:98} INFO - Subtask: File "/usr/local/lib/python2.7/site-packages/sqlalchemy/orm/session.py", line 1974, in _merge
> [2018-05-23 17:59:57,236] {base_task_runner.py:98} INFO - Subtask: merged = self.query(mapper.class_).get(key[1])
> [2018-05-23 17:59:57,236] {base_task_runner.py:98} INFO - Subtask: File "/usr/local/lib/python2.7/site-packages/sqlalchemy/orm/query.py", line 882, in get
> [2018-05-23 17:59:57,236] {base_task_runner.py:98} INFO - Subtask: ident, loading.load_on_pk_identity)
> [2018-05-23 17:59:57,236] {base_task_runner.py:98} INFO - Subtask: File "/usr/local/lib/python2.7/site-packages/sqlalchemy/orm/query.py", line 952, in _get_impl
> [2018-05-23 17:59:57,237] {base_task_runner.py:98} INFO - Subtask: return db_load_fn(self, primary_key_identity)
> [2018-05-23 17:59:57,237] {base_task_runner.py:98} INFO - Subtask: File "/usr/local/lib/python2.7/site-packages/sqlalchemy/orm/loading.py", line 247, in load_on_pk_i
> dentity
> [2018-05-23 17:59:57,237] {base_task_runner.py:98} INFO - Subtask: return q.one()
> [2018-05-23 17:59:57,238] {base_task_runner.py:98} INFO - Subtask: File "/usr/local/lib/python2.7/site-packages/sqlalchemy/orm/query.py", line 2884, in one
> [2018-05-23 17:59:57,238] {base_task_runner.py:98} INFO - Subtask: ret = self.one_or_none()
> [2018-05-23 17:59:57,238] {base_task_runner.py:98} INFO - Subtask: File "/usr/local/lib/python2.7/site-packages/sqlalchemy/orm/query.py", line 2854, in one_or_none
> [2018-05-23 17:59:57,238] {base_task_runner.py:98} INFO - Subtask: ret = list(self)
> [2018-05-23 17:59:57,239] {base_task_runner.py:98} INFO - Subtask: File "/usr/local/lib/python2.7/site-packages/sqlalchemy/orm/query.py", line 2925, in __iter__
> [2018-05-23 17:59:57,239] {base_task_runner.py:98} INFO - Subtask: return self._execute_and_instances(context)
> [2018-05-23 17:59:57,239] {base_task_runner.py:98} INFO - Subtask: File "/usr/local/lib/python2.7/site-packages/sqlalchemy/orm/query.py", line 2946, in _execute_and_instances
> [2018-05-23 17:59:57,240] {base_task_runner.py:98} INFO - Subtask: close_with_result=True)
> [2018-05-23 17:59:57,240] {base_task_runner.py:98} INFO - Subtask: File "/usr/local/lib/python2.7/site-packages/sqlalchemy/orm/query.py", line 2955, in _get_bind_ar
> s
> [2018-05-23 17:59:57,240] {base_task_runner.py:98} INFO - Subtask: **kw
> [2018-05-23 17:59:57,240] {base_task_runner.py:98} INFO - Subtask: File "/usr/local/lib/python2.7/site-packages/sqlalchemy/orm/query.py", line 2937, in _connection_f
> rom_session
> [2018-05-23 17:59:57,241] {base_task_runner.py:98} INFO - Subtask: conn = self.session.connection(**kw)
> [2018-05-23 17:59:57,241] {base_task_runner.py:98} INFO - Subtask: File "/usr/local/lib/python2.7/site-packages/sqlalchemy/orm/session.py", line 1035, in connection
> [2018-05-23 17:59:57,241] {base_task_runner.py:98} INFO - Subtask: execution_options=execution_options)
> [2018-05-23 17:59:57,241] {base_task_runner.py:98} INFO - Subtask: File "/usr/local/lib/python2.7/site-packages/sqlalchemy/orm/session.py", line 1040, in _connection
> _for_bind
> [2018-05-23 17:59:57,242] {base_task_runner.py:98} INFO - Subtask: engine, execution_options)
> [2018-05-23 17:59:57,242] {base_task_runner.py:98} INFO - Subtask: File "/usr/local/lib/python2.7/site-packages/sqlalchemy/orm/session.py", line 388, in _connection_
> for_bind
> [2018-05-23 17:59:57,242] {base_task_runner.py:98} INFO - Subtask: self._assert_active()
> [2018-05-23 17:59:57,243] {base_task_runner.py:98} INFO - Subtask: File "/usr/local/lib/python2.7/site-packages/sqlalchemy/orm/session.py", line 276, in _assert_acti
> ve
> [2018-05-23 17:59:57,243] {base_task_runner.py:98} INFO - Subtask: % self._rollback_exception
> [2018-05-23 17:59:57,244] {base_task_runner.py:98} INFO - Subtask: sqlalchemy.exc.InvalidRequestError: This Session's transaction has been rolled back due to a previou
> s exception during flush. To begin a new transaction with this Session, first issue Session.rollback(). Original exception was: (_mysql_exceptions.OperationalError) (1
> 213, 'Deadlock found when trying to get lock; try restarting transaction') [SQL: u'UPDATE task_instance SET state=%s WHERE task_instance.task_id = %s AND task_instance
> .dag_id = %s AND task_instance.execution_date = %s'] [parameters: (u'queued', 'Carat_20180227', 'production_wipeout_wipe_manager.Carat', datetime.datetime(2018, 5, 23,
> 17, 41, 18, 815809))] (Background on this error at: http://sqlalche.me/e/e3q8){code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)