You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by GitBox <gi...@apache.org> on 2021/08/17 04:28:10 UTC

[GitHub] [airflow] DreamyWen edited a comment on issue #8181: Duplicate key value violates unique constraint "task_instance_pkey" error

DreamyWen edited a comment on issue #8181:
URL: https://github.com/apache/airflow/issues/8181#issuecomment-899983073


   i encounter this in 2.1.1 airflow, the dags was triggered by java APP with same dag_id, execution_date
   
   2021-08-15 00:31:16 [pool-1-thread-1] INFO  c.s.e.bi.executor.MainExecutor - 等待dag script_dag jobId ad9435cd-459e-40c3-b9ef-2ee654641b06 执行完毕..
   2021-08-15 00:31:16 [pool-1-thread-1] INFO  c.s.e.bi.executor.MainExecutor - python_info: /data/anaconda3/envs/airflow/lib/python3.8/site-packages/airflow/configuration.py:346 DeprecationWarning: The hide_sensitive_variable_fields option in [admin] has been moved to the hide_sensitive_var_conn_fields option in [core] - the old setting has been used, but please update your config./data/anaconda3/envs/airflow/lib/python3.8/site-packages/airflow/configuration.py:346 DeprecationWarning: The default_queue option in [celery] has been moved to the default_queue option in [operators] - the old setting has been used, but please update your config./data/anaconda3/envs/airflow/lib/python3.8/site-packages/airflow/configuration.py:346 DeprecationWarning: The default_queue option in [celery] has been moved to the default_queue option in [operators] - the old setting has been used, but please update your config.[2021-08-15 00:31:16,050] {__init__.py:38} INFO - Loaded API au
 th backend: <module 'airflow.api.auth.backend.deny_all' from '/data/anaconda3/envs/airflow/lib/python3.8/site-packages/airflow/api/auth/backend/deny_all.py'>/data/anaconda3/envs/airflow/lib/python3.8/site-packages/airflow/plugins_manager.py:239 DeprecationWarning: This decorator is deprecated.In previous versions, all subclasses of BaseOperator must use apply_default decorator for the`default_args` feature to work properly.In current version, it is optional. The decorator is applied automatically using the metaclass.Traceback (most recent call last):  File "/data/anaconda3/envs/airflow/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1276, in _execute_context    self.dialect.do_execute(  File "/data/anaconda3/envs/airflow/lib/python3.8/site-packages/sqlalchemy/engine/default.py", line 608, in do_execute    cursor.execute(statement, parameters)  File "/data/anaconda3/envs/airflow/lib/python3.8/site-packages/MySQLdb/cursors.py", line 255, in execute    self.errorhandle
 r(self, exc, value)  File "/data/anaconda3/envs/airflow/lib/python3.8/site-packages/MySQLdb/connections.py", line 50, in defaulterrorhandler    raise errorvalue  File "/data/anaconda3/envs/airflow/lib/python3.8/site-packages/MySQLdb/cursors.py", line 252, in execute    res = self._query(query)  File "/data/anaconda3/envs/airflow/lib/python3.8/site-packages/MySQLdb/cursors.py", line 378, in _query    db.query(q)  File "/data/anaconda3/envs/airflow/lib/python3.8/site-packages/MySQLdb/connections.py", line 280, in query    _mysql.connection.query(self, query)_mysql_exceptions.IntegrityError: (1062, "Duplicate entry 'script_dag-2021-08-14 16:31:16.000000' for key 'dag_id'")The above exception was the direct cause of the following exception:Traceback (most recent call last):  File "/data/anaconda3/envs/airflow/bin/airflow", line 8, in <module>    sys.exit(main())  File "/data/anaconda3/envs/airflow/lib/python3.8/site-packages/airflow/__main__.py", line 40, in main    args.func(args)  Fil
 e "/data/anaconda3/envs/airflow/lib/python3.8/site-packages/airflow/cli/cli_parser.py", line 48, in command    return func(*args, **kwargs)  File "/data/anaconda3/envs/airflow/lib/python3.8/site-packages/airflow/utils/cli.py", line 91, in wrapper    return f(*args, **kwargs)  File "/data/anaconda3/envs/airflow/lib/python3.8/site-packages/airflow/cli/commands/dag_command.py", line 129, in dag_trigger    message = api_client.trigger_dag(  File "/data/anaconda3/envs/airflow/lib/python3.8/site-packages/airflow/api/client/local_client.py", line 29, in trigger_dag    dag_run = trigger_dag.trigger_dag(  File "/data/anaconda3/envs/airflow/lib/python3.8/site-packages/airflow/api/common/experimental/trigger_dag.py", line 117, in trigger_dag    triggers = _trigger_dag(  File "/data/anaconda3/envs/airflow/lib/python3.8/site-packages/airflow/api/common/experimental/trigger_dag.py", line 83, in _trigger_dag    trigger = _dag.create_dagrun(  File "/data/anaconda3/envs/airflow/lib/python3.8/site-pa
 ckages/airflow/utils/session.py", line 70, in wrapper    return func(*args, session=session, **kwargs)  File "/data/anaconda3/envs/airflow/lib/python3.8/site-packages/airflow/models/dag.py", line 1796, in create_dagrun    session.flush()  File "/data/anaconda3/envs/airflow/lib/python3.8/site-packages/sqlalchemy/orm/session.py", line 2540, in flush    self._flush(objects)  File "/data/anaconda3/envs/airflow/lib/python3.8/site-packages/sqlalchemy/orm/session.py", line 2682, in _flush    transaction.rollback(_capture_exception=True)  File "/data/anaconda3/envs/airflow/lib/python3.8/site-packages/sqlalchemy/util/langhelpers.py", line 68, in __exit__    compat.raise_(  File "/data/anaconda3/envs/airflow/lib/python3.8/site-packages/sqlalchemy/util/compat.py", line 182, in raise_    raise exception  File "/data/anaconda3/envs/airflow/lib/python3.8/site-packages/sqlalchemy/orm/session.py", line 2642, in _flush    flush_context.execute()  File "/data/anaconda3/envs/airflow/lib/python3.8/site
 -packages/sqlalchemy/orm/unitofwork.py", line 422, in execute    rec.execute(self)  File "/data/anaconda3/envs/airflow/lib/python3.8/site-packages/sqlalchemy/orm/unitofwork.py", line 586, in execute    persistence.save_obj(  File "/data/anaconda3/envs/airflow/lib/python3.8/site-packages/sqlalchemy/orm/persistence.py", line 239, in save_obj    _emit_insert_statements(  File "/data/anaconda3/envs/airflow/lib/python3.8/site-packages/sqlalchemy/orm/persistence.py", line 1135, in _emit_insert_statements    result = cached_connections[connection].execute(  File "/data/anaconda3/envs/airflow/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1011, in execute    return meth(self, multiparams, params)  File "/data/anaconda3/envs/airflow/lib/python3.8/site-packages/sqlalchemy/sql/elements.py", line 298, in _execute_on_connection    return connection._execute_clauseelement(self, multiparams, params)  File "/data/anaconda3/envs/airflow/lib/python3.8/site-packages/sqlalchemy/engine/bas
 e.py", line 1124, in _execute_clauseelement    ret = self._execute_context(  File "/data/anaconda3/envs/airflow/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1316, in _execute_context    self._handle_dbapi_exception(  File "/data/anaconda3/envs/airflow/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1510, in _handle_dbapi_exception    util.raise_(  File "/data/anaconda3/envs/airflow/lib/python3.8/site-packages/sqlalchemy/util/compat.py", line 182, in raise_    raise exception  File "/data/anaconda3/envs/airflow/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1276, in _execute_context    self.dialect.do_execute(  File "/data/anaconda3/envs/airflow/lib/python3.8/site-packages/sqlalchemy/engine/default.py", line 608, in do_execute    cursor.execute(statement, parameters)  File "/data/anaconda3/envs/airflow/lib/python3.8/site-packages/MySQLdb/cursors.py", line 255, in execute    self.errorhandler(self, exc, value)  File "/data/anaconda3/envs/airf
 low/lib/python3.8/site-packages/MySQLdb/connections.py", line 50, in defaulterrorhandler    raise errorvalue  File "/data/anaconda3/envs/airflow/lib/python3.8/site-packages/MySQLdb/cursors.py", line 252, in execute    res = self._query(query)  File "/data/anaconda3/envs/airflow/lib/python3.8/site-packages/MySQLdb/cursors.py", line 378, in _query    db.query(q)  File "/data/anaconda3/envs/airflow/lib/python3.8/site-packages/MySQLdb/connections.py", line 280, in query    _mysql.connection.query(self, query)sqlalchemy.exc.IntegrityError: (_mysql_exceptions.IntegrityError) (1062, "Duplicate entry 'script_dag-2021-08-14 16:31:16.000000' for key 'dag_id'")[SQL: INSERT INTO dag_run (dag_id, execution_date, start_date, end_date, state, run_id, creating_job_id, external_trigger, run_type, conf, last_scheduling_decision, dag_hash) VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s)][parameters: ('script_dag', datetime.datetime(2021, 8, 14, 16, 31, 16), datetime.datetime(2021, 8, 14, 16, 3
 1, 16, 165205), None, 'running', 'ad9435cd-459e-40c3-b9ef-2ee654641b06', None, 1, <DagRunType.MANUAL: 'manual'>, b'\x80\x05\x95\xad\x03\x00\x00\x00\x00\x00\x00}\x94(\x8c\x04key1\x94X\xc0\x02\x00\x00{"biz_type":1,"create_time":"2021-08-1423:14:03","create_user_cn" ... (1057 characters truncated) ... 94\x8c\x04key3\x94\x8c\x06online\x94\x8c\tuse_queue\x94\x8c\x1aonline_script_10.41.143.17\x94\x8c\x0bredis_queue\x94\x8c\x13airflow_eval_status\x94u.', None, '21462d7982e52b0a4c60655939353519')](Background on this error at: http://sqlalche.me/e/13/gkpj)
   2021-08-15 00:31:16 [pool-1-thread-1] INFO  c.s.e.b.s.service.AirflowService - 此次请求,本机ip[10.41.1


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org