You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by "Kamil Choudhury (Jira)" <ji...@apache.org> on 2020/05/04 04:37:00 UTC

[jira] [Commented] (AIRFLOW-4761) Airflow Task Clear function throws error

    [ https://issues.apache.org/jira/browse/AIRFLOW-4761?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17098649#comment-17098649 ] 

Kamil Choudhury commented on AIRFLOW-4761:
------------------------------------------

I'm still running into this on 1.10.10/python 2.7.10, and can reproduce it programmatically.

 

I'm not too familiar with airflow internals, but am happy to walk through it with people who are more experienced. Thanks!

> Airflow Task Clear function throws error
> ----------------------------------------
>
>                 Key: AIRFLOW-4761
>                 URL: https://issues.apache.org/jira/browse/AIRFLOW-4761
>             Project: Apache Airflow
>          Issue Type: Bug
>          Components: DAG, DagRun
>    Affects Versions: 1.10.3
>         Environment: CentOS 7, Python 2.7.10
>            Reporter: Ben Storrie
>            Priority: Major
>
> When using the airflow webserver to clear a task inside a dagrun, an error is thrown on certain types of tasks:
>  
> {code:java}
> Traceback (most recent call last):
> File "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/flask/app.py", line 2311, in wsgi_app
> response = self.full_dispatch_request()
> File "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/flask/app.py", line 1834, in full_dispatch_request
> rv = self.handle_user_exception(e)
> File "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/flask/app.py", line 1737, in handle_user_exception
> reraise(exc_type, exc_value, tb)
> File "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/flask/app.py", line 1832, in full_dispatch_request
> rv = self.dispatch_request()
> File "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/flask/app.py", line 1818, in dispatch_request
> return self.view_functions[rule.endpoint](**req.view_args)
> File "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/flask_admin/base.py", line 69, in inner
> return self._run_view(f, *args, **kwargs)
> File "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/flask_admin/base.py", line 368, in _run_view
> return fn(self, *args, **kwargs)
> File "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/flask_login/utils.py", line 261, in decorated_view
> return func(*args, **kwargs)
> File "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/airflow/www/utils.py", line 275, in wrapper
> return f(*args, **kwargs)
> File "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/airflow/www/utils.py", line 322, in wrapper
> return f(*args, **kwargs)
> File "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/airflow/www/views.py", line 1202, in clear
> include_upstream=upstream)
> File "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/airflow/models/__init__.py", line 3830, in sub_dag
> dag = copy.deepcopy(self)
> File "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/copy.py", line 174, in deepcopy
> y = copier(memo)
> File "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/airflow/models/__init__.py", line 3815, in __deepcopy__
> setattr(result, k, copy.deepcopy(v, memo))
> File "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/copy.py", line 163, in deepcopy
> y = copier(x, memo)
> File "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/copy.py", line 257, in _deepcopy_dict
> y[deepcopy(key, memo)] = deepcopy(value, memo)
> File "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/copy.py", line 174, in deepcopy
> y = copier(memo)
> File "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/site-packages/airflow/models/__init__.py", line 2492, in __deepcopy__
> setattr(result, k, copy.copy(v))
> File "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/copy.py", line 96, in copy
> return _reconstruct(x, rv, 0)
> File "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/copy.py", line 329, in _reconstruct
> y = callable(*args)
> File "/opt/my-miniconda/miniconda/envs/my-hadoop-airflow/lib/python2.7/copy_reg.py", line 93, in __newobj__
> return cls.__new__(cls, *args)
> TypeError: instancemethod expected at least 2 arguments, got 0{code}
>  
> I had expected AIRFLOW-2060 being resolved to resolve this on upgrade to 1.10.3:
> {code:java}
> (my-hadoop-airflow) [user@hostname ~]$ pip freeze | grep pendulum
> pendulum==1.4.4{code}
>  
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)