You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by GitBox <gi...@apache.org> on 2021/06/09 03:21:00 UTC

[GitHub] [airflow] Laydas opened a new issue #16344: Tried to upgrade to Airflow 2.1.0

Laydas opened a new issue #16344:
URL: https://github.com/apache/airflow/issues/16344


   
   
   
   **Apache Airflow version**: 2.1.0
   
   
   **Kubernetes version (if you are using kubernetes)** (use `kubectl version`):
   
   **Environment**: 
   
   - **Cloud provider or hardware configuration**: AWS EC2 
   - **OS** (e.g. from /etc/os-release): Ubuntu 20.04
   - **Kernel** (e.g. `uname -a`): Linux ip-172-31-73-251 5.8.0-1035-aws #37~20.04.1-Ubuntu SMP Tue Jun 1 09:54:15 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux
   - **Install tools**: pip
   - **Others**:
   
   **What happened**:
   
   Followed the upgrade instructions in documentation and upgraded from airflow 2.0.1 to 2.1.0
   
   Started webserver service and logged into UI, received the following screen.
   
   Something bad has happened.
   Please consider letting us know by creating a bug report using GitHub.
   
   Python version: 3.8.5
   Airflow version: 2.1.0
   Node: ip-172-31-73-251.ec2.internal
   -------------------------------------------------------------------------------
   Traceback (most recent call last):
     File "/home/ubuntu/.local/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1276, in _execute_context
       self.dialect.do_execute(
     File "/home/ubuntu/.local/lib/python3.8/site-packages/sqlalchemy/engine/default.py", line 608, in do_execute
       cursor.execute(statement, parameters)
   psycopg2.errors.UndefinedColumn: column dag.last_parsed_time does not exist
   LINE 1: ...AS dag_is_subdag, dag.is_active AS dag_is_active, dag.last_p...
                                                                ^
   
   
   The above exception was the direct cause of the following exception:
   
   Traceback (most recent call last):
     File "/home/ubuntu/.local/lib/python3.8/site-packages/flask/app.py", line 2447, in wsgi_app
       response = self.full_dispatch_request()
     File "/home/ubuntu/.local/lib/python3.8/site-packages/flask/app.py", line 1952, in full_dispatch_request
       rv = self.handle_user_exception(e)
     File "/home/ubuntu/.local/lib/python3.8/site-packages/flask/app.py", line 1821, in handle_user_exception
       reraise(exc_type, exc_value, tb)
     File "/home/ubuntu/.local/lib/python3.8/site-packages/flask/_compat.py", line 39, in reraise
       raise value
     File "/home/ubuntu/.local/lib/python3.8/site-packages/flask/app.py", line 1950, in full_dispatch_request
       rv = self.dispatch_request()
     File "/home/ubuntu/.local/lib/python3.8/site-packages/flask/app.py", line 1936, in dispatch_request
       return self.view_functions[rule.endpoint](**req.view_args)
     File "/home/ubuntu/.local/lib/python3.8/site-packages/airflow/www/auth.py", line 34, in decorated
       return func(*args, **kwargs)
     File "/home/ubuntu/.local/lib/python3.8/site-packages/airflow/www/views.py", line 547, in index
       filter_dag_ids = current_app.appbuilder.sm.get_accessible_dag_ids(g.user)
     File "/home/ubuntu/.local/lib/python3.8/site-packages/airflow/www/security.py", line 298, in get_accessible_dag_ids
       return {dag.dag_id for dag in accessible_dags}
     File "/home/ubuntu/.local/lib/python3.8/site-packages/sqlalchemy/orm/query.py", line 3535, in __iter__
       return self._execute_and_instances(context)
     File "/home/ubuntu/.local/lib/python3.8/site-packages/sqlalchemy/orm/query.py", line 3560, in _execute_and_instances
       result = conn.execute(querycontext.statement, self._params)
     File "/home/ubuntu/.local/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1011, in execute
       return meth(self, multiparams, params)
     File "/home/ubuntu/.local/lib/python3.8/site-packages/sqlalchemy/sql/elements.py", line 298, in _execute_on_connection
       return connection._execute_clauseelement(self, multiparams, params)
     File "/home/ubuntu/.local/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1124, in _execute_clauseelement
       ret = self._execute_context(
     File "/home/ubuntu/.local/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1316, in _execute_context
       self._handle_dbapi_exception(
     File "/home/ubuntu/.local/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1510, in _handle_dbapi_exception
       util.raise_(
     File "/home/ubuntu/.local/lib/python3.8/site-packages/sqlalchemy/util/compat.py", line 182, in raise_
       raise exception
     File "/home/ubuntu/.local/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1276, in _execute_context
       self.dialect.do_execute(
     File "/home/ubuntu/.local/lib/python3.8/site-packages/sqlalchemy/engine/default.py", line 608, in do_execute
       cursor.execute(statement, parameters)
   sqlalchemy.exc.ProgrammingError: (psycopg2.errors.UndefinedColumn) column dag.last_parsed_time does not exist
   LINE 1: ...AS dag_is_subdag, dag.is_active AS dag_is_active, dag.last_p...
                                                                ^
   
   [SQL: SELECT dag.dag_id AS dag_dag_id, dag.root_dag_id AS dag_root_dag_id, dag.is_paused AS dag_is_paused, dag.is_subdag AS dag_is_subdag, dag.is_active AS dag_is_active, dag.last_parsed_time AS dag_last_parsed_time, dag.last_pickled AS dag_last_pickled, dag.last_expired AS dag_last_expired, dag.scheduler_lock AS dag_scheduler_lock, dag.pickle_id AS dag_pickle_id, dag.fileloc AS dag_fileloc, dag.owners AS dag_owners, dag.description AS dag_description, dag.default_view AS dag_default_view, dag.schedule_interval AS dag_schedule_interval, dag.concurrency AS dag_concurrency, dag.has_task_concurrency_limits AS dag_has_task_concurrency_limits, dag.next_dagrun AS dag_next_dagrun, dag.next_dagrun_create_after AS dag_next_dagrun_create_after 
   FROM dag]
   (Background on this error at: http://sqlalche.me/e/13/f405)
   
   **What you expected to happen**:
   
   Load the UI properly
   
   **How to reproduce it**:
   
   when running Airflow 2.0.1
   wget https://raw.githubusercontent.com/apache/airflow/constraints-2.1.0/constraints-3.8.txt
   pip install --upgrade apache-airflow[postgres]==2.1.0 --constraint constraints-3.8.txt
   
   **Anything else we need to know**:
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] mik-laj commented on issue #16344: Tried to upgrade to Airflow 2.1.0

Posted by GitBox <gi...@apache.org>.
mik-laj commented on issue #16344:
URL: https://github.com/apache/airflow/issues/16344#issuecomment-857345953


   Did you do a db migration?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] stephenonethree commented on issue #16344: Tried to upgrade to Airflow 2.1.0

Posted by GitBox <gi...@apache.org>.
stephenonethree commented on issue #16344:
URL: https://github.com/apache/airflow/issues/16344#issuecomment-929472726


   Update: In my case I was able to fix it by:
   
   1) Change my Pipfile (which I use to control the environment where I have Airflow installed) to `apache-airflow-providers-postgres="==2.0.0"`. Previously I had it at `1.0.2`
   2) Rebuild local virtualenv (`pipenv --rm` and `pipenv install`)
   3) Rerun `airflow db upgrade`
   
   I'm not certain whether bumping the provider package helped (not sure if `airflow db upgrade` uses it), or if the problem was simply solved by rerunning the upgrade.
   
   For what it's worth I'm using `Postgres 12.7` locally.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] uranusjr commented on issue #16344: Tried to upgrade to Airflow 2.1.0

Posted by GitBox <gi...@apache.org>.
uranusjr commented on issue #16344:
URL: https://github.com/apache/airflow/issues/16344#issuecomment-872650767


   Sorry I was too vague; I was trying to say the `alter_table` function triggering a foreign key constraint error sounds like a bug to me. Alembic explicitly supports abstraction over SQLite’s lack of `ALTER TABLE`, and that abstraction is failing here.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] dimakaB commented on issue #16344: Tried to upgrade to Airflow 2.1.0

Posted by GitBox <gi...@apache.org>.
dimakaB commented on issue #16344:
URL: https://github.com/apache/airflow/issues/16344#issuecomment-858598208


   @mik-laj I have the same issue and tried running airflow db init, upgrade and reset. There is lack of those columns in DB. For me works fine in local container but not with Postgres in ECS Fargate container.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] stephenonethree commented on issue #16344: Tried to upgrade to Airflow 2.1.0

Posted by GitBox <gi...@apache.org>.
stephenonethree commented on issue #16344:
URL: https://github.com/apache/airflow/issues/16344#issuecomment-929414210


   I am also experiencing this issue when upgrading from Airflow 2.0.2 to 2.1.2. The error comes up when running `airflow db upgrade` itself.
   
   ```
   (airflow) szzz@MacBook-Pro airflow % airflow db upgrade
   --- Logging error ---
   Traceback (most recent call last):
     File "/Users/szzz/.local/share/virtualenvs/airflow-abcd/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1276, in _execute_context
       self.dialect.do_execute(
     File "/Users/szzz/.local/share/virtualenvs/airflow-abcd/lib/python3.8/site-packages/sqlalchemy/engine/default.py", line 608, in do_execute
       cursor.execute(statement, parameters)
   psycopg2.errors.UndefinedColumn: column variable.description does not exist
   LINE 1: ...e_id, public.variable.key AS public_variable_key, public.var...
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] potiuk commented on issue #16344: Tried to upgrade to Airflow 2.1.0

Posted by GitBox <gi...@apache.org>.
potiuk commented on issue #16344:
URL: https://github.com/apache/airflow/issues/16344#issuecomment-926827796


   Did you run `airflow db upgrade` as part of the upgrade? https://airflow.apache.org/docs/apache-airflow/stable/installation/setting-up-the-database.html


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] uranusjr commented on issue #16344: Tried to upgrade to Airflow 2.1.0

Posted by GitBox <gi...@apache.org>.
uranusjr commented on issue #16344:
URL: https://github.com/apache/airflow/issues/16344#issuecomment-872637674


   This sounds like a bug in Alembic. Would you mind reporting this on its issue tracker?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] boring-cyborg[bot] commented on issue #16344: Tried to upgrade to Airflow 2.1.0

Posted by GitBox <gi...@apache.org>.
boring-cyborg[bot] commented on issue #16344:
URL: https://github.com/apache/airflow/issues/16344#issuecomment-857340943


   Thanks for opening your first issue here! Be sure to follow the issue template!
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] stephenonethree commented on issue #16344: Tried to upgrade to Airflow 2.1.0

Posted by GitBox <gi...@apache.org>.
stephenonethree commented on issue #16344:
URL: https://github.com/apache/airflow/issues/16344#issuecomment-930346798


   Here are some further details, although this might be difficult to track down.
   
   I have always had 1 copy of the Airflow database. I had Airflow 1.x in my copy of Python 3.6 in `pyenv` (no `virtualenv`). For Airflow 2.x I decided to create a `virtualenv` in my Airflow folder using Python 3.8. So I did that, and merged my settings into the new airflow.cfg. My upgrade from 1.10.15 to Airflow 2.0.2 went fine. It's only once I upgraded to 2.1.2 that I had problems.
   
   I think I was probably running the correct Airflow because the `(airflow)` prefix in my command means I had run `pipenv shell` to enter my 2.1.2 virtualenv (I didn't have a virtualenv for 1.10.15) (`airflow` is the name of the virtualenv, which is used as a shell prefix when you have run pipenv shell).
   
   I thought it was strange to see an error message `column variable.description does not exist` because my understanding was that `airflow db upgrade` was going to create that column, which is a new column. So the error message seemed a bit "backwards".
   
   Anyway, my own problem is resolved. Unfortunately I guess this might not be enough information to reproduce, since now I am successfully upgraded.
   
   I would be happy to send along the logs for `airflow db upgrade` but I'm not sure where they are stored (I looked in `logs` but didn't see anything). If Airflow doesn't currently store logs for that command, it might be good to start, to help with future user upgrade issues.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] potiuk edited a comment on issue #16344: Tried to upgrade to Airflow 2.1.0

Posted by GitBox <gi...@apache.org>.
potiuk edited a comment on issue #16344:
URL: https://github.com/apache/airflow/issues/16344#issuecomment-930390265


   My wild guess is that you've made the first upgrade with different user or maybe with different setting for HOME or AIRFLOW_HOME. Airflow (when you use sqlite) keeps the data in sqlite database file which is stored in `${AIRFLOW_HOME}/airflow.db`  or if AIRFLOW_HOME is not set, it is `${HOME}/airflow/airflow.db`. It's very likely that when you run it first time you had it set differently or wrongly. If you did - then you will find a migrated airflow.db file somewhere - created more or less about the time when you run it for the first time.
   
   Another possibility is that you had AIRFLOW__CORE__UNIT_TEST_MODE (or corresponding configuration in `airflow.cfg`  set to True . If you do, then airflow uses different file (unittest.db) in the same location as above - and again if you had it, then you will likely have that file lying around somewhere.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] jj8huang commented on issue #16344: Tried to upgrade to Airflow 2.1.0

Posted by GitBox <gi...@apache.org>.
jj8huang commented on issue #16344:
URL: https://github.com/apache/airflow/issues/16344#issuecomment-926773110


   Was there a solution to this issue? I have been trying to migrate to airflow 2.1.4 and am getting this when running the upgrade in webserver:
   
   `sqlalchemy.exc.ProgrammingError: (psycopg2.errors.UndefinedColumn) column dag.last_scheduler_run does not exist`
   
   I'm not sure how to fix it


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] dmeibusch commented on issue #16344: Tried to upgrade to Airflow 2.1.0

Posted by GitBox <gi...@apache.org>.
dmeibusch commented on issue #16344:
URL: https://github.com/apache/airflow/issues/16344#issuecomment-872634600


   Looking into this, `alembic` is assuming that sqlite3 doesn't support the column rename and is attempting to recreate the `dag` table, hitting the key constraint. Newer versions of sqlite3 do support altering the column name, however I couldn't determine whether this was supported in newer `alembic` versions. 
   
   My workaround was to apply the column name change manually:
   ```
   $> airflow db shell
   alter table dag RENAME column 'last_scheduler_run' to 'last_parsed_time';
   ```
   Then just comment out the upgrade in `2e42bb497a22_rename_last_scheduler_run_column.py` in airflow, and then run the `airflow db upgrade`. Seems to have worked.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] github-actions[bot] commented on issue #16344: Tried to upgrade to Airflow 2.1.0

Posted by GitBox <gi...@apache.org>.
github-actions[bot] commented on issue #16344:
URL: https://github.com/apache/airflow/issues/16344#issuecomment-893056827


   This issue has been automatically marked as stale because it has been open for 30 days with no response from the author. It will be closed in next 7 days if no further activity occurs from the issue author.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] uranusjr commented on issue #16344: Tried to upgrade to Airflow 2.1.0

Posted by GitBox <gi...@apache.org>.
uranusjr commented on issue #16344:
URL: https://github.com/apache/airflow/issues/16344#issuecomment-872845056


   Ah OK, that makes sense, thanks. We’ll need a PR to fix the migration file then.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] stephenonethree commented on issue #16344: Tried to upgrade to Airflow 2.1.0

Posted by GitBox <gi...@apache.org>.
stephenonethree commented on issue #16344:
URL: https://github.com/apache/airflow/issues/16344#issuecomment-929414210






-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] stephenonethree commented on issue #16344: Tried to upgrade to Airflow 2.1.0

Posted by GitBox <gi...@apache.org>.
stephenonethree commented on issue #16344:
URL: https://github.com/apache/airflow/issues/16344#issuecomment-929652642


   @potiuk I think there may be a misunderstanding - what I'm saying is that I upgraded my Airflow package, ran `airflow db upgrade` after upgrading and got the error. Then I followed the 3 steps in my previous comment and it worked. 
   
   So the first time after upgrading it did not work, after that it started working.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] potiuk commented on issue #16344: Tried to upgrade to Airflow 2.1.0

Posted by GitBox <gi...@apache.org>.
potiuk commented on issue #16344:
URL: https://github.com/apache/airflow/issues/16344#issuecomment-929641813






-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] stephenonethree commented on issue #16344: Tried to upgrade to Airflow 2.1.0

Posted by GitBox <gi...@apache.org>.
stephenonethree commented on issue #16344:
URL: https://github.com/apache/airflow/issues/16344#issuecomment-929652642






-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] jj8huang commented on issue #16344: Tried to upgrade to Airflow 2.1.0

Posted by GitBox <gi...@apache.org>.
jj8huang commented on issue #16344:
URL: https://github.com/apache/airflow/issues/16344#issuecomment-927963260


   Yes I did! And i ended up getting this error: `Can't locate revision identified by 'ccde3e26fe78'` . Also worth noting that I couldn't find any logs in AWS of the other migrations running...


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] dimakaB edited a comment on issue #16344: Tried to upgrade to Airflow 2.1.0

Posted by GitBox <gi...@apache.org>.
dimakaB edited a comment on issue #16344:
URL: https://github.com/apache/airflow/issues/16344#issuecomment-858598208


   @mik-laj I have the same issue and tried running airflow db init, upgrade and reset. There is lack of those columns in DB. For me works fine in local container but not with Postgres in ECS Fargate container.
   UPDATE: after dropping all tables in DB and running `airflow db init` error is not visible anymore


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] dmeibusch commented on issue #16344: Tried to upgrade to Airflow 2.1.0

Posted by GitBox <gi...@apache.org>.
dmeibusch commented on issue #16344:
URL: https://github.com/apache/airflow/issues/16344#issuecomment-872644175


   I'm not sure that is it possible. The table drop/recreate is at the table level - it would be an optimisation (?) for alembic to determine that within a table alter operation that there is a column rename only that could be done directly without recreating the table. Airflow could detect the variant (and sqlite version) and run the `alter column` directly - one possibility. I didn't understand all the ramifications of the drop/recreate process, but I think with sqlite that alembic struggles if the constraints are unnamed - which I think is the default - to drop/recreate constraints. So, maybe another approach is checking whether Airflow is naming all its constraints. This might be worthwhile for supporting any future migrations with sqlite databases.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] dmeibusch commented on issue #16344: Tried to upgrade to Airflow 2.1.0

Posted by GitBox <gi...@apache.org>.
dmeibusch commented on issue #16344:
URL: https://github.com/apache/airflow/issues/16344#issuecomment-872652467


   Ah, yes - I was also not clear. They have specific circumstances, documented https://alembic.sqlalchemy.org/en/latest/batch.html , where their abstraction is leaky. Under the section [Dealing with Constraints](https://alembic.sqlalchemy.org/en/latest/batch.html#dealing-with-constraints). Looks like Airflow will have to take specific steps.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] potiuk edited a comment on issue #16344: Tried to upgrade to Airflow 2.1.0

Posted by GitBox <gi...@apache.org>.
potiuk edited a comment on issue #16344:
URL: https://github.com/apache/airflow/issues/16344#issuecomment-930390265


   My wild guess is that you've made the first upgrade with different user or maybe with different setting for HOME or AIRFLOW_HOME. Airflow (when you use sqlite) keeps the data in sqlite database file which is stored in `${AIRFLOW_HOME}/airflow.db`  or if AIRFLOW_HOME is not set, it is `${HOME}/airflow/airflow.db`. It's very likely that when you run it first time you had it set differently or wrongly. If you did - then you will find a migrated airflow.db file somewhere - created more or less about the time when you run it for the first time.
   
   Another possibility is that you had AIRFLOW__CORE__UNIT_TEST_MODE (or corresponding configuration in `airflow.cfg`  set to True . If you do, then airflow uses different file (unittest.db) in the same location as above - and again if you had it, then you will likely have that file lying around somewhere.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] potiuk commented on issue #16344: Tried to upgrade to Airflow 2.1.0

Posted by GitBox <gi...@apache.org>.
potiuk commented on issue #16344:
URL: https://github.com/apache/airflow/issues/16344#issuecomment-929641813


   It was `airflow db upgrade`. See https://airflow.apache.org/docs/apache-airflow/stable/installation/setting-up-the-database.html where it is specified that you should run `airflow db upgrade` every time you upgrade airflow.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] monti-python commented on issue #16344: Tried to upgrade to Airflow 2.1.0

Posted by GitBox <gi...@apache.org>.
monti-python commented on issue #16344:
URL: https://github.com/apache/airflow/issues/16344#issuecomment-872965332


   @Laydas I had the same issue when I updated from 2.0.1 to 2.1.0, fixed it by running `airflow db upgrade` from the scheduler


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] dmeibusch commented on issue #16344: Tried to upgrade to Airflow 2.1.0

Posted by GitBox <gi...@apache.org>.
dmeibusch commented on issue #16344:
URL: https://github.com/apache/airflow/issues/16344#issuecomment-872622407


   Upgrading from Airflow 2.0.1 to 2.1.0 using sqlite database, the db migration fails with a foreign key constraint.
   
   ```
   $> pip install --index https://artifactory.oci.oraclecorp.com/api/pypi/global-dev-pypi/simple 'apache-airflow[apache.hive,apache.spark,jenkins,oracle,redis,virtualenv,http,ssh,slack,ldap]==2.1.0' --constraint constraints-airflow-2.1.0-python-3.8.txt
   $> airflow db upgrade
   DB: sqlite:////Users/dmeibusc/ws/osint/airflow-conf/airflow.db
   [2021-07-02 09:57:05,177] {db.py:695} INFO - Creating tables
   INFO  [alembic.runtime.migration] Context impl SQLiteImpl.
   INFO  [alembic.runtime.migration] Will assume non-transactional DDL.
   INFO  [alembic.runtime.migration] Running upgrade 82b7c48c147f -> 449b4072c2da, Increase size of connection.extra field to handle multiple RSA keys
   INFO  [alembic.runtime.migration] Running upgrade 449b4072c2da -> 8646922c8a04, Change default pool_slots to 1
   INFO  [alembic.runtime.migration] Running upgrade 8646922c8a04 -> 2e42bb497a22, rename last_scheduler_run column
   Traceback (most recent call last):
     File "/Users/dmeibusc/ws/osint/airflow-conf/.venv/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1276, in _execute_context
       self.dialect.do_execute(
     File "/Users/dmeibusc/ws/osint/airflow-conf/.venv/lib/python3.8/site-packages/sqlalchemy/engine/default.py", line 608, in do_execute
       cursor.execute(statement, parameters)
   sqlite3.IntegrityError: FOREIGN KEY constraint failed
   
   The above exception was the direct cause of the following exception:
   
   Traceback (most recent call last):
     File "/Users/dmeibusc/ws/osint/airflow-conf/.venv/bin/airflow", line 8, in <module>
       sys.exit(main())
     File "/Users/dmeibusc/ws/osint/airflow-conf/.venv/lib/python3.8/site-packages/airflow/__main__.py", line 40, in main
       args.func(args)
     File "/Users/dmeibusc/ws/osint/airflow-conf/.venv/lib/python3.8/site-packages/airflow/cli/cli_parser.py", line 48, in command
       return func(*args, **kwargs)
     File "/Users/dmeibusc/ws/osint/airflow-conf/.venv/lib/python3.8/site-packages/airflow/utils/cli.py", line 91, in wrapper
       return f(*args, **kwargs)
     File "/Users/dmeibusc/ws/osint/airflow-conf/.venv/lib/python3.8/site-packages/airflow/cli/commands/db_command.py", line 48, in upgradedb
       db.upgradedb()
     File "/Users/dmeibusc/ws/osint/airflow-conf/.venv/lib/python3.8/site-packages/airflow/utils/db.py", line 705, in upgradedb
       command.upgrade(config, 'heads')
     File "/Users/dmeibusc/ws/osint/airflow-conf/.venv/lib/python3.8/site-packages/alembic/command.py", line 294, in upgrade
       script.run_env()
     File "/Users/dmeibusc/ws/osint/airflow-conf/.venv/lib/python3.8/site-packages/alembic/script/base.py", line 490, in run_env
       util.load_python_file(self.dir, "env.py")
     File "/Users/dmeibusc/ws/osint/airflow-conf/.venv/lib/python3.8/site-packages/alembic/util/pyfiles.py", line 97, in load_python_file
       module = load_module_py(module_id, path)
     File "/Users/dmeibusc/ws/osint/airflow-conf/.venv/lib/python3.8/site-packages/alembic/util/compat.py", line 182, in load_module_py
       spec.loader.exec_module(module)
     File "<frozen importlib._bootstrap_external>", line 783, in exec_module
     File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
     File "/Users/dmeibusc/ws/osint/airflow-conf/.venv/lib/python3.8/site-packages/airflow/migrations/env.py", line 116, in <module>
       run_migrations_online()
     File "/Users/dmeibusc/ws/osint/airflow-conf/.venv/lib/python3.8/site-packages/airflow/migrations/env.py", line 107, in run_migrations_online
       context.run_migrations()
     File "<string>", line 8, in run_migrations
     File "/Users/dmeibusc/ws/osint/airflow-conf/.venv/lib/python3.8/site-packages/alembic/runtime/environment.py", line 813, in run_migrations
       self.get_context().run_migrations(**kw)
     File "/Users/dmeibusc/ws/osint/airflow-conf/.venv/lib/python3.8/site-packages/alembic/runtime/migration.py", line 561, in run_migrations
       step.migration_fn(**kw)
     File "/Users/dmeibusc/ws/osint/airflow-conf/.venv/lib/python3.8/site-packages/airflow/migrations/versions/2e42bb497a22_rename_last_scheduler_run_column.py", line 48, in upgrade
       batch_op.alter_column(
     File "/Users/dmeibusc/.pyenv/versions/3.8.6/lib/python3.8/contextlib.py", line 120, in __exit__
       next(self.gen)
     File "/Users/dmeibusc/ws/osint/airflow-conf/.venv/lib/python3.8/site-packages/alembic/operations/base.py", line 336, in batch_alter_table
       impl.flush()
     File "/Users/dmeibusc/ws/osint/airflow-conf/.venv/lib/python3.8/site-packages/alembic/operations/batch.py", line 119, in flush
       batch_impl._create(self.impl)
     File "/Users/dmeibusc/ws/osint/airflow-conf/.venv/lib/python3.8/site-packages/alembic/operations/batch.py", line 391, in _create
       op_impl.drop_table(self.table)
     File "/Users/dmeibusc/ws/osint/airflow-conf/.venv/lib/python3.8/site-packages/alembic/ddl/impl.py", line 297, in drop_table
       self._exec(schema.DropTable(table))
     File "/Users/dmeibusc/ws/osint/airflow-conf/.venv/lib/python3.8/site-packages/alembic/ddl/impl.py", line 146, in _exec
       return conn.execute(construct, multiparams)
     File "/Users/dmeibusc/ws/osint/airflow-conf/.venv/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1011, in execute
       return meth(self, multiparams, params)
     File "/Users/dmeibusc/ws/osint/airflow-conf/.venv/lib/python3.8/site-packages/sqlalchemy/sql/ddl.py", line 72, in _execute_on_connection
       return connection._execute_ddl(self, multiparams, params)
     File "/Users/dmeibusc/ws/osint/airflow-conf/.venv/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1068, in _execute_ddl
       ret = self._execute_context(
     File "/Users/dmeibusc/ws/osint/airflow-conf/.venv/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1316, in _execute_context
       self._handle_dbapi_exception(
     File "/Users/dmeibusc/ws/osint/airflow-conf/.venv/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1510, in _handle_dbapi_exception
       util.raise_(
     File "/Users/dmeibusc/ws/osint/airflow-conf/.venv/lib/python3.8/site-packages/sqlalchemy/util/compat.py", line 182, in raise_
       raise exception
     File "/Users/dmeibusc/ws/osint/airflow-conf/.venv/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1276, in _execute_context
       self.dialect.do_execute(
     File "/Users/dmeibusc/ws/osint/airflow-conf/.venv/lib/python3.8/site-packages/sqlalchemy/engine/default.py", line 608, in do_execute
       cursor.execute(statement, parameters)
   sqlalchemy.exc.IntegrityError: (sqlite3.IntegrityError) FOREIGN KEY constraint failed
   [SQL: 
   DROP TABLE dag]
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] github-actions[bot] closed issue #16344: Tried to upgrade to Airflow 2.1.0

Posted by GitBox <gi...@apache.org>.
github-actions[bot] closed issue #16344:
URL: https://github.com/apache/airflow/issues/16344


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] github-actions[bot] commented on issue #16344: Tried to upgrade to Airflow 2.1.0

Posted by GitBox <gi...@apache.org>.
github-actions[bot] commented on issue #16344:
URL: https://github.com/apache/airflow/issues/16344#issuecomment-898053797


   This issue has been closed because it has not received response from the issue author.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] potiuk commented on issue #16344: Tried to upgrade to Airflow 2.1.0

Posted by GitBox <gi...@apache.org>.
potiuk commented on issue #16344:
URL: https://github.com/apache/airflow/issues/16344#issuecomment-929657878


   > So the first time after upgrading it did not work, after that it started working.
   
   Are you sure you run it with the right DB and that it succeeded? I believe you might have upgraded diferent DB the first time.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] potiuk commented on issue #16344: Tried to upgrade to Airflow 2.1.0

Posted by GitBox <gi...@apache.org>.
potiuk commented on issue #16344:
URL: https://github.com/apache/airflow/issues/16344#issuecomment-929659679


   There is no way that if you run `db upgrade' with new airflow and the right DB you get different result that running it for the second time - so what I am saying is that you could have run it in different environment/different DB /previous version of airflow. 
   
   It woudl be great if you get to the bottom of it because maybe the sequence of operations (upgrading airflow/db etc. ) could be just wrong when you tried it the first time (or maybe our docs are wrong about it and we should correct it ?)


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] github-actions[bot] commented on issue #16344: Tried to upgrade to Airflow 2.1.0

Posted by GitBox <gi...@apache.org>.
github-actions[bot] commented on issue #16344:
URL: https://github.com/apache/airflow/issues/16344#issuecomment-893056827


   This issue has been automatically marked as stale because it has been open for 30 days with no response from the author. It will be closed in next 7 days if no further activity occurs from the issue author.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] potiuk commented on issue #16344: Tried to upgrade to Airflow 2.1.0

Posted by GitBox <gi...@apache.org>.
potiuk commented on issue #16344:
URL: https://github.com/apache/airflow/issues/16344#issuecomment-930390265


   My wild guess is that you've made the first upgrade with different user or maybe with different setting for HOME or AIRFLOW_HOME. Airflow (when you use sqlite) keeps the data in sqlite database file which is stored in `${AIRFLOW_HOME}/airflow.db"  or if AIRFLOW_HOME is not set, it is `${HOME}/airflow/airflow.db`. It's very likely that when you run it first time you had it set differently or wrongly. If you did - then you will find a migrated airflow.db file somewhere - created more or less about the time when you run it for the first time.
   
   Another possibility is that you had AIRFLOW__CORE__UNIT_TEST_MODE (or corresponding configuration in `airflow.cfg`  set to True . If you do, then airflow uses different file (unittest.db) in the same location as above - and again if you had it, then you will likely have that file lying around somewhere.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org