You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by GitBox <gi...@apache.org> on 2021/01/18 19:12:41 UTC

[GitHub] [airflow] kaxil commented on a change in pull request #13728: Adds automated user creation in production image

kaxil commented on a change in pull request #13728:
URL: https://github.com/apache/airflow/pull/13728#discussion_r559756162



##########
File path: docs/apache-airflow/production-deployment.rst
##########
@@ -18,62 +18,70 @@
 Production Deployment
 ^^^^^^^^^^^^^^^^^^^^^
 
-It is time to deploy your DAG in production. To do this, first, you need to make sure that the Airflow is itself production-ready.
-Let's see what precautions you need to take.
+It is time to deploy your DAG in production. To do this, first, you need to make sure that the Airflow
+is itself production-ready. Let's see what precautions you need to take.
 
 Database backend
 ================
 
-Airflow comes with an ``SQLite`` backend by default. This allows the user to run Airflow without any external database.
-However, such a setup is meant to be used for testing purposes only; running the default setup in production can lead to data loss in multiple scenarios.
-If you want to run production-grade Airflow, make sure you :doc:`configure the backend <howto/set-up-database>` to be an external database such as PostgreSQL or MySQL.
+Airflow comes with an ``SQLite`` backend by default. This allows the user to run Airflow without any external
+database. However, such a setup is meant to be used for testing purposes only; running the default setup
+in production can lead to data loss in multiple scenarios. If you want to run production-grade Airflow,
+make sure you :doc:`configure the backend <howto/set-up-database>` to be an external database
+such as PostgreSQL or MySQL.
 
 You can change the backend using the following config
 
 .. code-block:: ini
 
- [core]
- sql_alchemy_conn = my_conn_string
+    [core]
+    sql_alchemy_conn = my_conn_string
 
 Once you have changed the backend, airflow needs to create all the tables required for operation.
 Create an empty DB and give airflow's user the permission to ``CREATE/ALTER`` it.
 Once that is done, you can run -
 
 .. code-block:: bash
 
- airflow db upgrade
+    airflow db upgrade
 
 ``upgrade`` keeps track of migrations already applied, so it's safe to run as often as you need.
 
 .. note::
 
- Do not use ``airflow db init`` as it can create a lot of default connections, charts, etc. which are not required in production DB.
+    Do not use ``airflow db init`` as it can create a lot of default connections, charts, etc. which are not
+    required in production DB.
 
 
 Multi-Node Cluster
 ==================
 
-Airflow uses :class:`airflow.executors.sequential_executor.SequentialExecutor` by default. However, by its nature, the user is limited to executing at most
-one task at a time. ``Sequential Executor`` also pauses the scheduler when it runs a task, hence not recommended in a production setup.
-You should use the :class:`Local executor <airflow.executors.local_executor.LocalExecutor>` for a single machine.
-For a multi-node setup, you should use the :doc:`Kubernetes executor <../executor/kubernetes>` or the :doc:`Celery executor <../executor/celery>`.
+Airflow uses :class:`airflow.executors.sequential_executor.SequentialExecutor` by default. However, by it

Review comment:
       ```suggestion
   Airflow uses :class:`~airflow.executors.sequential_executor.SequentialExecutor` by default. However, by it
   ```




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org