You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by ep...@apache.org on 2022/04/25 10:34:16 UTC

[airflow-site] branch new_2.3.0 created (now 251022861)

This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a change to branch new_2.3.0
in repository https://gitbox.apache.org/repos/asf/airflow-site.git


      at 251022861 Add 2.3.0 blog post

This branch includes the following new commits:

     new 251022861 Add 2.3.0 blog post

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.



[airflow-site] 01/01: Add 2.3.0 blog post

Posted by ep...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch new_2.3.0
in repository https://gitbox.apache.org/repos/asf/airflow-site.git

commit 25102286119f251ad31791cca0b4d6575aa8faae
Author: Ephraim Anierobi <sp...@gmail.com>
AuthorDate: Mon Apr 25 11:33:13 2022 +0100

    Add 2.3.0 blog post
---
 .../content/en/blog/airflow-2.3.0/grid-view.png    | Bin 0 -> 1136078 bytes
 .../site/content/en/blog/airflow-2.3.0/index.md    | 140 +++++++++++++++++++++
 2 files changed, 140 insertions(+)

diff --git a/landing-pages/site/content/en/blog/airflow-2.3.0/grid-view.png b/landing-pages/site/content/en/blog/airflow-2.3.0/grid-view.png
new file mode 100644
index 000000000..59a0cfa4d
Binary files /dev/null and b/landing-pages/site/content/en/blog/airflow-2.3.0/grid-view.png differ
diff --git a/landing-pages/site/content/en/blog/airflow-2.3.0/index.md b/landing-pages/site/content/en/blog/airflow-2.3.0/index.md
new file mode 100644
index 000000000..9e221b54d
--- /dev/null
+++ b/landing-pages/site/content/en/blog/airflow-2.3.0/index.md
@@ -0,0 +1,140 @@
+---
+title: "Apache Airflow 2.3.0 is here"
+linkTitle: "Apache Airflow 2.3.0 is here"
+author: "Ephraim Anierobi"
+github: "ephraimbuddy"
+linkedin: "ephraimanierobi"
+description: "We're proud to announce that Apache Airflow 2.3.0 has been released."
+tags: [Release]
+date: "TBD"
+---
+
+Apache Airflow 2.3.0 contains over TBD commits since 2.2.0 and includes TBD new features, TBD improvements, TBD bug fixes, and several doc changes.
+
+**Details**:
+
+πŸ“¦ PyPI: https://pypi.org/project/apache-airflow/2.3.0/ \
+πŸ“š Docs: https://airflow.apache.org/docs/apache-airflow/2.3.0/ \
+πŸ› οΈ Changelog: https://airflow.apache.org/docs/apache-airflow/2.3.0/changelog.html \
+🐳 Docker Image: docker pull apache/airflow:2.3.0 \
+🚏 Constraints: https://github.com/apache/airflow/tree/constraints-2.3.0
+
+As the changelog is quite large, the following are some notable new features that shipped in this release.
+
+## Dynamic Task Mapping(AIP-42)
+
+There's now first-class support for dynamic tasks in Airflow. What this means is that you can generate tasks dynamically at runtime. Much like using a `for` loop
+to create a list of tasks, here you can create the same tasks without having to know the exact number of tasks ahead of time. 
+
+You can have a `task` generate the list to iterate over, which is not possible with a `for` loop.
+
+Here is an example:
+
+```python
+@task
+def make_list():
+    # This can also be from an API call, checking a database, -- almost anything you like, as long as the
+    # resulting list/dictionary can be stored in the current XCom backend.
+    return [1, 2, {"a": "b"}, "str"]
+
+
+@task
+def consumer(arg):
+    print(list(arg))
+
+
+with DAG(dag_id="dynamic-map", start_date=datetime(2022, 4, 2)) as dag:
+    consumer.expand(arg=make_list())
+```
+
+More information can be found here: [Dynamic Task Mapping](https://airflow.apache.org/docs/apache-airflow/latest/concepts/dynamic-task-mapping.html)
+
+## Grid View replaces Tree View
+
+Grid view replaces tree view in Airflow 2.3.0. 
+
+**Screenshots**:
+![The new grid view](grid-view.png)
+
+
+## LocalKubernetesExecutor
+
+Airflow 2.3.0, features a new executor called LocalKubernetesExecutor. This executor helps you run some tasks using LocalExecutor and run another set of tasks using the KubernetesExecutor in the same deployment using task's queue to coordinate the switching.
+
+More information can be found here: [LocalKubernetesExecutor](https://airflow.apache.org/docs/apache-airflow/latest/executor/local_kubernetes.html)
+
+
+## DagProcessorManager as standalone process (AIP-43)
+
+As of 2.3.0, you can run the DagProcessorManager as a standalone process. Because DagProcessorManager runs user code, separating it from the scheduler process and running it as an independent process in a different host is a good idea.
+
+`airflow dag-processor` cli command will start a new process that will run the DagProcessorManager in a separate process. Before you can run the DagProcessorManager as a standalone process, you need to set the `AIRFLOW__SCHEDULER__STANDALONE_DAG_PROCESSOR` to `True`.
+
+More information can be found here: [dag-processor CLI command](https://airflow.apache.org/docs/apache-airflow/latest/cli-and-env-variables-ref.html#dag-processor)
+
+## JSON serialization for connections
+You can now create connections using the `json` serialization format.
+
+```bash
+airflow connections add 'my_prod_db' \
+    --conn-json '{
+        "conn_type": "my-conn-type",
+        "login": "my-login",
+        "password": "my-password",
+        "host": "my-host",
+        "port": 1234,
+        "schema": "my-schema",
+        "extra": {
+            "param1": "val1",
+            "param2": "val2"
+        }
+    }'
+```
+You can also use `json` serialization format when setting the connection in environment variables.
+
+More information can be found here: [JSON serialization for connections](https://airflow.apache.org/docs/apache-airflow/latest/howto/connection.html)
+
+## Airflow `db downgrade` and Offline generation of SQL scripts
+
+Airflow 2.3.0 introduced a new command `airflow db downgrade` that will downgrade the database to your chosen version. 
+
+You can also generate the downgrade/upgrade SQL scripts for your database and manually run it against your database or just view the SQL scripts that would be run by the downgrade/upgrade command.
+
+More information can be found here: [Airflow `db downgrade` and Offline generation of SQL scripts](https://airflow.apache.org/docs/apache-airflow/latest/usage-cli.html#downgrading-airflow)
+
+## Reuse of decorated tasks
+
+You can now reuse decorated tasks across your dag files. A decorated task has an `override` method that allows you to override it's arguments.
+
+Here's an example:
+
+```python
+@task
+def add_task(x, y):
+    print(f"Task args: x={x}, y={y}")
+    return x + y
+
+
+@dag(start_date=datetime(2022, 1, 1))
+def mydag():
+    start = add_task.override(task_id="start")(1, 2)
+    for i in range(3):
+        start >> add_task.override(task_id=f"add_start_{i}")(start, i)
+```
+
+More information can be found here: [Reuse of decorated DAGs](https://airflow.apache.org/docs/apache-airflow/latest/tutorial_taskflow_api.html#reusing-a-decorated-task)
+
+## Other small features
+
+This isn’t a comprehensive list, but some noteworthy or interesting small features include:
+
+- Support different timeout value for dag file parsing
+- `airflow dags reserialize` command to reserialize dags
+- `db clean` CLI command for purging old data
+- Events Timetable
+- SmoothOperator - Operator that does literally nothing but it logs YouTube link to
+    Sade song "Smooth Operator"
+
+## Contributors
+Thanks to everyone who contributed to this release: TBD
+