You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by GitBox <gi...@apache.org> on 2022/08/15 19:43:25 UTC

[GitHub] [airflow] josh-fell commented on a diff in pull request #25657: Update code examples from "classic" operators to TaskFlow decorators (faq, best practices, index)

josh-fell commented on code in PR #25657:
URL: https://github.com/apache/airflow/pull/25657#discussion_r946065430


##########
docs/apache-airflow/best-practices.rst:
##########
@@ -135,26 +135,21 @@ Bad example:
       catchup=False,
       tags=["example"],
   ) as dag:
-
+      @task()
       def print_array():

Review Comment:
   The example should show calling `print_array()` as well. Otherwise the task _technically_ doesn't get added to the DAG. Any users, I suspect especially beginners, could copy/paste this code and not understand why the `print_array` task doesn't appear in the DAG.



##########
docs/apache-airflow/best-practices.rst:
##########
@@ -163,7 +158,7 @@ Good example:
       catchup=False,
       tags=["example"],
   ) as dag:
-
+      @task()
       def print_array():

Review Comment:
   Same here. Let's make sure the code snippet actually calls `print_array()`.



##########
docs/apache-airflow/index.rst:
##########
@@ -40,24 +40,27 @@ Take a look at the following snippet of code:
     from datetime import datetime
 
     from airflow import DAG
+    from airflow.decorators import task
     from airflow.operators.bash import BashOperator
-    from airflow.operators.python import PythonOperator
 
     # A DAG represents a workflow, a collection of tasks
     with DAG(dag_id="demo", start_date=datetime(2022, 1, 1), schedule="0 0 * * *") as dag:
 
         # Tasks are represented as operators
         hello = BashOperator(task_id="hello", bash_command="echo hello")
-        airflow = PythonOperator(task_id="airflow", python_callable=lambda: print("airflow"))
+
+        @task()
+        def airflow():
+            print("airflow")
 
         # Set dependencies between tasks
-        hello >> airflow
+        hello >> airflow()
 
 
 Here you see:
 
 - A DAG named "demo", starting on Jan 1st 2022 and running once a day. A DAG is Airflow's representation of a workflow.
-- Two tasks, a BashOperator running a Bash script and a PythonOperator running a Python script
+- Two tasks, a BashOperator running a Bash script and a python function defined using the ``@task`` decorator

Review Comment:
   ```suggestion
   - Two tasks, a BashOperator running a Bash script and a Python function defined using the ``@task`` decorator
   ```
   Small nit here.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org