You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by po...@apache.org on 2024/02/29 08:45:50 UTC

(airflow) branch main updated: Refactor DatasetOrTimeSchedule timetable docs (#37771)

This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/main by this push:
     new 1e11443ed4 Refactor DatasetOrTimeSchedule timetable docs (#37771)
1e11443ed4 is described below

commit 1e11443ed479276fd765868c818bd59de96eeb86
Author: vatsrahul1001 <43...@users.noreply.github.com>
AuthorDate: Thu Feb 29 14:15:43 2024 +0530

    Refactor DatasetOrTimeSchedule timetable docs (#37771)
    
    
    ---------
    
    Co-authored-by: Ankit Chaurasia <86...@users.noreply.github.com>
---
 .../authoring-and-scheduling/timetable.rst         | 30 +++++-----------------
 1 file changed, 6 insertions(+), 24 deletions(-)

diff --git a/docs/apache-airflow/authoring-and-scheduling/timetable.rst b/docs/apache-airflow/authoring-and-scheduling/timetable.rst
index 5ea18b36c4..78234910a6 100644
--- a/docs/apache-airflow/authoring-and-scheduling/timetable.rst
+++ b/docs/apache-airflow/authoring-and-scheduling/timetable.rst
@@ -179,34 +179,15 @@ first) event for the data interval, otherwise manual runs will run with a ``data
 
 .. _dataset-timetable-section:
 
-DatasetTimetable
-^^^^^^^^^^^^^^^^
+Dataset event based scheduling with time based scheduling
+^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
+Combining conditional dataset expressions with time-based schedules enhances scheduling flexibility:
 
-The ``DatasetTimetable`` is a specialized timetable allowing for the scheduling of DAGs based on both time-based schedules and dataset events. It facilitates the creation of scheduled runs (as per traditional timetables) and dataset-triggered runs, which operate independently.
+The ``DatasetOrTimeSchedule`` is a specialized timetable allowing for the scheduling of DAGs based on both time-based schedules and dataset events. It facilitates the creation of scheduled runs (as per traditional timetables) and dataset-triggered runs, which operate independently.
 
 This feature is particularly useful in scenarios where a DAG needs to run on dataset updates and also at periodic intervals. It ensures that the workflow remains responsive to data changes and consistently runs regular checks or updates.
 
-Here's an example of a DAG using ``DatasetTimetable``:
-
-.. code-block:: python
-
-    from airflow.timetables.dataset import DatasetTimetable
-    from airflow.timetables.trigger import CronTriggerTimetable
-
-
-    @dag(
-        schedule=DatasetTimetable(time=CronTriggerTimetable("0 1 * * 3", timezone="UTC"), event=[dag1_dataset])
-        # Additional arguments here, replace this comment with actual arguments
-    )
-    def example_dag():
-        # DAG tasks go here
-        pass
-
-In this example, the DAG is scheduled to run every Wednesday at 01:00 UTC based on the ``CronTriggerTimetable``, and it is also triggered by updates to ``dag1_dataset``.
-
-Integrate conditional dataset with Time-Based Scheduling
-^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
-Combining conditional dataset expressions with time-based schedules enhances scheduling flexibility:
+Here's an example of a DAG using ``DatasetOrTimeSchedule``:
 
 .. code-block:: python
 
@@ -225,6 +206,7 @@ Combining conditional dataset expressions with time-based schedules enhances sch
         pass
 
 
+
 Timetables comparisons
 ----------------------