You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by GitBox <gi...@apache.org> on 2022/12/12 11:12:28 UTC

[GitHub] [airflow] potiuk opened a new pull request, #28300: Add Public API description to Airflow documentation

potiuk opened a new pull request, #28300:
URL: https://github.com/apache/airflow/pull/28300

   <!--
   Thank you for contributing! Please make sure that your code changes
   are covered with tests. And in case of new features or big changes
   remember to adjust the documentation.
   
   Feel free to ping committers for the review!
   
   In case of an existing issue, reference it using one of the following:
   
   closes: #ISSUE
   related: #ISSUE
   
   How to write a good git commit message:
   http://chris.beams.io/posts/git-commit/
   -->
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request Guidelines](https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#pull-request-guidelines)** for more information.
   In case of fundamental code changes, an Airflow Improvement Proposal ([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvement+Proposals)) is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in a newsfragment file, named `{pr_number}.significant.rst` or `{issue_number}.significant.rst`, in [newsfragments](https://github.com/apache/airflow/tree/main/newsfragments).
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] BasPH commented on a diff in pull request #28300: Add Public API description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
BasPH commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1046105195


##########
docs/apache-airflow/public-airflow-api.rst:
##########
@@ -0,0 +1,141 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+"Public API" of Airflow
+=======================
+
+The Public API of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new Executors, Plugins, Operators and Providers. The
+public API of Apache Airflow can be useful for building custom tools and integrations with other systems,
+as well as for automating certain aspects of the Airflow workflow.
+
+In general, the public API is an important part of the Airflow ecosystem and can be a powerful tool for users
+and developers who want to extend the functionality of the system.
+
+What kind of APIs are Public in Apache Airflow?
+===============================================
+
+Apache Airflow has a number of different public APIs that allow developers to interact with various
+aspects of the system. Some examples of the types of public APIs exposed as Python objects
+that are available in Apache Airflow include:
+
+* :doc:`DAG <concepts/dags>`_ (Directed Acyclic Graph) APIs, which allow developers to create, manage,
+  and access DAGs in Airflow.
+* :doc:`Task <concepts/tasks>`_ APIs, which provide access to information about individual tasks within
+  a DAG, such as their dependencies and execution status.
+* :doc:`Operator <concepts/operators>`_ APIs, which allows the developers to write their custom Operators.
+* :doc:`Decorators <howto/create-custom-decorator>`_ APIs, which allows the developers to write their
+  custom decorators to make it easier to write :doc:`TaskFlow <tutorial/taskflow>`_ DAGs.
+* :doc:`Secret Managers <security/secrets>`_ APIs, which allows the developers to write their custom
+  Secret Managers to safely access credentials and other secret configuration of their workflows.
+* :doc:`Connection management <concepts/connections>`_ APIs, which allow developers to manage
+  connections to external systems
+* :doc:`XCom <concepts/xcoms>`_, which allow developers to manage cross-task communication within Airflow.
+* :doc:`Variables <concepts/variables>`_, which allow developers to manage variables within Airflow.
+* :doc:`Executors <executor/index>`_, which allow developers to manage the execution of tasks within Airflow.
+* :doc:`Plugins <plugins>`_, which allow developers to extend internal Airflow capabilities - add new UI
+  pages, custom :doc:`TimeTables <concepts/timetable>`_, :doc:`Extra Links <howto/define_extra_link>`_,
+  :doc:`Listeners <listeners>`_ - all of which are considered public APIs.
+
+Also Airflow has a stable REST API that allows users to interact with Airflow via HTTP requests and a
+CLI that allows users to interact with Airflow via command line commands.
+
+Overall, the public APIs in Apache Airflow are designed to provide developers with a wide
+range of tools and capabilities for interacting with the system and extending its functionality.
+
+What is not part of the Public API of Apache Airflow?
+=====================================================
+
+A public API of Apache Airflow is a set of programming interfaces that are designed to be used
+by developers to access certain features and functionality of the system. As such, not everything in
+Apache Airflow is considered to be part of the public API.
+
+For example, the internal implementation details of Apache Airflow, such as the specific algorithms
+and data structures used to manage DAGs and tasks, are not considered to be part of the public API.
+These implementation details are not intended to be used by external developers, a
+nd are subject to change without notice.
+
+Additionally, certain features and functionality of Apache Airflow may not be exposed through the public API,
+either because they are not considered to be stable or because they are not intended for external use.
+In these cases, developers may not be able to access these features through the public API,
+and should instead use other available methods for interacting with the system.
+
+Similarly, :doc:`Database structure <database-erd-ref>`_ is considered to be an internal implementation
+detail of Apache Airflow and you should not assume the structure is going to be maintained in
+backwards-compatible way.

Review Comment:
   This whole section is vague to me, I'd just remove it. I think it's self-evident that when something is not mentioned as part of the public API, it isn't public.



##########
docs/apache-airflow/public-airflow-api.rst:
##########
@@ -0,0 +1,141 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+"Public API" of Airflow
+=======================
+
+The Public API of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new Executors, Plugins, Operators and Providers. The
+public API of Apache Airflow can be useful for building custom tools and integrations with other systems,
+as well as for automating certain aspects of the Airflow workflow.
+
+In general, the public API is an important part of the Airflow ecosystem and can be a powerful tool for users
+and developers who want to extend the functionality of the system.
+
+What kind of APIs are Public in Apache Airflow?
+===============================================
+
+Apache Airflow has a number of different public APIs that allow developers to interact with various
+aspects of the system. Some examples of the types of public APIs exposed as Python objects
+that are available in Apache Airflow include:
+
+* :doc:`DAG <concepts/dags>`_ (Directed Acyclic Graph) APIs, which allow developers to create, manage,
+  and access DAGs in Airflow.
+* :doc:`Task <concepts/tasks>`_ APIs, which provide access to information about individual tasks within
+  a DAG, such as their dependencies and execution status.
+* :doc:`Operator <concepts/operators>`_ APIs, which allows the developers to write their custom Operators.
+* :doc:`Decorators <howto/create-custom-decorator>`_ APIs, which allows the developers to write their
+  custom decorators to make it easier to write :doc:`TaskFlow <tutorial/taskflow>`_ DAGs.
+* :doc:`Secret Managers <security/secrets>`_ APIs, which allows the developers to write their custom
+  Secret Managers to safely access credentials and other secret configuration of their workflows.
+* :doc:`Connection management <concepts/connections>`_ APIs, which allow developers to manage
+  connections to external systems
+* :doc:`XCom <concepts/xcoms>`_, which allow developers to manage cross-task communication within Airflow.
+* :doc:`Variables <concepts/variables>`_, which allow developers to manage variables within Airflow.
+* :doc:`Executors <executor/index>`_, which allow developers to manage the execution of tasks within Airflow.
+* :doc:`Plugins <plugins>`_, which allow developers to extend internal Airflow capabilities - add new UI
+  pages, custom :doc:`TimeTables <concepts/timetable>`_, :doc:`Extra Links <howto/define_extra_link>`_,
+  :doc:`Listeners <listeners>`_ - all of which are considered public APIs.
+
+Also Airflow has a stable REST API that allows users to interact with Airflow via HTTP requests and a
+CLI that allows users to interact with Airflow via command line commands.
+
+Overall, the public APIs in Apache Airflow are designed to provide developers with a wide
+range of tools and capabilities for interacting with the system and extending its functionality.
+
+What is not part of the Public API of Apache Airflow?
+=====================================================
+
+A public API of Apache Airflow is a set of programming interfaces that are designed to be used
+by developers to access certain features and functionality of the system. As such, not everything in
+Apache Airflow is considered to be part of the public API.
+
+For example, the internal implementation details of Apache Airflow, such as the specific algorithms
+and data structures used to manage DAGs and tasks, are not considered to be part of the public API.
+These implementation details are not intended to be used by external developers, a
+nd are subject to change without notice.
+
+Additionally, certain features and functionality of Apache Airflow may not be exposed through the public API,
+either because they are not considered to be stable or because they are not intended for external use.
+In these cases, developers may not be able to access these features through the public API,
+and should instead use other available methods for interacting with the system.
+
+Similarly, :doc:`Database structure <database-erd-ref>`_ is considered to be an internal implementation
+detail of Apache Airflow and you should not assume the structure is going to be maintained in
+backwards-compatible way.
+
+Is Airflow DB part of the Public API of Airflow?
+================================================
+
+The :doc:`Airflow DB <database-erd-ref>`_ is not considered to be part of the public API of Apache Airflow.
+The Airflow DB is the internal database used by Airflow to store information about DAGs, tasks, and
+other aspects of the system's state. This database is not intended to be accessed directly by
+external developers, and the specific details of its schema and structure are not considered
+to be part of the public API.
+
+Instead, developers who want to interact with the Airflow DB should use the :doc:`stable-rest-api-ref` and
+the :doc:`stable-cli-ref`, both providing a number of different interfaces and methods for accessing and
+manipulating data in the database. These APIs are designed to be more stable and user-friendly than
+accessing the Airflow DB directly, and provide a more consistent and predictable way of interacting
+with the system.
+
+Is Stable REST API part of the Public API of Airflow?
+=====================================================
+
+The :doc:`Stable REST API <stable-cli-ref>`_ is considered to be part of the public API of Apache Airflow.
+The REST API is a set of programming interfaces that allow developers to access certain features of
+Airflow over the HTTP protocol, using a standard set of RESTful (representational state transfer) conventions.
+
+The stable REST API is considered to be part of the public API because it is designed to be used by
+external developers to interact with Airflow in a consistent and predictable way. The REST API provides
+access to many of the same features and functionality that are available through the other public APIs
+of Airflow, such as the ability to create and manage DAGs and tasks, and to monitor their status.
+
+Overall, the stable REST API is an important part of the public API of Apache Airflow, and can
+be a useful tool for developers who want to integrate Airflow with other systems or
+build custom tools and applications on top of Airflow.

Review Comment:
   Going forward, I suggest dropping "stable" and just refer to it as "the REST API".



##########
docs/apache-airflow/index.rst:
##########
@@ -79,6 +79,49 @@ seen running over time:
 Each column represents one DAG run. These are two of the most used views in Airflow, but there are several
 other views which allow you to deep dive into the state of your workflows.
 
+Customizing Airflow
+===================
+
+There are a number of ways to extend the capabilities of Apache Airflow. Some common methods
+for extending Airflow include:

Review Comment:
   "extending" has already been mentioned in the first sentence. So would leave out "for extending Airflow".
   
   Also, using "some common" suggests to me there are more methods. I'd make it explicit and write:
   
   ```suggestion
   There are several ways to extend the capabilities of Apache Airflow:
   ```



##########
docs/apache-airflow/public-airflow-api.rst:
##########
@@ -0,0 +1,141 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+"Public API" of Airflow

Review Comment:
   To me "Public API" suggest a REST API and I think "Public interfaces" suits better.



##########
docs/apache-airflow/public-airflow-api.rst:
##########
@@ -0,0 +1,141 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+"Public API" of Airflow
+=======================
+
+The Public API of Apache Airflow is a set of programmatic interfaces that allow developers to interact

Review Comment:
   "Public" isn't a name so would stick to lowercase.
   
   ```suggestion
   The public API of Apache Airflow is a set of programmatic interfaces that allow developers to interact
   ```



##########
docs/apache-airflow/index.rst:
##########
@@ -79,6 +79,49 @@ seen running over time:
 Each column represents one DAG run. These are two of the most used views in Airflow, but there are several
 other views which allow you to deep dive into the state of your workflows.
 
+Customizing Airflow
+===================
+
+There are a number of ways to extend the capabilities of Apache Airflow. Some common methods
+for extending Airflow include:
+
+* Writing :doc:`Custom Operators <howto/custom-operator>`_ : Operators are the building blocks of DAGs in
+  Airflow, and can be used to perform specific tasks within a DAG. By writing custom operators,
+  you can add new functionality to Airflow, such as the ability to connect to and interact
+  with external services, or to perform complex data transformations.
+
+* Writing :doc:`Custom Decorators <howto/create-custom-decorator>`_ : Custom decorators are a way to
+  extend the functionality of the Airflow ``@task`` decorator. By writing custom decorators,
+  you can add new functionality to Airflow, such as the ability to connect to and interact
+  with external services, or to perform complex data transformations.
+
+* Creating custom :doc:`plugins`: Airflow plugins are extensions to the core system that provide additional
+  features and functionality. By creating custom plugins, you can add new capabilities to Airflow,
+  such as adding custom UI components, creating custom macros, creating your own TimeTables,
+  adding Operator Extra Links, Listeners.
+
+* Writing custom :doc:`provider packages <apache-airflow-providers:index>`: Providers are the components
+  of Airflow that are responsible for connecting to and interacting with external services. By writing
+  custom providers, you can add support for new types of external services, such as databases or
+  cloud services, to Airflow.
+
+With all the above methods, you can use the :doc:`public-airflow-api` to write your
+custom Python code to interact with Airflow.
+
+However, you can also extend Airflow without directly integrating your Python code - via the
+:doc:`stable-rest-api-ref` and the :doc:`stable-cli-ref`. These methods allow you to interact with
+Airflow as an external system:
+
+Particularly, the doc:`stable-rest-api-ref` of Apache Airflow provides a number of methods for programmatic
+accessing and manipulating various aspects of the system. By using the public API, you can build
+custom tools and integrations with other systems, and automate certain aspects of the Airflow workflow.
+
+Overall, there are many different ways to extend the capabilities of Apache Airflow, and the specific
+approach that you choose will depend on your specific needs and requirements.
+By combining these different methods, you can create a powerful and flexible workflow management
+system that can help you automate and optimize your data pipelines.

Review Comment:
   What message do you want to tell here? It's obvious from the list above that there are multiple options. If not necessary -> remove.



##########
docs/apache-airflow/index.rst:
##########
@@ -79,6 +79,49 @@ seen running over time:
 Each column represents one DAG run. These are two of the most used views in Airflow, but there are several
 other views which allow you to deep dive into the state of your workflows.
 
+Customizing Airflow
+===================
+
+There are a number of ways to extend the capabilities of Apache Airflow. Some common methods
+for extending Airflow include:
+
+* Writing :doc:`Custom Operators <howto/custom-operator>`_ : Operators are the building blocks of DAGs in
+  Airflow, and can be used to perform specific tasks within a DAG. By writing custom operators,
+  you can add new functionality to Airflow, such as the ability to connect to and interact
+  with external services, or to perform complex data transformations.
+
+* Writing :doc:`Custom Decorators <howto/create-custom-decorator>`_ : Custom decorators are a way to
+  extend the functionality of the Airflow ``@task`` decorator. By writing custom decorators,
+  you can add new functionality to Airflow, such as the ability to connect to and interact
+  with external services, or to perform complex data transformations.
+
+* Creating custom :doc:`plugins`: Airflow plugins are extensions to the core system that provide additional
+  features and functionality. By creating custom plugins, you can add new capabilities to Airflow,
+  such as adding custom UI components, creating custom macros, creating your own TimeTables,
+  adding Operator Extra Links, Listeners.
+
+* Writing custom :doc:`provider packages <apache-airflow-providers:index>`: Providers are the components
+  of Airflow that are responsible for connecting to and interacting with external services. By writing
+  custom providers, you can add support for new types of external services, such as databases or
+  cloud services, to Airflow.
+
+With all the above methods, you can use the :doc:`public-airflow-api` to write your
+custom Python code to interact with Airflow.
+
+However, you can also extend Airflow without directly integrating your Python code - via the
+:doc:`stable-rest-api-ref` and the :doc:`stable-cli-ref`. These methods allow you to interact with
+Airflow as an external system:
+
+Particularly, the doc:`stable-rest-api-ref` of Apache Airflow provides a number of methods for programmatic

Review Comment:
   ```suggestion
   Particularly, the doc:`stable-rest-api-ref` of Apache Airflow provides a number of methods for programmatically
   ```



##########
docs/apache-airflow/public-airflow-api.rst:
##########
@@ -0,0 +1,141 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+"Public API" of Airflow
+=======================
+
+The Public API of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new Executors, Plugins, Operators and Providers. The
+public API of Apache Airflow can be useful for building custom tools and integrations with other systems,
+as well as for automating certain aspects of the Airflow workflow.
+
+In general, the public API is an important part of the Airflow ecosystem and can be a powerful tool for users
+and developers who want to extend the functionality of the system.
+
+What kind of APIs are Public in Apache Airflow?

Review Comment:
   ```suggestion
   Public interfaces
   ```



##########
docs/apache-airflow/public-airflow-api.rst:
##########
@@ -0,0 +1,141 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+"Public API" of Airflow
+=======================
+
+The Public API of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new Executors, Plugins, Operators and Providers. The
+public API of Apache Airflow can be useful for building custom tools and integrations with other systems,
+as well as for automating certain aspects of the Airflow workflow.
+
+In general, the public API is an important part of the Airflow ecosystem and can be a powerful tool for users
+and developers who want to extend the functionality of the system.
+
+What kind of APIs are Public in Apache Airflow?
+===============================================
+
+Apache Airflow has a number of different public APIs that allow developers to interact with various
+aspects of the system. Some examples of the types of public APIs exposed as Python objects
+that are available in Apache Airflow include:
+
+* :doc:`DAG <concepts/dags>`_ (Directed Acyclic Graph) APIs, which allow developers to create, manage,
+  and access DAGs in Airflow.
+* :doc:`Task <concepts/tasks>`_ APIs, which provide access to information about individual tasks within
+  a DAG, such as their dependencies and execution status.
+* :doc:`Operator <concepts/operators>`_ APIs, which allows the developers to write their custom Operators.
+* :doc:`Decorators <howto/create-custom-decorator>`_ APIs, which allows the developers to write their
+  custom decorators to make it easier to write :doc:`TaskFlow <tutorial/taskflow>`_ DAGs.
+* :doc:`Secret Managers <security/secrets>`_ APIs, which allows the developers to write their custom
+  Secret Managers to safely access credentials and other secret configuration of their workflows.
+* :doc:`Connection management <concepts/connections>`_ APIs, which allow developers to manage
+  connections to external systems
+* :doc:`XCom <concepts/xcoms>`_, which allow developers to manage cross-task communication within Airflow.
+* :doc:`Variables <concepts/variables>`_, which allow developers to manage variables within Airflow.
+* :doc:`Executors <executor/index>`_, which allow developers to manage the execution of tasks within Airflow.
+* :doc:`Plugins <plugins>`_, which allow developers to extend internal Airflow capabilities - add new UI
+  pages, custom :doc:`TimeTables <concepts/timetable>`_, :doc:`Extra Links <howto/define_extra_link>`_,
+  :doc:`Listeners <listeners>`_ - all of which are considered public APIs.
+
+Also Airflow has a stable REST API that allows users to interact with Airflow via HTTP requests and a
+CLI that allows users to interact with Airflow via command line commands.
+
+Overall, the public APIs in Apache Airflow are designed to provide developers with a wide
+range of tools and capabilities for interacting with the system and extending its functionality.

Review Comment:
   That's nice, but I don't think it's necessary to mention for the reader. Would remove.



##########
docs/apache-airflow/index.rst:
##########
@@ -79,6 +79,49 @@ seen running over time:
 Each column represents one DAG run. These are two of the most used views in Airflow, but there are several
 other views which allow you to deep dive into the state of your workflows.
 
+Customizing Airflow
+===================
+
+There are a number of ways to extend the capabilities of Apache Airflow. Some common methods
+for extending Airflow include:
+
+* Writing :doc:`Custom Operators <howto/custom-operator>`_ : Operators are the building blocks of DAGs in
+  Airflow, and can be used to perform specific tasks within a DAG. By writing custom operators,
+  you can add new functionality to Airflow, such as the ability to connect to and interact
+  with external services, or to perform complex data transformations.
+
+* Writing :doc:`Custom Decorators <howto/create-custom-decorator>`_ : Custom decorators are a way to
+  extend the functionality of the Airflow ``@task`` decorator. By writing custom decorators,
+  you can add new functionality to Airflow, such as the ability to connect to and interact
+  with external services, or to perform complex data transformations.
+
+* Creating custom :doc:`plugins`: Airflow plugins are extensions to the core system that provide additional
+  features and functionality. By creating custom plugins, you can add new capabilities to Airflow,
+  such as adding custom UI components, creating custom macros, creating your own TimeTables,
+  adding Operator Extra Links, Listeners.
+
+* Writing custom :doc:`provider packages <apache-airflow-providers:index>`: Providers are the components
+  of Airflow that are responsible for connecting to and interacting with external services. By writing
+  custom providers, you can add support for new types of external services, such as databases or
+  cloud services, to Airflow.
+
+With all the above methods, you can use the :doc:`public-airflow-api` to write your
+custom Python code to interact with Airflow.
+
+However, you can also extend Airflow without directly integrating your Python code - via the
+:doc:`stable-rest-api-ref` and the :doc:`stable-cli-ref`. These methods allow you to interact with
+Airflow as an external system:
+
+Particularly, the doc:`stable-rest-api-ref` of Apache Airflow provides a number of methods for programmatic

Review Comment:
   Remove redundant text
   
   ```suggestion
   Particularly, the doc:`stable-rest-api-ref` provides a number of methods for programmatic
   ```



##########
docs/apache-airflow/public-airflow-api.rst:
##########
@@ -0,0 +1,141 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+"Public API" of Airflow
+=======================
+
+The Public API of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new Executors, Plugins, Operators and Providers. The
+public API of Apache Airflow can be useful for building custom tools and integrations with other systems,
+as well as for automating certain aspects of the Airflow workflow.
+
+In general, the public API is an important part of the Airflow ecosystem and can be a powerful tool for users
+and developers who want to extend the functionality of the system.
+
+What kind of APIs are Public in Apache Airflow?
+===============================================
+
+Apache Airflow has a number of different public APIs that allow developers to interact with various
+aspects of the system. Some examples of the types of public APIs exposed as Python objects
+that are available in Apache Airflow include:
+
+* :doc:`DAG <concepts/dags>`_ (Directed Acyclic Graph) APIs, which allow developers to create, manage,
+  and access DAGs in Airflow.
+* :doc:`Task <concepts/tasks>`_ APIs, which provide access to information about individual tasks within
+  a DAG, such as their dependencies and execution status.
+* :doc:`Operator <concepts/operators>`_ APIs, which allows the developers to write their custom Operators.
+* :doc:`Decorators <howto/create-custom-decorator>`_ APIs, which allows the developers to write their
+  custom decorators to make it easier to write :doc:`TaskFlow <tutorial/taskflow>`_ DAGs.
+* :doc:`Secret Managers <security/secrets>`_ APIs, which allows the developers to write their custom
+  Secret Managers to safely access credentials and other secret configuration of their workflows.
+* :doc:`Connection management <concepts/connections>`_ APIs, which allow developers to manage
+  connections to external systems
+* :doc:`XCom <concepts/xcoms>`_, which allow developers to manage cross-task communication within Airflow.
+* :doc:`Variables <concepts/variables>`_, which allow developers to manage variables within Airflow.
+* :doc:`Executors <executor/index>`_, which allow developers to manage the execution of tasks within Airflow.
+* :doc:`Plugins <plugins>`_, which allow developers to extend internal Airflow capabilities - add new UI
+  pages, custom :doc:`TimeTables <concepts/timetable>`_, :doc:`Extra Links <howto/define_extra_link>`_,
+  :doc:`Listeners <listeners>`_ - all of which are considered public APIs.
+
+Also Airflow has a stable REST API that allows users to interact with Airflow via HTTP requests and a
+CLI that allows users to interact with Airflow via command line commands.

Review Comment:
   Add links to REST API & CLI docs.



##########
docs/apache-airflow/public-airflow-api.rst:
##########
@@ -0,0 +1,141 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+"Public API" of Airflow

Review Comment:
   Why quotes? Suggests to me it isn't public.



##########
docs/apache-airflow/public-airflow-api.rst:
##########
@@ -0,0 +1,141 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+"Public API" of Airflow
+=======================
+
+The Public API of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new Executors, Plugins, Operators and Providers. The
+public API of Apache Airflow can be useful for building custom tools and integrations with other systems,
+as well as for automating certain aspects of the Airflow workflow.
+
+In general, the public API is an important part of the Airflow ecosystem and can be a powerful tool for users
+and developers who want to extend the functionality of the system.
+
+What kind of APIs are Public in Apache Airflow?
+===============================================
+
+Apache Airflow has a number of different public APIs that allow developers to interact with various
+aspects of the system. Some examples of the types of public APIs exposed as Python objects
+that are available in Apache Airflow include:
+
+* :doc:`DAG <concepts/dags>`_ (Directed Acyclic Graph) APIs, which allow developers to create, manage,
+  and access DAGs in Airflow.
+* :doc:`Task <concepts/tasks>`_ APIs, which provide access to information about individual tasks within
+  a DAG, such as their dependencies and execution status.
+* :doc:`Operator <concepts/operators>`_ APIs, which allows the developers to write their custom Operators.
+* :doc:`Decorators <howto/create-custom-decorator>`_ APIs, which allows the developers to write their
+  custom decorators to make it easier to write :doc:`TaskFlow <tutorial/taskflow>`_ DAGs.
+* :doc:`Secret Managers <security/secrets>`_ APIs, which allows the developers to write their custom
+  Secret Managers to safely access credentials and other secret configuration of their workflows.
+* :doc:`Connection management <concepts/connections>`_ APIs, which allow developers to manage
+  connections to external systems
+* :doc:`XCom <concepts/xcoms>`_, which allow developers to manage cross-task communication within Airflow.
+* :doc:`Variables <concepts/variables>`_, which allow developers to manage variables within Airflow.
+* :doc:`Executors <executor/index>`_, which allow developers to manage the execution of tasks within Airflow.
+* :doc:`Plugins <plugins>`_, which allow developers to extend internal Airflow capabilities - add new UI
+  pages, custom :doc:`TimeTables <concepts/timetable>`_, :doc:`Extra Links <howto/define_extra_link>`_,
+  :doc:`Listeners <listeners>`_ - all of which are considered public APIs.
+
+Also Airflow has a stable REST API that allows users to interact with Airflow via HTTP requests and a
+CLI that allows users to interact with Airflow via command line commands.
+
+Overall, the public APIs in Apache Airflow are designed to provide developers with a wide
+range of tools and capabilities for interacting with the system and extending its functionality.
+
+What is not part of the Public API of Apache Airflow?
+=====================================================
+
+A public API of Apache Airflow is a set of programming interfaces that are designed to be used
+by developers to access certain features and functionality of the system. As such, not everything in
+Apache Airflow is considered to be part of the public API.
+
+For example, the internal implementation details of Apache Airflow, such as the specific algorithms
+and data structures used to manage DAGs and tasks, are not considered to be part of the public API.
+These implementation details are not intended to be used by external developers, a
+nd are subject to change without notice.
+
+Additionally, certain features and functionality of Apache Airflow may not be exposed through the public API,
+either because they are not considered to be stable or because they are not intended for external use.
+In these cases, developers may not be able to access these features through the public API,
+and should instead use other available methods for interacting with the system.
+
+Similarly, :doc:`Database structure <database-erd-ref>`_ is considered to be an internal implementation
+detail of Apache Airflow and you should not assume the structure is going to be maintained in
+backwards-compatible way.
+
+Is Airflow DB part of the Public API of Airflow?
+================================================
+
+The :doc:`Airflow DB <database-erd-ref>`_ is not considered to be part of the public API of Apache Airflow.
+The Airflow DB is the internal database used by Airflow to store information about DAGs, tasks, and
+other aspects of the system's state. This database is not intended to be accessed directly by
+external developers, and the specific details of its schema and structure are not considered
+to be part of the public API.
+
+Instead, developers who want to interact with the Airflow DB should use the :doc:`stable-rest-api-ref` and
+the :doc:`stable-cli-ref`, both providing a number of different interfaces and methods for accessing and
+manipulating data in the database. These APIs are designed to be more stable and user-friendly than
+accessing the Airflow DB directly, and provide a more consistent and predictable way of interacting
+with the system.
+
+Is Stable REST API part of the Public API of Airflow?
+=====================================================
+
+The :doc:`Stable REST API <stable-cli-ref>`_ is considered to be part of the public API of Apache Airflow.
+The REST API is a set of programming interfaces that allow developers to access certain features of
+Airflow over the HTTP protocol, using a standard set of RESTful (representational state transfer) conventions.
+
+The stable REST API is considered to be part of the public API because it is designed to be used by
+external developers to interact with Airflow in a consistent and predictable way. The REST API provides
+access to many of the same features and functionality that are available through the other public APIs
+of Airflow, such as the ability to create and manage DAGs and tasks, and to monitor their status.
+
+Overall, the stable REST API is an important part of the public API of Apache Airflow, and can
+be a useful tool for developers who want to integrate Airflow with other systems or
+build custom tools and applications on top of Airflow.
+
+Which classes and packages are a public API of Apache Airflow?
+==============================================================
+
+The specific classes and packages that are considered to be part of the public API of Apache
+Airflow can vary depending on the version of Airflow that you are using. In general, however, the
+public API of Airflow consists of a number of different classes and packages that provide access
+to the core features and functionality of the system.
+
+Some examples of the classes and packages that may be considered as the public API of Apache Airflow include:
+
+* The :class:`~airflow.models.dag.DAG`, which provides a way to define and manage DAGs in Airflow.
+* The :class:`~airflow.models.baseoperator.BaseOperator`, which provides a way write custom operators.
+* The :class:`~airflow.hooks.base.BaseHook`, which provides a way write custom hooks.
+* The :class:`~airflow.decorators.base.TaskDecorator`, which provides a way write custom hooks.
+* :class:`~airflow.models.connection.Connection`, :class:`~airflow.models.variable.Variable` and
+  :class:`~airflow.models.xcom.XCom` classes that are used to access and manage environment of execution of
+  the tasks and allow to perform an inter-task communication.
+* :class:`~airflow.executors.base_executor.BaseExecutor` - the Executors are the components of
+  Airflow that are responsible for executing tasks. There are a number of different executor implementations
+  available in Airflow, each with its own unique characteristics and capabilities. You can develop your own
+  executors in the way that suits your needs.
+
+.. note::
+
+    Originally Executors had a number of internal details and coupling with internal Airflow core logic
+    that made it difficult to write your own executors. But as of
+    Airflow 2.6.0 and `AIP-51 <https://cwiki.apache.org/confluence/display/AIRFLOW/AIP-51+Removing+Executor+Coupling+from+Core+Airflow>`_
+    this is no longer the case. The Executors are now fully decoupled and are a part of the public API.

Review Comment:
   IMO it doesn't add value to mention the history of executor development. Would remove



##########
docs/apache-airflow/public-airflow-api.rst:
##########
@@ -0,0 +1,141 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+"Public API" of Airflow
+=======================
+
+The Public API of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new Executors, Plugins, Operators and Providers. The
+public API of Apache Airflow can be useful for building custom tools and integrations with other systems,
+as well as for automating certain aspects of the Airflow workflow.
+
+In general, the public API is an important part of the Airflow ecosystem and can be a powerful tool for users
+and developers who want to extend the functionality of the system.

Review Comment:
   What message do you want to tell here? If not necessary -> remove.



##########
docs/apache-airflow/public-airflow-api.rst:
##########
@@ -0,0 +1,141 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+"Public API" of Airflow
+=======================
+
+The Public API of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new Executors, Plugins, Operators and Providers. The

Review Comment:
   ```suggestion
   and extending Airflow's capabilities by writing new executors, plugins, operators, and providers. The
   ```



##########
docs/apache-airflow/public-airflow-api.rst:
##########
@@ -0,0 +1,141 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+"Public API" of Airflow
+=======================
+
+The Public API of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new Executors, Plugins, Operators and Providers. The
+public API of Apache Airflow can be useful for building custom tools and integrations with other systems,
+as well as for automating certain aspects of the Airflow workflow.
+
+In general, the public API is an important part of the Airflow ecosystem and can be a powerful tool for users
+and developers who want to extend the functionality of the system.
+
+What kind of APIs are Public in Apache Airflow?
+===============================================
+
+Apache Airflow has a number of different public APIs that allow developers to interact with various
+aspects of the system. Some examples of the types of public APIs exposed as Python objects
+that are available in Apache Airflow include:
+
+* :doc:`DAG <concepts/dags>`_ (Directed Acyclic Graph) APIs, which allow developers to create, manage,
+  and access DAGs in Airflow.
+* :doc:`Task <concepts/tasks>`_ APIs, which provide access to information about individual tasks within
+  a DAG, such as their dependencies and execution status.
+* :doc:`Operator <concepts/operators>`_ APIs, which allows the developers to write their custom Operators.
+* :doc:`Decorators <howto/create-custom-decorator>`_ APIs, which allows the developers to write their
+  custom decorators to make it easier to write :doc:`TaskFlow <tutorial/taskflow>`_ DAGs.
+* :doc:`Secret Managers <security/secrets>`_ APIs, which allows the developers to write their custom
+  Secret Managers to safely access credentials and other secret configuration of their workflows.
+* :doc:`Connection management <concepts/connections>`_ APIs, which allow developers to manage
+  connections to external systems
+* :doc:`XCom <concepts/xcoms>`_, which allow developers to manage cross-task communication within Airflow.
+* :doc:`Variables <concepts/variables>`_, which allow developers to manage variables within Airflow.
+* :doc:`Executors <executor/index>`_, which allow developers to manage the execution of tasks within Airflow.
+* :doc:`Plugins <plugins>`_, which allow developers to extend internal Airflow capabilities - add new UI
+  pages, custom :doc:`TimeTables <concepts/timetable>`_, :doc:`Extra Links <howto/define_extra_link>`_,
+  :doc:`Listeners <listeners>`_ - all of which are considered public APIs.
+
+Also Airflow has a stable REST API that allows users to interact with Airflow via HTTP requests and a
+CLI that allows users to interact with Airflow via command line commands.
+
+Overall, the public APIs in Apache Airflow are designed to provide developers with a wide
+range of tools and capabilities for interacting with the system and extending its functionality.
+
+What is not part of the Public API of Apache Airflow?
+=====================================================
+
+A public API of Apache Airflow is a set of programming interfaces that are designed to be used
+by developers to access certain features and functionality of the system. As such, not everything in
+Apache Airflow is considered to be part of the public API.
+
+For example, the internal implementation details of Apache Airflow, such as the specific algorithms
+and data structures used to manage DAGs and tasks, are not considered to be part of the public API.
+These implementation details are not intended to be used by external developers, a
+nd are subject to change without notice.
+
+Additionally, certain features and functionality of Apache Airflow may not be exposed through the public API,
+either because they are not considered to be stable or because they are not intended for external use.
+In these cases, developers may not be able to access these features through the public API,
+and should instead use other available methods for interacting with the system.
+
+Similarly, :doc:`Database structure <database-erd-ref>`_ is considered to be an internal implementation
+detail of Apache Airflow and you should not assume the structure is going to be maintained in
+backwards-compatible way.
+
+Is Airflow DB part of the Public API of Airflow?
+================================================
+
+The :doc:`Airflow DB <database-erd-ref>`_ is not considered to be part of the public API of Apache Airflow.
+The Airflow DB is the internal database used by Airflow to store information about DAGs, tasks, and
+other aspects of the system's state. This database is not intended to be accessed directly by
+external developers, and the specific details of its schema and structure are not considered
+to be part of the public API.
+
+Instead, developers who want to interact with the Airflow DB should use the :doc:`stable-rest-api-ref` and
+the :doc:`stable-cli-ref`, both providing a number of different interfaces and methods for accessing and
+manipulating data in the database. These APIs are designed to be more stable and user-friendly than
+accessing the Airflow DB directly, and provide a more consistent and predictable way of interacting
+with the system.
+
+Is Stable REST API part of the Public API of Airflow?
+=====================================================
+
+The :doc:`Stable REST API <stable-cli-ref>`_ is considered to be part of the public API of Apache Airflow.
+The REST API is a set of programming interfaces that allow developers to access certain features of
+Airflow over the HTTP protocol, using a standard set of RESTful (representational state transfer) conventions.
+
+The stable REST API is considered to be part of the public API because it is designed to be used by
+external developers to interact with Airflow in a consistent and predictable way. The REST API provides
+access to many of the same features and functionality that are available through the other public APIs
+of Airflow, such as the ability to create and manage DAGs and tasks, and to monitor their status.
+
+Overall, the stable REST API is an important part of the public API of Apache Airflow, and can
+be a useful tool for developers who want to integrate Airflow with other systems or
+build custom tools and applications on top of Airflow.

Review Comment:
   IMO this section can be written in one word: "yes".
   
   There's already a list above of components that are considered public, therefore I'd remove this section and add it there.



##########
docs/apache-airflow/public-airflow-api.rst:
##########
@@ -0,0 +1,141 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+"Public API" of Airflow
+=======================
+
+The Public API of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new Executors, Plugins, Operators and Providers. The
+public API of Apache Airflow can be useful for building custom tools and integrations with other systems,
+as well as for automating certain aspects of the Airflow workflow.
+
+In general, the public API is an important part of the Airflow ecosystem and can be a powerful tool for users
+and developers who want to extend the functionality of the system.
+
+What kind of APIs are Public in Apache Airflow?
+===============================================
+
+Apache Airflow has a number of different public APIs that allow developers to interact with various
+aspects of the system. Some examples of the types of public APIs exposed as Python objects
+that are available in Apache Airflow include:
+
+* :doc:`DAG <concepts/dags>`_ (Directed Acyclic Graph) APIs, which allow developers to create, manage,
+  and access DAGs in Airflow.
+* :doc:`Task <concepts/tasks>`_ APIs, which provide access to information about individual tasks within
+  a DAG, such as their dependencies and execution status.
+* :doc:`Operator <concepts/operators>`_ APIs, which allows the developers to write their custom Operators.
+* :doc:`Decorators <howto/create-custom-decorator>`_ APIs, which allows the developers to write their
+  custom decorators to make it easier to write :doc:`TaskFlow <tutorial/taskflow>`_ DAGs.
+* :doc:`Secret Managers <security/secrets>`_ APIs, which allows the developers to write their custom
+  Secret Managers to safely access credentials and other secret configuration of their workflows.
+* :doc:`Connection management <concepts/connections>`_ APIs, which allow developers to manage
+  connections to external systems
+* :doc:`XCom <concepts/xcoms>`_, which allow developers to manage cross-task communication within Airflow.
+* :doc:`Variables <concepts/variables>`_, which allow developers to manage variables within Airflow.
+* :doc:`Executors <executor/index>`_, which allow developers to manage the execution of tasks within Airflow.
+* :doc:`Plugins <plugins>`_, which allow developers to extend internal Airflow capabilities - add new UI
+  pages, custom :doc:`TimeTables <concepts/timetable>`_, :doc:`Extra Links <howto/define_extra_link>`_,
+  :doc:`Listeners <listeners>`_ - all of which are considered public APIs.
+
+Also Airflow has a stable REST API that allows users to interact with Airflow via HTTP requests and a
+CLI that allows users to interact with Airflow via command line commands.
+
+Overall, the public APIs in Apache Airflow are designed to provide developers with a wide
+range of tools and capabilities for interacting with the system and extending its functionality.
+
+What is not part of the Public API of Apache Airflow?
+=====================================================
+
+A public API of Apache Airflow is a set of programming interfaces that are designed to be used
+by developers to access certain features and functionality of the system. As such, not everything in
+Apache Airflow is considered to be part of the public API.
+
+For example, the internal implementation details of Apache Airflow, such as the specific algorithms
+and data structures used to manage DAGs and tasks, are not considered to be part of the public API.
+These implementation details are not intended to be used by external developers, a
+nd are subject to change without notice.
+
+Additionally, certain features and functionality of Apache Airflow may not be exposed through the public API,
+either because they are not considered to be stable or because they are not intended for external use.
+In these cases, developers may not be able to access these features through the public API,
+and should instead use other available methods for interacting with the system.
+
+Similarly, :doc:`Database structure <database-erd-ref>`_ is considered to be an internal implementation
+detail of Apache Airflow and you should not assume the structure is going to be maintained in
+backwards-compatible way.
+
+Is Airflow DB part of the Public API of Airflow?
+================================================
+
+The :doc:`Airflow DB <database-erd-ref>`_ is not considered to be part of the public API of Apache Airflow.
+The Airflow DB is the internal database used by Airflow to store information about DAGs, tasks, and
+other aspects of the system's state. This database is not intended to be accessed directly by
+external developers, and the specific details of its schema and structure are not considered
+to be part of the public API.
+
+Instead, developers who want to interact with the Airflow DB should use the :doc:`stable-rest-api-ref` and
+the :doc:`stable-cli-ref`, both providing a number of different interfaces and methods for accessing and
+manipulating data in the database. These APIs are designed to be more stable and user-friendly than
+accessing the Airflow DB directly, and provide a more consistent and predictable way of interacting
+with the system.
+
+Is Stable REST API part of the Public API of Airflow?
+=====================================================
+
+The :doc:`Stable REST API <stable-cli-ref>`_ is considered to be part of the public API of Apache Airflow.
+The REST API is a set of programming interfaces that allow developers to access certain features of
+Airflow over the HTTP protocol, using a standard set of RESTful (representational state transfer) conventions.
+
+The stable REST API is considered to be part of the public API because it is designed to be used by
+external developers to interact with Airflow in a consistent and predictable way. The REST API provides
+access to many of the same features and functionality that are available through the other public APIs
+of Airflow, such as the ability to create and manage DAGs and tasks, and to monitor their status.
+
+Overall, the stable REST API is an important part of the public API of Apache Airflow, and can
+be a useful tool for developers who want to integrate Airflow with other systems or
+build custom tools and applications on top of Airflow.
+
+Which classes and packages are a public API of Apache Airflow?
+==============================================================
+
+The specific classes and packages that are considered to be part of the public API of Apache
+Airflow can vary depending on the version of Airflow that you are using. In general, however, the
+public API of Airflow consists of a number of different classes and packages that provide access
+to the core features and functionality of the system.
+
+Some examples of the classes and packages that may be considered as the public API of Apache Airflow include:

Review Comment:
   Is it not possible to create a complete list? I think the list of interfaces remains fairly static over versions.



##########
docs/apache-airflow/index.rst:
##########
@@ -79,6 +79,49 @@ seen running over time:
 Each column represents one DAG run. These are two of the most used views in Airflow, but there are several
 other views which allow you to deep dive into the state of your workflows.
 
+Customizing Airflow
+===================
+
+There are a number of ways to extend the capabilities of Apache Airflow. Some common methods
+for extending Airflow include:
+
+* Writing :doc:`Custom Operators <howto/custom-operator>`_ : Operators are the building blocks of DAGs in
+  Airflow, and can be used to perform specific tasks within a DAG. By writing custom operators,
+  you can add new functionality to Airflow, such as the ability to connect to and interact
+  with external services, or to perform complex data transformations.
+
+* Writing :doc:`Custom Decorators <howto/create-custom-decorator>`_ : Custom decorators are a way to
+  extend the functionality of the Airflow ``@task`` decorator. By writing custom decorators,
+  you can add new functionality to Airflow, such as the ability to connect to and interact
+  with external services, or to perform complex data transformations.
+
+* Creating custom :doc:`plugins`: Airflow plugins are extensions to the core system that provide additional
+  features and functionality. By creating custom plugins, you can add new capabilities to Airflow,
+  such as adding custom UI components, creating custom macros, creating your own TimeTables,
+  adding Operator Extra Links, Listeners.
+
+* Writing custom :doc:`provider packages <apache-airflow-providers:index>`: Providers are the components
+  of Airflow that are responsible for connecting to and interacting with external services. By writing
+  custom providers, you can add support for new types of external services, such as databases or
+  cloud services, to Airflow.
+
+With all the above methods, you can use the :doc:`public-airflow-api` to write your
+custom Python code to interact with Airflow.
+
+However, you can also extend Airflow without directly integrating your Python code - via the
+:doc:`stable-rest-api-ref` and the :doc:`stable-cli-ref`. These methods allow you to interact with
+Airflow as an external system:

Review Comment:
   Would structure this doc starting with the three key points:
   
   1. Extend Python code
   2. REST API
   3. CLI
   
   And then go deeper into each of the three. This gives the reader an immediate understanding of what's possible instead of having to read through all the options of point nr 1 first.



##########
docs/apache-airflow/public-airflow-api.rst:
##########
@@ -0,0 +1,141 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+"Public API" of Airflow
+=======================
+
+The Public API of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new Executors, Plugins, Operators and Providers. The
+public API of Apache Airflow can be useful for building custom tools and integrations with other systems,
+as well as for automating certain aspects of the Airflow workflow.
+
+In general, the public API is an important part of the Airflow ecosystem and can be a powerful tool for users
+and developers who want to extend the functionality of the system.
+
+What kind of APIs are Public in Apache Airflow?
+===============================================
+
+Apache Airflow has a number of different public APIs that allow developers to interact with various
+aspects of the system. Some examples of the types of public APIs exposed as Python objects
+that are available in Apache Airflow include:
+
+* :doc:`DAG <concepts/dags>`_ (Directed Acyclic Graph) APIs, which allow developers to create, manage,
+  and access DAGs in Airflow.
+* :doc:`Task <concepts/tasks>`_ APIs, which provide access to information about individual tasks within
+  a DAG, such as their dependencies and execution status.
+* :doc:`Operator <concepts/operators>`_ APIs, which allows the developers to write their custom Operators.
+* :doc:`Decorators <howto/create-custom-decorator>`_ APIs, which allows the developers to write their
+  custom decorators to make it easier to write :doc:`TaskFlow <tutorial/taskflow>`_ DAGs.
+* :doc:`Secret Managers <security/secrets>`_ APIs, which allows the developers to write their custom
+  Secret Managers to safely access credentials and other secret configuration of their workflows.
+* :doc:`Connection management <concepts/connections>`_ APIs, which allow developers to manage
+  connections to external systems
+* :doc:`XCom <concepts/xcoms>`_, which allow developers to manage cross-task communication within Airflow.
+* :doc:`Variables <concepts/variables>`_, which allow developers to manage variables within Airflow.
+* :doc:`Executors <executor/index>`_, which allow developers to manage the execution of tasks within Airflow.
+* :doc:`Plugins <plugins>`_, which allow developers to extend internal Airflow capabilities - add new UI
+  pages, custom :doc:`TimeTables <concepts/timetable>`_, :doc:`Extra Links <howto/define_extra_link>`_,
+  :doc:`Listeners <listeners>`_ - all of which are considered public APIs.
+
+Also Airflow has a stable REST API that allows users to interact with Airflow via HTTP requests and a
+CLI that allows users to interact with Airflow via command line commands.
+
+Overall, the public APIs in Apache Airflow are designed to provide developers with a wide
+range of tools and capabilities for interacting with the system and extending its functionality.
+
+What is not part of the Public API of Apache Airflow?
+=====================================================
+
+A public API of Apache Airflow is a set of programming interfaces that are designed to be used
+by developers to access certain features and functionality of the system. As such, not everything in
+Apache Airflow is considered to be part of the public API.
+
+For example, the internal implementation details of Apache Airflow, such as the specific algorithms
+and data structures used to manage DAGs and tasks, are not considered to be part of the public API.
+These implementation details are not intended to be used by external developers, a
+nd are subject to change without notice.
+
+Additionally, certain features and functionality of Apache Airflow may not be exposed through the public API,
+either because they are not considered to be stable or because they are not intended for external use.
+In these cases, developers may not be able to access these features through the public API,
+and should instead use other available methods for interacting with the system.
+
+Similarly, :doc:`Database structure <database-erd-ref>`_ is considered to be an internal implementation
+detail of Apache Airflow and you should not assume the structure is going to be maintained in
+backwards-compatible way.
+
+Is Airflow DB part of the Public API of Airflow?
+================================================
+
+The :doc:`Airflow DB <database-erd-ref>`_ is not considered to be part of the public API of Apache Airflow.
+The Airflow DB is the internal database used by Airflow to store information about DAGs, tasks, and
+other aspects of the system's state. This database is not intended to be accessed directly by
+external developers, and the specific details of its schema and structure are not considered
+to be part of the public API.
+
+Instead, developers who want to interact with the Airflow DB should use the :doc:`stable-rest-api-ref` and
+the :doc:`stable-cli-ref`, both providing a number of different interfaces and methods for accessing and
+manipulating data in the database. These APIs are designed to be more stable and user-friendly than
+accessing the Airflow DB directly, and provide a more consistent and predictable way of interacting
+with the system.
+
+Is Stable REST API part of the Public API of Airflow?
+=====================================================
+
+The :doc:`Stable REST API <stable-cli-ref>`_ is considered to be part of the public API of Apache Airflow.
+The REST API is a set of programming interfaces that allow developers to access certain features of
+Airflow over the HTTP protocol, using a standard set of RESTful (representational state transfer) conventions.
+
+The stable REST API is considered to be part of the public API because it is designed to be used by
+external developers to interact with Airflow in a consistent and predictable way. The REST API provides
+access to many of the same features and functionality that are available through the other public APIs
+of Airflow, such as the ability to create and manage DAGs and tasks, and to monitor their status.
+
+Overall, the stable REST API is an important part of the public API of Apache Airflow, and can
+be a useful tool for developers who want to integrate Airflow with other systems or
+build custom tools and applications on top of Airflow.

Review Comment:
   Instead of long stories, would keep it concise, and create a bullet list of items that are (and if needed -- are not) part of the public APIs. That reads much easier than having to distill the correct information from text.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on a diff in pull request #28300: Add Public API description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
potiuk commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1046552591


##########
docs/apache-airflow/public-airflow-api.rst:
##########
@@ -0,0 +1,141 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+"Public API" of Airflow
+=======================
+
+The Public API of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new Executors, Plugins, Operators and Providers. The

Review Comment:
   yep. Better when without links.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on pull request #28300: Add Public Interface description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
potiuk commented on PR #28300:
URL: https://github.com/apache/airflow/pull/28300#issuecomment-1363215643

   One other point (which very much concurs with what I described  in my proposal about executor is actually something that @Taragolis noticed.
   
    https://github.com/apache/airflow/pull/28538#issuecomment-1363130466
   
   Previously we could not really move Celery and Kubernetes executors to a provider, while with AiP-51 implemented in 2.6 we actually could.  I am not sure if we would like to (CeleryKubernetesExecutor is a bit probllematic) - but in theory after AIP-51 gets implemented, both Celery and Kubernetes executor **could** be entirely moved into the providers. 
   
   Whether we should do it or not is another story (and completely different discussion that we might be having),
   
   This is what I mean as truly decoupled and well defined interface
   
   @BasPH  -> does it make sense.
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on a diff in pull request #28300: Add Public Interface description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
potiuk commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1058628066


##########
docs/apache-airflow/administration-and-deployment/public-airflow-interface.rst:
##########
@@ -0,0 +1,87 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+===========================
+
+The Public Interface of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This includes operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+and for automating certain aspects of the Airflow workflow.
+
+You can extend Airflow in three ways:
+
+* By writing new custom Python code (via Operators, Plugins, Provider)
+* By using the `Stable REST API <stable-rest-api-ref>`_ (based on the OpenAPI specification)
+* By using the `Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_
+
+How can you extend Apache Airflow with custom Python Code?
+==========================================================
+
+The Public Interface of Airflow consists of a number of different classes and packages that provide access
+to the core features and functionality of the system.
+
+The classes and packages that may be considered as the Public Interface include:
+
+* The :class:`~airflow.DAG`, which provides a way to define and manage DAGs in Airflow.
+* The :class:`~airflow.models.baseoperator.BaseOperator`, which provides a way write custom operators.
+* The :class:`~airflow.hooks.base.BaseHook`, which provides a way write custom hooks.
+* The :class:`~airflow.models.connection.Connection`, which provides access to external service credentials and configuration.
+* The :class:`~airflow.models.variable.Variable`, which provides access to Airflow configuration variables.
+* The :class:`~airflow.models.xcom.XCom` which are used to access to inter-task communication data.
+* The :class:`~airflow.secrets.BaseSecretsBackend` which are used to define custom secret managers.
+* The :class:`~airflow.plugins_manager.AirflowPlugin` which are used to define custom plugins.
+* The :class:`~airflow.triggers.base.BaseTrigger`, which are used to implement custom Custom Deferrable Operators (based on ``asyncio``).
+* The :class:`~airflow.decorators.base.TaskDecorator`, which provides a way write custom decorators.
+* The :class:`~airflow.listeners.listener.ListenerManager` class which provides hooks that can be implemented to respond to DAG/Task lifecycle events.
+
+.. versionadded:: 2.5
+
+   Listener public interface has been added in version 2.5.
+
+* The :class:`~airflow.executors.base_executor.BaseExecutor` - the Executors are the components of Airflow
+  that are responsible for executing tasks.
+
+.. versionadded:: 2.6
+
+   There are a number of different executor implementations built-in Airflow, each with its own unique
+   characteristics and capabilities. Executor interface was available in earlier version of Airflow but
+   only as of version 2.6 executors are fully decoupled and Airflow does not rely on built-in set of executors.
+   You could have implemented (and succeeded) with implementing Executors before Airflow 2.6 and a number
+   of people succeeded in doing so, but there were some hard-coded behaviours that preferred in-built
+   executors, and custom executors could not provide full functionality that built-in executors had.
+
+
+What is not part of the Public Interface of Apache Airflow?
+===========================================================
+
+Everything not mentioned in this document should be considered as non-Public Interface.

Review Comment:
   And yes I want to add eventually tests to make sure that all our providers are only using what we consider public" and nothing else. This will be partially done as part of AIP-44 (for DB methods) but it can be extended to all Python calls made by the providers. 



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] o-nikolas commented on a diff in pull request #28300: Add Public API description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
o-nikolas commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1046515400


##########
docs/apache-airflow/public-airflow-api.rst:
##########
@@ -0,0 +1,141 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+"Public API" of Airflow
+=======================
+
+The Public API of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new Executors, Plugins, Operators and Providers. The
+public API of Apache Airflow can be useful for building custom tools and integrations with other systems,
+as well as for automating certain aspects of the Airflow workflow.
+
+In general, the public API is an important part of the Airflow ecosystem and can be a powerful tool for users
+and developers who want to extend the functionality of the system.
+
+What kind of APIs are Public in Apache Airflow?
+===============================================
+
+Apache Airflow has a number of different public APIs that allow developers to interact with various
+aspects of the system. Some examples of the types of public APIs exposed as Python objects
+that are available in Apache Airflow include:
+
+* :doc:`DAG <concepts/dags>`_ (Directed Acyclic Graph) APIs, which allow developers to create, manage,
+  and access DAGs in Airflow.
+* :doc:`Task <concepts/tasks>`_ APIs, which provide access to information about individual tasks within
+  a DAG, such as their dependencies and execution status.
+* :doc:`Operator <concepts/operators>`_ APIs, which allows the developers to write their custom Operators.
+* :doc:`Decorators <howto/create-custom-decorator>`_ APIs, which allows the developers to write their
+  custom decorators to make it easier to write :doc:`TaskFlow <tutorial/taskflow>`_ DAGs.
+* :doc:`Secret Managers <security/secrets>`_ APIs, which allows the developers to write their custom
+  Secret Managers to safely access credentials and other secret configuration of their workflows.
+* :doc:`Connection management <concepts/connections>`_ APIs, which allow developers to manage
+  connections to external systems
+* :doc:`XCom <concepts/xcoms>`_, which allow developers to manage cross-task communication within Airflow.
+* :doc:`Variables <concepts/variables>`_, which allow developers to manage variables within Airflow.
+* :doc:`Executors <executor/index>`_, which allow developers to manage the execution of tasks within Airflow.
+* :doc:`Plugins <plugins>`_, which allow developers to extend internal Airflow capabilities - add new UI
+  pages, custom :doc:`TimeTables <concepts/timetable>`_, :doc:`Extra Links <howto/define_extra_link>`_,
+  :doc:`Listeners <listeners>`_ - all of which are considered public APIs.
+
+Also Airflow has a stable REST API that allows users to interact with Airflow via HTTP requests and a
+CLI that allows users to interact with Airflow via command line commands.
+
+Overall, the public APIs in Apache Airflow are designed to provide developers with a wide
+range of tools and capabilities for interacting with the system and extending its functionality.
+
+What is not part of the Public API of Apache Airflow?
+=====================================================
+
+A public API of Apache Airflow is a set of programming interfaces that are designed to be used
+by developers to access certain features and functionality of the system. As such, not everything in
+Apache Airflow is considered to be part of the public API.
+
+For example, the internal implementation details of Apache Airflow, such as the specific algorithms
+and data structures used to manage DAGs and tasks, are not considered to be part of the public API.

Review Comment:
   > But I want to first document what we have rather than go into the rabbit hole of changing 100s of classes and methods to contain underscore.
   
   Sorry for the confusion I was **not** saying that we should update those classes in this PR. I was just saying that we should add concrete verbiage here in this documentation that says any class/method/function that is "private" is not included in the public API.
   It's an easy win and a useful tool we can use in the future (when we do start marking more things as private).



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on a diff in pull request #28300: Add Public Interface description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
potiuk commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1050150795


##########
docs/apache-airflow/public-airflow-interface.rst:
##########
@@ -0,0 +1,115 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+===========================
+
+The Public Interface of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+as well as for automating certain aspects of the Airflow workflow.
+
+In general, the Public Interface is an important part of the Airflow ecosystem and can be a powerful
+tool for users and developers who want to extend the functionality of the system.
+
+You can extend Airflow in three ways:
+
+* By writing new custom Python code (via Operators, Plugins, Provider)
+* By using the `Stable REST API <stable-rest-api-ref>`_ (based on the OpenAPI specification)
+* By using the `Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_
+
+How can you extend Apache Airflow with custom Python Code?
+==========================================================
+
+Apache Airflow has a number of different Public Interfaces that allow developers to interact with various
+aspects of the system. Some examples of the types of Public Interfaces exposed as Python objects
+that are available in Apache Airflow include:
+
+* `DAG <concepts/dags>`_ (Directed Acyclic Graph) APIs, which allow developers to create, manage,
+  and access DAGs in Airflow.
+* `Task <concepts/tasks>`_ APIs, which provide access to information about individual tasks within
+  a DAG, such as their dependencies and execution status.
+* `Operator <concepts/operators>`_ APIs, which allows the developers to write their custom Operators.
+* `Decorators <howto/create-custom-decorator>`_ APIs, which allows the developers to write their
+  custom decorators to make it easier to write `TaskFlow <tutorial/taskflow>`_ DAGs.
+* `Secret Managers <security/secrets>`_ APIs, which allows the developers to write their custom
+  Secret Managers to safely access credentials and other secret configuration of their workflows.
+* `Connection management <concepts/connections>`_ APIs, which allow developers to manage
+  connections to external systems
+* `XCom <concepts/xcoms>`_, which allow developers to manage cross-task communication within Airflow.
+* `Variables <concepts/variables>`_, which allow developers to manage variables within Airflow.
+* `Executors <executor/index>`_, which allow developers to manage the execution of tasks within Airflow.
+* `Listeners <listeners>`_, which allow developers to react to DAG/Task lifecycle events.
+* `Plugins <plugins>`_, which allow developers to extend internal Airflow capabilities - add new UI
+  pages, custom `TimeTables <concepts/timetable>`_, `Extra Links <howto/define_extra_link>`_,
+  `Triggers <concept/deferring>`_. `Listeners <listeners>`_.
+
+What is not part of the Public Interface of Apache Airflow?
+===========================================================
+
+Everything not mentioned in this document should be considered as non-Public Interface.
+
+The users however, might have some assumptions about different parts of Airflow, thinking that
+they can rely on them, so here we try to clarify and want to be explicit that they are not part of the
+Public Interface:
+
+* `Database structure <database-erd-ref>`_ is considered to be an internal implementation
+  detail and you should not assume the structure is going to be maintained in
+  backwards-compatible way.
+
+* `Web UI <ui>`_ is considered to be continuously adapting and evolving for Apache Airflow and
+  you should not assume that the UI will remain as it is in backwards-compatible way. Views and screens
+  might be updated and removed in major releases without impacting backwards-compatibility.
+
+* `Python API <python-api-ref>`_ (except the explicitly mentioned classes below), are considered an
+  internal implementation details and you should not assume they will be maintained
+  in backwards-compatible way.
+
+Which classes and packages are part of the Public Interface of Apache Airflow?
+==============================================================================
+
+The Public Interface of Airflow consists of a number of different classes and packages that provide access
+to the core features and functionality of the system.
+
+The classes and packages that may be considered as the Public Interface include:
+
+* The :class:`~airflow.DAG`, which provides a way to define and manage DAGs in Airflow.
+* The :class:`~airflow.models.baseoperator.BaseOperator`, which provides a way write custom operators.
+* The :class:`~airflow.hooks.base.BaseHook`, which provides a way write custom hooks.
+* The :class:`~airflow.models.connection.Connection`, which provides access to external service credentials and configuration.
+* The :class:`~airflow.models.variable.Variable`, which provides access to Airflow configuration variables.
+* The :class:`~airflow.models.xcom.XCom` which are used to access to inter-task communication data.
+* The :class:`~airflow.secrets.BaseSecretsBackend` which are used to define custom secret managers.
+* The :class:`~airflow.plugins_manager.AirflowPlugin` which are used to define custom plugins.
+* The :class:`~airflow.triggers.base.BaseTrigger`, which are used to implement custom Custom Deferrable Operators (based on ``async.io``).
+* The :class:`~airflow.decorators.base.TaskDecorator`, which provides a way write custom decorators.
+* The :class:`~airflow.listeners.listener.ListenerManager` class which provides hooks that can be implemented to respond to DAG/Task lifecycle events
+
+.. versionadded:: 2.5
+
+   Listener public interface has been added in version 2.5.
+
+* The :class:`~airflow.executors.base_executor.BaseExecutor` - the Executors are the components of Airflow
+  that are responsible for executing tasks.
+
+.. versionadded:: 2.6
+
+   There are a number of different executor implementations built-in Airflow, each with its own unique
+   characteristics and capabilities. Executor interface was available in earlier version of Airflow but
+   only as of version 2.6 executors are decoupled and Airflow does not rely on built-in set of executors.

Review Comment:
   I updated the note now:
   
   >    There are a number of different executor implementations built-in Airflow, each with its own unique
   characteristics and capabilities. Executor interface was available in earlier version of Airflow but
   only as of version 2.6 executors are fully decoupled and Airflow does not rely on built-in set of executors.
   You could have implemented (and succeeded) with implementing Executors before Airflow 2.6 and a number
   of people succeeded in doing so, but there were some hard-coded behaviours that preferred in-built
   executors, and custom executors could not provide full functionality that built-in executors had.
   
   I hope it is much closer to "truth" now - it mentions that it was possible to implement executors before and succeed with it (truth 1) but that you could have hit the limits because of some hard-coded behaviours (truth 2) and that in 2.6 you can rely on public api of executor to be on-par with the built-in executors (truth 3) 
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on a diff in pull request #28300: Add Public Interface description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
potiuk commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1051382595


##########
docs/apache-airflow/administration-and-deployment/public-airflow-interface.rst:
##########
@@ -0,0 +1,118 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+===========================
+
+The Public Interface of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+as well as for automating certain aspects of the Airflow workflow.
+
+In general, the Public Interface is an important part of the Airflow ecosystem and can be a powerful
+tool for users and developers who want to extend the functionality of the system.

Review Comment:
   Yes. Almost last remnant of the GPTChat.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on pull request #28300: Add Public Interface description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
potiuk commented on PR #28300:
URL: https://github.com/apache/airflow/pull/28300#issuecomment-1367581726

   Ok. I brought it to the state that - I **hope** we can start discussing what is missing. I re-wrote the headers to reflect the current thinking of mine how our users could find this page useful, but also to make this page a really comprehensive and complete description of what the users might expect from Airflow as a "library".
   
   This is the current proposal for the "structure" of the page:
   
   <img width="309" alt="Screenshot 2022-12-29 at 21 52 13" src="https://user-images.githubusercontent.com/595491/210010256-6c8a1f35-ad24-4004-bebe-68afa5adfee5.png">
   
   And I like that we can get rid of other pages that were "almost, but not quite entirely unlike what i wanted to achieve" (quoting my favourite writer). 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on pull request #28300: Add Public Interface description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
potiuk commented on PR #28300:
URL: https://github.com/apache/airflow/pull/28300#issuecomment-1377861970

   Hey @BasPH - just to repeat it - is your "change requested" still valid?
   
   Is the current approach good enough for you or do you think there is something stil not OK with teh approach I finally came up with after listening and getting feedback and applying multiple ideas on how to address the issue that has been already discussed in the devlist as needed ?
   
   I think we need to start openly communicate with our users on what they should expect as something they can rely on, And I think this is the first step tp do it.
   
   Anything else that needs to be done here do you thinkl? I would really love to merge it to move forward and seems it's been stalled on your review.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on a diff in pull request #28300: Add Public Interface description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
potiuk commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1047336269


##########
docs/apache-airflow/index.rst:
##########
@@ -79,6 +79,48 @@ seen running over time:
 Each column represents one DAG run. These are two of the most used views in Airflow, but there are several
 other views which allow you to deep dive into the state of your workflows.
 
+Customizing Airflow
+===================
+
+There are several ways to extend the capabilities of Apache Airflow:
+
+* Writing `Custom Operators <howto/custom-operator>`_ : Operators are the building blocks of DAGs in
+  Airflow, and can be used to perform specific tasks within a DAG. By writing custom operators,
+  you can add new functionality to Airflow, such as the ability to connect to and interact
+  with external services, or to perform complex data transformations.
+
+* Writing `Custom Decorators <howto/create-custom-decorator>`_ : Custom decorators are a way to
+  extend the functionality of the Airflow ``@task`` decorator. By writing custom decorators,
+  you can add new functionality to Airflow, such as the ability to connect to and interact
+  with external services, or to perform complex data transformations.
+
+* Creating custom :doc:`plugins`: Airflow plugins are extensions to the core system that provide additional
+  features and functionality. By creating custom plugins, you can add new capabilities to Airflow,
+  such as adding custom UI components, creating custom macros, creating your own TimeTables,
+  adding Operator Extra Links, Listeners.
+
+* Writing custom `provider packages <apache-airflow-providers:index>`_: Providers are the components
+  of Airflow that are responsible for connecting to and interacting with external services. By writing
+  custom providers, you can add support for new types of external services, such as databases or
+  cloud services, to Airflow.

Review Comment:
   Yep. Good idea.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] BasPH commented on a diff in pull request #28300: Add Public Interface description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
BasPH commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1048225125


##########
docs/apache-airflow/public-airflow-interface.rst:
##########
@@ -0,0 +1,115 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+===========================
+
+The Public Interface of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+as well as for automating certain aspects of the Airflow workflow.
+
+In general, the Public Interface is an important part of the Airflow ecosystem and can be a powerful
+tool for users and developers who want to extend the functionality of the system.
+
+You can extend Airflow in three ways:
+
+* By writing new custom Python code (via Operators, Plugins, Provider)
+* By using the `Stable REST API <stable-rest-api-ref>`_ (based on the OpenAPI specification)
+* By using the `Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_
+
+How can you extend Apache Airflow with custom Python Code?
+==========================================================
+
+Apache Airflow has a number of different Public Interfaces that allow developers to interact with various
+aspects of the system. Some examples of the types of Public Interfaces exposed as Python objects
+that are available in Apache Airflow include:
+
+* `DAG <concepts/dags>`_ (Directed Acyclic Graph) APIs, which allow developers to create, manage,
+  and access DAGs in Airflow.
+* `Task <concepts/tasks>`_ APIs, which provide access to information about individual tasks within
+  a DAG, such as their dependencies and execution status.
+* `Operator <concepts/operators>`_ APIs, which allows the developers to write their custom Operators.
+* `Decorators <howto/create-custom-decorator>`_ APIs, which allows the developers to write their
+  custom decorators to make it easier to write `TaskFlow <tutorial/taskflow>`_ DAGs.
+* `Secret Managers <security/secrets>`_ APIs, which allows the developers to write their custom
+  Secret Managers to safely access credentials and other secret configuration of their workflows.
+* `Connection management <concepts/connections>`_ APIs, which allow developers to manage
+  connections to external systems
+* `XCom <concepts/xcoms>`_, which allow developers to manage cross-task communication within Airflow.
+* `Variables <concepts/variables>`_, which allow developers to manage variables within Airflow.
+* `Executors <executor/index>`_, which allow developers to manage the execution of tasks within Airflow.
+* `Listeners <listeners>`_, which allow developers to react to DAG/Task lifecycle events.
+* `Plugins <plugins>`_, which allow developers to extend internal Airflow capabilities - add new UI
+  pages, custom `TimeTables <concepts/timetable>`_, `Extra Links <howto/define_extra_link>`_,
+  `Triggers <concept/deferring>`_. `Listeners <listeners>`_.
+
+What is not part of the Public Interface of Apache Airflow?
+===========================================================
+
+Everything not mentioned in this document should be considered as non-Public Interface.
+
+The users however, might have some assumptions about different parts of Airflow, thinking that
+they can rely on them, so here we try to clarify and want to be explicit that they are not part of the
+Public Interface:

Review Comment:
   ```suggestion
   Everything not mentioned in this document should be considered as non-Public Interface. This includes:
   ```



##########
docs/apache-airflow/public-airflow-interface.rst:
##########
@@ -0,0 +1,115 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+===========================
+
+The Public Interface of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+as well as for automating certain aspects of the Airflow workflow.
+
+In general, the Public Interface is an important part of the Airflow ecosystem and can be a powerful
+tool for users and developers who want to extend the functionality of the system.
+
+You can extend Airflow in three ways:
+
+* By writing new custom Python code (via Operators, Plugins, Provider)
+* By using the `Stable REST API <stable-rest-api-ref>`_ (based on the OpenAPI specification)
+* By using the `Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_
+
+How can you extend Apache Airflow with custom Python Code?
+==========================================================
+
+Apache Airflow has a number of different Public Interfaces that allow developers to interact with various
+aspects of the system. Some examples of the types of Public Interfaces exposed as Python objects
+that are available in Apache Airflow include:
+
+* `DAG <concepts/dags>`_ (Directed Acyclic Graph) APIs, which allow developers to create, manage,
+  and access DAGs in Airflow.
+* `Task <concepts/tasks>`_ APIs, which provide access to information about individual tasks within
+  a DAG, such as their dependencies and execution status.
+* `Operator <concepts/operators>`_ APIs, which allows the developers to write their custom Operators.
+* `Decorators <howto/create-custom-decorator>`_ APIs, which allows the developers to write their
+  custom decorators to make it easier to write `TaskFlow <tutorial/taskflow>`_ DAGs.
+* `Secret Managers <security/secrets>`_ APIs, which allows the developers to write their custom
+  Secret Managers to safely access credentials and other secret configuration of their workflows.
+* `Connection management <concepts/connections>`_ APIs, which allow developers to manage
+  connections to external systems

Review Comment:
   ```suggestion
     connections to external systems.
   ```



##########
docs/apache-airflow/public-airflow-interface.rst:
##########
@@ -0,0 +1,115 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+===========================
+
+The Public Interface of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+as well as for automating certain aspects of the Airflow workflow.
+
+In general, the Public Interface is an important part of the Airflow ecosystem and can be a powerful
+tool for users and developers who want to extend the functionality of the system.
+
+You can extend Airflow in three ways:
+
+* By writing new custom Python code (via Operators, Plugins, Provider)
+* By using the `Stable REST API <stable-rest-api-ref>`_ (based on the OpenAPI specification)
+* By using the `Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_
+
+How can you extend Apache Airflow with custom Python Code?
+==========================================================
+
+Apache Airflow has a number of different Public Interfaces that allow developers to interact with various
+aspects of the system. Some examples of the types of Public Interfaces exposed as Python objects
+that are available in Apache Airflow include:
+
+* `DAG <concepts/dags>`_ (Directed Acyclic Graph) APIs, which allow developers to create, manage,
+  and access DAGs in Airflow.
+* `Task <concepts/tasks>`_ APIs, which provide access to information about individual tasks within
+  a DAG, such as their dependencies and execution status.
+* `Operator <concepts/operators>`_ APIs, which allows the developers to write their custom Operators.
+* `Decorators <howto/create-custom-decorator>`_ APIs, which allows the developers to write their
+  custom decorators to make it easier to write `TaskFlow <tutorial/taskflow>`_ DAGs.
+* `Secret Managers <security/secrets>`_ APIs, which allows the developers to write their custom
+  Secret Managers to safely access credentials and other secret configuration of their workflows.
+* `Connection management <concepts/connections>`_ APIs, which allow developers to manage
+  connections to external systems
+* `XCom <concepts/xcoms>`_, which allow developers to manage cross-task communication within Airflow.
+* `Variables <concepts/variables>`_, which allow developers to manage variables within Airflow.
+* `Executors <executor/index>`_, which allow developers to manage the execution of tasks within Airflow.
+* `Listeners <listeners>`_, which allow developers to react to DAG/Task lifecycle events.
+* `Plugins <plugins>`_, which allow developers to extend internal Airflow capabilities - add new UI
+  pages, custom `TimeTables <concepts/timetable>`_, `Extra Links <howto/define_extra_link>`_,
+  `Triggers <concept/deferring>`_. `Listeners <listeners>`_.

Review Comment:
   ```suggestion
     `Triggers <concept/deferring>`_, and `Listeners <listeners>`_.
   ```



##########
docs/apache-airflow/public-airflow-interface.rst:
##########
@@ -0,0 +1,115 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+===========================
+
+The Public Interface of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+as well as for automating certain aspects of the Airflow workflow.
+
+In general, the Public Interface is an important part of the Airflow ecosystem and can be a powerful
+tool for users and developers who want to extend the functionality of the system.
+
+You can extend Airflow in three ways:
+
+* By writing new custom Python code (via Operators, Plugins, Provider)
+* By using the `Stable REST API <stable-rest-api-ref>`_ (based on the OpenAPI specification)
+* By using the `Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_
+
+How can you extend Apache Airflow with custom Python Code?
+==========================================================
+
+Apache Airflow has a number of different Public Interfaces that allow developers to interact with various
+aspects of the system. Some examples of the types of Public Interfaces exposed as Python objects
+that are available in Apache Airflow include:
+
+* `DAG <concepts/dags>`_ (Directed Acyclic Graph) APIs, which allow developers to create, manage,
+  and access DAGs in Airflow.
+* `Task <concepts/tasks>`_ APIs, which provide access to information about individual tasks within
+  a DAG, such as their dependencies and execution status.
+* `Operator <concepts/operators>`_ APIs, which allows the developers to write their custom Operators.
+* `Decorators <howto/create-custom-decorator>`_ APIs, which allows the developers to write their
+  custom decorators to make it easier to write `TaskFlow <tutorial/taskflow>`_ DAGs.
+* `Secret Managers <security/secrets>`_ APIs, which allows the developers to write their custom
+  Secret Managers to safely access credentials and other secret configuration of their workflows.
+* `Connection management <concepts/connections>`_ APIs, which allow developers to manage
+  connections to external systems
+* `XCom <concepts/xcoms>`_, which allow developers to manage cross-task communication within Airflow.
+* `Variables <concepts/variables>`_, which allow developers to manage variables within Airflow.
+* `Executors <executor/index>`_, which allow developers to manage the execution of tasks within Airflow.
+* `Listeners <listeners>`_, which allow developers to react to DAG/Task lifecycle events.
+* `Plugins <plugins>`_, which allow developers to extend internal Airflow capabilities - add new UI
+  pages, custom `TimeTables <concepts/timetable>`_, `Extra Links <howto/define_extra_link>`_,
+  `Triggers <concept/deferring>`_. `Listeners <listeners>`_.
+
+What is not part of the Public Interface of Apache Airflow?
+===========================================================
+
+Everything not mentioned in this document should be considered as non-Public Interface.
+
+The users however, might have some assumptions about different parts of Airflow, thinking that
+they can rely on them, so here we try to clarify and want to be explicit that they are not part of the
+Public Interface:
+
+* `Database structure <database-erd-ref>`_ is considered to be an internal implementation
+  detail and you should not assume the structure is going to be maintained in
+  backwards-compatible way.
+
+* `Web UI <ui>`_ is considered to be continuously adapting and evolving for Apache Airflow and
+  you should not assume that the UI will remain as it is in backwards-compatible way. Views and screens
+  might be updated and removed in major releases without impacting backwards-compatibility.
+
+* `Python API <python-api-ref>`_ (except the explicitly mentioned classes below), are considered an
+  internal implementation details and you should not assume they will be maintained
+  in backwards-compatible way.
+
+Which classes and packages are part of the Public Interface of Apache Airflow?
+==============================================================================
+
+The Public Interface of Airflow consists of a number of different classes and packages that provide access
+to the core features and functionality of the system.
+
+The classes and packages that may be considered as the Public Interface include:
+
+* The :class:`~airflow.DAG`, which provides a way to define and manage DAGs in Airflow.
+* The :class:`~airflow.models.baseoperator.BaseOperator`, which provides a way write custom operators.
+* The :class:`~airflow.hooks.base.BaseHook`, which provides a way write custom hooks.
+* The :class:`~airflow.models.connection.Connection`, which provides access to external service credentials and configuration.
+* The :class:`~airflow.models.variable.Variable`, which provides access to Airflow configuration variables.
+* The :class:`~airflow.models.xcom.XCom` which are used to access to inter-task communication data.
+* The :class:`~airflow.secrets.BaseSecretsBackend` which are used to define custom secret managers.
+* The :class:`~airflow.plugins_manager.AirflowPlugin` which are used to define custom plugins.
+* The :class:`~airflow.triggers.base.BaseTrigger`, which are used to implement custom Custom Deferrable Operators (based on ``async.io``).

Review Comment:
   ```suggestion
   * The :class:`~airflow.triggers.base.BaseTrigger`, which are used to implement custom Custom Deferrable Operators (based on ``asyncio``).
   ```



##########
docs/apache-airflow/public-airflow-interface.rst:
##########
@@ -0,0 +1,115 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+===========================
+
+The Public Interface of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+as well as for automating certain aspects of the Airflow workflow.
+
+In general, the Public Interface is an important part of the Airflow ecosystem and can be a powerful
+tool for users and developers who want to extend the functionality of the system.
+
+You can extend Airflow in three ways:
+
+* By writing new custom Python code (via Operators, Plugins, Provider)
+* By using the `Stable REST API <stable-rest-api-ref>`_ (based on the OpenAPI specification)
+* By using the `Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_
+
+How can you extend Apache Airflow with custom Python Code?
+==========================================================
+
+Apache Airflow has a number of different Public Interfaces that allow developers to interact with various
+aspects of the system. Some examples of the types of Public Interfaces exposed as Python objects
+that are available in Apache Airflow include:
+
+* `DAG <concepts/dags>`_ (Directed Acyclic Graph) APIs, which allow developers to create, manage,
+  and access DAGs in Airflow.
+* `Task <concepts/tasks>`_ APIs, which provide access to information about individual tasks within
+  a DAG, such as their dependencies and execution status.
+* `Operator <concepts/operators>`_ APIs, which allows the developers to write their custom Operators.
+* `Decorators <howto/create-custom-decorator>`_ APIs, which allows the developers to write their
+  custom decorators to make it easier to write `TaskFlow <tutorial/taskflow>`_ DAGs.
+* `Secret Managers <security/secrets>`_ APIs, which allows the developers to write their custom
+  Secret Managers to safely access credentials and other secret configuration of their workflows.
+* `Connection management <concepts/connections>`_ APIs, which allow developers to manage
+  connections to external systems
+* `XCom <concepts/xcoms>`_, which allow developers to manage cross-task communication within Airflow.
+* `Variables <concepts/variables>`_, which allow developers to manage variables within Airflow.
+* `Executors <executor/index>`_, which allow developers to manage the execution of tasks within Airflow.
+* `Listeners <listeners>`_, which allow developers to react to DAG/Task lifecycle events.
+* `Plugins <plugins>`_, which allow developers to extend internal Airflow capabilities - add new UI
+  pages, custom `TimeTables <concepts/timetable>`_, `Extra Links <howto/define_extra_link>`_,
+  `Triggers <concept/deferring>`_. `Listeners <listeners>`_.
+
+What is not part of the Public Interface of Apache Airflow?
+===========================================================
+
+Everything not mentioned in this document should be considered as non-Public Interface.
+
+The users however, might have some assumptions about different parts of Airflow, thinking that
+they can rely on them, so here we try to clarify and want to be explicit that they are not part of the
+Public Interface:
+
+* `Database structure <database-erd-ref>`_ is considered to be an internal implementation
+  detail and you should not assume the structure is going to be maintained in
+  backwards-compatible way.
+
+* `Web UI <ui>`_ is considered to be continuously adapting and evolving for Apache Airflow and
+  you should not assume that the UI will remain as it is in backwards-compatible way. Views and screens
+  might be updated and removed in major releases without impacting backwards-compatibility.
+
+* `Python API <python-api-ref>`_ (except the explicitly mentioned classes below), are considered an
+  internal implementation details and you should not assume they will be maintained
+  in backwards-compatible way.
+
+Which classes and packages are part of the Public Interface of Apache Airflow?
+==============================================================================
+
+The Public Interface of Airflow consists of a number of different classes and packages that provide access
+to the core features and functionality of the system.
+
+The classes and packages that may be considered as the Public Interface include:
+
+* The :class:`~airflow.DAG`, which provides a way to define and manage DAGs in Airflow.
+* The :class:`~airflow.models.baseoperator.BaseOperator`, which provides a way write custom operators.
+* The :class:`~airflow.hooks.base.BaseHook`, which provides a way write custom hooks.
+* The :class:`~airflow.models.connection.Connection`, which provides access to external service credentials and configuration.
+* The :class:`~airflow.models.variable.Variable`, which provides access to Airflow configuration variables.
+* The :class:`~airflow.models.xcom.XCom` which are used to access to inter-task communication data.
+* The :class:`~airflow.secrets.BaseSecretsBackend` which are used to define custom secret managers.
+* The :class:`~airflow.plugins_manager.AirflowPlugin` which are used to define custom plugins.
+* The :class:`~airflow.triggers.base.BaseTrigger`, which are used to implement custom Custom Deferrable Operators (based on ``async.io``).
+* The :class:`~airflow.decorators.base.TaskDecorator`, which provides a way write custom decorators.
+* The :class:`~airflow.listeners.listener.ListenerManager` class which provides hooks that can be implemented to respond to DAG/Task lifecycle events
+
+.. versionadded:: 2.5
+
+   Listener public interface has been added in version 2.5.
+
+* The :class:`~airflow.executors.base_executor.BaseExecutor` - the Executors are the components of Airflow
+  that are responsible for executing tasks.
+
+.. versionadded:: 2.6
+
+   There are a number of different executor implementations built-in Airflow, each with its own unique
+   characteristics and capabilities. Executor interface was available in earlier version of Airflow but
+   only as of version 2.6 executors are decoupled and Airflow does not rely on built-in set of executors.

Review Comment:
   I'm not sure what the key message is here?



##########
docs/apache-airflow/public-airflow-interface.rst:
##########
@@ -0,0 +1,115 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+===========================
+
+The Public Interface of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+as well as for automating certain aspects of the Airflow workflow.
+
+In general, the Public Interface is an important part of the Airflow ecosystem and can be a powerful
+tool for users and developers who want to extend the functionality of the system.
+
+You can extend Airflow in three ways:
+
+* By writing new custom Python code (via Operators, Plugins, Provider)
+* By using the `Stable REST API <stable-rest-api-ref>`_ (based on the OpenAPI specification)
+* By using the `Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_
+
+How can you extend Apache Airflow with custom Python Code?
+==========================================================
+
+Apache Airflow has a number of different Public Interfaces that allow developers to interact with various
+aspects of the system. Some examples of the types of Public Interfaces exposed as Python objects
+that are available in Apache Airflow include:
+
+* `DAG <concepts/dags>`_ (Directed Acyclic Graph) APIs, which allow developers to create, manage,
+  and access DAGs in Airflow.
+* `Task <concepts/tasks>`_ APIs, which provide access to information about individual tasks within
+  a DAG, such as their dependencies and execution status.
+* `Operator <concepts/operators>`_ APIs, which allows the developers to write their custom Operators.
+* `Decorators <howto/create-custom-decorator>`_ APIs, which allows the developers to write their
+  custom decorators to make it easier to write `TaskFlow <tutorial/taskflow>`_ DAGs.
+* `Secret Managers <security/secrets>`_ APIs, which allows the developers to write their custom
+  Secret Managers to safely access credentials and other secret configuration of their workflows.
+* `Connection management <concepts/connections>`_ APIs, which allow developers to manage
+  connections to external systems
+* `XCom <concepts/xcoms>`_, which allow developers to manage cross-task communication within Airflow.
+* `Variables <concepts/variables>`_, which allow developers to manage variables within Airflow.
+* `Executors <executor/index>`_, which allow developers to manage the execution of tasks within Airflow.
+* `Listeners <listeners>`_, which allow developers to react to DAG/Task lifecycle events.
+* `Plugins <plugins>`_, which allow developers to extend internal Airflow capabilities - add new UI
+  pages, custom `TimeTables <concepts/timetable>`_, `Extra Links <howto/define_extra_link>`_,
+  `Triggers <concept/deferring>`_. `Listeners <listeners>`_.
+
+What is not part of the Public Interface of Apache Airflow?
+===========================================================
+
+Everything not mentioned in this document should be considered as non-Public Interface.
+
+The users however, might have some assumptions about different parts of Airflow, thinking that
+they can rely on them, so here we try to clarify and want to be explicit that they are not part of the
+Public Interface:
+
+* `Database structure <database-erd-ref>`_ is considered to be an internal implementation
+  detail and you should not assume the structure is going to be maintained in
+  backwards-compatible way.
+
+* `Web UI <ui>`_ is considered to be continuously adapting and evolving for Apache Airflow and
+  you should not assume that the UI will remain as it is in backwards-compatible way. Views and screens
+  might be updated and removed in major releases without impacting backwards-compatibility.
+
+* `Python API <python-api-ref>`_ (except the explicitly mentioned classes below), are considered an
+  internal implementation details and you should not assume they will be maintained
+  in backwards-compatible way.
+
+Which classes and packages are part of the Public Interface of Apache Airflow?
+==============================================================================
+
+The Public Interface of Airflow consists of a number of different classes and packages that provide access
+to the core features and functionality of the system.
+
+The classes and packages that may be considered as the Public Interface include:
+
+* The :class:`~airflow.DAG`, which provides a way to define and manage DAGs in Airflow.
+* The :class:`~airflow.models.baseoperator.BaseOperator`, which provides a way write custom operators.
+* The :class:`~airflow.hooks.base.BaseHook`, which provides a way write custom hooks.
+* The :class:`~airflow.models.connection.Connection`, which provides access to external service credentials and configuration.
+* The :class:`~airflow.models.variable.Variable`, which provides access to Airflow configuration variables.
+* The :class:`~airflow.models.xcom.XCom` which are used to access to inter-task communication data.
+* The :class:`~airflow.secrets.BaseSecretsBackend` which are used to define custom secret managers.
+* The :class:`~airflow.plugins_manager.AirflowPlugin` which are used to define custom plugins.
+* The :class:`~airflow.triggers.base.BaseTrigger`, which are used to implement custom Custom Deferrable Operators (based on ``async.io``).
+* The :class:`~airflow.decorators.base.TaskDecorator`, which provides a way write custom decorators.
+* The :class:`~airflow.listeners.listener.ListenerManager` class which provides hooks that can be implemented to respond to DAG/Task lifecycle events

Review Comment:
   ```suggestion
   * The :class:`~airflow.listeners.listener.ListenerManager` class which provides hooks that can be implemented to respond to DAG/Task lifecycle events.
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] BasPH commented on a diff in pull request #28300: Add Public Interface description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
BasPH commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1066710063


##########
docs/apache-airflow/public-airflow-interface.rst:
##########
@@ -0,0 +1,374 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+...........................
+
+The Public Interface of Apache Airflow is a set of interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This includes operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+and for automating certain aspects of the Airflow workflow.
+
+Using Airflow Public Interfaces
+===============================
+
+Using Airflow Public Interfaces is needed when you want to interact with Airflow programmatically:
+
+* When you are writing new (or extending existing) custom Python classes (Operators, Hooks) - basic building
+  blocks of DAGs and can be done DAG Authors to add missing functionality in their DAGs or by those who write
+  reusable custom operators for other DAG authors.
+* When writing new :doc:`Plugins <authoring-and-scheduling/plugins>` that extend Airflow's functionality beyond
+  DAG building blocks. Secrets, Timetables, Triggers, Listeners are all examples of such functionality. This
+  is usually done by users who manage Airflow instances.
+* Bundling custom Operators, Hooks, Plugins and releasing them together via
+  :doc:`provider packages <apache-airflow-providers:index>` - this is usually done by those who intend to
+  provide reusable set of functionality for external service or application Airflow integrates with.

Review Comment:
   ```suggestion
     provide a reusable set of functionality for external services or applications Airflow integrates with.
   ```



##########
docs/apache-airflow/public-airflow-interface.rst:
##########
@@ -0,0 +1,374 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+...........................
+
+The Public Interface of Apache Airflow is a set of interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This includes operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,

Review Comment:
   ```suggestion
   creating and managing DAGs (Directed Acyclic Graphs), managing tasks and their dependencies,
   ```



##########
docs/apache-airflow/public-airflow-interface.rst:
##########
@@ -0,0 +1,374 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+...........................
+
+The Public Interface of Apache Airflow is a set of interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This includes operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+and for automating certain aspects of the Airflow workflow.
+
+Using Airflow Public Interfaces
+===============================
+
+Using Airflow Public Interfaces is needed when you want to interact with Airflow programmatically:
+
+* When you are writing new (or extending existing) custom Python classes (Operators, Hooks) - basic building
+  blocks of DAGs and can be done DAG Authors to add missing functionality in their DAGs or by those who write
+  reusable custom operators for other DAG authors.
+* When writing new :doc:`Plugins <authoring-and-scheduling/plugins>` that extend Airflow's functionality beyond
+  DAG building blocks. Secrets, Timetables, Triggers, Listeners are all examples of such functionality. This
+  is usually done by users who manage Airflow instances.
+* Bundling custom Operators, Hooks, Plugins and releasing them together via
+  :doc:`provider packages <apache-airflow-providers:index>` - this is usually done by those who intend to
+  provide reusable set of functionality for external service or application Airflow integrates with.
+
+All the ways above involve extending or using Airflow Python classes and functions. The classes
+and functions mentioned below can be relied to keep backwards-compatible signatures and behaviours within
+MAJOR version of Airflow. On the other hand, classes and method started with ``_`` (also known
+as protected Python methods) and ``__`` (also known as private Python methods) are not part of the Public
+Airflow Interface and might change any time.
+
+You can also use Airflow's Public Interface via the `Stable REST API <stable-rest-api-ref>`_ (based on the
+OpenAPI specification). For specific needs you can also use the
+`Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_ though it's behaviour might change
+in details (such as output format and available flags) so if you want to rely on those in programmatic
+way, the Stable REST API is recommended.
+
+
+Using Public Interface by DAG Authors
+=====================================
+
+DAGS

Review Comment:
   ```suggestion
   DAGs
   ```



##########
docs/apache-airflow/public-airflow-interface.rst:
##########
@@ -0,0 +1,374 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+...........................
+
+The Public Interface of Apache Airflow is a set of interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This includes operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+and for automating certain aspects of the Airflow workflow.
+
+Using Airflow Public Interfaces
+===============================
+
+Using Airflow Public Interfaces is needed when you want to interact with Airflow programmatically:
+
+* When you are writing new (or extending existing) custom Python classes (Operators, Hooks) - basic building
+  blocks of DAGs and can be done DAG Authors to add missing functionality in their DAGs or by those who write
+  reusable custom operators for other DAG authors.
+* When writing new :doc:`Plugins <authoring-and-scheduling/plugins>` that extend Airflow's functionality beyond
+  DAG building blocks. Secrets, Timetables, Triggers, Listeners are all examples of such functionality. This
+  is usually done by users who manage Airflow instances.
+* Bundling custom Operators, Hooks, Plugins and releasing them together via
+  :doc:`provider packages <apache-airflow-providers:index>` - this is usually done by those who intend to
+  provide reusable set of functionality for external service or application Airflow integrates with.
+
+All the ways above involve extending or using Airflow Python classes and functions. The classes
+and functions mentioned below can be relied to keep backwards-compatible signatures and behaviours within
+MAJOR version of Airflow. On the other hand, classes and method started with ``_`` (also known
+as protected Python methods) and ``__`` (also known as private Python methods) are not part of the Public
+Airflow Interface and might change any time.
+
+You can also use Airflow's Public Interface via the `Stable REST API <stable-rest-api-ref>`_ (based on the
+OpenAPI specification). For specific needs you can also use the
+`Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_ though it's behaviour might change
+in details (such as output format and available flags) so if you want to rely on those in programmatic
+way, the Stable REST API is recommended.
+
+
+Using Public Interface by DAG Authors
+=====================================

Review Comment:
   ```suggestion
   Using the Public Interface for DAG Authors
   ==========================================
   ```



##########
docs/apache-airflow/public-airflow-interface.rst:
##########
@@ -0,0 +1,374 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+...........................
+
+The Public Interface of Apache Airflow is a set of interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This includes operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+and for automating certain aspects of the Airflow workflow.
+
+Using Airflow Public Interfaces
+===============================
+
+Using Airflow Public Interfaces is needed when you want to interact with Airflow programmatically:
+
+* When you are writing new (or extending existing) custom Python classes (Operators, Hooks) - basic building
+  blocks of DAGs and can be done DAG Authors to add missing functionality in their DAGs or by those who write
+  reusable custom operators for other DAG authors.
+* When writing new :doc:`Plugins <authoring-and-scheduling/plugins>` that extend Airflow's functionality beyond
+  DAG building blocks. Secrets, Timetables, Triggers, Listeners are all examples of such functionality. This
+  is usually done by users who manage Airflow instances.
+* Bundling custom Operators, Hooks, Plugins and releasing them together via
+  :doc:`provider packages <apache-airflow-providers:index>` - this is usually done by those who intend to
+  provide reusable set of functionality for external service or application Airflow integrates with.
+
+All the ways above involve extending or using Airflow Python classes and functions. The classes
+and functions mentioned below can be relied to keep backwards-compatible signatures and behaviours within
+MAJOR version of Airflow. On the other hand, classes and method started with ``_`` (also known
+as protected Python methods) and ``__`` (also known as private Python methods) are not part of the Public
+Airflow Interface and might change any time.
+
+You can also use Airflow's Public Interface via the `Stable REST API <stable-rest-api-ref>`_ (based on the
+OpenAPI specification). For specific needs you can also use the
+`Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_ though it's behaviour might change
+in details (such as output format and available flags) so if you want to rely on those in programmatic
+way, the Stable REST API is recommended.
+
+
+Using Public Interface by DAG Authors
+=====================================
+
+DAGS
+----
+
+The DAG is Airflow's core entity that represents a recurring workflow. You can create a DAG by
+instantiating the :class:`~airflow.models.dag.DAG` class in your DAG file.
+
+Airflow has a set of Example DAGs that you can use to learn how to write DAGs
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/example_dags/index
+
+You can read more about DAGs in :doc:`DAGs <core-concepts/dags>`.
+
+.. _pythonapi:operators:
+
+Operators
+---------
+
+Operators allow for generation of certain types of tasks that become nodes in
+the DAG when instantiated.
+
+There are 3 main types of operators:
+
+- Operators that performs an **action**, or tell another system to
+  perform an action
+- **Transfer** operators move data from one system to another
+- **Sensors** are a certain type of operator that will keep running until a
+  certain criterion is met. Examples include a specific file landing in HDFS or
+  S3, a partition appearing in Hive, or a specific time of the day. Sensors
+  are derived from :class:`~airflow.sensors.base.BaseSensorOperator` and run a poke
+  method at a specified :attr:`~airflow.sensors.base.BaseSensorOperator.poke_interval` until it
+  returns ``True``.
+
+All operators are derived from :class:`~airflow.models.baseoperator.BaseOperator` and acquire much
+functionality through inheritance. Since this is the core of the engine,
+it's worth taking the time to understand the parameters of :class:`~airflow.models.baseoperator.BaseOperator`
+to understand the primitive features that can be leveraged in your DAGs.
+
+Airflow has a set of Operators that are considered public. You are also free to extend their functionality
+by extending them:
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/operators/index
+
+  _api/airflow/sensors/index
+
+
+You can read more about the operators in :doc:`core-concepts/operators`, :doc:`core-concepts/sensors`.
+Also you can learn how to write a custom operator in :doc:`howto/custom-operator`.
+
+.. _pythonapi:hooks:
+
+Hooks
+-----
+
+Hooks are interfaces to external platforms and databases, implementing a common
+interface when possible and acting as building blocks for operators. All hooks
+are derived from :class:`~airflow.hooks.base.BaseHook`.
+
+Airflow has a set of Hooks that are considered public. You are free to extend their functionality
+by extending them:
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/hooks/index
+
+Public Airflow utilities
+------------------------
+
+When writing or extending Hooks and Operators, DAG authors and developers of those Operators and Hooks can
+use the following classes:
+
+* The :class:`~airflow.models.connection.Connection`, which provides access to external service credentials and configuration.
+* The :class:`~airflow.models.variable.Variable`, which provides access to Airflow configuration variables.
+* The :class:`~airflow.models.xcom.XCom` which are used to access to inter-task communication data.
+
+You can read more about the public Airflow utilities in :doc:`howto/connection`,
+:doc:`core-concepts/variables`, :doc:`core-concepts/xcoms`
+
+Public Exceptions
+-----------------
+
+When writing the custom Operators and Hooks, you can handle and raise public Exceptions that Airflow
+exposes:
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/exceptions/index
+
+
+Using Public Interface to extend Airflow capabilities
+=====================================================
+
+Airflow uses Plugin mechanism as a way to extend Airflow platform capabilities. They allow to extend
+Airflow UI but also they are the way to expose the below customizations (Triggers, Timetables, Listeners).

Review Comment:
   ```suggestion
   Airflow UI but also they are the way to expose the below customizations (Triggers, Timetables, Listeners, etc.).
   ```
   
   Adding "etc." because there's more



##########
docs/apache-airflow/public-airflow-interface.rst:
##########
@@ -0,0 +1,374 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+...........................
+
+The Public Interface of Apache Airflow is a set of interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This includes operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+and for automating certain aspects of the Airflow workflow.
+
+Using Airflow Public Interfaces
+===============================
+
+Using Airflow Public Interfaces is needed when you want to interact with Airflow programmatically:
+
+* When you are writing new (or extending existing) custom Python classes (Operators, Hooks) - basic building
+  blocks of DAGs and can be done DAG Authors to add missing functionality in their DAGs or by those who write
+  reusable custom operators for other DAG authors.
+* When writing new :doc:`Plugins <authoring-and-scheduling/plugins>` that extend Airflow's functionality beyond
+  DAG building blocks. Secrets, Timetables, Triggers, Listeners are all examples of such functionality. This
+  is usually done by users who manage Airflow instances.
+* Bundling custom Operators, Hooks, Plugins and releasing them together via
+  :doc:`provider packages <apache-airflow-providers:index>` - this is usually done by those who intend to
+  provide reusable set of functionality for external service or application Airflow integrates with.
+
+All the ways above involve extending or using Airflow Python classes and functions. The classes
+and functions mentioned below can be relied to keep backwards-compatible signatures and behaviours within
+MAJOR version of Airflow. On the other hand, classes and method started with ``_`` (also known
+as protected Python methods) and ``__`` (also known as private Python methods) are not part of the Public
+Airflow Interface and might change any time.
+
+You can also use Airflow's Public Interface via the `Stable REST API <stable-rest-api-ref>`_ (based on the
+OpenAPI specification). For specific needs you can also use the
+`Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_ though it's behaviour might change
+in details (such as output format and available flags) so if you want to rely on those in programmatic
+way, the Stable REST API is recommended.
+
+
+Using Public Interface by DAG Authors
+=====================================
+
+DAGS
+----
+
+The DAG is Airflow's core entity that represents a recurring workflow. You can create a DAG by
+instantiating the :class:`~airflow.models.dag.DAG` class in your DAG file.
+
+Airflow has a set of Example DAGs that you can use to learn how to write DAGs
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/example_dags/index
+
+You can read more about DAGs in :doc:`DAGs <core-concepts/dags>`.
+
+.. _pythonapi:operators:
+
+Operators
+---------
+
+Operators allow for generation of certain types of tasks that become nodes in
+the DAG when instantiated.
+
+There are 3 main types of operators:
+
+- Operators that performs an **action**, or tell another system to
+  perform an action
+- **Transfer** operators move data from one system to another
+- **Sensors** are a certain type of operator that will keep running until a
+  certain criterion is met. Examples include a specific file landing in HDFS or
+  S3, a partition appearing in Hive, or a specific time of the day. Sensors
+  are derived from :class:`~airflow.sensors.base.BaseSensorOperator` and run a poke
+  method at a specified :attr:`~airflow.sensors.base.BaseSensorOperator.poke_interval` until it
+  returns ``True``.
+
+All operators are derived from :class:`~airflow.models.baseoperator.BaseOperator` and acquire much
+functionality through inheritance. Since this is the core of the engine,
+it's worth taking the time to understand the parameters of :class:`~airflow.models.baseoperator.BaseOperator`
+to understand the primitive features that can be leveraged in your DAGs.
+
+Airflow has a set of Operators that are considered public. You are also free to extend their functionality
+by extending them:
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/operators/index
+
+  _api/airflow/sensors/index
+
+
+You can read more about the operators in :doc:`core-concepts/operators`, :doc:`core-concepts/sensors`.
+Also you can learn how to write a custom operator in :doc:`howto/custom-operator`.
+
+.. _pythonapi:hooks:
+
+Hooks
+-----
+
+Hooks are interfaces to external platforms and databases, implementing a common
+interface when possible and acting as building blocks for operators. All hooks
+are derived from :class:`~airflow.hooks.base.BaseHook`.
+
+Airflow has a set of Hooks that are considered public. You are free to extend their functionality
+by extending them:
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/hooks/index
+
+Public Airflow utilities
+------------------------
+
+When writing or extending Hooks and Operators, DAG authors and developers of those Operators and Hooks can
+use the following classes:
+
+* The :class:`~airflow.models.connection.Connection`, which provides access to external service credentials and configuration.
+* The :class:`~airflow.models.variable.Variable`, which provides access to Airflow configuration variables.
+* The :class:`~airflow.models.xcom.XCom` which are used to access to inter-task communication data.
+
+You can read more about the public Airflow utilities in :doc:`howto/connection`,
+:doc:`core-concepts/variables`, :doc:`core-concepts/xcoms`
+
+Public Exceptions
+-----------------
+
+When writing the custom Operators and Hooks, you can handle and raise public Exceptions that Airflow
+exposes:
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/exceptions/index
+
+
+Using Public Interface to extend Airflow capabilities
+=====================================================
+
+Airflow uses Plugin mechanism as a way to extend Airflow platform capabilities. They allow to extend
+Airflow UI but also they are the way to expose the below customizations (Triggers, Timetables, Listeners).
+Providers can also implement plugin endpoints and customize Airflow UI and the customizations.
+
+You can read more about plugins in :doc:`authoring-and-scheduling/plugins`. You can read how to extend
+Airflow UI in :doc:`howto/custom-view-plugin`. Note that there are some simple customizations of the UI
+that do not require plugins - you can read more about them in :doc:`howto/customize-ui`.
+
+Here are the ways how Plugins can be used to extend Airflow:
+
+Triggers
+--------
+
+Airflow might be configure to use Triggers in order to implement ``asyncio`` compatible Deferrable Operators.
+All Triggers derive from :class:`~airflow.triggers.base.BaseTrigger`.
+
+Airflow has a set of Triggers that are considered public. You are free to extend their functionality
+by extending them:
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/triggers/index
+
+You can read more about Triggers in :doc:`authoring-and-scheduling/deferring`.
+
+Timetables
+----------
+
+Custom timetable implementations provide Airflow's scheduler additional logic to
+schedule DAG runs in ways not possible with built-in schedule expressions.
+All Timetables derive from :class:`~airflow.timetables.base.Timetable`.
+
+Airflow has a set of Timetables that are considered public. You are free to extend their functionality
+by extending them:
+
+.. toctree::
+  :includehidden:
+  :maxdepth: 1
+
+  _api/airflow/timetables/index
+
+You can read more about Timetables in :doc:`howto/timetable`.
+
+Listeners
+---------
+
+Listeners are the way that airflow platform as a whole can be extended to respond to DAG/Task lifecycle events.
+
+This is implemented via :class:`~airflow.listeners.listener.ListenerManager` class that provides hooks that
+can be implemented to respond to DAG/Task lifecycle events.
+
+.. versionadded:: 2.5
+
+   Listener public interface has been added in version 2.5.
+
+You can read more about Listeners in :doc:`administration-and-deployment/listeners`.
+
+Extra Links
+-----------
+
+Extra links are dynamic links that could be added to Airflow independently from custom Operators. Normally
+they can be defined by the Operators, but plugins allow to override the links on aa global level.

Review Comment:
   ```suggestion
   they can be defined by the Operators, but plugins allow you to override the links on a global level.
   ```



##########
docs/apache-airflow/public-airflow-interface.rst:
##########
@@ -0,0 +1,374 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+...........................
+
+The Public Interface of Apache Airflow is a set of interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This includes operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+and for automating certain aspects of the Airflow workflow.
+
+Using Airflow Public Interfaces
+===============================
+
+Using Airflow Public Interfaces is needed when you want to interact with Airflow programmatically:
+
+* When you are writing new (or extending existing) custom Python classes (Operators, Hooks) - basic building
+  blocks of DAGs and can be done DAG Authors to add missing functionality in their DAGs or by those who write
+  reusable custom operators for other DAG authors.
+* When writing new :doc:`Plugins <authoring-and-scheduling/plugins>` that extend Airflow's functionality beyond
+  DAG building blocks. Secrets, Timetables, Triggers, Listeners are all examples of such functionality. This
+  is usually done by users who manage Airflow instances.
+* Bundling custom Operators, Hooks, Plugins and releasing them together via
+  :doc:`provider packages <apache-airflow-providers:index>` - this is usually done by those who intend to
+  provide reusable set of functionality for external service or application Airflow integrates with.
+
+All the ways above involve extending or using Airflow Python classes and functions. The classes
+and functions mentioned below can be relied to keep backwards-compatible signatures and behaviours within
+MAJOR version of Airflow. On the other hand, classes and method started with ``_`` (also known
+as protected Python methods) and ``__`` (also known as private Python methods) are not part of the Public
+Airflow Interface and might change any time.
+
+You can also use Airflow's Public Interface via the `Stable REST API <stable-rest-api-ref>`_ (based on the
+OpenAPI specification). For specific needs you can also use the
+`Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_ though it's behaviour might change
+in details (such as output format and available flags) so if you want to rely on those in programmatic
+way, the Stable REST API is recommended.
+
+
+Using Public Interface by DAG Authors
+=====================================
+
+DAGS
+----
+
+The DAG is Airflow's core entity that represents a recurring workflow. You can create a DAG by
+instantiating the :class:`~airflow.models.dag.DAG` class in your DAG file.
+
+Airflow has a set of Example DAGs that you can use to learn how to write DAGs
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/example_dags/index
+
+You can read more about DAGs in :doc:`DAGs <core-concepts/dags>`.
+
+.. _pythonapi:operators:
+
+Operators
+---------
+
+Operators allow for generation of certain types of tasks that become nodes in
+the DAG when instantiated.
+
+There are 3 main types of operators:
+
+- Operators that performs an **action**, or tell another system to
+  perform an action
+- **Transfer** operators move data from one system to another
+- **Sensors** are a certain type of operator that will keep running until a
+  certain criterion is met. Examples include a specific file landing in HDFS or
+  S3, a partition appearing in Hive, or a specific time of the day. Sensors
+  are derived from :class:`~airflow.sensors.base.BaseSensorOperator` and run a poke
+  method at a specified :attr:`~airflow.sensors.base.BaseSensorOperator.poke_interval` until it
+  returns ``True``.
+
+All operators are derived from :class:`~airflow.models.baseoperator.BaseOperator` and acquire much
+functionality through inheritance. Since this is the core of the engine,
+it's worth taking the time to understand the parameters of :class:`~airflow.models.baseoperator.BaseOperator`
+to understand the primitive features that can be leveraged in your DAGs.
+
+Airflow has a set of Operators that are considered public. You are also free to extend their functionality
+by extending them:
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/operators/index
+
+  _api/airflow/sensors/index
+
+
+You can read more about the operators in :doc:`core-concepts/operators`, :doc:`core-concepts/sensors`.
+Also you can learn how to write a custom operator in :doc:`howto/custom-operator`.
+
+.. _pythonapi:hooks:
+
+Hooks
+-----
+
+Hooks are interfaces to external platforms and databases, implementing a common
+interface when possible and acting as building blocks for operators. All hooks
+are derived from :class:`~airflow.hooks.base.BaseHook`.
+
+Airflow has a set of Hooks that are considered public. You are free to extend their functionality
+by extending them:
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/hooks/index
+
+Public Airflow utilities
+------------------------
+
+When writing or extending Hooks and Operators, DAG authors and developers of those Operators and Hooks can
+use the following classes:
+
+* The :class:`~airflow.models.connection.Connection`, which provides access to external service credentials and configuration.
+* The :class:`~airflow.models.variable.Variable`, which provides access to Airflow configuration variables.
+* The :class:`~airflow.models.xcom.XCom` which are used to access to inter-task communication data.
+
+You can read more about the public Airflow utilities in :doc:`howto/connection`,
+:doc:`core-concepts/variables`, :doc:`core-concepts/xcoms`
+
+Public Exceptions
+-----------------
+
+When writing the custom Operators and Hooks, you can handle and raise public Exceptions that Airflow
+exposes:
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/exceptions/index
+
+
+Using Public Interface to extend Airflow capabilities
+=====================================================
+
+Airflow uses Plugin mechanism as a way to extend Airflow platform capabilities. They allow to extend
+Airflow UI but also they are the way to expose the below customizations (Triggers, Timetables, Listeners).
+Providers can also implement plugin endpoints and customize Airflow UI and the customizations.
+
+You can read more about plugins in :doc:`authoring-and-scheduling/plugins`. You can read how to extend
+Airflow UI in :doc:`howto/custom-view-plugin`. Note that there are some simple customizations of the UI
+that do not require plugins - you can read more about them in :doc:`howto/customize-ui`.
+
+Here are the ways how Plugins can be used to extend Airflow:
+
+Triggers
+--------
+
+Airflow might be configure to use Triggers in order to implement ``asyncio`` compatible Deferrable Operators.
+All Triggers derive from :class:`~airflow.triggers.base.BaseTrigger`.
+
+Airflow has a set of Triggers that are considered public. You are free to extend their functionality
+by extending them:
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/triggers/index
+
+You can read more about Triggers in :doc:`authoring-and-scheduling/deferring`.
+
+Timetables
+----------
+
+Custom timetable implementations provide Airflow's scheduler additional logic to
+schedule DAG runs in ways not possible with built-in schedule expressions.
+All Timetables derive from :class:`~airflow.timetables.base.Timetable`.
+
+Airflow has a set of Timetables that are considered public. You are free to extend their functionality
+by extending them:
+
+.. toctree::
+  :includehidden:
+  :maxdepth: 1
+
+  _api/airflow/timetables/index
+
+You can read more about Timetables in :doc:`howto/timetable`.
+
+Listeners
+---------
+
+Listeners are the way that airflow platform as a whole can be extended to respond to DAG/Task lifecycle events.
+
+This is implemented via :class:`~airflow.listeners.listener.ListenerManager` class that provides hooks that
+can be implemented to respond to DAG/Task lifecycle events.
+
+.. versionadded:: 2.5
+
+   Listener public interface has been added in version 2.5.
+
+You can read more about Listeners in :doc:`administration-and-deployment/listeners`.
+
+Extra Links
+-----------
+
+Extra links are dynamic links that could be added to Airflow independently from custom Operators. Normally
+they can be defined by the Operators, but plugins allow to override the links on aa global level.
+
+You can read more about the Extra Links in :doc:`/howto/define-extra-link`.
+
+Using Public Interface to integrate with external services and applications
+===========================================================================
+
+In order to integrate Airflow with external services and applications you can develop Hooks and Operators,
+similarly to DAG Authors, but also you can also extend the core functionality of Airflow by integrating
+them with those external services. This can be done by just adding the extension classes to PYTHONPATH
+on your system,and configuring Airflow to use them (there is no need to use plugin mechanism). However,
+you can also package and release the Hooks, Operators and core extensions in the form of
+:doc:`provider packages <apache-airflow-providers:index>`. You can read more about core extensions
+delivered by providers via :doc:`provider packages <apache-airflow-providers:core-extensions/index>`.
+
+Executors
+---------
+
+Executors are the mechanism by which task instances get run. All executors are
+derived from :class:`~airflow.executors.base_executor.BaseExecutor`. There are a number of different
+executor implementations built-in Airflow, each with its own unique characteristics and capabilities.
+
+Airflow has a set of Executors that are considered public. You are free to extend their functionality
+by extending them:
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/executors/index
+
+You can read more about executors in :doc:`core-concepts/executor/index`.
+
+.. versionadded:: 2.6
+
+  Executor interface was available in earlier version of Airflow but only as of version 2.6 executors are
+  fully decoupled and Airflow does not rely on built-in set of executors.
+  You could have implemented (and succeeded) with implementing Executors before Airflow 2.6 and a number
+  of people succeeded in doing so, but there were some hard-coded behaviours that preferred in-built
+  executors, and custom executors could not provide full functionality that built-in executors had.
+
+Secrets Backends
+----------------
+
+Airflow might be configured to rely on secrets backends to retrieve
+:class:`~airflow.models.connection.Connection` and :class:`~airflow.models.Variables`.
+All secrets backends derive from :class:`~airflow.secrets.BaseSecretsBackend`.
+
+Airflow has a set of Secret Backends that are considered public. You are free to extend their functionality
+by extending them:

Review Comment:
   ```suggestion
   All Secrets Backend implementations are public. You can extend their functionality:
   ```
   
   This suggests there's also a set that's not considered public. Would reword for clarity. Also removing text duplication for clarity.



##########
docs/apache-airflow/public-airflow-interface.rst:
##########
@@ -0,0 +1,374 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+...........................
+
+The Public Interface of Apache Airflow is a set of interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This includes operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+and for automating certain aspects of the Airflow workflow.
+
+Using Airflow Public Interfaces
+===============================
+
+Using Airflow Public Interfaces is needed when you want to interact with Airflow programmatically:
+
+* When you are writing new (or extending existing) custom Python classes (Operators, Hooks) - basic building
+  blocks of DAGs and can be done DAG Authors to add missing functionality in their DAGs or by those who write
+  reusable custom operators for other DAG authors.
+* When writing new :doc:`Plugins <authoring-and-scheduling/plugins>` that extend Airflow's functionality beyond
+  DAG building blocks. Secrets, Timetables, Triggers, Listeners are all examples of such functionality. This
+  is usually done by users who manage Airflow instances.
+* Bundling custom Operators, Hooks, Plugins and releasing them together via
+  :doc:`provider packages <apache-airflow-providers:index>` - this is usually done by those who intend to
+  provide reusable set of functionality for external service or application Airflow integrates with.
+
+All the ways above involve extending or using Airflow Python classes and functions. The classes
+and functions mentioned below can be relied to keep backwards-compatible signatures and behaviours within
+MAJOR version of Airflow. On the other hand, classes and method started with ``_`` (also known
+as protected Python methods) and ``__`` (also known as private Python methods) are not part of the Public
+Airflow Interface and might change any time.
+
+You can also use Airflow's Public Interface via the `Stable REST API <stable-rest-api-ref>`_ (based on the
+OpenAPI specification). For specific needs you can also use the
+`Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_ though it's behaviour might change
+in details (such as output format and available flags) so if you want to rely on those in programmatic
+way, the Stable REST API is recommended.
+
+
+Using Public Interface by DAG Authors
+=====================================
+
+DAGS
+----
+
+The DAG is Airflow's core entity that represents a recurring workflow. You can create a DAG by
+instantiating the :class:`~airflow.models.dag.DAG` class in your DAG file.
+
+Airflow has a set of Example DAGs that you can use to learn how to write DAGs
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/example_dags/index
+
+You can read more about DAGs in :doc:`DAGs <core-concepts/dags>`.
+
+.. _pythonapi:operators:
+
+Operators
+---------
+
+Operators allow for generation of certain types of tasks that become nodes in
+the DAG when instantiated.
+
+There are 3 main types of operators:
+
+- Operators that performs an **action**, or tell another system to
+  perform an action
+- **Transfer** operators move data from one system to another
+- **Sensors** are a certain type of operator that will keep running until a
+  certain criterion is met. Examples include a specific file landing in HDFS or
+  S3, a partition appearing in Hive, or a specific time of the day. Sensors
+  are derived from :class:`~airflow.sensors.base.BaseSensorOperator` and run a poke
+  method at a specified :attr:`~airflow.sensors.base.BaseSensorOperator.poke_interval` until it
+  returns ``True``.
+
+All operators are derived from :class:`~airflow.models.baseoperator.BaseOperator` and acquire much
+functionality through inheritance. Since this is the core of the engine,
+it's worth taking the time to understand the parameters of :class:`~airflow.models.baseoperator.BaseOperator`
+to understand the primitive features that can be leveraged in your DAGs.
+
+Airflow has a set of Operators that are considered public. You are also free to extend their functionality
+by extending them:
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/operators/index
+
+  _api/airflow/sensors/index
+
+
+You can read more about the operators in :doc:`core-concepts/operators`, :doc:`core-concepts/sensors`.
+Also you can learn how to write a custom operator in :doc:`howto/custom-operator`.
+
+.. _pythonapi:hooks:
+
+Hooks
+-----
+
+Hooks are interfaces to external platforms and databases, implementing a common
+interface when possible and acting as building blocks for operators. All hooks
+are derived from :class:`~airflow.hooks.base.BaseHook`.
+
+Airflow has a set of Hooks that are considered public. You are free to extend their functionality
+by extending them:
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/hooks/index
+
+Public Airflow utilities
+------------------------
+
+When writing or extending Hooks and Operators, DAG authors and developers of those Operators and Hooks can
+use the following classes:
+
+* The :class:`~airflow.models.connection.Connection`, which provides access to external service credentials and configuration.
+* The :class:`~airflow.models.variable.Variable`, which provides access to Airflow configuration variables.
+* The :class:`~airflow.models.xcom.XCom` which are used to access to inter-task communication data.
+
+You can read more about the public Airflow utilities in :doc:`howto/connection`,
+:doc:`core-concepts/variables`, :doc:`core-concepts/xcoms`
+
+Public Exceptions
+-----------------
+
+When writing the custom Operators and Hooks, you can handle and raise public Exceptions that Airflow
+exposes:
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/exceptions/index
+
+
+Using Public Interface to extend Airflow capabilities
+=====================================================
+
+Airflow uses Plugin mechanism as a way to extend Airflow platform capabilities. They allow to extend
+Airflow UI but also they are the way to expose the below customizations (Triggers, Timetables, Listeners).
+Providers can also implement plugin endpoints and customize Airflow UI and the customizations.
+
+You can read more about plugins in :doc:`authoring-and-scheduling/plugins`. You can read how to extend
+Airflow UI in :doc:`howto/custom-view-plugin`. Note that there are some simple customizations of the UI
+that do not require plugins - you can read more about them in :doc:`howto/customize-ui`.
+
+Here are the ways how Plugins can be used to extend Airflow:
+
+Triggers
+--------
+
+Airflow might be configure to use Triggers in order to implement ``asyncio`` compatible Deferrable Operators.
+All Triggers derive from :class:`~airflow.triggers.base.BaseTrigger`.
+
+Airflow has a set of Triggers that are considered public. You are free to extend their functionality
+by extending them:
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/triggers/index
+
+You can read more about Triggers in :doc:`authoring-and-scheduling/deferring`.
+
+Timetables
+----------
+
+Custom timetable implementations provide Airflow's scheduler additional logic to
+schedule DAG runs in ways not possible with built-in schedule expressions.
+All Timetables derive from :class:`~airflow.timetables.base.Timetable`.
+
+Airflow has a set of Timetables that are considered public. You are free to extend their functionality
+by extending them:
+
+.. toctree::
+  :includehidden:
+  :maxdepth: 1
+
+  _api/airflow/timetables/index
+
+You can read more about Timetables in :doc:`howto/timetable`.
+
+Listeners
+---------
+
+Listeners are the way that airflow platform as a whole can be extended to respond to DAG/Task lifecycle events.
+
+This is implemented via :class:`~airflow.listeners.listener.ListenerManager` class that provides hooks that
+can be implemented to respond to DAG/Task lifecycle events.
+
+.. versionadded:: 2.5
+
+   Listener public interface has been added in version 2.5.
+
+You can read more about Listeners in :doc:`administration-and-deployment/listeners`.
+
+Extra Links
+-----------
+
+Extra links are dynamic links that could be added to Airflow independently from custom Operators. Normally
+they can be defined by the Operators, but plugins allow to override the links on aa global level.
+
+You can read more about the Extra Links in :doc:`/howto/define-extra-link`.
+
+Using Public Interface to integrate with external services and applications
+===========================================================================
+
+In order to integrate Airflow with external services and applications you can develop Hooks and Operators,
+similarly to DAG Authors, but also you can also extend the core functionality of Airflow by integrating
+them with those external services. This can be done by just adding the extension classes to PYTHONPATH
+on your system,and configuring Airflow to use them (there is no need to use plugin mechanism). However,
+you can also package and release the Hooks, Operators and core extensions in the form of
+:doc:`provider packages <apache-airflow-providers:index>`. You can read more about core extensions
+delivered by providers via :doc:`provider packages <apache-airflow-providers:core-extensions/index>`.

Review Comment:
   ```suggestion
   
   Tasks in Airflow can orchestrate external services via Hooks and Operators. The core functionality of Airflow (such as authentication) can also be extended to leverage external services.
   ```
   
   Struggling to understand this paragraph. It tells me multiple conflicting things:
   
   - "you can develop Hooks and Operators, similarly to DAG Authors, but also"... this reads as if "DAG authors" is a concept in Airflow
   - Do you intend to tell the reader "how to deploy custom code"? The header is "integrate with external services" so why explain that here?
   - Why is there no need to use the plugin mechanism?
   - I'm reading multiple ways to deploy code without explanation, why would a reader choose one over another?
   
   Reading all the text below, I actually think this paragraph intends to explain "you can communicate with external services in your tasks (which is what DAG authors do), but you can also make Airflow's core functionality leverage external services". I made a suggestion for the text, but feel free to correct me if this assumption was wrong.



##########
docs/apache-airflow/public-airflow-interface.rst:
##########
@@ -0,0 +1,374 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+...........................
+
+The Public Interface of Apache Airflow is a set of interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This includes operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+and for automating certain aspects of the Airflow workflow.
+
+Using Airflow Public Interfaces
+===============================
+
+Using Airflow Public Interfaces is needed when you want to interact with Airflow programmatically:
+
+* When you are writing new (or extending existing) custom Python classes (Operators, Hooks) - basic building
+  blocks of DAGs and can be done DAG Authors to add missing functionality in their DAGs or by those who write
+  reusable custom operators for other DAG authors.
+* When writing new :doc:`Plugins <authoring-and-scheduling/plugins>` that extend Airflow's functionality beyond
+  DAG building blocks. Secrets, Timetables, Triggers, Listeners are all examples of such functionality. This
+  is usually done by users who manage Airflow instances.
+* Bundling custom Operators, Hooks, Plugins and releasing them together via
+  :doc:`provider packages <apache-airflow-providers:index>` - this is usually done by those who intend to
+  provide reusable set of functionality for external service or application Airflow integrates with.
+
+All the ways above involve extending or using Airflow Python classes and functions. The classes
+and functions mentioned below can be relied to keep backwards-compatible signatures and behaviours within
+MAJOR version of Airflow. On the other hand, classes and method started with ``_`` (also known
+as protected Python methods) and ``__`` (also known as private Python methods) are not part of the Public
+Airflow Interface and might change any time.

Review Comment:
   ```suggestion
   Airflow Interface and might change at any time.
   ```



##########
docs/apache-airflow/public-airflow-interface.rst:
##########
@@ -0,0 +1,374 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+...........................
+
+The Public Interface of Apache Airflow is a set of interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This includes operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+and for automating certain aspects of the Airflow workflow.
+
+Using Airflow Public Interfaces
+===============================
+
+Using Airflow Public Interfaces is needed when you want to interact with Airflow programmatically:
+
+* When you are writing new (or extending existing) custom Python classes (Operators, Hooks) - basic building
+  blocks of DAGs and can be done DAG Authors to add missing functionality in their DAGs or by those who write
+  reusable custom operators for other DAG authors.
+* When writing new :doc:`Plugins <authoring-and-scheduling/plugins>` that extend Airflow's functionality beyond
+  DAG building blocks. Secrets, Timetables, Triggers, Listeners are all examples of such functionality. This
+  is usually done by users who manage Airflow instances.
+* Bundling custom Operators, Hooks, Plugins and releasing them together via
+  :doc:`provider packages <apache-airflow-providers:index>` - this is usually done by those who intend to
+  provide reusable set of functionality for external service or application Airflow integrates with.
+
+All the ways above involve extending or using Airflow Python classes and functions. The classes
+and functions mentioned below can be relied to keep backwards-compatible signatures and behaviours within

Review Comment:
   ```suggestion
   and functions mentioned below can be relied on to keep backwards-compatible signatures and behaviours within
   ```



##########
docs/apache-airflow/public-airflow-interface.rst:
##########
@@ -0,0 +1,374 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+...........................
+
+The Public Interface of Apache Airflow is a set of interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This includes operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+and for automating certain aspects of the Airflow workflow.
+
+Using Airflow Public Interfaces
+===============================
+
+Using Airflow Public Interfaces is needed when you want to interact with Airflow programmatically:
+
+* When you are writing new (or extending existing) custom Python classes (Operators, Hooks) - basic building
+  blocks of DAGs and can be done DAG Authors to add missing functionality in their DAGs or by those who write
+  reusable custom operators for other DAG authors.

Review Comment:
   ```suggestion
   * When you are extending Airflow classes such as Operators and Hooks. This can be done by DAG authors to add missing functionality in their DAGs or by those who write reusable custom operators for other DAG authors.
   ```
   
   In the context of public interfaces, people are always subclassing an Airflow component. So "writing new (or extending existing)" doesn't make much sense here. Would reword.



##########
docs/apache-airflow/public-airflow-interface.rst:
##########
@@ -0,0 +1,374 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+...........................
+
+The Public Interface of Apache Airflow is a set of interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This includes operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+and for automating certain aspects of the Airflow workflow.
+
+Using Airflow Public Interfaces
+===============================
+
+Using Airflow Public Interfaces is needed when you want to interact with Airflow programmatically:
+
+* When you are writing new (or extending existing) custom Python classes (Operators, Hooks) - basic building
+  blocks of DAGs and can be done DAG Authors to add missing functionality in their DAGs or by those who write
+  reusable custom operators for other DAG authors.
+* When writing new :doc:`Plugins <authoring-and-scheduling/plugins>` that extend Airflow's functionality beyond
+  DAG building blocks. Secrets, Timetables, Triggers, Listeners are all examples of such functionality. This
+  is usually done by users who manage Airflow instances.
+* Bundling custom Operators, Hooks, Plugins and releasing them together via
+  :doc:`provider packages <apache-airflow-providers:index>` - this is usually done by those who intend to
+  provide reusable set of functionality for external service or application Airflow integrates with.
+
+All the ways above involve extending or using Airflow Python classes and functions. The classes
+and functions mentioned below can be relied to keep backwards-compatible signatures and behaviours within
+MAJOR version of Airflow. On the other hand, classes and method started with ``_`` (also known
+as protected Python methods) and ``__`` (also known as private Python methods) are not part of the Public
+Airflow Interface and might change any time.
+
+You can also use Airflow's Public Interface via the `Stable REST API <stable-rest-api-ref>`_ (based on the
+OpenAPI specification). For specific needs you can also use the
+`Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_ though it's behaviour might change
+in details (such as output format and available flags) so if you want to rely on those in programmatic
+way, the Stable REST API is recommended.
+
+
+Using Public Interface by DAG Authors
+=====================================
+
+DAGS
+----
+
+The DAG is Airflow's core entity that represents a recurring workflow. You can create a DAG by
+instantiating the :class:`~airflow.models.dag.DAG` class in your DAG file.
+
+Airflow has a set of Example DAGs that you can use to learn how to write DAGs

Review Comment:
   ```suggestion
   Airflow has a set of example DAGs that you can use to learn how to write DAGs
   ```
   
   "Example" isn't a name so would lowercase.



##########
docs/apache-airflow/public-airflow-interface.rst:
##########
@@ -0,0 +1,374 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+...........................
+
+The Public Interface of Apache Airflow is a set of interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This includes operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+and for automating certain aspects of the Airflow workflow.
+
+Using Airflow Public Interfaces
+===============================
+
+Using Airflow Public Interfaces is needed when you want to interact with Airflow programmatically:
+
+* When you are writing new (or extending existing) custom Python classes (Operators, Hooks) - basic building
+  blocks of DAGs and can be done DAG Authors to add missing functionality in their DAGs or by those who write
+  reusable custom operators for other DAG authors.
+* When writing new :doc:`Plugins <authoring-and-scheduling/plugins>` that extend Airflow's functionality beyond
+  DAG building blocks. Secrets, Timetables, Triggers, Listeners are all examples of such functionality. This
+  is usually done by users who manage Airflow instances.
+* Bundling custom Operators, Hooks, Plugins and releasing them together via
+  :doc:`provider packages <apache-airflow-providers:index>` - this is usually done by those who intend to
+  provide reusable set of functionality for external service or application Airflow integrates with.
+
+All the ways above involve extending or using Airflow Python classes and functions. The classes
+and functions mentioned below can be relied to keep backwards-compatible signatures and behaviours within
+MAJOR version of Airflow. On the other hand, classes and method started with ``_`` (also known

Review Comment:
   ```suggestion
   MAJOR version of Airflow. On the other hand, classes and methods starting with ``_`` (also known
   ```



##########
docs/apache-airflow/public-airflow-interface.rst:
##########
@@ -0,0 +1,374 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+...........................
+
+The Public Interface of Apache Airflow is a set of interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This includes operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+and for automating certain aspects of the Airflow workflow.
+
+Using Airflow Public Interfaces
+===============================
+
+Using Airflow Public Interfaces is needed when you want to interact with Airflow programmatically:
+
+* When you are writing new (or extending existing) custom Python classes (Operators, Hooks) - basic building
+  blocks of DAGs and can be done DAG Authors to add missing functionality in their DAGs or by those who write
+  reusable custom operators for other DAG authors.
+* When writing new :doc:`Plugins <authoring-and-scheduling/plugins>` that extend Airflow's functionality beyond
+  DAG building blocks. Secrets, Timetables, Triggers, Listeners are all examples of such functionality. This
+  is usually done by users who manage Airflow instances.
+* Bundling custom Operators, Hooks, Plugins and releasing them together via
+  :doc:`provider packages <apache-airflow-providers:index>` - this is usually done by those who intend to
+  provide reusable set of functionality for external service or application Airflow integrates with.
+
+All the ways above involve extending or using Airflow Python classes and functions. The classes
+and functions mentioned below can be relied to keep backwards-compatible signatures and behaviours within
+MAJOR version of Airflow. On the other hand, classes and method started with ``_`` (also known
+as protected Python methods) and ``__`` (also known as private Python methods) are not part of the Public
+Airflow Interface and might change any time.
+
+You can also use Airflow's Public Interface via the `Stable REST API <stable-rest-api-ref>`_ (based on the
+OpenAPI specification). For specific needs you can also use the
+`Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_ though it's behaviour might change
+in details (such as output format and available flags) so if you want to rely on those in programmatic
+way, the Stable REST API is recommended.
+
+
+Using Public Interface by DAG Authors
+=====================================
+
+DAGS
+----
+
+The DAG is Airflow's core entity that represents a recurring workflow. You can create a DAG by
+instantiating the :class:`~airflow.models.dag.DAG` class in your DAG file.
+
+Airflow has a set of Example DAGs that you can use to learn how to write DAGs
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/example_dags/index
+
+You can read more about DAGs in :doc:`DAGs <core-concepts/dags>`.
+
+.. _pythonapi:operators:
+
+Operators
+---------
+
+Operators allow for generation of certain types of tasks that become nodes in
+the DAG when instantiated.
+
+There are 3 main types of operators:
+
+- Operators that performs an **action**, or tell another system to
+  perform an action
+- **Transfer** operators move data from one system to another
+- **Sensors** are a certain type of operator that will keep running until a
+  certain criterion is met. Examples include a specific file landing in HDFS or
+  S3, a partition appearing in Hive, or a specific time of the day. Sensors
+  are derived from :class:`~airflow.sensors.base.BaseSensorOperator` and run a poke
+  method at a specified :attr:`~airflow.sensors.base.BaseSensorOperator.poke_interval` until it
+  returns ``True``.
+
+All operators are derived from :class:`~airflow.models.baseoperator.BaseOperator` and acquire much
+functionality through inheritance. Since this is the core of the engine,
+it's worth taking the time to understand the parameters of :class:`~airflow.models.baseoperator.BaseOperator`
+to understand the primitive features that can be leveraged in your DAGs.
+
+Airflow has a set of Operators that are considered public. You are also free to extend their functionality
+by extending them:
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/operators/index
+
+  _api/airflow/sensors/index
+
+
+You can read more about the operators in :doc:`core-concepts/operators`, :doc:`core-concepts/sensors`.
+Also you can learn how to write a custom operator in :doc:`howto/custom-operator`.
+
+.. _pythonapi:hooks:
+
+Hooks
+-----
+
+Hooks are interfaces to external platforms and databases, implementing a common
+interface when possible and acting as building blocks for operators. All hooks
+are derived from :class:`~airflow.hooks.base.BaseHook`.
+
+Airflow has a set of Hooks that are considered public. You are free to extend their functionality
+by extending them:
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/hooks/index
+
+Public Airflow utilities
+------------------------
+
+When writing or extending Hooks and Operators, DAG authors and developers of those Operators and Hooks can

Review Comment:
   ```suggestion
   When writing or extending Hooks and Operators, DAG authors and developers can
   ```
   
   Remove text duplication for readability



##########
docs/apache-airflow/public-airflow-interface.rst:
##########
@@ -0,0 +1,374 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+...........................
+
+The Public Interface of Apache Airflow is a set of interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This includes operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+and for automating certain aspects of the Airflow workflow.
+
+Using Airflow Public Interfaces
+===============================
+
+Using Airflow Public Interfaces is needed when you want to interact with Airflow programmatically:
+
+* When you are writing new (or extending existing) custom Python classes (Operators, Hooks) - basic building
+  blocks of DAGs and can be done DAG Authors to add missing functionality in their DAGs or by those who write
+  reusable custom operators for other DAG authors.
+* When writing new :doc:`Plugins <authoring-and-scheduling/plugins>` that extend Airflow's functionality beyond
+  DAG building blocks. Secrets, Timetables, Triggers, Listeners are all examples of such functionality. This
+  is usually done by users who manage Airflow instances.
+* Bundling custom Operators, Hooks, Plugins and releasing them together via
+  :doc:`provider packages <apache-airflow-providers:index>` - this is usually done by those who intend to
+  provide reusable set of functionality for external service or application Airflow integrates with.
+
+All the ways above involve extending or using Airflow Python classes and functions. The classes
+and functions mentioned below can be relied to keep backwards-compatible signatures and behaviours within
+MAJOR version of Airflow. On the other hand, classes and method started with ``_`` (also known
+as protected Python methods) and ``__`` (also known as private Python methods) are not part of the Public
+Airflow Interface and might change any time.
+
+You can also use Airflow's Public Interface via the `Stable REST API <stable-rest-api-ref>`_ (based on the
+OpenAPI specification). For specific needs you can also use the
+`Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_ though it's behaviour might change
+in details (such as output format and available flags) so if you want to rely on those in programmatic
+way, the Stable REST API is recommended.
+
+
+Using Public Interface by DAG Authors
+=====================================
+
+DAGS
+----
+
+The DAG is Airflow's core entity that represents a recurring workflow. You can create a DAG by
+instantiating the :class:`~airflow.models.dag.DAG` class in your DAG file.
+
+Airflow has a set of Example DAGs that you can use to learn how to write DAGs
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/example_dags/index
+
+You can read more about DAGs in :doc:`DAGs <core-concepts/dags>`.
+
+.. _pythonapi:operators:
+
+Operators
+---------
+
+Operators allow for generation of certain types of tasks that become nodes in
+the DAG when instantiated.
+
+There are 3 main types of operators:
+
+- Operators that performs an **action**, or tell another system to
+  perform an action
+- **Transfer** operators move data from one system to another
+- **Sensors** are a certain type of operator that will keep running until a
+  certain criterion is met. Examples include a specific file landing in HDFS or
+  S3, a partition appearing in Hive, or a specific time of the day. Sensors
+  are derived from :class:`~airflow.sensors.base.BaseSensorOperator` and run a poke
+  method at a specified :attr:`~airflow.sensors.base.BaseSensorOperator.poke_interval` until it
+  returns ``True``.
+
+All operators are derived from :class:`~airflow.models.baseoperator.BaseOperator` and acquire much
+functionality through inheritance. Since this is the core of the engine,
+it's worth taking the time to understand the parameters of :class:`~airflow.models.baseoperator.BaseOperator`
+to understand the primitive features that can be leveraged in your DAGs.
+
+Airflow has a set of Operators that are considered public. You are also free to extend their functionality
+by extending them:
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/operators/index
+
+  _api/airflow/sensors/index
+
+
+You can read more about the operators in :doc:`core-concepts/operators`, :doc:`core-concepts/sensors`.
+Also you can learn how to write a custom operator in :doc:`howto/custom-operator`.
+
+.. _pythonapi:hooks:
+
+Hooks
+-----
+
+Hooks are interfaces to external platforms and databases, implementing a common
+interface when possible and acting as building blocks for operators. All hooks
+are derived from :class:`~airflow.hooks.base.BaseHook`.
+
+Airflow has a set of Hooks that are considered public. You are free to extend their functionality
+by extending them:
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/hooks/index
+
+Public Airflow utilities
+------------------------
+
+When writing or extending Hooks and Operators, DAG authors and developers of those Operators and Hooks can
+use the following classes:
+
+* The :class:`~airflow.models.connection.Connection`, which provides access to external service credentials and configuration.
+* The :class:`~airflow.models.variable.Variable`, which provides access to Airflow configuration variables.
+* The :class:`~airflow.models.xcom.XCom` which are used to access to inter-task communication data.
+
+You can read more about the public Airflow utilities in :doc:`howto/connection`,
+:doc:`core-concepts/variables`, :doc:`core-concepts/xcoms`
+
+Public Exceptions
+-----------------
+
+When writing the custom Operators and Hooks, you can handle and raise public Exceptions that Airflow
+exposes:
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/exceptions/index
+
+
+Using Public Interface to extend Airflow capabilities
+=====================================================
+
+Airflow uses Plugin mechanism as a way to extend Airflow platform capabilities. They allow to extend

Review Comment:
   ```suggestion
   Airflow uses Plugin mechanism to extend Airflow platform capabilities. They allow to extend
   ```
   
   Remove obsolete text for readability



##########
docs/apache-airflow/public-airflow-interface.rst:
##########
@@ -0,0 +1,374 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+...........................
+
+The Public Interface of Apache Airflow is a set of interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This includes operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+and for automating certain aspects of the Airflow workflow.
+
+Using Airflow Public Interfaces
+===============================
+
+Using Airflow Public Interfaces is needed when you want to interact with Airflow programmatically:
+
+* When you are writing new (or extending existing) custom Python classes (Operators, Hooks) - basic building
+  blocks of DAGs and can be done DAG Authors to add missing functionality in their DAGs or by those who write
+  reusable custom operators for other DAG authors.
+* When writing new :doc:`Plugins <authoring-and-scheduling/plugins>` that extend Airflow's functionality beyond
+  DAG building blocks. Secrets, Timetables, Triggers, Listeners are all examples of such functionality. This
+  is usually done by users who manage Airflow instances.
+* Bundling custom Operators, Hooks, Plugins and releasing them together via
+  :doc:`provider packages <apache-airflow-providers:index>` - this is usually done by those who intend to
+  provide reusable set of functionality for external service or application Airflow integrates with.
+
+All the ways above involve extending or using Airflow Python classes and functions. The classes
+and functions mentioned below can be relied to keep backwards-compatible signatures and behaviours within
+MAJOR version of Airflow. On the other hand, classes and method started with ``_`` (also known
+as protected Python methods) and ``__`` (also known as private Python methods) are not part of the Public
+Airflow Interface and might change any time.
+
+You can also use Airflow's Public Interface via the `Stable REST API <stable-rest-api-ref>`_ (based on the
+OpenAPI specification). For specific needs you can also use the
+`Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_ though it's behaviour might change
+in details (such as output format and available flags) so if you want to rely on those in programmatic
+way, the Stable REST API is recommended.
+
+
+Using Public Interface by DAG Authors
+=====================================
+
+DAGS
+----
+
+The DAG is Airflow's core entity that represents a recurring workflow. You can create a DAG by
+instantiating the :class:`~airflow.models.dag.DAG` class in your DAG file.
+
+Airflow has a set of Example DAGs that you can use to learn how to write DAGs
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/example_dags/index
+
+You can read more about DAGs in :doc:`DAGs <core-concepts/dags>`.
+
+.. _pythonapi:operators:
+
+Operators
+---------
+
+Operators allow for generation of certain types of tasks that become nodes in
+the DAG when instantiated.
+
+There are 3 main types of operators:
+
+- Operators that performs an **action**, or tell another system to
+  perform an action
+- **Transfer** operators move data from one system to another
+- **Sensors** are a certain type of operator that will keep running until a
+  certain criterion is met. Examples include a specific file landing in HDFS or
+  S3, a partition appearing in Hive, or a specific time of the day. Sensors
+  are derived from :class:`~airflow.sensors.base.BaseSensorOperator` and run a poke
+  method at a specified :attr:`~airflow.sensors.base.BaseSensorOperator.poke_interval` until it
+  returns ``True``.
+
+All operators are derived from :class:`~airflow.models.baseoperator.BaseOperator` and acquire much
+functionality through inheritance. Since this is the core of the engine,
+it's worth taking the time to understand the parameters of :class:`~airflow.models.baseoperator.BaseOperator`
+to understand the primitive features that can be leveraged in your DAGs.
+
+Airflow has a set of Operators that are considered public. You are also free to extend their functionality
+by extending them:
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/operators/index
+
+  _api/airflow/sensors/index
+
+
+You can read more about the operators in :doc:`core-concepts/operators`, :doc:`core-concepts/sensors`.
+Also you can learn how to write a custom operator in :doc:`howto/custom-operator`.
+
+.. _pythonapi:hooks:
+
+Hooks
+-----
+
+Hooks are interfaces to external platforms and databases, implementing a common
+interface when possible and acting as building blocks for operators. All hooks
+are derived from :class:`~airflow.hooks.base.BaseHook`.
+
+Airflow has a set of Hooks that are considered public. You are free to extend their functionality
+by extending them:
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/hooks/index
+
+Public Airflow utilities
+------------------------
+
+When writing or extending Hooks and Operators, DAG authors and developers of those Operators and Hooks can
+use the following classes:
+
+* The :class:`~airflow.models.connection.Connection`, which provides access to external service credentials and configuration.
+* The :class:`~airflow.models.variable.Variable`, which provides access to Airflow configuration variables.
+* The :class:`~airflow.models.xcom.XCom` which are used to access to inter-task communication data.
+
+You can read more about the public Airflow utilities in :doc:`howto/connection`,
+:doc:`core-concepts/variables`, :doc:`core-concepts/xcoms`
+
+Public Exceptions
+-----------------
+
+When writing the custom Operators and Hooks, you can handle and raise public Exceptions that Airflow
+exposes:
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/exceptions/index
+
+
+Using Public Interface to extend Airflow capabilities
+=====================================================
+
+Airflow uses Plugin mechanism as a way to extend Airflow platform capabilities. They allow to extend
+Airflow UI but also they are the way to expose the below customizations (Triggers, Timetables, Listeners).
+Providers can also implement plugin endpoints and customize Airflow UI and the customizations.
+
+You can read more about plugins in :doc:`authoring-and-scheduling/plugins`. You can read how to extend
+Airflow UI in :doc:`howto/custom-view-plugin`. Note that there are some simple customizations of the UI
+that do not require plugins - you can read more about them in :doc:`howto/customize-ui`.
+
+Here are the ways how Plugins can be used to extend Airflow:
+
+Triggers
+--------
+
+Airflow might be configure to use Triggers in order to implement ``asyncio`` compatible Deferrable Operators.
+All Triggers derive from :class:`~airflow.triggers.base.BaseTrigger`.
+
+Airflow has a set of Triggers that are considered public. You are free to extend their functionality
+by extending them:
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/triggers/index
+
+You can read more about Triggers in :doc:`authoring-and-scheduling/deferring`.
+
+Timetables
+----------
+
+Custom timetable implementations provide Airflow's scheduler additional logic to
+schedule DAG runs in ways not possible with built-in schedule expressions.
+All Timetables derive from :class:`~airflow.timetables.base.Timetable`.
+
+Airflow has a set of Timetables that are considered public. You are free to extend their functionality
+by extending them:
+
+.. toctree::
+  :includehidden:
+  :maxdepth: 1
+
+  _api/airflow/timetables/index
+
+You can read more about Timetables in :doc:`howto/timetable`.
+
+Listeners
+---------
+
+Listeners are the way that airflow platform as a whole can be extended to respond to DAG/Task lifecycle events.

Review Comment:
   ```suggestion
   Listeners enable you to respond to DAG/Task lifecycle events.
   ```
   
   Vague text. Rewording for clarity.



##########
docs/apache-airflow/public-airflow-interface.rst:
##########
@@ -0,0 +1,374 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+...........................
+
+The Public Interface of Apache Airflow is a set of interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This includes operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+and for automating certain aspects of the Airflow workflow.
+
+Using Airflow Public Interfaces
+===============================
+
+Using Airflow Public Interfaces is needed when you want to interact with Airflow programmatically:
+
+* When you are writing new (or extending existing) custom Python classes (Operators, Hooks) - basic building
+  blocks of DAGs and can be done DAG Authors to add missing functionality in their DAGs or by those who write
+  reusable custom operators for other DAG authors.
+* When writing new :doc:`Plugins <authoring-and-scheduling/plugins>` that extend Airflow's functionality beyond
+  DAG building blocks. Secrets, Timetables, Triggers, Listeners are all examples of such functionality. This
+  is usually done by users who manage Airflow instances.
+* Bundling custom Operators, Hooks, Plugins and releasing them together via
+  :doc:`provider packages <apache-airflow-providers:index>` - this is usually done by those who intend to
+  provide reusable set of functionality for external service or application Airflow integrates with.
+
+All the ways above involve extending or using Airflow Python classes and functions. The classes
+and functions mentioned below can be relied to keep backwards-compatible signatures and behaviours within
+MAJOR version of Airflow. On the other hand, classes and method started with ``_`` (also known
+as protected Python methods) and ``__`` (also known as private Python methods) are not part of the Public
+Airflow Interface and might change any time.
+
+You can also use Airflow's Public Interface via the `Stable REST API <stable-rest-api-ref>`_ (based on the
+OpenAPI specification). For specific needs you can also use the
+`Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_ though it's behaviour might change
+in details (such as output format and available flags) so if you want to rely on those in programmatic
+way, the Stable REST API is recommended.
+
+
+Using Public Interface by DAG Authors
+=====================================
+
+DAGS
+----
+
+The DAG is Airflow's core entity that represents a recurring workflow. You can create a DAG by
+instantiating the :class:`~airflow.models.dag.DAG` class in your DAG file.
+
+Airflow has a set of Example DAGs that you can use to learn how to write DAGs
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/example_dags/index
+
+You can read more about DAGs in :doc:`DAGs <core-concepts/dags>`.
+
+.. _pythonapi:operators:
+
+Operators
+---------
+
+Operators allow for generation of certain types of tasks that become nodes in
+the DAG when instantiated.
+
+There are 3 main types of operators:
+
+- Operators that performs an **action**, or tell another system to
+  perform an action
+- **Transfer** operators move data from one system to another
+- **Sensors** are a certain type of operator that will keep running until a
+  certain criterion is met. Examples include a specific file landing in HDFS or
+  S3, a partition appearing in Hive, or a specific time of the day. Sensors
+  are derived from :class:`~airflow.sensors.base.BaseSensorOperator` and run a poke
+  method at a specified :attr:`~airflow.sensors.base.BaseSensorOperator.poke_interval` until it
+  returns ``True``.
+
+All operators are derived from :class:`~airflow.models.baseoperator.BaseOperator` and acquire much
+functionality through inheritance. Since this is the core of the engine,
+it's worth taking the time to understand the parameters of :class:`~airflow.models.baseoperator.BaseOperator`
+to understand the primitive features that can be leveraged in your DAGs.
+
+Airflow has a set of Operators that are considered public. You are also free to extend their functionality
+by extending them:
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/operators/index
+
+  _api/airflow/sensors/index
+
+
+You can read more about the operators in :doc:`core-concepts/operators`, :doc:`core-concepts/sensors`.
+Also you can learn how to write a custom operator in :doc:`howto/custom-operator`.
+
+.. _pythonapi:hooks:
+
+Hooks
+-----
+
+Hooks are interfaces to external platforms and databases, implementing a common
+interface when possible and acting as building blocks for operators. All hooks
+are derived from :class:`~airflow.hooks.base.BaseHook`.
+
+Airflow has a set of Hooks that are considered public. You are free to extend their functionality
+by extending them:
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/hooks/index
+
+Public Airflow utilities
+------------------------
+
+When writing or extending Hooks and Operators, DAG authors and developers of those Operators and Hooks can
+use the following classes:
+
+* The :class:`~airflow.models.connection.Connection`, which provides access to external service credentials and configuration.
+* The :class:`~airflow.models.variable.Variable`, which provides access to Airflow configuration variables.
+* The :class:`~airflow.models.xcom.XCom` which are used to access to inter-task communication data.
+
+You can read more about the public Airflow utilities in :doc:`howto/connection`,
+:doc:`core-concepts/variables`, :doc:`core-concepts/xcoms`
+
+Public Exceptions
+-----------------
+
+When writing the custom Operators and Hooks, you can handle and raise public Exceptions that Airflow
+exposes:
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/exceptions/index
+
+
+Using Public Interface to extend Airflow capabilities
+=====================================================
+
+Airflow uses Plugin mechanism as a way to extend Airflow platform capabilities. They allow to extend
+Airflow UI but also they are the way to expose the below customizations (Triggers, Timetables, Listeners).
+Providers can also implement plugin endpoints and customize Airflow UI and the customizations.
+
+You can read more about plugins in :doc:`authoring-and-scheduling/plugins`. You can read how to extend
+Airflow UI in :doc:`howto/custom-view-plugin`. Note that there are some simple customizations of the UI
+that do not require plugins - you can read more about them in :doc:`howto/customize-ui`.
+
+Here are the ways how Plugins can be used to extend Airflow:
+
+Triggers
+--------
+
+Airflow might be configure to use Triggers in order to implement ``asyncio`` compatible Deferrable Operators.

Review Comment:
   ```suggestion
   Airflow uses Triggers to implement ``asyncio`` compatible Deferrable Operators.
   ```
   
   "might be" is very vague. Rewording for clarity.



##########
docs/apache-airflow/public-airflow-interface.rst:
##########
@@ -0,0 +1,374 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+...........................
+
+The Public Interface of Apache Airflow is a set of interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This includes operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+and for automating certain aspects of the Airflow workflow.
+
+Using Airflow Public Interfaces
+===============================
+
+Using Airflow Public Interfaces is needed when you want to interact with Airflow programmatically:
+
+* When you are writing new (or extending existing) custom Python classes (Operators, Hooks) - basic building
+  blocks of DAGs and can be done DAG Authors to add missing functionality in their DAGs or by those who write
+  reusable custom operators for other DAG authors.
+* When writing new :doc:`Plugins <authoring-and-scheduling/plugins>` that extend Airflow's functionality beyond
+  DAG building blocks. Secrets, Timetables, Triggers, Listeners are all examples of such functionality. This
+  is usually done by users who manage Airflow instances.
+* Bundling custom Operators, Hooks, Plugins and releasing them together via
+  :doc:`provider packages <apache-airflow-providers:index>` - this is usually done by those who intend to
+  provide reusable set of functionality for external service or application Airflow integrates with.
+
+All the ways above involve extending or using Airflow Python classes and functions. The classes
+and functions mentioned below can be relied to keep backwards-compatible signatures and behaviours within
+MAJOR version of Airflow. On the other hand, classes and method started with ``_`` (also known
+as protected Python methods) and ``__`` (also known as private Python methods) are not part of the Public
+Airflow Interface and might change any time.
+
+You can also use Airflow's Public Interface via the `Stable REST API <stable-rest-api-ref>`_ (based on the
+OpenAPI specification). For specific needs you can also use the
+`Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_ though it's behaviour might change
+in details (such as output format and available flags) so if you want to rely on those in programmatic
+way, the Stable REST API is recommended.
+
+
+Using Public Interface by DAG Authors
+=====================================
+
+DAGS
+----
+
+The DAG is Airflow's core entity that represents a recurring workflow. You can create a DAG by
+instantiating the :class:`~airflow.models.dag.DAG` class in your DAG file.
+
+Airflow has a set of Example DAGs that you can use to learn how to write DAGs
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/example_dags/index
+
+You can read more about DAGs in :doc:`DAGs <core-concepts/dags>`.
+
+.. _pythonapi:operators:
+
+Operators
+---------
+
+Operators allow for generation of certain types of tasks that become nodes in
+the DAG when instantiated.
+
+There are 3 main types of operators:
+
+- Operators that performs an **action**, or tell another system to
+  perform an action
+- **Transfer** operators move data from one system to another
+- **Sensors** are a certain type of operator that will keep running until a
+  certain criterion is met. Examples include a specific file landing in HDFS or
+  S3, a partition appearing in Hive, or a specific time of the day. Sensors
+  are derived from :class:`~airflow.sensors.base.BaseSensorOperator` and run a poke
+  method at a specified :attr:`~airflow.sensors.base.BaseSensorOperator.poke_interval` until it
+  returns ``True``.
+
+All operators are derived from :class:`~airflow.models.baseoperator.BaseOperator` and acquire much
+functionality through inheritance. Since this is the core of the engine,
+it's worth taking the time to understand the parameters of :class:`~airflow.models.baseoperator.BaseOperator`
+to understand the primitive features that can be leveraged in your DAGs.
+
+Airflow has a set of Operators that are considered public. You are also free to extend their functionality
+by extending them:
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/operators/index
+
+  _api/airflow/sensors/index
+
+
+You can read more about the operators in :doc:`core-concepts/operators`, :doc:`core-concepts/sensors`.
+Also you can learn how to write a custom operator in :doc:`howto/custom-operator`.
+
+.. _pythonapi:hooks:
+
+Hooks
+-----
+
+Hooks are interfaces to external platforms and databases, implementing a common
+interface when possible and acting as building blocks for operators. All hooks
+are derived from :class:`~airflow.hooks.base.BaseHook`.
+
+Airflow has a set of Hooks that are considered public. You are free to extend their functionality
+by extending them:
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/hooks/index
+
+Public Airflow utilities
+------------------------
+
+When writing or extending Hooks and Operators, DAG authors and developers of those Operators and Hooks can
+use the following classes:
+
+* The :class:`~airflow.models.connection.Connection`, which provides access to external service credentials and configuration.
+* The :class:`~airflow.models.variable.Variable`, which provides access to Airflow configuration variables.
+* The :class:`~airflow.models.xcom.XCom` which are used to access to inter-task communication data.
+
+You can read more about the public Airflow utilities in :doc:`howto/connection`,
+:doc:`core-concepts/variables`, :doc:`core-concepts/xcoms`
+
+Public Exceptions
+-----------------
+
+When writing the custom Operators and Hooks, you can handle and raise public Exceptions that Airflow
+exposes:
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/exceptions/index
+
+
+Using Public Interface to extend Airflow capabilities
+=====================================================
+
+Airflow uses Plugin mechanism as a way to extend Airflow platform capabilities. They allow to extend
+Airflow UI but also they are the way to expose the below customizations (Triggers, Timetables, Listeners).
+Providers can also implement plugin endpoints and customize Airflow UI and the customizations.
+
+You can read more about plugins in :doc:`authoring-and-scheduling/plugins`. You can read how to extend
+Airflow UI in :doc:`howto/custom-view-plugin`. Note that there are some simple customizations of the UI
+that do not require plugins - you can read more about them in :doc:`howto/customize-ui`.
+
+Here are the ways how Plugins can be used to extend Airflow:
+
+Triggers
+--------
+
+Airflow might be configure to use Triggers in order to implement ``asyncio`` compatible Deferrable Operators.
+All Triggers derive from :class:`~airflow.triggers.base.BaseTrigger`.
+
+Airflow has a set of Triggers that are considered public. You are free to extend their functionality
+by extending them:
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/triggers/index
+
+You can read more about Triggers in :doc:`authoring-and-scheduling/deferring`.
+
+Timetables
+----------
+
+Custom timetable implementations provide Airflow's scheduler additional logic to
+schedule DAG runs in ways not possible with built-in schedule expressions.
+All Timetables derive from :class:`~airflow.timetables.base.Timetable`.
+
+Airflow has a set of Timetables that are considered public. You are free to extend their functionality
+by extending them:
+
+.. toctree::
+  :includehidden:
+  :maxdepth: 1
+
+  _api/airflow/timetables/index
+
+You can read more about Timetables in :doc:`howto/timetable`.
+
+Listeners
+---------
+
+Listeners are the way that airflow platform as a whole can be extended to respond to DAG/Task lifecycle events.
+
+This is implemented via :class:`~airflow.listeners.listener.ListenerManager` class that provides hooks that
+can be implemented to respond to DAG/Task lifecycle events.
+
+.. versionadded:: 2.5
+
+   Listener public interface has been added in version 2.5.
+
+You can read more about Listeners in :doc:`administration-and-deployment/listeners`.
+
+Extra Links
+-----------
+
+Extra links are dynamic links that could be added to Airflow independently from custom Operators. Normally
+they can be defined by the Operators, but plugins allow to override the links on aa global level.
+
+You can read more about the Extra Links in :doc:`/howto/define-extra-link`.
+
+Using Public Interface to integrate with external services and applications
+===========================================================================
+
+In order to integrate Airflow with external services and applications you can develop Hooks and Operators,
+similarly to DAG Authors, but also you can also extend the core functionality of Airflow by integrating
+them with those external services. This can be done by just adding the extension classes to PYTHONPATH
+on your system,and configuring Airflow to use them (there is no need to use plugin mechanism). However,
+you can also package and release the Hooks, Operators and core extensions in the form of
+:doc:`provider packages <apache-airflow-providers:index>`. You can read more about core extensions
+delivered by providers via :doc:`provider packages <apache-airflow-providers:core-extensions/index>`.
+
+Executors
+---------
+
+Executors are the mechanism by which task instances get run. All executors are
+derived from :class:`~airflow.executors.base_executor.BaseExecutor`. There are a number of different

Review Comment:
   ```suggestion
   derived from :class:`~airflow.executors.base_executor.BaseExecutor`. There are several
   ```
   
   "A number of" generally refers to a large number. Would use "several" since there's only a small number of executor implementations.



##########
docs/apache-airflow/public-airflow-interface.rst:
##########
@@ -0,0 +1,374 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+...........................
+
+The Public Interface of Apache Airflow is a set of interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This includes operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+and for automating certain aspects of the Airflow workflow.
+
+Using Airflow Public Interfaces
+===============================
+
+Using Airflow Public Interfaces is needed when you want to interact with Airflow programmatically:
+
+* When you are writing new (or extending existing) custom Python classes (Operators, Hooks) - basic building
+  blocks of DAGs and can be done DAG Authors to add missing functionality in their DAGs or by those who write
+  reusable custom operators for other DAG authors.
+* When writing new :doc:`Plugins <authoring-and-scheduling/plugins>` that extend Airflow's functionality beyond
+  DAG building blocks. Secrets, Timetables, Triggers, Listeners are all examples of such functionality. This
+  is usually done by users who manage Airflow instances.
+* Bundling custom Operators, Hooks, Plugins and releasing them together via
+  :doc:`provider packages <apache-airflow-providers:index>` - this is usually done by those who intend to
+  provide reusable set of functionality for external service or application Airflow integrates with.
+
+All the ways above involve extending or using Airflow Python classes and functions. The classes
+and functions mentioned below can be relied to keep backwards-compatible signatures and behaviours within
+MAJOR version of Airflow. On the other hand, classes and method started with ``_`` (also known
+as protected Python methods) and ``__`` (also known as private Python methods) are not part of the Public
+Airflow Interface and might change any time.
+
+You can also use Airflow's Public Interface via the `Stable REST API <stable-rest-api-ref>`_ (based on the
+OpenAPI specification). For specific needs you can also use the
+`Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_ though it's behaviour might change
+in details (such as output format and available flags) so if you want to rely on those in programmatic
+way, the Stable REST API is recommended.
+
+
+Using Public Interface by DAG Authors
+=====================================
+
+DAGS
+----
+
+The DAG is Airflow's core entity that represents a recurring workflow. You can create a DAG by
+instantiating the :class:`~airflow.models.dag.DAG` class in your DAG file.
+
+Airflow has a set of Example DAGs that you can use to learn how to write DAGs
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/example_dags/index
+
+You can read more about DAGs in :doc:`DAGs <core-concepts/dags>`.
+
+.. _pythonapi:operators:
+
+Operators
+---------
+
+Operators allow for generation of certain types of tasks that become nodes in
+the DAG when instantiated.
+
+There are 3 main types of operators:
+
+- Operators that performs an **action**, or tell another system to
+  perform an action
+- **Transfer** operators move data from one system to another
+- **Sensors** are a certain type of operator that will keep running until a
+  certain criterion is met. Examples include a specific file landing in HDFS or
+  S3, a partition appearing in Hive, or a specific time of the day. Sensors
+  are derived from :class:`~airflow.sensors.base.BaseSensorOperator` and run a poke
+  method at a specified :attr:`~airflow.sensors.base.BaseSensorOperator.poke_interval` until it
+  returns ``True``.
+
+All operators are derived from :class:`~airflow.models.baseoperator.BaseOperator` and acquire much
+functionality through inheritance. Since this is the core of the engine,
+it's worth taking the time to understand the parameters of :class:`~airflow.models.baseoperator.BaseOperator`
+to understand the primitive features that can be leveraged in your DAGs.
+
+Airflow has a set of Operators that are considered public. You are also free to extend their functionality
+by extending them:
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/operators/index
+
+  _api/airflow/sensors/index
+
+
+You can read more about the operators in :doc:`core-concepts/operators`, :doc:`core-concepts/sensors`.
+Also you can learn how to write a custom operator in :doc:`howto/custom-operator`.
+
+.. _pythonapi:hooks:
+
+Hooks
+-----
+
+Hooks are interfaces to external platforms and databases, implementing a common
+interface when possible and acting as building blocks for operators. All hooks
+are derived from :class:`~airflow.hooks.base.BaseHook`.
+
+Airflow has a set of Hooks that are considered public. You are free to extend their functionality
+by extending them:
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/hooks/index
+
+Public Airflow utilities
+------------------------
+
+When writing or extending Hooks and Operators, DAG authors and developers of those Operators and Hooks can
+use the following classes:
+
+* The :class:`~airflow.models.connection.Connection`, which provides access to external service credentials and configuration.
+* The :class:`~airflow.models.variable.Variable`, which provides access to Airflow configuration variables.
+* The :class:`~airflow.models.xcom.XCom` which are used to access to inter-task communication data.
+
+You can read more about the public Airflow utilities in :doc:`howto/connection`,
+:doc:`core-concepts/variables`, :doc:`core-concepts/xcoms`
+
+Public Exceptions
+-----------------
+
+When writing the custom Operators and Hooks, you can handle and raise public Exceptions that Airflow
+exposes:
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/exceptions/index
+
+
+Using Public Interface to extend Airflow capabilities
+=====================================================
+
+Airflow uses Plugin mechanism as a way to extend Airflow platform capabilities. They allow to extend
+Airflow UI but also they are the way to expose the below customizations (Triggers, Timetables, Listeners).
+Providers can also implement plugin endpoints and customize Airflow UI and the customizations.
+
+You can read more about plugins in :doc:`authoring-and-scheduling/plugins`. You can read how to extend
+Airflow UI in :doc:`howto/custom-view-plugin`. Note that there are some simple customizations of the UI
+that do not require plugins - you can read more about them in :doc:`howto/customize-ui`.
+
+Here are the ways how Plugins can be used to extend Airflow:
+
+Triggers
+--------
+
+Airflow might be configure to use Triggers in order to implement ``asyncio`` compatible Deferrable Operators.
+All Triggers derive from :class:`~airflow.triggers.base.BaseTrigger`.
+
+Airflow has a set of Triggers that are considered public. You are free to extend their functionality
+by extending them:
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/triggers/index
+
+You can read more about Triggers in :doc:`authoring-and-scheduling/deferring`.
+
+Timetables
+----------
+
+Custom timetable implementations provide Airflow's scheduler additional logic to
+schedule DAG runs in ways not possible with built-in schedule expressions.
+All Timetables derive from :class:`~airflow.timetables.base.Timetable`.
+
+Airflow has a set of Timetables that are considered public. You are free to extend their functionality
+by extending them:
+
+.. toctree::
+  :includehidden:
+  :maxdepth: 1
+
+  _api/airflow/timetables/index
+
+You can read more about Timetables in :doc:`howto/timetable`.
+
+Listeners
+---------
+
+Listeners are the way that airflow platform as a whole can be extended to respond to DAG/Task lifecycle events.
+
+This is implemented via :class:`~airflow.listeners.listener.ListenerManager` class that provides hooks that
+can be implemented to respond to DAG/Task lifecycle events.
+
+.. versionadded:: 2.5
+
+   Listener public interface has been added in version 2.5.
+
+You can read more about Listeners in :doc:`administration-and-deployment/listeners`.
+
+Extra Links
+-----------
+
+Extra links are dynamic links that could be added to Airflow independently from custom Operators. Normally
+they can be defined by the Operators, but plugins allow to override the links on aa global level.
+
+You can read more about the Extra Links in :doc:`/howto/define-extra-link`.
+
+Using Public Interface to integrate with external services and applications
+===========================================================================
+
+In order to integrate Airflow with external services and applications you can develop Hooks and Operators,
+similarly to DAG Authors, but also you can also extend the core functionality of Airflow by integrating
+them with those external services. This can be done by just adding the extension classes to PYTHONPATH
+on your system,and configuring Airflow to use them (there is no need to use plugin mechanism). However,
+you can also package and release the Hooks, Operators and core extensions in the form of
+:doc:`provider packages <apache-airflow-providers:index>`. You can read more about core extensions
+delivered by providers via :doc:`provider packages <apache-airflow-providers:core-extensions/index>`.
+
+Executors
+---------
+
+Executors are the mechanism by which task instances get run. All executors are
+derived from :class:`~airflow.executors.base_executor.BaseExecutor`. There are a number of different
+executor implementations built-in Airflow, each with its own unique characteristics and capabilities.
+
+Airflow has a set of Executors that are considered public. You are free to extend their functionality
+by extending them:
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/executors/index
+
+You can read more about executors in :doc:`core-concepts/executor/index`.
+
+.. versionadded:: 2.6
+
+  Executor interface was available in earlier version of Airflow but only as of version 2.6 executors are
+  fully decoupled and Airflow does not rely on built-in set of executors.
+  You could have implemented (and succeeded) with implementing Executors before Airflow 2.6 and a number
+  of people succeeded in doing so, but there were some hard-coded behaviours that preferred in-built
+  executors, and custom executors could not provide full functionality that built-in executors had.
+
+Secrets Backends
+----------------
+
+Airflow might be configured to rely on secrets backends to retrieve

Review Comment:
   ```suggestion
   Airflow can be configured to rely on secrets backends to retrieve
   ```
   
   "might" is asking for permission. Would use "can" to express to the capability of implementing a secrets backend.



##########
docs/apache-airflow/public-airflow-interface.rst:
##########
@@ -0,0 +1,374 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+...........................
+
+The Public Interface of Apache Airflow is a set of interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This includes operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+and for automating certain aspects of the Airflow workflow.
+
+Using Airflow Public Interfaces
+===============================
+
+Using Airflow Public Interfaces is needed when you want to interact with Airflow programmatically:
+
+* When you are writing new (or extending existing) custom Python classes (Operators, Hooks) - basic building
+  blocks of DAGs and can be done DAG Authors to add missing functionality in their DAGs or by those who write
+  reusable custom operators for other DAG authors.
+* When writing new :doc:`Plugins <authoring-and-scheduling/plugins>` that extend Airflow's functionality beyond
+  DAG building blocks. Secrets, Timetables, Triggers, Listeners are all examples of such functionality. This
+  is usually done by users who manage Airflow instances.
+* Bundling custom Operators, Hooks, Plugins and releasing them together via
+  :doc:`provider packages <apache-airflow-providers:index>` - this is usually done by those who intend to
+  provide reusable set of functionality for external service or application Airflow integrates with.
+
+All the ways above involve extending or using Airflow Python classes and functions. The classes
+and functions mentioned below can be relied to keep backwards-compatible signatures and behaviours within
+MAJOR version of Airflow. On the other hand, classes and method started with ``_`` (also known
+as protected Python methods) and ``__`` (also known as private Python methods) are not part of the Public
+Airflow Interface and might change any time.
+
+You can also use Airflow's Public Interface via the `Stable REST API <stable-rest-api-ref>`_ (based on the
+OpenAPI specification). For specific needs you can also use the
+`Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_ though it's behaviour might change
+in details (such as output format and available flags) so if you want to rely on those in programmatic
+way, the Stable REST API is recommended.
+
+
+Using Public Interface by DAG Authors
+=====================================
+
+DAGS
+----
+
+The DAG is Airflow's core entity that represents a recurring workflow. You can create a DAG by
+instantiating the :class:`~airflow.models.dag.DAG` class in your DAG file.
+
+Airflow has a set of Example DAGs that you can use to learn how to write DAGs
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/example_dags/index
+
+You can read more about DAGs in :doc:`DAGs <core-concepts/dags>`.
+
+.. _pythonapi:operators:
+
+Operators
+---------
+
+Operators allow for generation of certain types of tasks that become nodes in
+the DAG when instantiated.
+
+There are 3 main types of operators:
+
+- Operators that performs an **action**, or tell another system to
+  perform an action
+- **Transfer** operators move data from one system to another
+- **Sensors** are a certain type of operator that will keep running until a
+  certain criterion is met. Examples include a specific file landing in HDFS or
+  S3, a partition appearing in Hive, or a specific time of the day. Sensors
+  are derived from :class:`~airflow.sensors.base.BaseSensorOperator` and run a poke
+  method at a specified :attr:`~airflow.sensors.base.BaseSensorOperator.poke_interval` until it
+  returns ``True``.
+
+All operators are derived from :class:`~airflow.models.baseoperator.BaseOperator` and acquire much
+functionality through inheritance. Since this is the core of the engine,
+it's worth taking the time to understand the parameters of :class:`~airflow.models.baseoperator.BaseOperator`
+to understand the primitive features that can be leveraged in your DAGs.
+
+Airflow has a set of Operators that are considered public. You are also free to extend their functionality
+by extending them:
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/operators/index
+
+  _api/airflow/sensors/index
+
+
+You can read more about the operators in :doc:`core-concepts/operators`, :doc:`core-concepts/sensors`.
+Also you can learn how to write a custom operator in :doc:`howto/custom-operator`.
+
+.. _pythonapi:hooks:
+
+Hooks
+-----
+
+Hooks are interfaces to external platforms and databases, implementing a common
+interface when possible and acting as building blocks for operators. All hooks
+are derived from :class:`~airflow.hooks.base.BaseHook`.
+
+Airflow has a set of Hooks that are considered public. You are free to extend their functionality
+by extending them:
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/hooks/index
+
+Public Airflow utilities
+------------------------
+
+When writing or extending Hooks and Operators, DAG authors and developers of those Operators and Hooks can
+use the following classes:
+
+* The :class:`~airflow.models.connection.Connection`, which provides access to external service credentials and configuration.
+* The :class:`~airflow.models.variable.Variable`, which provides access to Airflow configuration variables.
+* The :class:`~airflow.models.xcom.XCom` which are used to access to inter-task communication data.
+
+You can read more about the public Airflow utilities in :doc:`howto/connection`,
+:doc:`core-concepts/variables`, :doc:`core-concepts/xcoms`
+
+Public Exceptions
+-----------------
+
+When writing the custom Operators and Hooks, you can handle and raise public Exceptions that Airflow
+exposes:
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/exceptions/index
+
+
+Using Public Interface to extend Airflow capabilities
+=====================================================
+
+Airflow uses Plugin mechanism as a way to extend Airflow platform capabilities. They allow to extend
+Airflow UI but also they are the way to expose the below customizations (Triggers, Timetables, Listeners).
+Providers can also implement plugin endpoints and customize Airflow UI and the customizations.
+
+You can read more about plugins in :doc:`authoring-and-scheduling/plugins`. You can read how to extend
+Airflow UI in :doc:`howto/custom-view-plugin`. Note that there are some simple customizations of the UI
+that do not require plugins - you can read more about them in :doc:`howto/customize-ui`.
+
+Here are the ways how Plugins can be used to extend Airflow:
+
+Triggers
+--------
+
+Airflow might be configure to use Triggers in order to implement ``asyncio`` compatible Deferrable Operators.
+All Triggers derive from :class:`~airflow.triggers.base.BaseTrigger`.
+
+Airflow has a set of Triggers that are considered public. You are free to extend their functionality
+by extending them:
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/triggers/index
+
+You can read more about Triggers in :doc:`authoring-and-scheduling/deferring`.
+
+Timetables
+----------
+
+Custom timetable implementations provide Airflow's scheduler additional logic to
+schedule DAG runs in ways not possible with built-in schedule expressions.
+All Timetables derive from :class:`~airflow.timetables.base.Timetable`.
+
+Airflow has a set of Timetables that are considered public. You are free to extend their functionality
+by extending them:
+
+.. toctree::
+  :includehidden:
+  :maxdepth: 1
+
+  _api/airflow/timetables/index
+
+You can read more about Timetables in :doc:`howto/timetable`.
+
+Listeners
+---------
+
+Listeners are the way that airflow platform as a whole can be extended to respond to DAG/Task lifecycle events.
+
+This is implemented via :class:`~airflow.listeners.listener.ListenerManager` class that provides hooks that
+can be implemented to respond to DAG/Task lifecycle events.
+
+.. versionadded:: 2.5
+
+   Listener public interface has been added in version 2.5.
+
+You can read more about Listeners in :doc:`administration-and-deployment/listeners`.
+
+Extra Links
+-----------
+
+Extra links are dynamic links that could be added to Airflow independently from custom Operators. Normally
+they can be defined by the Operators, but plugins allow to override the links on aa global level.
+
+You can read more about the Extra Links in :doc:`/howto/define-extra-link`.
+
+Using Public Interface to integrate with external services and applications
+===========================================================================
+
+In order to integrate Airflow with external services and applications you can develop Hooks and Operators,
+similarly to DAG Authors, but also you can also extend the core functionality of Airflow by integrating
+them with those external services. This can be done by just adding the extension classes to PYTHONPATH
+on your system,and configuring Airflow to use them (there is no need to use plugin mechanism). However,
+you can also package and release the Hooks, Operators and core extensions in the form of
+:doc:`provider packages <apache-airflow-providers:index>`. You can read more about core extensions
+delivered by providers via :doc:`provider packages <apache-airflow-providers:core-extensions/index>`.
+
+Executors
+---------
+
+Executors are the mechanism by which task instances get run. All executors are
+derived from :class:`~airflow.executors.base_executor.BaseExecutor`. There are a number of different
+executor implementations built-in Airflow, each with its own unique characteristics and capabilities.
+
+Airflow has a set of Executors that are considered public. You are free to extend their functionality
+by extending them:
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/executors/index
+
+You can read more about executors in :doc:`core-concepts/executor/index`.
+
+.. versionadded:: 2.6
+
+  Executor interface was available in earlier version of Airflow but only as of version 2.6 executors are
+  fully decoupled and Airflow does not rely on built-in set of executors.
+  You could have implemented (and succeeded) with implementing Executors before Airflow 2.6 and a number
+  of people succeeded in doing so, but there were some hard-coded behaviours that preferred in-built
+  executors, and custom executors could not provide full functionality that built-in executors had.
+
+Secrets Backends
+----------------
+
+Airflow might be configured to rely on secrets backends to retrieve
+:class:`~airflow.models.connection.Connection` and :class:`~airflow.models.Variables`.
+All secrets backends derive from :class:`~airflow.secrets.BaseSecretsBackend`.
+
+Airflow has a set of Secret Backends that are considered public. You are free to extend their functionality
+by extending them:
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/secrets/index
+
+You can read more about Secret Backends in :doc:`administration-and-deployment/security/secrets/secrets-backend/index`.
+You can also find all the available Secrets Backends implemented in community providers
+in :doc:`apache-airflow-providers:core-extensions/secrets-backends`.
+
+Authentication Backends
+-----------------------
+
+Authentication backends can extend the way how Airflow authentication mechanism works. You can find out more
+about authentication in :doc:`apache-airflow-providers:core-extensions/auth-backends` that also shows available
+Authentication backends implemented in the community providers.
+
+Connections
+-----------
+
+When crating Hooks, you can add custom Connections. You can read more

Review Comment:
   ```suggestion
   When creating Hooks, you can add custom Connections. You can read more
   ```



##########
docs/apache-airflow/public-airflow-interface.rst:
##########
@@ -0,0 +1,374 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+...........................
+
+The Public Interface of Apache Airflow is a set of interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This includes operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+and for automating certain aspects of the Airflow workflow.
+
+Using Airflow Public Interfaces
+===============================
+
+Using Airflow Public Interfaces is needed when you want to interact with Airflow programmatically:
+
+* When you are writing new (or extending existing) custom Python classes (Operators, Hooks) - basic building
+  blocks of DAGs and can be done DAG Authors to add missing functionality in their DAGs or by those who write
+  reusable custom operators for other DAG authors.
+* When writing new :doc:`Plugins <authoring-and-scheduling/plugins>` that extend Airflow's functionality beyond
+  DAG building blocks. Secrets, Timetables, Triggers, Listeners are all examples of such functionality. This
+  is usually done by users who manage Airflow instances.
+* Bundling custom Operators, Hooks, Plugins and releasing them together via
+  :doc:`provider packages <apache-airflow-providers:index>` - this is usually done by those who intend to
+  provide reusable set of functionality for external service or application Airflow integrates with.
+
+All the ways above involve extending or using Airflow Python classes and functions. The classes
+and functions mentioned below can be relied to keep backwards-compatible signatures and behaviours within
+MAJOR version of Airflow. On the other hand, classes and method started with ``_`` (also known
+as protected Python methods) and ``__`` (also known as private Python methods) are not part of the Public
+Airflow Interface and might change any time.
+
+You can also use Airflow's Public Interface via the `Stable REST API <stable-rest-api-ref>`_ (based on the
+OpenAPI specification). For specific needs you can also use the
+`Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_ though it's behaviour might change
+in details (such as output format and available flags) so if you want to rely on those in programmatic
+way, the Stable REST API is recommended.
+
+
+Using Public Interface by DAG Authors
+=====================================
+
+DAGS
+----
+
+The DAG is Airflow's core entity that represents a recurring workflow. You can create a DAG by
+instantiating the :class:`~airflow.models.dag.DAG` class in your DAG file.
+
+Airflow has a set of Example DAGs that you can use to learn how to write DAGs
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/example_dags/index
+
+You can read more about DAGs in :doc:`DAGs <core-concepts/dags>`.
+
+.. _pythonapi:operators:
+
+Operators
+---------
+
+Operators allow for generation of certain types of tasks that become nodes in
+the DAG when instantiated.
+
+There are 3 main types of operators:
+
+- Operators that performs an **action**, or tell another system to
+  perform an action
+- **Transfer** operators move data from one system to another
+- **Sensors** are a certain type of operator that will keep running until a
+  certain criterion is met. Examples include a specific file landing in HDFS or
+  S3, a partition appearing in Hive, or a specific time of the day. Sensors
+  are derived from :class:`~airflow.sensors.base.BaseSensorOperator` and run a poke
+  method at a specified :attr:`~airflow.sensors.base.BaseSensorOperator.poke_interval` until it
+  returns ``True``.
+
+All operators are derived from :class:`~airflow.models.baseoperator.BaseOperator` and acquire much
+functionality through inheritance. Since this is the core of the engine,
+it's worth taking the time to understand the parameters of :class:`~airflow.models.baseoperator.BaseOperator`
+to understand the primitive features that can be leveraged in your DAGs.
+
+Airflow has a set of Operators that are considered public. You are also free to extend their functionality
+by extending them:
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/operators/index
+
+  _api/airflow/sensors/index
+
+
+You can read more about the operators in :doc:`core-concepts/operators`, :doc:`core-concepts/sensors`.
+Also you can learn how to write a custom operator in :doc:`howto/custom-operator`.
+
+.. _pythonapi:hooks:
+
+Hooks
+-----
+
+Hooks are interfaces to external platforms and databases, implementing a common
+interface when possible and acting as building blocks for operators. All hooks
+are derived from :class:`~airflow.hooks.base.BaseHook`.
+
+Airflow has a set of Hooks that are considered public. You are free to extend their functionality
+by extending them:
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/hooks/index
+
+Public Airflow utilities
+------------------------
+
+When writing or extending Hooks and Operators, DAG authors and developers of those Operators and Hooks can
+use the following classes:
+
+* The :class:`~airflow.models.connection.Connection`, which provides access to external service credentials and configuration.
+* The :class:`~airflow.models.variable.Variable`, which provides access to Airflow configuration variables.
+* The :class:`~airflow.models.xcom.XCom` which are used to access to inter-task communication data.
+
+You can read more about the public Airflow utilities in :doc:`howto/connection`,
+:doc:`core-concepts/variables`, :doc:`core-concepts/xcoms`
+
+Public Exceptions
+-----------------
+
+When writing the custom Operators and Hooks, you can handle and raise public Exceptions that Airflow
+exposes:
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/exceptions/index
+
+
+Using Public Interface to extend Airflow capabilities
+=====================================================
+
+Airflow uses Plugin mechanism as a way to extend Airflow platform capabilities. They allow to extend
+Airflow UI but also they are the way to expose the below customizations (Triggers, Timetables, Listeners).
+Providers can also implement plugin endpoints and customize Airflow UI and the customizations.
+
+You can read more about plugins in :doc:`authoring-and-scheduling/plugins`. You can read how to extend
+Airflow UI in :doc:`howto/custom-view-plugin`. Note that there are some simple customizations of the UI
+that do not require plugins - you can read more about them in :doc:`howto/customize-ui`.
+
+Here are the ways how Plugins can be used to extend Airflow:
+
+Triggers
+--------
+
+Airflow might be configure to use Triggers in order to implement ``asyncio`` compatible Deferrable Operators.
+All Triggers derive from :class:`~airflow.triggers.base.BaseTrigger`.
+
+Airflow has a set of Triggers that are considered public. You are free to extend their functionality
+by extending them:
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/triggers/index
+
+You can read more about Triggers in :doc:`authoring-and-scheduling/deferring`.
+
+Timetables
+----------
+
+Custom timetable implementations provide Airflow's scheduler additional logic to
+schedule DAG runs in ways not possible with built-in schedule expressions.
+All Timetables derive from :class:`~airflow.timetables.base.Timetable`.
+
+Airflow has a set of Timetables that are considered public. You are free to extend their functionality
+by extending them:
+
+.. toctree::
+  :includehidden:
+  :maxdepth: 1
+
+  _api/airflow/timetables/index
+
+You can read more about Timetables in :doc:`howto/timetable`.
+
+Listeners
+---------
+
+Listeners are the way that airflow platform as a whole can be extended to respond to DAG/Task lifecycle events.
+
+This is implemented via :class:`~airflow.listeners.listener.ListenerManager` class that provides hooks that
+can be implemented to respond to DAG/Task lifecycle events.
+
+.. versionadded:: 2.5
+
+   Listener public interface has been added in version 2.5.
+
+You can read more about Listeners in :doc:`administration-and-deployment/listeners`.
+
+Extra Links
+-----------
+
+Extra links are dynamic links that could be added to Airflow independently from custom Operators. Normally
+they can be defined by the Operators, but plugins allow to override the links on aa global level.
+
+You can read more about the Extra Links in :doc:`/howto/define-extra-link`.
+
+Using Public Interface to integrate with external services and applications
+===========================================================================
+
+In order to integrate Airflow with external services and applications you can develop Hooks and Operators,
+similarly to DAG Authors, but also you can also extend the core functionality of Airflow by integrating
+them with those external services. This can be done by just adding the extension classes to PYTHONPATH
+on your system,and configuring Airflow to use them (there is no need to use plugin mechanism). However,
+you can also package and release the Hooks, Operators and core extensions in the form of
+:doc:`provider packages <apache-airflow-providers:index>`. You can read more about core extensions
+delivered by providers via :doc:`provider packages <apache-airflow-providers:core-extensions/index>`.
+
+Executors
+---------
+
+Executors are the mechanism by which task instances get run. All executors are
+derived from :class:`~airflow.executors.base_executor.BaseExecutor`. There are a number of different
+executor implementations built-in Airflow, each with its own unique characteristics and capabilities.
+
+Airflow has a set of Executors that are considered public. You are free to extend their functionality
+by extending them:
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/executors/index
+
+You can read more about executors in :doc:`core-concepts/executor/index`.
+
+.. versionadded:: 2.6
+
+  Executor interface was available in earlier version of Airflow but only as of version 2.6 executors are
+  fully decoupled and Airflow does not rely on built-in set of executors.
+  You could have implemented (and succeeded) with implementing Executors before Airflow 2.6 and a number
+  of people succeeded in doing so, but there were some hard-coded behaviours that preferred in-built
+  executors, and custom executors could not provide full functionality that built-in executors had.
+
+Secrets Backends
+----------------
+
+Airflow might be configured to rely on secrets backends to retrieve
+:class:`~airflow.models.connection.Connection` and :class:`~airflow.models.Variables`.
+All secrets backends derive from :class:`~airflow.secrets.BaseSecretsBackend`.
+
+Airflow has a set of Secret Backends that are considered public. You are free to extend their functionality
+by extending them:
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/secrets/index
+
+You can read more about Secret Backends in :doc:`administration-and-deployment/security/secrets/secrets-backend/index`.
+You can also find all the available Secrets Backends implemented in community providers
+in :doc:`apache-airflow-providers:core-extensions/secrets-backends`.
+
+Authentication Backends
+-----------------------
+
+Authentication backends can extend the way how Airflow authentication mechanism works. You can find out more
+about authentication in :doc:`apache-airflow-providers:core-extensions/auth-backends` that also shows available
+Authentication backends implemented in the community providers.
+
+Connections
+-----------
+
+When crating Hooks, you can add custom Connections. You can read more
+about connections in :doc:`apache-airflow-providers:core-extensions/connections` for available
+Connections implemented in the community providers.
+
+Extra Links
+-----------
+
+When creating Hooks, you can add custom Extra Links that are displayed when the tasks are run.
+You can find out more about extra links in :doc:`apache-airflow-providers:core-extensions/extra-links`
+that also shows available Authentication backends implemented in the community providers.

Review Comment:
   Mentioning "Authentication backends" here doesn't seem correct.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on a diff in pull request #28300: Add Public API description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
potiuk commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1046532326


##########
docs/apache-airflow/public-airflow-api.rst:
##########
@@ -0,0 +1,141 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+"Public API" of Airflow
+=======================
+
+The Public API of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new Executors, Plugins, Operators and Providers. The
+public API of Apache Airflow can be useful for building custom tools and integrations with other systems,
+as well as for automating certain aspects of the Airflow workflow.
+
+In general, the public API is an important part of the Airflow ecosystem and can be a powerful tool for users
+and developers who want to extend the functionality of the system.
+
+What kind of APIs are Public in Apache Airflow?
+===============================================
+
+Apache Airflow has a number of different public APIs that allow developers to interact with various
+aspects of the system. Some examples of the types of public APIs exposed as Python objects
+that are available in Apache Airflow include:
+
+* :doc:`DAG <concepts/dags>`_ (Directed Acyclic Graph) APIs, which allow developers to create, manage,
+  and access DAGs in Airflow.
+* :doc:`Task <concepts/tasks>`_ APIs, which provide access to information about individual tasks within
+  a DAG, such as their dependencies and execution status.
+* :doc:`Operator <concepts/operators>`_ APIs, which allows the developers to write their custom Operators.
+* :doc:`Decorators <howto/create-custom-decorator>`_ APIs, which allows the developers to write their
+  custom decorators to make it easier to write :doc:`TaskFlow <tutorial/taskflow>`_ DAGs.
+* :doc:`Secret Managers <security/secrets>`_ APIs, which allows the developers to write their custom
+  Secret Managers to safely access credentials and other secret configuration of their workflows.
+* :doc:`Connection management <concepts/connections>`_ APIs, which allow developers to manage
+  connections to external systems
+* :doc:`XCom <concepts/xcoms>`_, which allow developers to manage cross-task communication within Airflow.
+* :doc:`Variables <concepts/variables>`_, which allow developers to manage variables within Airflow.
+* :doc:`Executors <executor/index>`_, which allow developers to manage the execution of tasks within Airflow.
+* :doc:`Plugins <plugins>`_, which allow developers to extend internal Airflow capabilities - add new UI
+  pages, custom :doc:`TimeTables <concepts/timetable>`_, :doc:`Extra Links <howto/define_extra_link>`_,
+  :doc:`Listeners <listeners>`_ - all of which are considered public APIs.
+
+Also Airflow has a stable REST API that allows users to interact with Airflow via HTTP requests and a
+CLI that allows users to interact with Airflow via command line commands.
+
+Overall, the public APIs in Apache Airflow are designed to provide developers with a wide
+range of tools and capabilities for interacting with the system and extending its functionality.
+
+What is not part of the Public API of Apache Airflow?
+=====================================================
+
+A public API of Apache Airflow is a set of programming interfaces that are designed to be used
+by developers to access certain features and functionality of the system. As such, not everything in
+Apache Airflow is considered to be part of the public API.
+
+For example, the internal implementation details of Apache Airflow, such as the specific algorithms
+and data structures used to manage DAGs and tasks, are not considered to be part of the public API.
+These implementation details are not intended to be used by external developers, a
+nd are subject to change without notice.
+
+Additionally, certain features and functionality of Apache Airflow may not be exposed through the public API,
+either because they are not considered to be stable or because they are not intended for external use.
+In these cases, developers may not be able to access these features through the public API,
+and should instead use other available methods for interacting with the system.
+
+Similarly, :doc:`Database structure <database-erd-ref>`_ is considered to be an internal implementation
+detail of Apache Airflow and you should not assume the structure is going to be maintained in
+backwards-compatible way.

Review Comment:
   For me, I've been always thought when writing documents is to write what is NOT included. And for me this is always one of the most important part of such document.
   
   This cuts down ALL discussions about the subject. And this saves time and explanation when you are sending people to read such document.
   
   While I agree this one is pretty vague (written by GPT Chat mostly), I would argue that we MUST have a chapter when we explicitly write "DATABASE IS NOT PUBLIC". If you do not write it here, people might assume it is  - and even if this is not logical, they will argue it - sometimes vehemotly: "Well yeah, that DB thing is not included here, but we've always been using it and it's pretty obvious that we can add new columns and fields etc.etc.". I had such a discussion exactly today on slack. The user there did not ask "can I extend the DB of Airlfow" - but "What's the recommended way to do it". Assuming, they can and should do it if they want: https://apache-airflow.slack.com/archives/CSS36QQS1/p1670836953319909
   
   I want to be very clear about DB and few other cases that are likely people will assume can be "public interface".
   
   The thing is that If you explicitly mention that something is *not included*, there is no discussion. Whatsoever. Full stop. Don't even mention it. It's explicitly written it is not included and it is not. Discussion is ended at the moment it started.
   
   There is a very good reason why in legal docs you write "all the other things not covered by this, including but not limited to X and Y" ... 
   
   I will change it to be more specific, but I want to say explicitly that some things people are likely to assume to be part of the public interface are NOT.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on a diff in pull request #28300: Add Public API description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
potiuk commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1046546705


##########
docs/apache-airflow/public-airflow-api.rst:
##########
@@ -0,0 +1,141 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+"Public API" of Airflow
+=======================
+
+The Public API of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new Executors, Plugins, Operators and Providers. The
+public API of Apache Airflow can be useful for building custom tools and integrations with other systems,
+as well as for automating certain aspects of the Airflow workflow.
+
+In general, the public API is an important part of the Airflow ecosystem and can be a powerful tool for users
+and developers who want to extend the functionality of the system.
+
+What kind of APIs are Public in Apache Airflow?
+===============================================
+
+Apache Airflow has a number of different public APIs that allow developers to interact with various
+aspects of the system. Some examples of the types of public APIs exposed as Python objects
+that are available in Apache Airflow include:
+
+* :doc:`DAG <concepts/dags>`_ (Directed Acyclic Graph) APIs, which allow developers to create, manage,
+  and access DAGs in Airflow.
+* :doc:`Task <concepts/tasks>`_ APIs, which provide access to information about individual tasks within
+  a DAG, such as their dependencies and execution status.
+* :doc:`Operator <concepts/operators>`_ APIs, which allows the developers to write their custom Operators.
+* :doc:`Decorators <howto/create-custom-decorator>`_ APIs, which allows the developers to write their
+  custom decorators to make it easier to write :doc:`TaskFlow <tutorial/taskflow>`_ DAGs.
+* :doc:`Secret Managers <security/secrets>`_ APIs, which allows the developers to write their custom
+  Secret Managers to safely access credentials and other secret configuration of their workflows.
+* :doc:`Connection management <concepts/connections>`_ APIs, which allow developers to manage
+  connections to external systems
+* :doc:`XCom <concepts/xcoms>`_, which allow developers to manage cross-task communication within Airflow.
+* :doc:`Variables <concepts/variables>`_, which allow developers to manage variables within Airflow.
+* :doc:`Executors <executor/index>`_, which allow developers to manage the execution of tasks within Airflow.
+* :doc:`Plugins <plugins>`_, which allow developers to extend internal Airflow capabilities - add new UI
+  pages, custom :doc:`TimeTables <concepts/timetable>`_, :doc:`Extra Links <howto/define_extra_link>`_,
+  :doc:`Listeners <listeners>`_ - all of which are considered public APIs.
+
+Also Airflow has a stable REST API that allows users to interact with Airflow via HTTP requests and a
+CLI that allows users to interact with Airflow via command line commands.
+
+Overall, the public APIs in Apache Airflow are designed to provide developers with a wide
+range of tools and capabilities for interacting with the system and extending its functionality.
+
+What is not part of the Public API of Apache Airflow?
+=====================================================
+
+A public API of Apache Airflow is a set of programming interfaces that are designed to be used
+by developers to access certain features and functionality of the system. As such, not everything in
+Apache Airflow is considered to be part of the public API.
+
+For example, the internal implementation details of Apache Airflow, such as the specific algorithms
+and data structures used to manage DAGs and tasks, are not considered to be part of the public API.
+These implementation details are not intended to be used by external developers, a
+nd are subject to change without notice.
+
+Additionally, certain features and functionality of Apache Airflow may not be exposed through the public API,
+either because they are not considered to be stable or because they are not intended for external use.
+In these cases, developers may not be able to access these features through the public API,
+and should instead use other available methods for interacting with the system.
+
+Similarly, :doc:`Database structure <database-erd-ref>`_ is considered to be an internal implementation
+detail of Apache Airflow and you should not assume the structure is going to be maintained in
+backwards-compatible way.
+
+Is Airflow DB part of the Public API of Airflow?
+================================================
+
+The :doc:`Airflow DB <database-erd-ref>`_ is not considered to be part of the public API of Apache Airflow.
+The Airflow DB is the internal database used by Airflow to store information about DAGs, tasks, and
+other aspects of the system's state. This database is not intended to be accessed directly by
+external developers, and the specific details of its schema and structure are not considered
+to be part of the public API.
+
+Instead, developers who want to interact with the Airflow DB should use the :doc:`stable-rest-api-ref` and
+the :doc:`stable-cli-ref`, both providing a number of different interfaces and methods for accessing and
+manipulating data in the database. These APIs are designed to be more stable and user-friendly than
+accessing the Airflow DB directly, and provide a more consistent and predictable way of interacting
+with the system.
+
+Is Stable REST API part of the Public API of Airflow?
+=====================================================
+
+The :doc:`Stable REST API <stable-cli-ref>`_ is considered to be part of the public API of Apache Airflow.
+The REST API is a set of programming interfaces that allow developers to access certain features of
+Airflow over the HTTP protocol, using a standard set of RESTful (representational state transfer) conventions.
+
+The stable REST API is considered to be part of the public API because it is designed to be used by
+external developers to interact with Airflow in a consistent and predictable way. The REST API provides
+access to many of the same features and functionality that are available through the other public APIs
+of Airflow, such as the ability to create and manage DAGs and tasks, and to monitor their status.
+
+Overall, the stable REST API is an important part of the public API of Apache Airflow, and can
+be a useful tool for developers who want to integrate Airflow with other systems or
+build custom tools and applications on top of Airflow.
+
+Which classes and packages are a public API of Apache Airflow?
+==============================================================
+
+The specific classes and packages that are considered to be part of the public API of Apache
+Airflow can vary depending on the version of Airflow that you are using. In general, however, the
+public API of Airflow consists of a number of different classes and packages that provide access
+to the core features and functionality of the system.
+
+Some examples of the classes and packages that may be considered as the public API of Apache Airflow include:
+
+* The :class:`~airflow.models.dag.DAG`, which provides a way to define and manage DAGs in Airflow.
+* The :class:`~airflow.models.baseoperator.BaseOperator`, which provides a way write custom operators.
+* The :class:`~airflow.hooks.base.BaseHook`, which provides a way write custom hooks.
+* The :class:`~airflow.decorators.base.TaskDecorator`, which provides a way write custom hooks.
+* :class:`~airflow.models.connection.Connection`, :class:`~airflow.models.variable.Variable` and
+  :class:`~airflow.models.xcom.XCom` classes that are used to access and manage environment of execution of
+  the tasks and allow to perform an inter-task communication.
+* :class:`~airflow.executors.base_executor.BaseExecutor` - the Executors are the components of
+  Airflow that are responsible for executing tasks. There are a number of different executor implementations
+  available in Airflow, each with its own unique characteristics and capabilities. You can develop your own
+  executors in the way that suits your needs.
+
+.. note::
+
+    Originally Executors had a number of internal details and coupling with internal Airflow core logic
+    that made it difficult to write your own executors. But as of
+    Airflow 2.6.0 and `AIP-51 <https://cwiki.apache.org/confluence/display/AIRFLOW/AIP-51+Removing+Executor+Coupling+from+Core+Airflow>`_
+    this is no longer the case. The Executors are now fully decoupled and are a part of the public API.

Review Comment:
   I think. It's equivalent of "Added in 2.X.0" note.
   
   People seeing this document will extrapolate it to previous versions. And I think they have the right to do it because - despite of claiming it works, we miserably failed in fulfilling our promise there. We knew people were using and developing custom executors and we suggested they can do it, yet we did not make it "truly" possible. And if people will try to do  it for earlier versions they might fail without knowin why.
   
   While I agree the link and note is a bit "over-the-top". I will just leave
   
   ```
   .. versionadded:: 2.6
   ```
   
   Because this is the version where it wil be actually possible to get your custom executor "really" works in a decoupled way.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk merged pull request #28300: Add Public Interface description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
potiuk merged PR #28300:
URL: https://github.com/apache/airflow/pull/28300


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on pull request #28300: Add Public Interface description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
potiuk commented on PR #28300:
URL: https://github.com/apache/airflow/pull/28300#issuecomment-1378555534

   > Did another review. Starting to look nice and clear!
   
   Indeed. I like how this change now consolidates few places where we tried to communicate what should be considered as "public" in one dedicated place and I like the way how concise it is after all those corrections and feedback. 
   
   I updated it now:
   
   * applied your corrections Bas (I actually made you co-author @BasPH if you do not mind, as I think there are enormous improvements in clarity from you here :)
   * i re-added the links to providers with just a "read more" - similarly as in other cases - I think this is consistent with all other parts and allows the reader to dig-deeper if they want.
   
   I hope we will be able to get it in soon :)


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on pull request #28300: Add Public Interface description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
potiuk commented on PR #28300:
URL: https://github.com/apache/airflow/pull/28300#issuecomment-1367257938

   Probably it can still be shortened and cleaned up a bit, but I think this way it is going to be precisely what I wanted to have - one place where we can tell people "If you want to do somethign with Airflow to extend it - here are all the possible ways you can do it when you want to make sure backwards compatibility is maintained".


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on a diff in pull request #28300: Add Public Interface description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
potiuk commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1058655785


##########
docs/apache-airflow/administration-and-deployment/public-airflow-interface.rst:
##########
@@ -0,0 +1,87 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+===========================
+
+The Public Interface of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This includes operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+and for automating certain aspects of the Airflow workflow.
+
+You can extend Airflow in three ways:
+
+* By writing new custom Python code (via Operators, Plugins, Provider)
+* By using the `Stable REST API <stable-rest-api-ref>`_ (based on the OpenAPI specification)
+* By using the `Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_
+
+How can you extend Apache Airflow with custom Python Code?
+==========================================================
+
+The Public Interface of Airflow consists of a number of different classes and packages that provide access
+to the core features and functionality of the system.
+
+The classes and packages that may be considered as the Public Interface include:
+
+* The :class:`~airflow.DAG`, which provides a way to define and manage DAGs in Airflow.
+* The :class:`~airflow.models.baseoperator.BaseOperator`, which provides a way write custom operators.
+* The :class:`~airflow.hooks.base.BaseHook`, which provides a way write custom hooks.
+* The :class:`~airflow.models.connection.Connection`, which provides access to external service credentials and configuration.
+* The :class:`~airflow.models.variable.Variable`, which provides access to Airflow configuration variables.
+* The :class:`~airflow.models.xcom.XCom` which are used to access to inter-task communication data.
+* The :class:`~airflow.secrets.BaseSecretsBackend` which are used to define custom secret managers.
+* The :class:`~airflow.plugins_manager.AirflowPlugin` which are used to define custom plugins.
+* The :class:`~airflow.triggers.base.BaseTrigger`, which are used to implement custom Custom Deferrable Operators (based on ``asyncio``).
+* The :class:`~airflow.decorators.base.TaskDecorator`, which provides a way write custom decorators.
+* The :class:`~airflow.listeners.listener.ListenerManager` class which provides hooks that can be implemented to respond to DAG/Task lifecycle events.
+
+.. versionadded:: 2.5
+
+   Listener public interface has been added in version 2.5.
+
+* The :class:`~airflow.executors.base_executor.BaseExecutor` - the Executors are the components of Airflow
+  that are responsible for executing tasks.
+
+.. versionadded:: 2.6
+
+   There are a number of different executor implementations built-in Airflow, each with its own unique
+   characteristics and capabilities. Executor interface was available in earlier version of Airflow but
+   only as of version 2.6 executors are fully decoupled and Airflow does not rely on built-in set of executors.
+   You could have implemented (and succeeded) with implementing Executors before Airflow 2.6 and a number
+   of people succeeded in doing so, but there were some hard-coded behaviours that preferred in-built
+   executors, and custom executors could not provide full functionality that built-in executors had.
+
+
+What is not part of the Public Interface of Apache Airflow?
+===========================================================
+
+Everything not mentioned in this document should be considered as non-Public Interface.

Review Comment:
   > It's also important to clarify that some methods are intentionally "private". E.g. in the operator, it might have helper methods. But these methods are more implementation detail right? If users subclass, though, they may use and depend on these. So what do we do? Ideally, from maintainer perspective, we want the overall behavior of the operator to be subject to backcompat but not any of its methods. Currently the convention is, i believe, all methods are assumed public unless prefix with underscore / marked meta private. WDYT?
   
   For providers it's far easier because there is no "internal" sharing between multiple modules. Each of them is really either "standalone" or uses "public" classes from other providers. And with common.sql we even introduced stubgen to generate the actual "public interface" that MyPy can surface as errors. And we even have a friction added (necessiety of regenerating the stubs) when someone changes the API by accident.
   
   > One problem with the public / private distinction is... from tool perspective ... even using _my_func across modules is sorta forbidden. But there is a difference between the internal "private" and the user-facing "private". We need to be able to write helper code that can be used across modules (internal public), but not have it be user-facing public.
   
   This is very right assesment.
   
   That's why I proposed to allowlist and be explicit about everything that is public. And from the tool perspective, one thing is  important (what we did for common.sql) - to keep the list of all public methods, and whenever something there is *removed* or *updated*, fail the build and make it an explicit action (friction) for the one who modifies it accidentally. We can also add a tool for whoever uses airflow as library to check if they are not using something not "intentionally" public (we've done that for common.sql - via MyPy stubs).
   
   But we cannot  (yet) do any automation in Airflow, because first we need to agree and document what is and what is not public. We need a good list of things that should be "public" - so that we can (eventually) add friction to removing or updating those and possibly eventually add a tool for others to verify if they are not using something they should not. 
   
   This doc here is mostly to start the discussion and work-out a good set of classes/packages that we should consider "really public". For now in a document, eventually "guarded automatically".
   
   BTW. As I mentioned in many places, we are not able to get it 100% correct. Never. [Hyrum's law](https://www.hyrumslaw.com/) is very clear and IMHO very true about this. 
   
   But we can state our intentions, and make an effort to keep those "explicitly intended" APIs controlled. 
   
   And first ... we need to agree on what our intentions are. If we don't document them and don't agree what is and what is not our intention for public API - everyone will have their own understanding of it.  And often those understandings will be different one (for example many users still use the DB of Airflow and relies on its structure because they assume they can rely on it). You wrote yourself that you prefer to `minimize backcompat surface` and "everything that's not an operator / sensor / hook should be private". But this is only your understanding - others might say "everything in utils can be used by DAG authors and provider writers".  Both statement are possible - both orthogonally different. Both might be deliberate decision we could make as a community. But we need to make that decision and be very explicit about it. This is a starting point that is missing now IMHO.
   
   I just want to have a page where we all agree what we intentionally make "public", and have clear rules that all the rest is not.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on a diff in pull request #28300: Add Public API description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
potiuk commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1046502612


##########
docs/apache-airflow/public-airflow-api.rst:
##########
@@ -0,0 +1,141 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+"Public API" of Airflow
+=======================
+
+The Public API of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new Executors, Plugins, Operators and Providers. The
+public API of Apache Airflow can be useful for building custom tools and integrations with other systems,
+as well as for automating certain aspects of the Airflow workflow.
+
+In general, the public API is an important part of the Airflow ecosystem and can be a powerful tool for users
+and developers who want to extend the functionality of the system.
+
+What kind of APIs are Public in Apache Airflow?
+===============================================
+
+Apache Airflow has a number of different public APIs that allow developers to interact with various
+aspects of the system. Some examples of the types of public APIs exposed as Python objects
+that are available in Apache Airflow include:
+
+* :doc:`DAG <concepts/dags>`_ (Directed Acyclic Graph) APIs, which allow developers to create, manage,
+  and access DAGs in Airflow.
+* :doc:`Task <concepts/tasks>`_ APIs, which provide access to information about individual tasks within
+  a DAG, such as their dependencies and execution status.
+* :doc:`Operator <concepts/operators>`_ APIs, which allows the developers to write their custom Operators.
+* :doc:`Decorators <howto/create-custom-decorator>`_ APIs, which allows the developers to write their
+  custom decorators to make it easier to write :doc:`TaskFlow <tutorial/taskflow>`_ DAGs.
+* :doc:`Secret Managers <security/secrets>`_ APIs, which allows the developers to write their custom
+  Secret Managers to safely access credentials and other secret configuration of their workflows.
+* :doc:`Connection management <concepts/connections>`_ APIs, which allow developers to manage
+  connections to external systems
+* :doc:`XCom <concepts/xcoms>`_, which allow developers to manage cross-task communication within Airflow.
+* :doc:`Variables <concepts/variables>`_, which allow developers to manage variables within Airflow.
+* :doc:`Executors <executor/index>`_, which allow developers to manage the execution of tasks within Airflow.
+* :doc:`Plugins <plugins>`_, which allow developers to extend internal Airflow capabilities - add new UI
+  pages, custom :doc:`TimeTables <concepts/timetable>`_, :doc:`Extra Links <howto/define_extra_link>`_,
+  :doc:`Listeners <listeners>`_ - all of which are considered public APIs.
+
+Also Airflow has a stable REST API that allows users to interact with Airflow via HTTP requests and a
+CLI that allows users to interact with Airflow via command line commands.
+
+Overall, the public APIs in Apache Airflow are designed to provide developers with a wide
+range of tools and capabilities for interacting with the system and extending its functionality.
+
+What is not part of the Public API of Apache Airflow?
+=====================================================
+
+A public API of Apache Airflow is a set of programming interfaces that are designed to be used
+by developers to access certain features and functionality of the system. As such, not everything in
+Apache Airflow is considered to be part of the public API.
+
+For example, the internal implementation details of Apache Airflow, such as the specific algorithms
+and data structures used to manage DAGs and tasks, are not considered to be part of the public API.
+These implementation details are not intended to be used by external developers, a
+nd are subject to change without notice.
+
+Additionally, certain features and functionality of Apache Airflow may not be exposed through the public API,
+either because they are not considered to be stable or because they are not intended for external use.
+In these cases, developers may not be able to access these features through the public API,
+and should instead use other available methods for interacting with the system.
+
+Similarly, :doc:`Database structure <database-erd-ref>`_ is considered to be an internal implementation
+detail of Apache Airflow and you should not assume the structure is going to be maintained in
+backwards-compatible way.
+
+Is Airflow DB part of the Public API of Airflow?
+================================================
+
+The :doc:`Airflow DB <database-erd-ref>`_ is not considered to be part of the public API of Apache Airflow.
+The Airflow DB is the internal database used by Airflow to store information about DAGs, tasks, and
+other aspects of the system's state. This database is not intended to be accessed directly by
+external developers, and the specific details of its schema and structure are not considered
+to be part of the public API.
+
+Instead, developers who want to interact with the Airflow DB should use the :doc:`stable-rest-api-ref` and
+the :doc:`stable-cli-ref`, both providing a number of different interfaces and methods for accessing and
+manipulating data in the database. These APIs are designed to be more stable and user-friendly than
+accessing the Airflow DB directly, and provide a more consistent and predictable way of interacting
+with the system.
+
+Is Stable REST API part of the Public API of Airflow?
+=====================================================
+
+The :doc:`Stable REST API <stable-cli-ref>`_ is considered to be part of the public API of Apache Airflow.
+The REST API is a set of programming interfaces that allow developers to access certain features of
+Airflow over the HTTP protocol, using a standard set of RESTful (representational state transfer) conventions.
+
+The stable REST API is considered to be part of the public API because it is designed to be used by
+external developers to interact with Airflow in a consistent and predictable way. The REST API provides
+access to many of the same features and functionality that are available through the other public APIs
+of Airflow, such as the ability to create and manage DAGs and tasks, and to monitor their status.
+
+Overall, the stable REST API is an important part of the public API of Apache Airflow, and can
+be a useful tool for developers who want to integrate Airflow with other systems or
+build custom tools and applications on top of Airflow.
+
+Which classes and packages are a public API of Apache Airflow?
+==============================================================
+
+The specific classes and packages that are considered to be part of the public API of Apache
+Airflow can vary depending on the version of Airflow that you are using. In general, however, the
+public API of Airflow consists of a number of different classes and packages that provide access
+to the core features and functionality of the system.
+
+Some examples of the classes and packages that may be considered as the public API of Apache Airflow include:
+
+* The :class:`~airflow.models.dag.DAG`, which provides a way to define and manage DAGs in Airflow.
+* The :class:`~airflow.models.baseoperator.BaseOperator`, which provides a way write custom operators.
+* The :class:`~airflow.hooks.base.BaseHook`, which provides a way write custom hooks.
+* The :class:`~airflow.decorators.base.TaskDecorator`, which provides a way write custom hooks.
+* :class:`~airflow.models.connection.Connection`, :class:`~airflow.models.variable.Variable` and
+  :class:`~airflow.models.xcom.XCom` classes that are used to access and manage environment of execution of
+  the tasks and allow to perform an inter-task communication.
+* :class:`~airflow.executors.base_executor.BaseExecutor` - the Executors are the components of
+  Airflow that are responsible for executing tasks. There are a number of different executor implementations
+  available in Airflow, each with its own unique characteristics and capabilities. You can develop your own
+  executors in the way that suits your needs.

Review Comment:
   This one assumes that the discussion is completed.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on pull request #28300: Add Public Interface description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
potiuk commented on PR #28300:
URL: https://github.com/apache/airflow/pull/28300#issuecomment-1351052753

   BTW. That was a good exercise (and learning) for GPT Chat after the reviews. I don't think it is good in writing docs like that - it's far to chatty (it's a chat bot eventually). Almost none of the original text from the GPT chat is left, and the number of omissions (which were added during the review) and mistakes it made was quite big. 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on a diff in pull request #28300: Add Public Interface description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
potiuk commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1051382483


##########
docs/apache-airflow/administration-and-deployment/public-airflow-interface.rst:
##########
@@ -0,0 +1,118 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+===========================
+
+The Public Interface of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+as well as for automating certain aspects of the Airflow workflow.
+
+In general, the Public Interface is an important part of the Airflow ecosystem and can be a powerful
+tool for users and developers who want to extend the functionality of the system.
+
+You can extend Airflow in three ways:
+
+* By writing new custom Python code (via Operators, Plugins, Provider)
+* By using the `Stable REST API <stable-rest-api-ref>`_ (based on the OpenAPI specification)
+* By using the `Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_
+
+How can you extend Apache Airflow with custom Python Code?
+==========================================================
+
+Apache Airflow has a number of different Public Interfaces that allow developers to interact with various
+aspects of the system. Some examples of the types of Public Interfaces exposed as Python objects
+that are available in Apache Airflow include:

Review Comment:
   This whole section went away with consolidating the two packages.  



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on a diff in pull request #28300: Add Public Interface description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
potiuk commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1050150795


##########
docs/apache-airflow/public-airflow-interface.rst:
##########
@@ -0,0 +1,115 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+===========================
+
+The Public Interface of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+as well as for automating certain aspects of the Airflow workflow.
+
+In general, the Public Interface is an important part of the Airflow ecosystem and can be a powerful
+tool for users and developers who want to extend the functionality of the system.
+
+You can extend Airflow in three ways:
+
+* By writing new custom Python code (via Operators, Plugins, Provider)
+* By using the `Stable REST API <stable-rest-api-ref>`_ (based on the OpenAPI specification)
+* By using the `Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_
+
+How can you extend Apache Airflow with custom Python Code?
+==========================================================
+
+Apache Airflow has a number of different Public Interfaces that allow developers to interact with various
+aspects of the system. Some examples of the types of Public Interfaces exposed as Python objects
+that are available in Apache Airflow include:
+
+* `DAG <concepts/dags>`_ (Directed Acyclic Graph) APIs, which allow developers to create, manage,
+  and access DAGs in Airflow.
+* `Task <concepts/tasks>`_ APIs, which provide access to information about individual tasks within
+  a DAG, such as their dependencies and execution status.
+* `Operator <concepts/operators>`_ APIs, which allows the developers to write their custom Operators.
+* `Decorators <howto/create-custom-decorator>`_ APIs, which allows the developers to write their
+  custom decorators to make it easier to write `TaskFlow <tutorial/taskflow>`_ DAGs.
+* `Secret Managers <security/secrets>`_ APIs, which allows the developers to write their custom
+  Secret Managers to safely access credentials and other secret configuration of their workflows.
+* `Connection management <concepts/connections>`_ APIs, which allow developers to manage
+  connections to external systems
+* `XCom <concepts/xcoms>`_, which allow developers to manage cross-task communication within Airflow.
+* `Variables <concepts/variables>`_, which allow developers to manage variables within Airflow.
+* `Executors <executor/index>`_, which allow developers to manage the execution of tasks within Airflow.
+* `Listeners <listeners>`_, which allow developers to react to DAG/Task lifecycle events.
+* `Plugins <plugins>`_, which allow developers to extend internal Airflow capabilities - add new UI
+  pages, custom `TimeTables <concepts/timetable>`_, `Extra Links <howto/define_extra_link>`_,
+  `Triggers <concept/deferring>`_. `Listeners <listeners>`_.
+
+What is not part of the Public Interface of Apache Airflow?
+===========================================================
+
+Everything not mentioned in this document should be considered as non-Public Interface.
+
+The users however, might have some assumptions about different parts of Airflow, thinking that
+they can rely on them, so here we try to clarify and want to be explicit that they are not part of the
+Public Interface:
+
+* `Database structure <database-erd-ref>`_ is considered to be an internal implementation
+  detail and you should not assume the structure is going to be maintained in
+  backwards-compatible way.
+
+* `Web UI <ui>`_ is considered to be continuously adapting and evolving for Apache Airflow and
+  you should not assume that the UI will remain as it is in backwards-compatible way. Views and screens
+  might be updated and removed in major releases without impacting backwards-compatibility.
+
+* `Python API <python-api-ref>`_ (except the explicitly mentioned classes below), are considered an
+  internal implementation details and you should not assume they will be maintained
+  in backwards-compatible way.
+
+Which classes and packages are part of the Public Interface of Apache Airflow?
+==============================================================================
+
+The Public Interface of Airflow consists of a number of different classes and packages that provide access
+to the core features and functionality of the system.
+
+The classes and packages that may be considered as the Public Interface include:
+
+* The :class:`~airflow.DAG`, which provides a way to define and manage DAGs in Airflow.
+* The :class:`~airflow.models.baseoperator.BaseOperator`, which provides a way write custom operators.
+* The :class:`~airflow.hooks.base.BaseHook`, which provides a way write custom hooks.
+* The :class:`~airflow.models.connection.Connection`, which provides access to external service credentials and configuration.
+* The :class:`~airflow.models.variable.Variable`, which provides access to Airflow configuration variables.
+* The :class:`~airflow.models.xcom.XCom` which are used to access to inter-task communication data.
+* The :class:`~airflow.secrets.BaseSecretsBackend` which are used to define custom secret managers.
+* The :class:`~airflow.plugins_manager.AirflowPlugin` which are used to define custom plugins.
+* The :class:`~airflow.triggers.base.BaseTrigger`, which are used to implement custom Custom Deferrable Operators (based on ``async.io``).
+* The :class:`~airflow.decorators.base.TaskDecorator`, which provides a way write custom decorators.
+* The :class:`~airflow.listeners.listener.ListenerManager` class which provides hooks that can be implemented to respond to DAG/Task lifecycle events
+
+.. versionadded:: 2.5
+
+   Listener public interface has been added in version 2.5.
+
+* The :class:`~airflow.executors.base_executor.BaseExecutor` - the Executors are the components of Airflow
+  that are responsible for executing tasks.
+
+.. versionadded:: 2.6
+
+   There are a number of different executor implementations built-in Airflow, each with its own unique
+   characteristics and capabilities. Executor interface was available in earlier version of Airflow but
+   only as of version 2.6 executors are decoupled and Airflow does not rely on built-in set of executors.

Review Comment:
   I updated the note now:
   
   >    There are a number of different executor implementations built-in Airflow, each with its own unique
   characteristics and capabilities. Executor interface was available in earlier version of Airflow but
   only as of version 2.6 executors are fully decoupled and Airflow does not rely on built-in set of executors.
   You could have implemented (and succeeded) with implementing Executors before Airflow 2.6 and a number
   of people succeeded in doing so, but there were some hard-coded behaviours that preferred in-built
   executors, and custom executors could not provide full functionality that built-in executors had.
   
   I hope it is much closer to "truth" now - it mentions that it was possible to implement executors before and people succeed with it (truth 1) but that you could have hit the limits because of some hard-coded behaviours (truth 2) and that in 2.6 you can rely on public api of executor to be on-par with the built-in executors (truth 3) 
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] dstandish commented on a diff in pull request #28300: Add Public Interface description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
dstandish commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1058631912


##########
docs/apache-airflow/administration-and-deployment/public-airflow-interface.rst:
##########
@@ -0,0 +1,87 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+===========================
+
+The Public Interface of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This includes operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+and for automating certain aspects of the Airflow workflow.
+
+You can extend Airflow in three ways:
+
+* By writing new custom Python code (via Operators, Plugins, Provider)
+* By using the `Stable REST API <stable-rest-api-ref>`_ (based on the OpenAPI specification)
+* By using the `Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_
+
+How can you extend Apache Airflow with custom Python Code?
+==========================================================
+
+The Public Interface of Airflow consists of a number of different classes and packages that provide access
+to the core features and functionality of the system.
+
+The classes and packages that may be considered as the Public Interface include:
+
+* The :class:`~airflow.DAG`, which provides a way to define and manage DAGs in Airflow.
+* The :class:`~airflow.models.baseoperator.BaseOperator`, which provides a way write custom operators.
+* The :class:`~airflow.hooks.base.BaseHook`, which provides a way write custom hooks.
+* The :class:`~airflow.models.connection.Connection`, which provides access to external service credentials and configuration.
+* The :class:`~airflow.models.variable.Variable`, which provides access to Airflow configuration variables.
+* The :class:`~airflow.models.xcom.XCom` which are used to access to inter-task communication data.
+* The :class:`~airflow.secrets.BaseSecretsBackend` which are used to define custom secret managers.
+* The :class:`~airflow.plugins_manager.AirflowPlugin` which are used to define custom plugins.
+* The :class:`~airflow.triggers.base.BaseTrigger`, which are used to implement custom Custom Deferrable Operators (based on ``asyncio``).
+* The :class:`~airflow.decorators.base.TaskDecorator`, which provides a way write custom decorators.
+* The :class:`~airflow.listeners.listener.ListenerManager` class which provides hooks that can be implemented to respond to DAG/Task lifecycle events.
+
+.. versionadded:: 2.5
+
+   Listener public interface has been added in version 2.5.
+
+* The :class:`~airflow.executors.base_executor.BaseExecutor` - the Executors are the components of Airflow
+  that are responsible for executing tasks.
+
+.. versionadded:: 2.6
+
+   There are a number of different executor implementations built-in Airflow, each with its own unique
+   characteristics and capabilities. Executor interface was available in earlier version of Airflow but
+   only as of version 2.6 executors are fully decoupled and Airflow does not rely on built-in set of executors.
+   You could have implemented (and succeeded) with implementing Executors before Airflow 2.6 and a number
+   of people succeeded in doing so, but there were some hard-coded behaviours that preferred in-built
+   executors, and custom executors could not provide full functionality that built-in executors had.
+
+
+What is not part of the Public Interface of Apache Airflow?
+===========================================================
+
+Everything not mentioned in this document should be considered as non-Public Interface.

Review Comment:
   Sorry... so ... specifically in the provider package... is everything private, public, or some private some public?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] dstandish commented on a diff in pull request #28300: Add Public Interface description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
dstandish commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1058640908


##########
docs/apache-airflow/administration-and-deployment/public-airflow-interface.rst:
##########
@@ -0,0 +1,87 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+===========================
+
+The Public Interface of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This includes operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+and for automating certain aspects of the Airflow workflow.
+
+You can extend Airflow in three ways:
+
+* By writing new custom Python code (via Operators, Plugins, Provider)
+* By using the `Stable REST API <stable-rest-api-ref>`_ (based on the OpenAPI specification)
+* By using the `Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_
+
+How can you extend Apache Airflow with custom Python Code?
+==========================================================
+
+The Public Interface of Airflow consists of a number of different classes and packages that provide access
+to the core features and functionality of the system.
+
+The classes and packages that may be considered as the Public Interface include:
+
+* The :class:`~airflow.DAG`, which provides a way to define and manage DAGs in Airflow.
+* The :class:`~airflow.models.baseoperator.BaseOperator`, which provides a way write custom operators.
+* The :class:`~airflow.hooks.base.BaseHook`, which provides a way write custom hooks.
+* The :class:`~airflow.models.connection.Connection`, which provides access to external service credentials and configuration.
+* The :class:`~airflow.models.variable.Variable`, which provides access to Airflow configuration variables.
+* The :class:`~airflow.models.xcom.XCom` which are used to access to inter-task communication data.
+* The :class:`~airflow.secrets.BaseSecretsBackend` which are used to define custom secret managers.
+* The :class:`~airflow.plugins_manager.AirflowPlugin` which are used to define custom plugins.
+* The :class:`~airflow.triggers.base.BaseTrigger`, which are used to implement custom Custom Deferrable Operators (based on ``asyncio``).
+* The :class:`~airflow.decorators.base.TaskDecorator`, which provides a way write custom decorators.
+* The :class:`~airflow.listeners.listener.ListenerManager` class which provides hooks that can be implemented to respond to DAG/Task lifecycle events.
+
+.. versionadded:: 2.5
+
+   Listener public interface has been added in version 2.5.
+
+* The :class:`~airflow.executors.base_executor.BaseExecutor` - the Executors are the components of Airflow
+  that are responsible for executing tasks.
+
+.. versionadded:: 2.6
+
+   There are a number of different executor implementations built-in Airflow, each with its own unique
+   characteristics and capabilities. Executor interface was available in earlier version of Airflow but
+   only as of version 2.6 executors are fully decoupled and Airflow does not rely on built-in set of executors.
+   You could have implemented (and succeeded) with implementing Executors before Airflow 2.6 and a number
+   of people succeeded in doing so, but there were some hard-coded behaviours that preferred in-built
+   executors, and custom executors could not provide full functionality that built-in executors had.
+
+
+What is not part of the Public Interface of Apache Airflow?
+===========================================================
+
+Everything not mentioned in this document should be considered as non-Public Interface.

Review Comment:
   And if it's both then how do we distinguish?  I think the current convention is underscores OR meta private tag.
   
   But interested in your thoughts.  We should include that in this proposal i think.
   
   Personally I think it is an important goal to minimize backcompat surface area so i would suggest that everything that's not an operator / sensor / hook should be private (i.e. all utils / helper code)



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on a diff in pull request #28300: Add Public Interface description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
potiuk commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1058655785


##########
docs/apache-airflow/administration-and-deployment/public-airflow-interface.rst:
##########
@@ -0,0 +1,87 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+===========================
+
+The Public Interface of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This includes operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+and for automating certain aspects of the Airflow workflow.
+
+You can extend Airflow in three ways:
+
+* By writing new custom Python code (via Operators, Plugins, Provider)
+* By using the `Stable REST API <stable-rest-api-ref>`_ (based on the OpenAPI specification)
+* By using the `Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_
+
+How can you extend Apache Airflow with custom Python Code?
+==========================================================
+
+The Public Interface of Airflow consists of a number of different classes and packages that provide access
+to the core features and functionality of the system.
+
+The classes and packages that may be considered as the Public Interface include:
+
+* The :class:`~airflow.DAG`, which provides a way to define and manage DAGs in Airflow.
+* The :class:`~airflow.models.baseoperator.BaseOperator`, which provides a way write custom operators.
+* The :class:`~airflow.hooks.base.BaseHook`, which provides a way write custom hooks.
+* The :class:`~airflow.models.connection.Connection`, which provides access to external service credentials and configuration.
+* The :class:`~airflow.models.variable.Variable`, which provides access to Airflow configuration variables.
+* The :class:`~airflow.models.xcom.XCom` which are used to access to inter-task communication data.
+* The :class:`~airflow.secrets.BaseSecretsBackend` which are used to define custom secret managers.
+* The :class:`~airflow.plugins_manager.AirflowPlugin` which are used to define custom plugins.
+* The :class:`~airflow.triggers.base.BaseTrigger`, which are used to implement custom Custom Deferrable Operators (based on ``asyncio``).
+* The :class:`~airflow.decorators.base.TaskDecorator`, which provides a way write custom decorators.
+* The :class:`~airflow.listeners.listener.ListenerManager` class which provides hooks that can be implemented to respond to DAG/Task lifecycle events.
+
+.. versionadded:: 2.5
+
+   Listener public interface has been added in version 2.5.
+
+* The :class:`~airflow.executors.base_executor.BaseExecutor` - the Executors are the components of Airflow
+  that are responsible for executing tasks.
+
+.. versionadded:: 2.6
+
+   There are a number of different executor implementations built-in Airflow, each with its own unique
+   characteristics and capabilities. Executor interface was available in earlier version of Airflow but
+   only as of version 2.6 executors are fully decoupled and Airflow does not rely on built-in set of executors.
+   You could have implemented (and succeeded) with implementing Executors before Airflow 2.6 and a number
+   of people succeeded in doing so, but there were some hard-coded behaviours that preferred in-built
+   executors, and custom executors could not provide full functionality that built-in executors had.
+
+
+What is not part of the Public Interface of Apache Airflow?
+===========================================================
+
+Everything not mentioned in this document should be considered as non-Public Interface.

Review Comment:
   > It's also important to clarify that some methods are intentionally "private". E.g. in the operator, it might have helper methods. But these methods are more implementation detail right? If users subclass, though, they may use and depend on these. So what do we do? Ideally, from maintainer perspective, we want the overall behavior of the operator to be subject to backcompat but not any of its methods. Currently the convention is, i believe, all methods are assumed public unless prefix with underscore / marked meta private. WDYT?
   
   For providers it's far easier because there is no "internal" sharing between multiple modules. Each of them is really either "standalone" or uses "public" classes from other providers. And with common.sql we even introduced stubgen to generate the actual "public interface" that MyPy can surface as errors. And we even have a friction added (necessiety of regenerating the stubs) when someone changes the API by accident.
   
   > One problem with the public / private distinction is... from tool perspective ... even using _my_func across modules is sorta forbidden. But there is a difference between the internal "private" and the user-facing "private". We need to be able to write helper code that can be used across modules (internal public), but not have it be user-facing public.
   
   This is very right assesment.
   
   That's why I proposed to allowlist and be explicit about everything that is public. And from the tool perspective, one thing is  important (what we did for common.sql) - to keep the list of all public methods, and whenever something there is *removed* or *updated*, fail the build and make it an explicit action (friction) for the one who modifies it accidentally. We can also add a tool for whoever uses airflow tools to check if they are not using something not "intentionally" public (we've done that for common.sql - via MyPy stubs).
   
   But we cannot  (yet) do any automation in Airflow, because first we need to agree and document what is and what is not public. We need a good list of things that should be "public" - so that we can (eventually) add friction to removing or updating those and possibly eventually add a tool for others to verify if they are not using something they should not. 
   
   This doc here is mostly to start the discussion and work-out a good set of classes/packages that we should consider "really public". For now in a document, eventually "guarded automatically".
   
   BTW. As I mentioned in many places, we are not able to get it 100% correct. Never. [Hyrum's law](https://www.hyrumslaw.com/) is very clear and IMHO very true about this. 
   
   But we can state our intentions, and make an effort to keep those "explicitly intended" APIs controlled. 
   
   And first ... we need to agree on what our intentions are. If we don't document them and don't agree what is and what is not our intention for public API - everyone will have their own understanding of it.  And often those understandings will be different one (for example many users still use the DB of Airflow and relies on its structure because they assume they can rely on it). You wrote yourself that you prefer to `minimize backcompat surface` and "everything that's not an operator / sensor / hook should be private". But this is only your understanding - others might say "everything in utils can be used by DAG authors and provider writers".  Both statement are possible - both orthogonally different. Both might be deliberate decision we should make as a community. But we need to make that decision and be very explicit about it. This is a starting point that is missing now IMHO.
   
   I just want to have a page where we all agree what we intentionally make "public", and have clear rules that all the rest is not.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on a diff in pull request #28300: Add Public Interface description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
potiuk commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1058656390


##########
docs/apache-airflow/administration-and-deployment/public-airflow-interface.rst:
##########
@@ -0,0 +1,87 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+===========================
+
+The Public Interface of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This includes operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+and for automating certain aspects of the Airflow workflow.
+
+You can extend Airflow in three ways:
+
+* By writing new custom Python code (via Operators, Plugins, Provider)
+* By using the `Stable REST API <stable-rest-api-ref>`_ (based on the OpenAPI specification)
+* By using the `Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_
+
+How can you extend Apache Airflow with custom Python Code?
+==========================================================
+
+The Public Interface of Airflow consists of a number of different classes and packages that provide access
+to the core features and functionality of the system.
+
+The classes and packages that may be considered as the Public Interface include:
+
+* The :class:`~airflow.DAG`, which provides a way to define and manage DAGs in Airflow.
+* The :class:`~airflow.models.baseoperator.BaseOperator`, which provides a way write custom operators.
+* The :class:`~airflow.hooks.base.BaseHook`, which provides a way write custom hooks.
+* The :class:`~airflow.models.connection.Connection`, which provides access to external service credentials and configuration.
+* The :class:`~airflow.models.variable.Variable`, which provides access to Airflow configuration variables.
+* The :class:`~airflow.models.xcom.XCom` which are used to access to inter-task communication data.
+* The :class:`~airflow.secrets.BaseSecretsBackend` which are used to define custom secret managers.
+* The :class:`~airflow.plugins_manager.AirflowPlugin` which are used to define custom plugins.
+* The :class:`~airflow.triggers.base.BaseTrigger`, which are used to implement custom Custom Deferrable Operators (based on ``asyncio``).
+* The :class:`~airflow.decorators.base.TaskDecorator`, which provides a way write custom decorators.
+* The :class:`~airflow.listeners.listener.ListenerManager` class which provides hooks that can be implemented to respond to DAG/Task lifecycle events.
+
+.. versionadded:: 2.5
+
+   Listener public interface has been added in version 2.5.
+
+* The :class:`~airflow.executors.base_executor.BaseExecutor` - the Executors are the components of Airflow
+  that are responsible for executing tasks.
+
+.. versionadded:: 2.6
+
+   There are a number of different executor implementations built-in Airflow, each with its own unique
+   characteristics and capabilities. Executor interface was available in earlier version of Airflow but
+   only as of version 2.6 executors are fully decoupled and Airflow does not rely on built-in set of executors.
+   You could have implemented (and succeeded) with implementing Executors before Airflow 2.6 and a number
+   of people succeeded in doing so, but there were some hard-coded behaviours that preferred in-built
+   executors, and custom executors could not provide full functionality that built-in executors had.
+
+
+What is not part of the Public Interface of Apache Airflow?
+===========================================================
+
+Everything not mentioned in this document should be considered as non-Public Interface.

Review Comment:
   BTW. Eventually we might be able to produce airflow typeshed https://github.com/python/typeshed with only stubs for only "externally public" APIs. This is what should be an "ultimate" goal I think. 



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] dstandish commented on a diff in pull request #28300: Add Public Interface description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
dstandish commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1058722499


##########
docs/apache-airflow/administration-and-deployment/public-airflow-interface.rst:
##########
@@ -0,0 +1,87 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+===========================
+
+The Public Interface of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This includes operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+and for automating certain aspects of the Airflow workflow.
+
+You can extend Airflow in three ways:
+
+* By writing new custom Python code (via Operators, Plugins, Provider)
+* By using the `Stable REST API <stable-rest-api-ref>`_ (based on the OpenAPI specification)
+* By using the `Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_
+
+How can you extend Apache Airflow with custom Python Code?
+==========================================================
+
+The Public Interface of Airflow consists of a number of different classes and packages that provide access
+to the core features and functionality of the system.
+
+The classes and packages that may be considered as the Public Interface include:
+
+* The :class:`~airflow.DAG`, which provides a way to define and manage DAGs in Airflow.
+* The :class:`~airflow.models.baseoperator.BaseOperator`, which provides a way write custom operators.
+* The :class:`~airflow.hooks.base.BaseHook`, which provides a way write custom hooks.
+* The :class:`~airflow.models.connection.Connection`, which provides access to external service credentials and configuration.
+* The :class:`~airflow.models.variable.Variable`, which provides access to Airflow configuration variables.
+* The :class:`~airflow.models.xcom.XCom` which are used to access to inter-task communication data.
+* The :class:`~airflow.secrets.BaseSecretsBackend` which are used to define custom secret managers.
+* The :class:`~airflow.plugins_manager.AirflowPlugin` which are used to define custom plugins.
+* The :class:`~airflow.triggers.base.BaseTrigger`, which are used to implement custom Custom Deferrable Operators (based on ``asyncio``).
+* The :class:`~airflow.decorators.base.TaskDecorator`, which provides a way write custom decorators.
+* The :class:`~airflow.listeners.listener.ListenerManager` class which provides hooks that can be implemented to respond to DAG/Task lifecycle events.
+
+.. versionadded:: 2.5
+
+   Listener public interface has been added in version 2.5.
+
+* The :class:`~airflow.executors.base_executor.BaseExecutor` - the Executors are the components of Airflow
+  that are responsible for executing tasks.
+
+.. versionadded:: 2.6
+
+   There are a number of different executor implementations built-in Airflow, each with its own unique
+   characteristics and capabilities. Executor interface was available in earlier version of Airflow but
+   only as of version 2.6 executors are fully decoupled and Airflow does not rely on built-in set of executors.
+   You could have implemented (and succeeded) with implementing Executors before Airflow 2.6 and a number
+   of people succeeded in doing so, but there were some hard-coded behaviours that preferred in-built
+   executors, and custom executors could not provide full functionality that built-in executors had.
+
+
+What is not part of the Public Interface of Apache Airflow?
+===========================================================
+
+Everything not mentioned in this document should be considered as non-Public Interface.

Review Comment:
   > you wrote yourself that you prefer to minimize backcompat surface and "everything that's not an operator / sensor / hook should be private".
   
   just want to clarify that, specifically with that example about "only hook / sensor / operator are public" i was not quite saying that's what i think it should be, but speaking to the ideal from the perspective of a maintainer.  strictly from maintainer hat, your job is easier the less backcompat you have to worry about.  



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on a diff in pull request #28300: Add Public API description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
potiuk commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1046562614


##########
docs/apache-airflow/public-airflow-api.rst:
##########
@@ -0,0 +1,141 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+"Public API" of Airflow
+=======================
+
+The Public API of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new Executors, Plugins, Operators and Providers. The
+public API of Apache Airflow can be useful for building custom tools and integrations with other systems,
+as well as for automating certain aspects of the Airflow workflow.
+
+In general, the public API is an important part of the Airflow ecosystem and can be a powerful tool for users
+and developers who want to extend the functionality of the system.
+
+What kind of APIs are Public in Apache Airflow?
+===============================================
+
+Apache Airflow has a number of different public APIs that allow developers to interact with various
+aspects of the system. Some examples of the types of public APIs exposed as Python objects
+that are available in Apache Airflow include:
+
+* :doc:`DAG <concepts/dags>`_ (Directed Acyclic Graph) APIs, which allow developers to create, manage,
+  and access DAGs in Airflow.
+* :doc:`Task <concepts/tasks>`_ APIs, which provide access to information about individual tasks within
+  a DAG, such as their dependencies and execution status.
+* :doc:`Operator <concepts/operators>`_ APIs, which allows the developers to write their custom Operators.
+* :doc:`Decorators <howto/create-custom-decorator>`_ APIs, which allows the developers to write their
+  custom decorators to make it easier to write :doc:`TaskFlow <tutorial/taskflow>`_ DAGs.
+* :doc:`Secret Managers <security/secrets>`_ APIs, which allows the developers to write their custom
+  Secret Managers to safely access credentials and other secret configuration of their workflows.
+* :doc:`Connection management <concepts/connections>`_ APIs, which allow developers to manage
+  connections to external systems
+* :doc:`XCom <concepts/xcoms>`_, which allow developers to manage cross-task communication within Airflow.
+* :doc:`Variables <concepts/variables>`_, which allow developers to manage variables within Airflow.
+* :doc:`Executors <executor/index>`_, which allow developers to manage the execution of tasks within Airflow.
+* :doc:`Plugins <plugins>`_, which allow developers to extend internal Airflow capabilities - add new UI
+  pages, custom :doc:`TimeTables <concepts/timetable>`_, :doc:`Extra Links <howto/define_extra_link>`_,
+  :doc:`Listeners <listeners>`_ - all of which are considered public APIs.
+
+Also Airflow has a stable REST API that allows users to interact with Airflow via HTTP requests and a
+CLI that allows users to interact with Airflow via command line commands.
+
+Overall, the public APIs in Apache Airflow are designed to provide developers with a wide
+range of tools and capabilities for interacting with the system and extending its functionality.
+
+What is not part of the Public API of Apache Airflow?
+=====================================================
+
+A public API of Apache Airflow is a set of programming interfaces that are designed to be used
+by developers to access certain features and functionality of the system. As such, not everything in
+Apache Airflow is considered to be part of the public API.
+
+For example, the internal implementation details of Apache Airflow, such as the specific algorithms
+and data structures used to manage DAGs and tasks, are not considered to be part of the public API.
+These implementation details are not intended to be used by external developers, a
+nd are subject to change without notice.
+
+Additionally, certain features and functionality of Apache Airflow may not be exposed through the public API,
+either because they are not considered to be stable or because they are not intended for external use.
+In these cases, developers may not be able to access these features through the public API,
+and should instead use other available methods for interacting with the system.
+
+Similarly, :doc:`Database structure <database-erd-ref>`_ is considered to be an internal implementation
+detail of Apache Airflow and you should not assume the structure is going to be maintained in
+backwards-compatible way.
+
+Is Airflow DB part of the Public API of Airflow?
+================================================
+
+The :doc:`Airflow DB <database-erd-ref>`_ is not considered to be part of the public API of Apache Airflow.
+The Airflow DB is the internal database used by Airflow to store information about DAGs, tasks, and
+other aspects of the system's state. This database is not intended to be accessed directly by
+external developers, and the specific details of its schema and structure are not considered
+to be part of the public API.
+
+Instead, developers who want to interact with the Airflow DB should use the :doc:`stable-rest-api-ref` and
+the :doc:`stable-cli-ref`, both providing a number of different interfaces and methods for accessing and
+manipulating data in the database. These APIs are designed to be more stable and user-friendly than
+accessing the Airflow DB directly, and provide a more consistent and predictable way of interacting
+with the system.
+
+Is Stable REST API part of the Public API of Airflow?
+=====================================================
+
+The :doc:`Stable REST API <stable-cli-ref>`_ is considered to be part of the public API of Apache Airflow.
+The REST API is a set of programming interfaces that allow developers to access certain features of
+Airflow over the HTTP protocol, using a standard set of RESTful (representational state transfer) conventions.
+
+The stable REST API is considered to be part of the public API because it is designed to be used by
+external developers to interact with Airflow in a consistent and predictable way. The REST API provides
+access to many of the same features and functionality that are available through the other public APIs
+of Airflow, such as the ability to create and manage DAGs and tasks, and to monitor their status.
+
+Overall, the stable REST API is an important part of the public API of Apache Airflow, and can
+be a useful tool for developers who want to integrate Airflow with other systems or
+build custom tools and applications on top of Airflow.
+
+Which classes and packages are a public API of Apache Airflow?
+==============================================================
+
+The specific classes and packages that are considered to be part of the public API of Apache
+Airflow can vary depending on the version of Airflow that you are using. In general, however, the
+public API of Airflow consists of a number of different classes and packages that provide access
+to the core features and functionality of the system.
+
+Some examples of the classes and packages that may be considered as the public API of Apache Airflow include:

Review Comment:
   yes. I removed the examples and made it more complete (looking also for more classes that should be added here).
   
   Eventually we might want to automate control over it but we need to start with documented list.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on a diff in pull request #28300: Add Public API description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
potiuk commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1046551405


##########
docs/apache-airflow/index.rst:
##########
@@ -79,6 +79,49 @@ seen running over time:
 Each column represents one DAG run. These are two of the most used views in Airflow, but there are several
 other views which allow you to deep dive into the state of your workflows.
 
+Customizing Airflow
+===================
+
+There are a number of ways to extend the capabilities of Apache Airflow. Some common methods
+for extending Airflow include:
+
+* Writing :doc:`Custom Operators <howto/custom-operator>`_ : Operators are the building blocks of DAGs in
+  Airflow, and can be used to perform specific tasks within a DAG. By writing custom operators,
+  you can add new functionality to Airflow, such as the ability to connect to and interact
+  with external services, or to perform complex data transformations.
+
+* Writing :doc:`Custom Decorators <howto/create-custom-decorator>`_ : Custom decorators are a way to
+  extend the functionality of the Airflow ``@task`` decorator. By writing custom decorators,
+  you can add new functionality to Airflow, such as the ability to connect to and interact
+  with external services, or to perform complex data transformations.
+
+* Creating custom :doc:`plugins`: Airflow plugins are extensions to the core system that provide additional
+  features and functionality. By creating custom plugins, you can add new capabilities to Airflow,
+  such as adding custom UI components, creating custom macros, creating your own TimeTables,
+  adding Operator Extra Links, Listeners.
+
+* Writing custom :doc:`provider packages <apache-airflow-providers:index>`: Providers are the components
+  of Airflow that are responsible for connecting to and interacting with external services. By writing
+  custom providers, you can add support for new types of external services, such as databases or
+  cloud services, to Airflow.
+
+With all the above methods, you can use the :doc:`public-airflow-api` to write your
+custom Python code to interact with Airflow.
+
+However, you can also extend Airflow without directly integrating your Python code - via the
+:doc:`stable-rest-api-ref` and the :doc:`stable-cli-ref`. These methods allow you to interact with
+Airflow as an external system:

Review Comment:
   Yes. Done that.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on a diff in pull request #28300: Add Public API description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
potiuk commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1046496887


##########
docs/apache-airflow/public-airflow-api.rst:
##########
@@ -0,0 +1,141 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+"Public API" of Airflow
+=======================
+
+The Public API of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new Executors, Plugins, Operators and Providers. The
+public API of Apache Airflow can be useful for building custom tools and integrations with other systems,
+as well as for automating certain aspects of the Airflow workflow.
+
+In general, the public API is an important part of the Airflow ecosystem and can be a powerful tool for users
+and developers who want to extend the functionality of the system.
+
+What kind of APIs are Public in Apache Airflow?
+===============================================
+
+Apache Airflow has a number of different public APIs that allow developers to interact with various
+aspects of the system. Some examples of the types of public APIs exposed as Python objects
+that are available in Apache Airflow include:
+
+* :doc:`DAG <concepts/dags>`_ (Directed Acyclic Graph) APIs, which allow developers to create, manage,
+  and access DAGs in Airflow.
+* :doc:`Task <concepts/tasks>`_ APIs, which provide access to information about individual tasks within
+  a DAG, such as their dependencies and execution status.
+* :doc:`Operator <concepts/operators>`_ APIs, which allows the developers to write their custom Operators.
+* :doc:`Decorators <howto/create-custom-decorator>`_ APIs, which allows the developers to write their
+  custom decorators to make it easier to write :doc:`TaskFlow <tutorial/taskflow>`_ DAGs.
+* :doc:`Secret Managers <security/secrets>`_ APIs, which allows the developers to write their custom
+  Secret Managers to safely access credentials and other secret configuration of their workflows.
+* :doc:`Connection management <concepts/connections>`_ APIs, which allow developers to manage
+  connections to external systems
+* :doc:`XCom <concepts/xcoms>`_, which allow developers to manage cross-task communication within Airflow.
+* :doc:`Variables <concepts/variables>`_, which allow developers to manage variables within Airflow.
+* :doc:`Executors <executor/index>`_, which allow developers to manage the execution of tasks within Airflow.
+* :doc:`Plugins <plugins>`_, which allow developers to extend internal Airflow capabilities - add new UI
+  pages, custom :doc:`TimeTables <concepts/timetable>`_, :doc:`Extra Links <howto/define_extra_link>`_,
+  :doc:`Listeners <listeners>`_ - all of which are considered public APIs.
+
+Also Airflow has a stable REST API that allows users to interact with Airflow via HTTP requests and a
+CLI that allows users to interact with Airflow via command line commands.
+
+Overall, the public APIs in Apache Airflow are designed to provide developers with a wide
+range of tools and capabilities for interacting with the system and extending its functionality.
+
+What is not part of the Public API of Apache Airflow?
+=====================================================
+
+A public API of Apache Airflow is a set of programming interfaces that are designed to be used
+by developers to access certain features and functionality of the system. As such, not everything in
+Apache Airflow is considered to be part of the public API.
+
+For example, the internal implementation details of Apache Airflow, such as the specific algorithms
+and data structures used to manage DAGs and tasks, are not considered to be part of the public API.
+These implementation details are not intended to be used by external developers, a
+nd are subject to change without notice.
+
+Additionally, certain features and functionality of Apache Airflow may not be exposed through the public API,
+either because they are not considered to be stable or because they are not intended for external use.
+In these cases, developers may not be able to access these features through the public API,
+and should instead use other available methods for interacting with the system.
+
+Similarly, :doc:`Database structure <database-erd-ref>`_ is considered to be an internal implementation
+detail of Apache Airflow and you should not assume the structure is going to be maintained in
+backwards-compatible way.

Review Comment:
   This is done by me.



##########
docs/apache-airflow/public-airflow-api.rst:
##########
@@ -0,0 +1,141 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+"Public API" of Airflow
+=======================
+
+The Public API of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new Executors, Plugins, Operators and Providers. The
+public API of Apache Airflow can be useful for building custom tools and integrations with other systems,
+as well as for automating certain aspects of the Airflow workflow.
+
+In general, the public API is an important part of the Airflow ecosystem and can be a powerful tool for users
+and developers who want to extend the functionality of the system.
+
+What kind of APIs are Public in Apache Airflow?
+===============================================
+
+Apache Airflow has a number of different public APIs that allow developers to interact with various
+aspects of the system. Some examples of the types of public APIs exposed as Python objects
+that are available in Apache Airflow include:
+
+* :doc:`DAG <concepts/dags>`_ (Directed Acyclic Graph) APIs, which allow developers to create, manage,
+  and access DAGs in Airflow.
+* :doc:`Task <concepts/tasks>`_ APIs, which provide access to information about individual tasks within
+  a DAG, such as their dependencies and execution status.
+* :doc:`Operator <concepts/operators>`_ APIs, which allows the developers to write their custom Operators.
+* :doc:`Decorators <howto/create-custom-decorator>`_ APIs, which allows the developers to write their
+  custom decorators to make it easier to write :doc:`TaskFlow <tutorial/taskflow>`_ DAGs.
+* :doc:`Secret Managers <security/secrets>`_ APIs, which allows the developers to write their custom
+  Secret Managers to safely access credentials and other secret configuration of their workflows.
+* :doc:`Connection management <concepts/connections>`_ APIs, which allow developers to manage
+  connections to external systems
+* :doc:`XCom <concepts/xcoms>`_, which allow developers to manage cross-task communication within Airflow.
+* :doc:`Variables <concepts/variables>`_, which allow developers to manage variables within Airflow.
+* :doc:`Executors <executor/index>`_, which allow developers to manage the execution of tasks within Airflow.
+* :doc:`Plugins <plugins>`_, which allow developers to extend internal Airflow capabilities - add new UI
+  pages, custom :doc:`TimeTables <concepts/timetable>`_, :doc:`Extra Links <howto/define_extra_link>`_,
+  :doc:`Listeners <listeners>`_ - all of which are considered public APIs.
+
+Also Airflow has a stable REST API that allows users to interact with Airflow via HTTP requests and a
+CLI that allows users to interact with Airflow via command line commands.
+
+Overall, the public APIs in Apache Airflow are designed to provide developers with a wide
+range of tools and capabilities for interacting with the system and extending its functionality.
+
+What is not part of the Public API of Apache Airflow?
+=====================================================
+
+A public API of Apache Airflow is a set of programming interfaces that are designed to be used
+by developers to access certain features and functionality of the system. As such, not everything in
+Apache Airflow is considered to be part of the public API.
+
+For example, the internal implementation details of Apache Airflow, such as the specific algorithms
+and data structures used to manage DAGs and tasks, are not considered to be part of the public API.

Review Comment:
   This is done by GPT chat.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on pull request #28300: Add Public Interface description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
potiuk commented on PR #28300:
URL: https://github.com/apache/airflow/pull/28300#issuecomment-1347632747

   (and docs building is fixed here).


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on a diff in pull request #28300: Add Public Interface description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
potiuk commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1066811953


##########
docs/apache-airflow/public-airflow-interface.rst:
##########
@@ -0,0 +1,374 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+...........................
+
+The Public Interface of Apache Airflow is a set of interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This includes operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+and for automating certain aspects of the Airflow workflow.
+
+Using Airflow Public Interfaces
+===============================
+
+Using Airflow Public Interfaces is needed when you want to interact with Airflow programmatically:
+
+* When you are writing new (or extending existing) custom Python classes (Operators, Hooks) - basic building
+  blocks of DAGs and can be done DAG Authors to add missing functionality in their DAGs or by those who write
+  reusable custom operators for other DAG authors.
+* When writing new :doc:`Plugins <authoring-and-scheduling/plugins>` that extend Airflow's functionality beyond
+  DAG building blocks. Secrets, Timetables, Triggers, Listeners are all examples of such functionality. This
+  is usually done by users who manage Airflow instances.
+* Bundling custom Operators, Hooks, Plugins and releasing them together via
+  :doc:`provider packages <apache-airflow-providers:index>` - this is usually done by those who intend to
+  provide reusable set of functionality for external service or application Airflow integrates with.
+
+All the ways above involve extending or using Airflow Python classes and functions. The classes
+and functions mentioned below can be relied to keep backwards-compatible signatures and behaviours within
+MAJOR version of Airflow. On the other hand, classes and method started with ``_`` (also known
+as protected Python methods) and ``__`` (also known as private Python methods) are not part of the Public
+Airflow Interface and might change any time.
+
+You can also use Airflow's Public Interface via the `Stable REST API <stable-rest-api-ref>`_ (based on the
+OpenAPI specification). For specific needs you can also use the
+`Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_ though it's behaviour might change
+in details (such as output format and available flags) so if you want to rely on those in programmatic
+way, the Stable REST API is recommended.
+
+
+Using Public Interface by DAG Authors
+=====================================
+
+DAGS
+----
+
+The DAG is Airflow's core entity that represents a recurring workflow. You can create a DAG by
+instantiating the :class:`~airflow.models.dag.DAG` class in your DAG file.
+
+Airflow has a set of Example DAGs that you can use to learn how to write DAGs
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/example_dags/index
+
+You can read more about DAGs in :doc:`DAGs <core-concepts/dags>`.
+
+.. _pythonapi:operators:
+
+Operators
+---------
+
+Operators allow for generation of certain types of tasks that become nodes in
+the DAG when instantiated.
+
+There are 3 main types of operators:
+
+- Operators that performs an **action**, or tell another system to
+  perform an action
+- **Transfer** operators move data from one system to another
+- **Sensors** are a certain type of operator that will keep running until a
+  certain criterion is met. Examples include a specific file landing in HDFS or
+  S3, a partition appearing in Hive, or a specific time of the day. Sensors
+  are derived from :class:`~airflow.sensors.base.BaseSensorOperator` and run a poke
+  method at a specified :attr:`~airflow.sensors.base.BaseSensorOperator.poke_interval` until it
+  returns ``True``.
+
+All operators are derived from :class:`~airflow.models.baseoperator.BaseOperator` and acquire much
+functionality through inheritance. Since this is the core of the engine,
+it's worth taking the time to understand the parameters of :class:`~airflow.models.baseoperator.BaseOperator`
+to understand the primitive features that can be leveraged in your DAGs.
+
+Airflow has a set of Operators that are considered public. You are also free to extend their functionality
+by extending them:
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/operators/index
+
+  _api/airflow/sensors/index
+
+
+You can read more about the operators in :doc:`core-concepts/operators`, :doc:`core-concepts/sensors`.
+Also you can learn how to write a custom operator in :doc:`howto/custom-operator`.
+
+.. _pythonapi:hooks:
+
+Hooks
+-----
+
+Hooks are interfaces to external platforms and databases, implementing a common
+interface when possible and acting as building blocks for operators. All hooks
+are derived from :class:`~airflow.hooks.base.BaseHook`.
+
+Airflow has a set of Hooks that are considered public. You are free to extend their functionality
+by extending them:
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/hooks/index
+
+Public Airflow utilities
+------------------------
+
+When writing or extending Hooks and Operators, DAG authors and developers of those Operators and Hooks can
+use the following classes:
+
+* The :class:`~airflow.models.connection.Connection`, which provides access to external service credentials and configuration.
+* The :class:`~airflow.models.variable.Variable`, which provides access to Airflow configuration variables.
+* The :class:`~airflow.models.xcom.XCom` which are used to access to inter-task communication data.
+
+You can read more about the public Airflow utilities in :doc:`howto/connection`,
+:doc:`core-concepts/variables`, :doc:`core-concepts/xcoms`
+
+Public Exceptions
+-----------------
+
+When writing the custom Operators and Hooks, you can handle and raise public Exceptions that Airflow
+exposes:
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/exceptions/index
+
+
+Using Public Interface to extend Airflow capabilities
+=====================================================
+
+Airflow uses Plugin mechanism as a way to extend Airflow platform capabilities. They allow to extend
+Airflow UI but also they are the way to expose the below customizations (Triggers, Timetables, Listeners).
+Providers can also implement plugin endpoints and customize Airflow UI and the customizations.
+
+You can read more about plugins in :doc:`authoring-and-scheduling/plugins`. You can read how to extend
+Airflow UI in :doc:`howto/custom-view-plugin`. Note that there are some simple customizations of the UI
+that do not require plugins - you can read more about them in :doc:`howto/customize-ui`.
+
+Here are the ways how Plugins can be used to extend Airflow:
+
+Triggers
+--------
+
+Airflow might be configure to use Triggers in order to implement ``asyncio`` compatible Deferrable Operators.
+All Triggers derive from :class:`~airflow.triggers.base.BaseTrigger`.
+
+Airflow has a set of Triggers that are considered public. You are free to extend their functionality
+by extending them:
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/triggers/index
+
+You can read more about Triggers in :doc:`authoring-and-scheduling/deferring`.
+
+Timetables
+----------
+
+Custom timetable implementations provide Airflow's scheduler additional logic to
+schedule DAG runs in ways not possible with built-in schedule expressions.
+All Timetables derive from :class:`~airflow.timetables.base.Timetable`.
+
+Airflow has a set of Timetables that are considered public. You are free to extend their functionality
+by extending them:
+
+.. toctree::
+  :includehidden:
+  :maxdepth: 1
+
+  _api/airflow/timetables/index
+
+You can read more about Timetables in :doc:`howto/timetable`.
+
+Listeners
+---------
+
+Listeners are the way that airflow platform as a whole can be extended to respond to DAG/Task lifecycle events.
+
+This is implemented via :class:`~airflow.listeners.listener.ListenerManager` class that provides hooks that
+can be implemented to respond to DAG/Task lifecycle events.
+
+.. versionadded:: 2.5
+
+   Listener public interface has been added in version 2.5.
+
+You can read more about Listeners in :doc:`administration-and-deployment/listeners`.
+
+Extra Links
+-----------
+
+Extra links are dynamic links that could be added to Airflow independently from custom Operators. Normally
+they can be defined by the Operators, but plugins allow to override the links on aa global level.
+
+You can read more about the Extra Links in :doc:`/howto/define-extra-link`.
+
+Using Public Interface to integrate with external services and applications
+===========================================================================
+
+In order to integrate Airflow with external services and applications you can develop Hooks and Operators,
+similarly to DAG Authors, but also you can also extend the core functionality of Airflow by integrating
+them with those external services. This can be done by just adding the extension classes to PYTHONPATH
+on your system,and configuring Airflow to use them (there is no need to use plugin mechanism). However,
+you can also package and release the Hooks, Operators and core extensions in the form of
+:doc:`provider packages <apache-airflow-providers:index>`. You can read more about core extensions
+delivered by providers via :doc:`provider packages <apache-airflow-providers:core-extensions/index>`.

Review Comment:
   Yes it is that. But I would also leave the links to detailed documents of the providers and "core-extensions" provided by the providers. I find it really important to inter-link our documentation and refer to more details when we are explaining something as it is the way how the reader can look easily for more information.
   
   I will add a fixup after I bulk-merge your corrections.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on a diff in pull request #28300: Add Public API description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
potiuk commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1046551983


##########
docs/apache-airflow/index.rst:
##########
@@ -79,6 +79,49 @@ seen running over time:
 Each column represents one DAG run. These are two of the most used views in Airflow, but there are several
 other views which allow you to deep dive into the state of your workflows.
 
+Customizing Airflow
+===================
+
+There are a number of ways to extend the capabilities of Apache Airflow. Some common methods
+for extending Airflow include:
+
+* Writing :doc:`Custom Operators <howto/custom-operator>`_ : Operators are the building blocks of DAGs in
+  Airflow, and can be used to perform specific tasks within a DAG. By writing custom operators,
+  you can add new functionality to Airflow, such as the ability to connect to and interact
+  with external services, or to perform complex data transformations.
+
+* Writing :doc:`Custom Decorators <howto/create-custom-decorator>`_ : Custom decorators are a way to
+  extend the functionality of the Airflow ``@task`` decorator. By writing custom decorators,
+  you can add new functionality to Airflow, such as the ability to connect to and interact
+  with external services, or to perform complex data transformations.
+
+* Creating custom :doc:`plugins`: Airflow plugins are extensions to the core system that provide additional
+  features and functionality. By creating custom plugins, you can add new capabilities to Airflow,
+  such as adding custom UI components, creating custom macros, creating your own TimeTables,
+  adding Operator Extra Links, Listeners.
+
+* Writing custom :doc:`provider packages <apache-airflow-providers:index>`: Providers are the components
+  of Airflow that are responsible for connecting to and interacting with external services. By writing
+  custom providers, you can add support for new types of external services, such as databases or
+  cloud services, to Airflow.
+
+With all the above methods, you can use the :doc:`public-airflow-api` to write your
+custom Python code to interact with Airflow.
+
+However, you can also extend Airflow without directly integrating your Python code - via the
+:doc:`stable-rest-api-ref` and the :doc:`stable-cli-ref`. These methods allow you to interact with
+Airflow as an external system:
+
+Particularly, the doc:`stable-rest-api-ref` of Apache Airflow provides a number of methods for programmatic

Review Comment:
   Yep. Left it only in headers.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on pull request #28300: Add Public Interface description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
potiuk commented on PR #28300:
URL: https://github.com/apache/airflow/pull/28300#issuecomment-1347617665

   First round review passed - all comments applied @BasPH @o-nikolas 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on a diff in pull request #28300: Add Public API description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
potiuk commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1046499266


##########
docs/apache-airflow/public-airflow-api.rst:
##########
@@ -0,0 +1,141 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+"Public API" of Airflow
+=======================
+
+The Public API of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new Executors, Plugins, Operators and Providers. The
+public API of Apache Airflow can be useful for building custom tools and integrations with other systems,
+as well as for automating certain aspects of the Airflow workflow.
+
+In general, the public API is an important part of the Airflow ecosystem and can be a powerful tool for users
+and developers who want to extend the functionality of the system.
+
+What kind of APIs are Public in Apache Airflow?
+===============================================
+
+Apache Airflow has a number of different public APIs that allow developers to interact with various
+aspects of the system. Some examples of the types of public APIs exposed as Python objects
+that are available in Apache Airflow include:
+
+* :doc:`DAG <concepts/dags>`_ (Directed Acyclic Graph) APIs, which allow developers to create, manage,
+  and access DAGs in Airflow.
+* :doc:`Task <concepts/tasks>`_ APIs, which provide access to information about individual tasks within
+  a DAG, such as their dependencies and execution status.
+* :doc:`Operator <concepts/operators>`_ APIs, which allows the developers to write their custom Operators.
+* :doc:`Decorators <howto/create-custom-decorator>`_ APIs, which allows the developers to write their
+  custom decorators to make it easier to write :doc:`TaskFlow <tutorial/taskflow>`_ DAGs.
+* :doc:`Secret Managers <security/secrets>`_ APIs, which allows the developers to write their custom
+  Secret Managers to safely access credentials and other secret configuration of their workflows.
+* :doc:`Connection management <concepts/connections>`_ APIs, which allow developers to manage
+  connections to external systems
+* :doc:`XCom <concepts/xcoms>`_, which allow developers to manage cross-task communication within Airflow.
+* :doc:`Variables <concepts/variables>`_, which allow developers to manage variables within Airflow.
+* :doc:`Executors <executor/index>`_, which allow developers to manage the execution of tasks within Airflow.
+* :doc:`Plugins <plugins>`_, which allow developers to extend internal Airflow capabilities - add new UI
+  pages, custom :doc:`TimeTables <concepts/timetable>`_, :doc:`Extra Links <howto/define_extra_link>`_,
+  :doc:`Listeners <listeners>`_ - all of which are considered public APIs.
+
+Also Airflow has a stable REST API that allows users to interact with Airflow via HTTP requests and a
+CLI that allows users to interact with Airflow via command line commands.
+
+Overall, the public APIs in Apache Airflow are designed to provide developers with a wide
+range of tools and capabilities for interacting with the system and extending its functionality.
+
+What is not part of the Public API of Apache Airflow?
+=====================================================
+
+A public API of Apache Airflow is a set of programming interfaces that are designed to be used
+by developers to access certain features and functionality of the system. As such, not everything in
+Apache Airflow is considered to be part of the public API.
+
+For example, the internal implementation details of Apache Airflow, such as the specific algorithms
+and data structures used to manage DAGs and tasks, are not considered to be part of the public API.

Review Comment:
   I think we have (for now) a LOT of classes that should not be used but are not _underscored . Simply because it has never been created in the way to separte them. I am afraid for now at least what we can do is to be a bit vague. 
   
   Later - when we introduce a formal list of classes - we can change it. But I want to  first document what we have rather than go into the rabbit hole of changing 100s of classes and methods to contain underscore. It's just pragmatic approach.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on a diff in pull request #28300: Add Public Interface description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
potiuk commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1048330340


##########
docs/apache-airflow/public-airflow-interface.rst:
##########
@@ -0,0 +1,115 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+===========================
+
+The Public Interface of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+as well as for automating certain aspects of the Airflow workflow.
+
+In general, the Public Interface is an important part of the Airflow ecosystem and can be a powerful
+tool for users and developers who want to extend the functionality of the system.
+
+You can extend Airflow in three ways:
+
+* By writing new custom Python code (via Operators, Plugins, Provider)
+* By using the `Stable REST API <stable-rest-api-ref>`_ (based on the OpenAPI specification)
+* By using the `Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_
+
+How can you extend Apache Airflow with custom Python Code?
+==========================================================
+
+Apache Airflow has a number of different Public Interfaces that allow developers to interact with various
+aspects of the system. Some examples of the types of Public Interfaces exposed as Python objects
+that are available in Apache Airflow include:
+
+* `DAG <concepts/dags>`_ (Directed Acyclic Graph) APIs, which allow developers to create, manage,
+  and access DAGs in Airflow.
+* `Task <concepts/tasks>`_ APIs, which provide access to information about individual tasks within
+  a DAG, such as their dependencies and execution status.
+* `Operator <concepts/operators>`_ APIs, which allows the developers to write their custom Operators.
+* `Decorators <howto/create-custom-decorator>`_ APIs, which allows the developers to write their
+  custom decorators to make it easier to write `TaskFlow <tutorial/taskflow>`_ DAGs.
+* `Secret Managers <security/secrets>`_ APIs, which allows the developers to write their custom
+  Secret Managers to safely access credentials and other secret configuration of their workflows.
+* `Connection management <concepts/connections>`_ APIs, which allow developers to manage
+  connections to external systems
+* `XCom <concepts/xcoms>`_, which allow developers to manage cross-task communication within Airflow.
+* `Variables <concepts/variables>`_, which allow developers to manage variables within Airflow.
+* `Executors <executor/index>`_, which allow developers to manage the execution of tasks within Airflow.
+* `Listeners <listeners>`_, which allow developers to react to DAG/Task lifecycle events.
+* `Plugins <plugins>`_, which allow developers to extend internal Airflow capabilities - add new UI
+  pages, custom `TimeTables <concepts/timetable>`_, `Extra Links <howto/define_extra_link>`_,
+  `Triggers <concept/deferring>`_. `Listeners <listeners>`_.
+
+What is not part of the Public Interface of Apache Airflow?
+===========================================================
+
+Everything not mentioned in this document should be considered as non-Public Interface.
+
+The users however, might have some assumptions about different parts of Airflow, thinking that
+they can rely on them, so here we try to clarify and want to be explicit that they are not part of the
+Public Interface:

Review Comment:
   I would like to keep it and explain context. It is all about setting the expectations.
   
    I think if we do not provide that context, users might mistakenly think that this list of "things" that are excluded is "closed" (and there are no others). By explicitly explaining that here we are only showing things that some people might easily take for granted we provide more context - why we are listing those things here.
   
   We are listing them here because people might (and did) make assumptions those things are "public interface". This alreaady happened, and it is likely .o happen again. So listing those things here, we are addressing the likely case that people will attempt to rely on. This is purely based on past discussions and assumptions people already made so it is not "guesswork". That's based on real cases.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] ashb commented on a diff in pull request #28300: Add Public Interface description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
ashb commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1046942317


##########
docs/apache-airflow/public-airflow-interface.rst:
##########
@@ -0,0 +1,110 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+===========================
+
+The Public Interface of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+as well as for automating certain aspects of the Airflow workflow.
+
+In general, the Public Interface is an important part of the Airflow ecosystem and can be a powerful
+tool for users and developers who want to extend the functionality of the system.
+
+You can extend Airflow in three ways:
+
+* By writing new custom Python code (via Operators, Plugins, Provider)
+* By using the `Stable REST API <stable-rest-api-ref>`_ (based on the OpenAPI specification)
+* By using the `Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_
+
+How can you extend Apache Airflow with custom Python Code?
+==========================================================
+
+Apache Airflow has a number of different Public Interfaces that allow developers to interact with various
+aspects of the system. Some examples of the types of Public Interfaces exposed as Python objects
+that are available in Apache Airflow include:
+
+* `DAG <concepts/dags>`_ (Directed Acyclic Graph) APIs, which allow developers to create, manage,
+  and access DAGs in Airflow.
+* `Task <concepts/tasks>`_ APIs, which provide access to information about individual tasks within
+  a DAG, such as their dependencies and execution status.
+* `Operator <concepts/operators>`_ APIs, which allows the developers to write their custom Operators.
+* `Decorators <howto/create-custom-decorator>`_ APIs, which allows the developers to write their
+  custom decorators to make it easier to write `TaskFlow <tutorial/taskflow>`_ DAGs.
+* `Secret Managers <security/secrets>`_ APIs, which allows the developers to write their custom
+  Secret Managers to safely access credentials and other secret configuration of their workflows.
+* `Connection management <concepts/connections>`_ APIs, which allow developers to manage
+  connections to external systems
+* `XCom <concepts/xcoms>`_, which allow developers to manage cross-task communication within Airflow.
+* `Variables <concepts/variables>`_, which allow developers to manage variables within Airflow.
+* `Executors <executor/index>`_, which allow developers to manage the execution of tasks within Airflow.
+* `Listeners <listeners>`_ - all of which are considered Public Interfaces.
+* `Plugins <plugins>`_, which allow developers to extend internal Airflow capabilities - add new UI
+  pages, custom `TimeTables <concepts/timetable>`_, `Extra Links <howto/define_extra_link>`_,
+
+What is not part of the Public Interface of Apache Airflow?
+===========================================================
+
+Everything not mentioned in this document should be considered as non-Public Interface.
+
+The users however, might have some assumptions about different parts of Airflow, thinking that
+they can rely on them, so here we try to clarify and want to be explicit that they are not part of the
+Public Interface:
+
+* `Database structure <database-erd-ref>`_ is considered to be an internal implementation
+  detail and you should not assume the structure is going to be maintained in
+  backwards-compatible way.
+
+* `Web UI <ui>`_ is considered to be continuously adapting and evolving for Apache Airflow and
+  you should not assume that the UI will remain as it is in backwards-compatible way. Views and screens
+  might be updated and removed in major releases without impacting backwards-compatibility.
+
+* `Python API <python-api-ref>`_ (except the explicitly mentioned classes below), are considered an
+  internal implementation details and you should not assume they will be maintained
+  in backwards-compatible way.
+
+Which classes and packages are part of the Public Interface of Apache Airflow?
+==============================================================================
+
+The Public Interface of Airflow consists of a number of different classes and packages that provide access
+to the core features and functionality of the system.
+
+The classes and packages that may be considered as the Public Interface include:
+
+* The :class:`~airflow.models.dag.DAG`, which provides a way to define and manage DAGs in Airflow.
+* The :class:`~airflow.models.baseoperator.BaseOperator`, which provides a way write custom operators.
+* The :class:`~airflow.hooks.base.BaseHook`, which provides a way write custom hooks.
+* The :class:`~airflow.decorators.base.TaskDecorator`, which provides a way write custom decorators.
+* The :class:`~airflow.models.connection.Connection`, which provides access to external service credentials and configuration.
+* The :class:`~airflow.models.variable.Variable`, which provides access to Airflow configuration variables.
+* The :class:`~airflow.models.xcom.XCom` classes that are used to access to inter-task communication data.
+* The :class:`~airflow.secrets.BaseSecretsBackend` classes that are used to define custom secret managers.
+* The :class:`~airflow.plugins_manager.AirflowPlugin` class that is used to define custom plugins.
+
+.. versionadded:: 2.5
+
+* The :class:`~airflow.listeners.listener` classes that are used to define your own listeners for DAG/Task events
+
+.. versionadded:: 2.6

Review Comment:
   These version added don't line up with anything. 



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on a diff in pull request #28300: Add Public Interface description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
potiuk commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1047362049


##########
docs/apache-airflow/public-airflow-interface.rst:
##########
@@ -0,0 +1,110 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+===========================
+
+The Public Interface of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+as well as for automating certain aspects of the Airflow workflow.
+
+In general, the Public Interface is an important part of the Airflow ecosystem and can be a powerful
+tool for users and developers who want to extend the functionality of the system.
+
+You can extend Airflow in three ways:
+
+* By writing new custom Python code (via Operators, Plugins, Provider)
+* By using the `Stable REST API <stable-rest-api-ref>`_ (based on the OpenAPI specification)
+* By using the `Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_
+
+How can you extend Apache Airflow with custom Python Code?
+==========================================================
+
+Apache Airflow has a number of different Public Interfaces that allow developers to interact with various
+aspects of the system. Some examples of the types of Public Interfaces exposed as Python objects
+that are available in Apache Airflow include:
+
+* `DAG <concepts/dags>`_ (Directed Acyclic Graph) APIs, which allow developers to create, manage,
+  and access DAGs in Airflow.
+* `Task <concepts/tasks>`_ APIs, which provide access to information about individual tasks within
+  a DAG, such as their dependencies and execution status.
+* `Operator <concepts/operators>`_ APIs, which allows the developers to write their custom Operators.
+* `Decorators <howto/create-custom-decorator>`_ APIs, which allows the developers to write their
+  custom decorators to make it easier to write `TaskFlow <tutorial/taskflow>`_ DAGs.
+* `Secret Managers <security/secrets>`_ APIs, which allows the developers to write their custom
+  Secret Managers to safely access credentials and other secret configuration of their workflows.
+* `Connection management <concepts/connections>`_ APIs, which allow developers to manage
+  connections to external systems
+* `XCom <concepts/xcoms>`_, which allow developers to manage cross-task communication within Airflow.
+* `Variables <concepts/variables>`_, which allow developers to manage variables within Airflow.
+* `Executors <executor/index>`_, which allow developers to manage the execution of tasks within Airflow.
+* `Listeners <listeners>`_ - all of which are considered Public Interfaces.
+* `Plugins <plugins>`_, which allow developers to extend internal Airflow capabilities - add new UI
+  pages, custom `TimeTables <concepts/timetable>`_, `Extra Links <howto/define_extra_link>`_,
+
+What is not part of the Public Interface of Apache Airflow?
+===========================================================
+
+Everything not mentioned in this document should be considered as non-Public Interface.
+
+The users however, might have some assumptions about different parts of Airflow, thinking that
+they can rely on them, so here we try to clarify and want to be explicit that they are not part of the
+Public Interface:
+
+* `Database structure <database-erd-ref>`_ is considered to be an internal implementation
+  detail and you should not assume the structure is going to be maintained in
+  backwards-compatible way.
+
+* `Web UI <ui>`_ is considered to be continuously adapting and evolving for Apache Airflow and
+  you should not assume that the UI will remain as it is in backwards-compatible way. Views and screens
+  might be updated and removed in major releases without impacting backwards-compatibility.
+
+* `Python API <python-api-ref>`_ (except the explicitly mentioned classes below), are considered an
+  internal implementation details and you should not assume they will be maintained
+  in backwards-compatible way.
+
+Which classes and packages are part of the Public Interface of Apache Airflow?
+==============================================================================
+
+The Public Interface of Airflow consists of a number of different classes and packages that provide access
+to the core features and functionality of the system.
+
+The classes and packages that may be considered as the Public Interface include:
+
+* The :class:`~airflow.models.dag.DAG`, which provides a way to define and manage DAGs in Airflow.
+* The :class:`~airflow.models.baseoperator.BaseOperator`, which provides a way write custom operators.
+* The :class:`~airflow.hooks.base.BaseHook`, which provides a way write custom hooks.
+* The :class:`~airflow.decorators.base.TaskDecorator`, which provides a way write custom decorators.
+* The :class:`~airflow.models.connection.Connection`, which provides access to external service credentials and configuration.
+* The :class:`~airflow.models.variable.Variable`, which provides access to Airflow configuration variables.
+* The :class:`~airflow.models.xcom.XCom` classes that are used to access to inter-task communication data.
+* The :class:`~airflow.secrets.BaseSecretsBackend` classes that are used to define custom secret managers.
+* The :class:`~airflow.plugins_manager.AirflowPlugin` class that is used to define custom plugins.
+
+.. versionadded:: 2.5
+
+* The :class:`~airflow.listeners.listener` classes that are used to define your own listeners for DAG/Task events
+
+.. versionadded:: 2.6

Review Comment:
   Restructured the docs a  bit to let them contain content.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on a diff in pull request #28300: Add Public Interface description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
potiuk commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1050144300


##########
docs/apache-airflow/public-airflow-interface.rst:
##########
@@ -0,0 +1,115 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+===========================
+
+The Public Interface of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+as well as for automating certain aspects of the Airflow workflow.
+
+In general, the Public Interface is an important part of the Airflow ecosystem and can be a powerful
+tool for users and developers who want to extend the functionality of the system.
+
+You can extend Airflow in three ways:
+
+* By writing new custom Python code (via Operators, Plugins, Provider)
+* By using the `Stable REST API <stable-rest-api-ref>`_ (based on the OpenAPI specification)
+* By using the `Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_
+
+How can you extend Apache Airflow with custom Python Code?
+==========================================================
+
+Apache Airflow has a number of different Public Interfaces that allow developers to interact with various
+aspects of the system. Some examples of the types of Public Interfaces exposed as Python objects
+that are available in Apache Airflow include:
+
+* `DAG <concepts/dags>`_ (Directed Acyclic Graph) APIs, which allow developers to create, manage,
+  and access DAGs in Airflow.
+* `Task <concepts/tasks>`_ APIs, which provide access to information about individual tasks within
+  a DAG, such as their dependencies and execution status.
+* `Operator <concepts/operators>`_ APIs, which allows the developers to write their custom Operators.
+* `Decorators <howto/create-custom-decorator>`_ APIs, which allows the developers to write their
+  custom decorators to make it easier to write `TaskFlow <tutorial/taskflow>`_ DAGs.
+* `Secret Managers <security/secrets>`_ APIs, which allows the developers to write their custom
+  Secret Managers to safely access credentials and other secret configuration of their workflows.
+* `Connection management <concepts/connections>`_ APIs, which allow developers to manage
+  connections to external systems
+* `XCom <concepts/xcoms>`_, which allow developers to manage cross-task communication within Airflow.
+* `Variables <concepts/variables>`_, which allow developers to manage variables within Airflow.
+* `Executors <executor/index>`_, which allow developers to manage the execution of tasks within Airflow.
+* `Listeners <listeners>`_, which allow developers to react to DAG/Task lifecycle events.
+* `Plugins <plugins>`_, which allow developers to extend internal Airflow capabilities - add new UI
+  pages, custom `TimeTables <concepts/timetable>`_, `Extra Links <howto/define_extra_link>`_,
+  `Triggers <concept/deferring>`_. `Listeners <listeners>`_.
+
+What is not part of the Public Interface of Apache Airflow?
+===========================================================
+
+Everything not mentioned in this document should be considered as non-Public Interface.
+
+The users however, might have some assumptions about different parts of Airflow, thinking that
+they can rely on them, so here we try to clarify and want to be explicit that they are not part of the
+Public Interface:
+
+* `Database structure <database-erd-ref>`_ is considered to be an internal implementation
+  detail and you should not assume the structure is going to be maintained in
+  backwards-compatible way.
+
+* `Web UI <ui>`_ is considered to be continuously adapting and evolving for Apache Airflow and
+  you should not assume that the UI will remain as it is in backwards-compatible way. Views and screens
+  might be updated and removed in major releases without impacting backwards-compatibility.
+
+* `Python API <python-api-ref>`_ (except the explicitly mentioned classes below), are considered an
+  internal implementation details and you should not assume they will be maintained
+  in backwards-compatible way.
+
+Which classes and packages are part of the Public Interface of Apache Airflow?
+==============================================================================
+
+The Public Interface of Airflow consists of a number of different classes and packages that provide access
+to the core features and functionality of the system.
+
+The classes and packages that may be considered as the Public Interface include:
+
+* The :class:`~airflow.DAG`, which provides a way to define and manage DAGs in Airflow.
+* The :class:`~airflow.models.baseoperator.BaseOperator`, which provides a way write custom operators.
+* The :class:`~airflow.hooks.base.BaseHook`, which provides a way write custom hooks.
+* The :class:`~airflow.models.connection.Connection`, which provides access to external service credentials and configuration.
+* The :class:`~airflow.models.variable.Variable`, which provides access to Airflow configuration variables.
+* The :class:`~airflow.models.xcom.XCom` which are used to access to inter-task communication data.
+* The :class:`~airflow.secrets.BaseSecretsBackend` which are used to define custom secret managers.
+* The :class:`~airflow.plugins_manager.AirflowPlugin` which are used to define custom plugins.
+* The :class:`~airflow.triggers.base.BaseTrigger`, which are used to implement custom Custom Deferrable Operators (based on ``async.io``).
+* The :class:`~airflow.decorators.base.TaskDecorator`, which provides a way write custom decorators.
+* The :class:`~airflow.listeners.listener.ListenerManager` class which provides hooks that can be implemented to respond to DAG/Task lifecycle events
+
+.. versionadded:: 2.5
+
+   Listener public interface has been added in version 2.5.
+
+* The :class:`~airflow.executors.base_executor.BaseExecutor` - the Executors are the components of Airflow
+  that are responsible for executing tasks.
+
+.. versionadded:: 2.6
+
+   There are a number of different executor implementations built-in Airflow, each with its own unique
+   characteristics and capabilities. Executor interface was available in earlier version of Airflow but
+   only as of version 2.6 executors are decoupled and Airflow does not rely on built-in set of executors.

Review Comment:
   > This is simple not true though -- I know of multiple people (Astro and others) who have successfully written custom executors against 2.1 and above.
   
   It's very strong statement to claim something is not true. But I think we are mixing two things here. I know there are people who wrote some implementations of custom executors that worked in some cases. This is true. I agree with that statement. And I am not arguing with that statement at all. In fact I wholeheartedly agree with your statement @ashb. And I believe it is not invalidating what I wrote (or at least what my intention was). What I wrote that the earlier attempts to write executor could have failed in a number of non-obvious cases because there were a number of hard-coded behaviours that made executior not a "well defined API".
   
   I do not want to make an attempt to describe what things you should avoid and what problems you would hit with tryibng to implement Executors before. I think it's a topic that requires some intrinsic analysis of where the "hard-coded" behaviours could cause the problems. And if someone would like to attempt it - I am all ears.
   
   But - let me rephrase it in the way that will make it clear that there were successful attempts before, but you could have hit difficult to diagnose and solve problems because built-in executors were hard-coded (and I will mention that a there were cases successes in writing limited implementations of executors). I hope this will be acceptable.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on a diff in pull request #28300: Add Public Interface description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
potiuk commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1050166158


##########
docs/apache-airflow/public-airflow-interface.rst:
##########
@@ -0,0 +1,115 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+===========================
+
+The Public Interface of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+as well as for automating certain aspects of the Airflow workflow.
+
+In general, the Public Interface is an important part of the Airflow ecosystem and can be a powerful
+tool for users and developers who want to extend the functionality of the system.
+
+You can extend Airflow in three ways:
+
+* By writing new custom Python code (via Operators, Plugins, Provider)
+* By using the `Stable REST API <stable-rest-api-ref>`_ (based on the OpenAPI specification)
+* By using the `Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_
+
+How can you extend Apache Airflow with custom Python Code?
+==========================================================
+
+Apache Airflow has a number of different Public Interfaces that allow developers to interact with various
+aspects of the system. Some examples of the types of Public Interfaces exposed as Python objects
+that are available in Apache Airflow include:
+
+* `DAG <concepts/dags>`_ (Directed Acyclic Graph) APIs, which allow developers to create, manage,
+  and access DAGs in Airflow.
+* `Task <concepts/tasks>`_ APIs, which provide access to information about individual tasks within
+  a DAG, such as their dependencies and execution status.
+* `Operator <concepts/operators>`_ APIs, which allows the developers to write their custom Operators.
+* `Decorators <howto/create-custom-decorator>`_ APIs, which allows the developers to write their
+  custom decorators to make it easier to write `TaskFlow <tutorial/taskflow>`_ DAGs.
+* `Secret Managers <security/secrets>`_ APIs, which allows the developers to write their custom
+  Secret Managers to safely access credentials and other secret configuration of their workflows.
+* `Connection management <concepts/connections>`_ APIs, which allow developers to manage
+  connections to external systems
+* `XCom <concepts/xcoms>`_, which allow developers to manage cross-task communication within Airflow.
+* `Variables <concepts/variables>`_, which allow developers to manage variables within Airflow.
+* `Executors <executor/index>`_, which allow developers to manage the execution of tasks within Airflow.
+* `Listeners <listeners>`_, which allow developers to react to DAG/Task lifecycle events.
+* `Plugins <plugins>`_, which allow developers to extend internal Airflow capabilities - add new UI
+  pages, custom `TimeTables <concepts/timetable>`_, `Extra Links <howto/define_extra_link>`_,
+  `Triggers <concept/deferring>`_. `Listeners <listeners>`_.
+
+What is not part of the Public Interface of Apache Airflow?
+===========================================================
+
+Everything not mentioned in this document should be considered as non-Public Interface.
+
+The users however, might have some assumptions about different parts of Airflow, thinking that
+they can rely on them, so here we try to clarify and want to be explicit that they are not part of the
+Public Interface:
+
+* `Database structure <database-erd-ref>`_ is considered to be an internal implementation
+  detail and you should not assume the structure is going to be maintained in
+  backwards-compatible way.
+
+* `Web UI <ui>`_ is considered to be continuously adapting and evolving for Apache Airflow and
+  you should not assume that the UI will remain as it is in backwards-compatible way. Views and screens
+  might be updated and removed in major releases without impacting backwards-compatibility.
+
+* `Python API <python-api-ref>`_ (except the explicitly mentioned classes below), are considered an
+  internal implementation details and you should not assume they will be maintained
+  in backwards-compatible way.
+
+Which classes and packages are part of the Public Interface of Apache Airflow?
+==============================================================================
+
+The Public Interface of Airflow consists of a number of different classes and packages that provide access
+to the core features and functionality of the system.
+
+The classes and packages that may be considered as the Public Interface include:
+
+* The :class:`~airflow.DAG`, which provides a way to define and manage DAGs in Airflow.
+* The :class:`~airflow.models.baseoperator.BaseOperator`, which provides a way write custom operators.
+* The :class:`~airflow.hooks.base.BaseHook`, which provides a way write custom hooks.
+* The :class:`~airflow.models.connection.Connection`, which provides access to external service credentials and configuration.
+* The :class:`~airflow.models.variable.Variable`, which provides access to Airflow configuration variables.
+* The :class:`~airflow.models.xcom.XCom` which are used to access to inter-task communication data.
+* The :class:`~airflow.secrets.BaseSecretsBackend` which are used to define custom secret managers.
+* The :class:`~airflow.plugins_manager.AirflowPlugin` which are used to define custom plugins.
+* The :class:`~airflow.triggers.base.BaseTrigger`, which are used to implement custom Custom Deferrable Operators (based on ``async.io``).
+* The :class:`~airflow.decorators.base.TaskDecorator`, which provides a way write custom decorators.
+* The :class:`~airflow.listeners.listener.ListenerManager` class which provides hooks that can be implemented to respond to DAG/Task lifecycle events
+
+.. versionadded:: 2.5
+
+   Listener public interface has been added in version 2.5.
+
+* The :class:`~airflow.executors.base_executor.BaseExecutor` - the Executors are the components of Airflow
+  that are responsible for executing tasks.
+
+.. versionadded:: 2.6
+
+   There are a number of different executor implementations built-in Airflow, each with its own unique
+   characteristics and capabilities. Executor interface was available in earlier version of Airflow but
+   only as of version 2.6 executors are decoupled and Airflow does not rely on built-in set of executors.

Review Comment:
   BTW. I can also link to AIP-51 for details of what "could not be done" before. I think it describes it pretty well (and completely). But I think we rarely link to AIPs in our docs (and it was criricised before when I added it), so I think just leaving a generic note is better. 



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on a diff in pull request #28300: Add Public Interface description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
potiuk commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1058656390


##########
docs/apache-airflow/administration-and-deployment/public-airflow-interface.rst:
##########
@@ -0,0 +1,87 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+===========================
+
+The Public Interface of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This includes operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+and for automating certain aspects of the Airflow workflow.
+
+You can extend Airflow in three ways:
+
+* By writing new custom Python code (via Operators, Plugins, Provider)
+* By using the `Stable REST API <stable-rest-api-ref>`_ (based on the OpenAPI specification)
+* By using the `Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_
+
+How can you extend Apache Airflow with custom Python Code?
+==========================================================
+
+The Public Interface of Airflow consists of a number of different classes and packages that provide access
+to the core features and functionality of the system.
+
+The classes and packages that may be considered as the Public Interface include:
+
+* The :class:`~airflow.DAG`, which provides a way to define and manage DAGs in Airflow.
+* The :class:`~airflow.models.baseoperator.BaseOperator`, which provides a way write custom operators.
+* The :class:`~airflow.hooks.base.BaseHook`, which provides a way write custom hooks.
+* The :class:`~airflow.models.connection.Connection`, which provides access to external service credentials and configuration.
+* The :class:`~airflow.models.variable.Variable`, which provides access to Airflow configuration variables.
+* The :class:`~airflow.models.xcom.XCom` which are used to access to inter-task communication data.
+* The :class:`~airflow.secrets.BaseSecretsBackend` which are used to define custom secret managers.
+* The :class:`~airflow.plugins_manager.AirflowPlugin` which are used to define custom plugins.
+* The :class:`~airflow.triggers.base.BaseTrigger`, which are used to implement custom Custom Deferrable Operators (based on ``asyncio``).
+* The :class:`~airflow.decorators.base.TaskDecorator`, which provides a way write custom decorators.
+* The :class:`~airflow.listeners.listener.ListenerManager` class which provides hooks that can be implemented to respond to DAG/Task lifecycle events.
+
+.. versionadded:: 2.5
+
+   Listener public interface has been added in version 2.5.
+
+* The :class:`~airflow.executors.base_executor.BaseExecutor` - the Executors are the components of Airflow
+  that are responsible for executing tasks.
+
+.. versionadded:: 2.6
+
+   There are a number of different executor implementations built-in Airflow, each with its own unique
+   characteristics and capabilities. Executor interface was available in earlier version of Airflow but
+   only as of version 2.6 executors are fully decoupled and Airflow does not rely on built-in set of executors.
+   You could have implemented (and succeeded) with implementing Executors before Airflow 2.6 and a number
+   of people succeeded in doing so, but there were some hard-coded behaviours that preferred in-built
+   executors, and custom executors could not provide full functionality that built-in executors had.
+
+
+What is not part of the Public Interface of Apache Airflow?
+===========================================================
+
+Everything not mentioned in this document should be considered as non-Public Interface.

Review Comment:
   BTW. Eventually we might be able to produce airflow typeshed-like package: https://github.com/python/typeshed with only stubs for only "externally public" APIs. This is what should be an "ultimate" goal I think. 



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] dstandish commented on a diff in pull request #28300: Add Public Interface description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
dstandish commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1058642484


##########
docs/apache-airflow/administration-and-deployment/public-airflow-interface.rst:
##########
@@ -0,0 +1,87 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+===========================
+
+The Public Interface of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This includes operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+and for automating certain aspects of the Airflow workflow.
+
+You can extend Airflow in three ways:
+
+* By writing new custom Python code (via Operators, Plugins, Provider)
+* By using the `Stable REST API <stable-rest-api-ref>`_ (based on the OpenAPI specification)
+* By using the `Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_
+
+How can you extend Apache Airflow with custom Python Code?
+==========================================================
+
+The Public Interface of Airflow consists of a number of different classes and packages that provide access
+to the core features and functionality of the system.
+
+The classes and packages that may be considered as the Public Interface include:
+
+* The :class:`~airflow.DAG`, which provides a way to define and manage DAGs in Airflow.
+* The :class:`~airflow.models.baseoperator.BaseOperator`, which provides a way write custom operators.
+* The :class:`~airflow.hooks.base.BaseHook`, which provides a way write custom hooks.
+* The :class:`~airflow.models.connection.Connection`, which provides access to external service credentials and configuration.
+* The :class:`~airflow.models.variable.Variable`, which provides access to Airflow configuration variables.
+* The :class:`~airflow.models.xcom.XCom` which are used to access to inter-task communication data.
+* The :class:`~airflow.secrets.BaseSecretsBackend` which are used to define custom secret managers.
+* The :class:`~airflow.plugins_manager.AirflowPlugin` which are used to define custom plugins.
+* The :class:`~airflow.triggers.base.BaseTrigger`, which are used to implement custom Custom Deferrable Operators (based on ``asyncio``).
+* The :class:`~airflow.decorators.base.TaskDecorator`, which provides a way write custom decorators.
+* The :class:`~airflow.listeners.listener.ListenerManager` class which provides hooks that can be implemented to respond to DAG/Task lifecycle events.
+
+.. versionadded:: 2.5
+
+   Listener public interface has been added in version 2.5.
+
+* The :class:`~airflow.executors.base_executor.BaseExecutor` - the Executors are the components of Airflow
+  that are responsible for executing tasks.
+
+.. versionadded:: 2.6
+
+   There are a number of different executor implementations built-in Airflow, each with its own unique
+   characteristics and capabilities. Executor interface was available in earlier version of Airflow but
+   only as of version 2.6 executors are fully decoupled and Airflow does not rely on built-in set of executors.
+   You could have implemented (and succeeded) with implementing Executors before Airflow 2.6 and a number
+   of people succeeded in doing so, but there were some hard-coded behaviours that preferred in-built
+   executors, and custom executors could not provide full functionality that built-in executors had.
+
+
+What is not part of the Public Interface of Apache Airflow?
+===========================================================
+
+Everything not mentioned in this document should be considered as non-Public Interface.

Review Comment:
   It's also important to clarify that some methods are intentionally "private". E.g. in the operator, it might have helper methods.  But these methods are more implementation detail right?  If users subclass, though, they may use and depend on these.  So what do we do? Ideally, from maintainer perspective, we want the overall behavior of the operator to be subject to backcompat but not _any_ of its methods.  Currently the convention is, i believe, all methods are assumed public unless prefix with underscore / marked meta private.  WDYT?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on pull request #28300: Add Public API description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
potiuk commented on PR #28300:
URL: https://github.com/apache/airflow/pull/28300#issuecomment-1347532706

   > Besides several structural changes, my main concern is the naming. To me "public APIs" 99% of the time relates to a REST API. I think "public interfaces" is better suited here. WDYT?
   
   Yes. That might avoid a lot of confusion.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on a diff in pull request #28300: Add Public API description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
potiuk commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1046543724


##########
docs/apache-airflow/public-airflow-api.rst:
##########
@@ -0,0 +1,141 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+"Public API" of Airflow
+=======================
+
+The Public API of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new Executors, Plugins, Operators and Providers. The
+public API of Apache Airflow can be useful for building custom tools and integrations with other systems,
+as well as for automating certain aspects of the Airflow workflow.
+
+In general, the public API is an important part of the Airflow ecosystem and can be a powerful tool for users
+and developers who want to extend the functionality of the system.
+
+What kind of APIs are Public in Apache Airflow?
+===============================================
+
+Apache Airflow has a number of different public APIs that allow developers to interact with various
+aspects of the system. Some examples of the types of public APIs exposed as Python objects
+that are available in Apache Airflow include:
+
+* :doc:`DAG <concepts/dags>`_ (Directed Acyclic Graph) APIs, which allow developers to create, manage,
+  and access DAGs in Airflow.
+* :doc:`Task <concepts/tasks>`_ APIs, which provide access to information about individual tasks within
+  a DAG, such as their dependencies and execution status.
+* :doc:`Operator <concepts/operators>`_ APIs, which allows the developers to write their custom Operators.
+* :doc:`Decorators <howto/create-custom-decorator>`_ APIs, which allows the developers to write their
+  custom decorators to make it easier to write :doc:`TaskFlow <tutorial/taskflow>`_ DAGs.
+* :doc:`Secret Managers <security/secrets>`_ APIs, which allows the developers to write their custom
+  Secret Managers to safely access credentials and other secret configuration of their workflows.
+* :doc:`Connection management <concepts/connections>`_ APIs, which allow developers to manage
+  connections to external systems
+* :doc:`XCom <concepts/xcoms>`_, which allow developers to manage cross-task communication within Airflow.
+* :doc:`Variables <concepts/variables>`_, which allow developers to manage variables within Airflow.
+* :doc:`Executors <executor/index>`_, which allow developers to manage the execution of tasks within Airflow.
+* :doc:`Plugins <plugins>`_, which allow developers to extend internal Airflow capabilities - add new UI
+  pages, custom :doc:`TimeTables <concepts/timetable>`_, :doc:`Extra Links <howto/define_extra_link>`_,
+  :doc:`Listeners <listeners>`_ - all of which are considered public APIs.
+
+Also Airflow has a stable REST API that allows users to interact with Airflow via HTTP requests and a
+CLI that allows users to interact with Airflow via command line commands.
+
+Overall, the public APIs in Apache Airflow are designed to provide developers with a wide
+range of tools and capabilities for interacting with the system and extending its functionality.
+
+What is not part of the Public API of Apache Airflow?
+=====================================================
+
+A public API of Apache Airflow is a set of programming interfaces that are designed to be used
+by developers to access certain features and functionality of the system. As such, not everything in
+Apache Airflow is considered to be part of the public API.
+
+For example, the internal implementation details of Apache Airflow, such as the specific algorithms
+and data structures used to manage DAGs and tasks, are not considered to be part of the public API.
+These implementation details are not intended to be used by external developers, a
+nd are subject to change without notice.
+
+Additionally, certain features and functionality of Apache Airflow may not be exposed through the public API,
+either because they are not considered to be stable or because they are not intended for external use.
+In these cases, developers may not be able to access these features through the public API,
+and should instead use other available methods for interacting with the system.
+
+Similarly, :doc:`Database structure <database-erd-ref>`_ is considered to be an internal implementation
+detail of Apache Airflow and you should not assume the structure is going to be maintained in
+backwards-compatible way.
+
+Is Airflow DB part of the Public API of Airflow?
+================================================
+
+The :doc:`Airflow DB <database-erd-ref>`_ is not considered to be part of the public API of Apache Airflow.
+The Airflow DB is the internal database used by Airflow to store information about DAGs, tasks, and
+other aspects of the system's state. This database is not intended to be accessed directly by
+external developers, and the specific details of its schema and structure are not considered
+to be part of the public API.
+
+Instead, developers who want to interact with the Airflow DB should use the :doc:`stable-rest-api-ref` and
+the :doc:`stable-cli-ref`, both providing a number of different interfaces and methods for accessing and
+manipulating data in the database. These APIs are designed to be more stable and user-friendly than
+accessing the Airflow DB directly, and provide a more consistent and predictable way of interacting
+with the system.
+
+Is Stable REST API part of the Public API of Airflow?
+=====================================================
+
+The :doc:`Stable REST API <stable-cli-ref>`_ is considered to be part of the public API of Apache Airflow.
+The REST API is a set of programming interfaces that allow developers to access certain features of
+Airflow over the HTTP protocol, using a standard set of RESTful (representational state transfer) conventions.
+
+The stable REST API is considered to be part of the public API because it is designed to be used by
+external developers to interact with Airflow in a consistent and predictable way. The REST API provides
+access to many of the same features and functionality that are available through the other public APIs
+of Airflow, such as the ability to create and manage DAGs and tasks, and to monitor their status.
+
+Overall, the stable REST API is an important part of the public API of Apache Airflow, and can
+be a useful tool for developers who want to integrate Airflow with other systems or
+build custom tools and applications on top of Airflow.

Review Comment:
   Yep. this can be indeed moved up.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on pull request #28300: Add Public Interface description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
potiuk commented on PR #28300:
URL: https://github.com/apache/airflow/pull/28300#issuecomment-1348978875

   Next round of reviews completed. I've also added Trigger as it was missing. I think it's coming up nicely with everyone's contributions.
   
   Please take a look and see if there are other things I missing,  maybe it is close to mergability
   
   BTW. The more review. the less GPT Chatbot's text remains:) . 
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] o-nikolas commented on a diff in pull request #28300: Add Public API description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
o-nikolas commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1046245918


##########
docs/apache-airflow/public-airflow-api.rst:
##########
@@ -0,0 +1,141 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+"Public API" of Airflow
+=======================
+
+The Public API of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new Executors, Plugins, Operators and Providers. The
+public API of Apache Airflow can be useful for building custom tools and integrations with other systems,
+as well as for automating certain aspects of the Airflow workflow.
+
+In general, the public API is an important part of the Airflow ecosystem and can be a powerful tool for users
+and developers who want to extend the functionality of the system.
+
+What kind of APIs are Public in Apache Airflow?
+===============================================
+
+Apache Airflow has a number of different public APIs that allow developers to interact with various
+aspects of the system. Some examples of the types of public APIs exposed as Python objects
+that are available in Apache Airflow include:
+
+* :doc:`DAG <concepts/dags>`_ (Directed Acyclic Graph) APIs, which allow developers to create, manage,
+  and access DAGs in Airflow.
+* :doc:`Task <concepts/tasks>`_ APIs, which provide access to information about individual tasks within
+  a DAG, such as their dependencies and execution status.
+* :doc:`Operator <concepts/operators>`_ APIs, which allows the developers to write their custom Operators.
+* :doc:`Decorators <howto/create-custom-decorator>`_ APIs, which allows the developers to write their
+  custom decorators to make it easier to write :doc:`TaskFlow <tutorial/taskflow>`_ DAGs.
+* :doc:`Secret Managers <security/secrets>`_ APIs, which allows the developers to write their custom
+  Secret Managers to safely access credentials and other secret configuration of their workflows.
+* :doc:`Connection management <concepts/connections>`_ APIs, which allow developers to manage
+  connections to external systems
+* :doc:`XCom <concepts/xcoms>`_, which allow developers to manage cross-task communication within Airflow.
+* :doc:`Variables <concepts/variables>`_, which allow developers to manage variables within Airflow.
+* :doc:`Executors <executor/index>`_, which allow developers to manage the execution of tasks within Airflow.
+* :doc:`Plugins <plugins>`_, which allow developers to extend internal Airflow capabilities - add new UI
+  pages, custom :doc:`TimeTables <concepts/timetable>`_, :doc:`Extra Links <howto/define_extra_link>`_,
+  :doc:`Listeners <listeners>`_ - all of which are considered public APIs.
+
+Also Airflow has a stable REST API that allows users to interact with Airflow via HTTP requests and a
+CLI that allows users to interact with Airflow via command line commands.
+
+Overall, the public APIs in Apache Airflow are designed to provide developers with a wide
+range of tools and capabilities for interacting with the system and extending its functionality.
+
+What is not part of the Public API of Apache Airflow?
+=====================================================
+
+A public API of Apache Airflow is a set of programming interfaces that are designed to be used
+by developers to access certain features and functionality of the system. As such, not everything in
+Apache Airflow is considered to be part of the public API.
+
+For example, the internal implementation details of Apache Airflow, such as the specific algorithms
+and data structures used to manage DAGs and tasks, are not considered to be part of the public API.
+These implementation details are not intended to be used by external developers, a
+nd are subject to change without notice.
+
+Additionally, certain features and functionality of Apache Airflow may not be exposed through the public API,
+either because they are not considered to be stable or because they are not intended for external use.
+In these cases, developers may not be able to access these features through the public API,
+and should instead use other available methods for interacting with the system.
+
+Similarly, :doc:`Database structure <database-erd-ref>`_ is considered to be an internal implementation
+detail of Apache Airflow and you should not assume the structure is going to be maintained in
+backwards-compatible way.

Review Comment:
   I like the concreteness of this bit :+1: 



##########
docs/apache-airflow/public-airflow-api.rst:
##########
@@ -0,0 +1,141 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+"Public API" of Airflow
+=======================
+
+The Public API of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new Executors, Plugins, Operators and Providers. The
+public API of Apache Airflow can be useful for building custom tools and integrations with other systems,
+as well as for automating certain aspects of the Airflow workflow.
+
+In general, the public API is an important part of the Airflow ecosystem and can be a powerful tool for users
+and developers who want to extend the functionality of the system.
+
+What kind of APIs are Public in Apache Airflow?
+===============================================
+
+Apache Airflow has a number of different public APIs that allow developers to interact with various
+aspects of the system. Some examples of the types of public APIs exposed as Python objects
+that are available in Apache Airflow include:
+
+* :doc:`DAG <concepts/dags>`_ (Directed Acyclic Graph) APIs, which allow developers to create, manage,
+  and access DAGs in Airflow.
+* :doc:`Task <concepts/tasks>`_ APIs, which provide access to information about individual tasks within
+  a DAG, such as their dependencies and execution status.
+* :doc:`Operator <concepts/operators>`_ APIs, which allows the developers to write their custom Operators.
+* :doc:`Decorators <howto/create-custom-decorator>`_ APIs, which allows the developers to write their
+  custom decorators to make it easier to write :doc:`TaskFlow <tutorial/taskflow>`_ DAGs.
+* :doc:`Secret Managers <security/secrets>`_ APIs, which allows the developers to write their custom
+  Secret Managers to safely access credentials and other secret configuration of their workflows.
+* :doc:`Connection management <concepts/connections>`_ APIs, which allow developers to manage
+  connections to external systems
+* :doc:`XCom <concepts/xcoms>`_, which allow developers to manage cross-task communication within Airflow.
+* :doc:`Variables <concepts/variables>`_, which allow developers to manage variables within Airflow.
+* :doc:`Executors <executor/index>`_, which allow developers to manage the execution of tasks within Airflow.
+* :doc:`Plugins <plugins>`_, which allow developers to extend internal Airflow capabilities - add new UI
+  pages, custom :doc:`TimeTables <concepts/timetable>`_, :doc:`Extra Links <howto/define_extra_link>`_,
+  :doc:`Listeners <listeners>`_ - all of which are considered public APIs.
+
+Also Airflow has a stable REST API that allows users to interact with Airflow via HTTP requests and a
+CLI that allows users to interact with Airflow via command line commands.
+
+Overall, the public APIs in Apache Airflow are designed to provide developers with a wide
+range of tools and capabilities for interacting with the system and extending its functionality.
+
+What is not part of the Public API of Apache Airflow?
+=====================================================
+
+A public API of Apache Airflow is a set of programming interfaces that are designed to be used

Review Comment:
   ```suggestion
   The public API of Apache Airflow is a set of programming interfaces that are designed to be used
   ```



##########
docs/apache-airflow/public-airflow-api.rst:
##########
@@ -0,0 +1,141 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+"Public API" of Airflow
+=======================
+
+The Public API of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new Executors, Plugins, Operators and Providers. The
+public API of Apache Airflow can be useful for building custom tools and integrations with other systems,
+as well as for automating certain aspects of the Airflow workflow.
+
+In general, the public API is an important part of the Airflow ecosystem and can be a powerful tool for users
+and developers who want to extend the functionality of the system.
+
+What kind of APIs are Public in Apache Airflow?
+===============================================
+
+Apache Airflow has a number of different public APIs that allow developers to interact with various
+aspects of the system. Some examples of the types of public APIs exposed as Python objects
+that are available in Apache Airflow include:
+
+* :doc:`DAG <concepts/dags>`_ (Directed Acyclic Graph) APIs, which allow developers to create, manage,
+  and access DAGs in Airflow.
+* :doc:`Task <concepts/tasks>`_ APIs, which provide access to information about individual tasks within
+  a DAG, such as their dependencies and execution status.
+* :doc:`Operator <concepts/operators>`_ APIs, which allows the developers to write their custom Operators.
+* :doc:`Decorators <howto/create-custom-decorator>`_ APIs, which allows the developers to write their
+  custom decorators to make it easier to write :doc:`TaskFlow <tutorial/taskflow>`_ DAGs.
+* :doc:`Secret Managers <security/secrets>`_ APIs, which allows the developers to write their custom
+  Secret Managers to safely access credentials and other secret configuration of their workflows.
+* :doc:`Connection management <concepts/connections>`_ APIs, which allow developers to manage
+  connections to external systems
+* :doc:`XCom <concepts/xcoms>`_, which allow developers to manage cross-task communication within Airflow.
+* :doc:`Variables <concepts/variables>`_, which allow developers to manage variables within Airflow.
+* :doc:`Executors <executor/index>`_, which allow developers to manage the execution of tasks within Airflow.
+* :doc:`Plugins <plugins>`_, which allow developers to extend internal Airflow capabilities - add new UI
+  pages, custom :doc:`TimeTables <concepts/timetable>`_, :doc:`Extra Links <howto/define_extra_link>`_,
+  :doc:`Listeners <listeners>`_ - all of which are considered public APIs.
+
+Also Airflow has a stable REST API that allows users to interact with Airflow via HTTP requests and a
+CLI that allows users to interact with Airflow via command line commands.
+
+Overall, the public APIs in Apache Airflow are designed to provide developers with a wide
+range of tools and capabilities for interacting with the system and extending its functionality.
+
+What is not part of the Public API of Apache Airflow?
+=====================================================
+
+A public API of Apache Airflow is a set of programming interfaces that are designed to be used
+by developers to access certain features and functionality of the system. As such, not everything in
+Apache Airflow is considered to be part of the public API.
+
+For example, the internal implementation details of Apache Airflow, such as the specific algorithms
+and data structures used to manage DAGs and tasks, are not considered to be part of the public API.
+These implementation details are not intended to be used by external developers, a
+nd are subject to change without notice.
+
+Additionally, certain features and functionality of Apache Airflow may not be exposed through the public API,
+either because they are not considered to be stable or because they are not intended for external use.
+In these cases, developers may not be able to access these features through the public API,
+and should instead use other available methods for interacting with the system.
+
+Similarly, :doc:`Database structure <database-erd-ref>`_ is considered to be an internal implementation
+detail of Apache Airflow and you should not assume the structure is going to be maintained in
+backwards-compatible way.
+
+Is Airflow DB part of the Public API of Airflow?
+================================================
+
+The :doc:`Airflow DB <database-erd-ref>`_ is not considered to be part of the public API of Apache Airflow.
+The Airflow DB is the internal database used by Airflow to store information about DAGs, tasks, and
+other aspects of the system's state. This database is not intended to be accessed directly by
+external developers, and the specific details of its schema and structure are not considered
+to be part of the public API.
+
+Instead, developers who want to interact with the Airflow DB should use the :doc:`stable-rest-api-ref` and
+the :doc:`stable-cli-ref`, both providing a number of different interfaces and methods for accessing and
+manipulating data in the database. These APIs are designed to be more stable and user-friendly than
+accessing the Airflow DB directly, and provide a more consistent and predictable way of interacting
+with the system.
+
+Is Stable REST API part of the Public API of Airflow?
+=====================================================
+
+The :doc:`Stable REST API <stable-cli-ref>`_ is considered to be part of the public API of Apache Airflow.
+The REST API is a set of programming interfaces that allow developers to access certain features of
+Airflow over the HTTP protocol, using a standard set of RESTful (representational state transfer) conventions.
+
+The stable REST API is considered to be part of the public API because it is designed to be used by
+external developers to interact with Airflow in a consistent and predictable way. The REST API provides
+access to many of the same features and functionality that are available through the other public APIs
+of Airflow, such as the ability to create and manage DAGs and tasks, and to monitor their status.
+
+Overall, the stable REST API is an important part of the public API of Apache Airflow, and can
+be a useful tool for developers who want to integrate Airflow with other systems or
+build custom tools and applications on top of Airflow.
+
+Which classes and packages are a public API of Apache Airflow?

Review Comment:
   So far you've been capitalizing Public API
   ```suggestion
   Which classes and packages are part of the Public API of Apache Airflow?
   ```



##########
docs/apache-airflow/public-airflow-api.rst:
##########
@@ -0,0 +1,141 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+"Public API" of Airflow
+=======================
+
+The Public API of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new Executors, Plugins, Operators and Providers. The
+public API of Apache Airflow can be useful for building custom tools and integrations with other systems,
+as well as for automating certain aspects of the Airflow workflow.
+
+In general, the public API is an important part of the Airflow ecosystem and can be a powerful tool for users
+and developers who want to extend the functionality of the system.
+
+What kind of APIs are Public in Apache Airflow?
+===============================================
+
+Apache Airflow has a number of different public APIs that allow developers to interact with various
+aspects of the system. Some examples of the types of public APIs exposed as Python objects
+that are available in Apache Airflow include:
+
+* :doc:`DAG <concepts/dags>`_ (Directed Acyclic Graph) APIs, which allow developers to create, manage,
+  and access DAGs in Airflow.
+* :doc:`Task <concepts/tasks>`_ APIs, which provide access to information about individual tasks within
+  a DAG, such as their dependencies and execution status.
+* :doc:`Operator <concepts/operators>`_ APIs, which allows the developers to write their custom Operators.
+* :doc:`Decorators <howto/create-custom-decorator>`_ APIs, which allows the developers to write their
+  custom decorators to make it easier to write :doc:`TaskFlow <tutorial/taskflow>`_ DAGs.
+* :doc:`Secret Managers <security/secrets>`_ APIs, which allows the developers to write their custom
+  Secret Managers to safely access credentials and other secret configuration of their workflows.
+* :doc:`Connection management <concepts/connections>`_ APIs, which allow developers to manage
+  connections to external systems
+* :doc:`XCom <concepts/xcoms>`_, which allow developers to manage cross-task communication within Airflow.
+* :doc:`Variables <concepts/variables>`_, which allow developers to manage variables within Airflow.
+* :doc:`Executors <executor/index>`_, which allow developers to manage the execution of tasks within Airflow.
+* :doc:`Plugins <plugins>`_, which allow developers to extend internal Airflow capabilities - add new UI
+  pages, custom :doc:`TimeTables <concepts/timetable>`_, :doc:`Extra Links <howto/define_extra_link>`_,
+  :doc:`Listeners <listeners>`_ - all of which are considered public APIs.
+
+Also Airflow has a stable REST API that allows users to interact with Airflow via HTTP requests and a
+CLI that allows users to interact with Airflow via command line commands.
+
+Overall, the public APIs in Apache Airflow are designed to provide developers with a wide
+range of tools and capabilities for interacting with the system and extending its functionality.
+
+What is not part of the Public API of Apache Airflow?
+=====================================================
+
+A public API of Apache Airflow is a set of programming interfaces that are designed to be used
+by developers to access certain features and functionality of the system. As such, not everything in
+Apache Airflow is considered to be part of the public API.
+
+For example, the internal implementation details of Apache Airflow, such as the specific algorithms
+and data structures used to manage DAGs and tasks, are not considered to be part of the public API.
+These implementation details are not intended to be used by external developers, a
+nd are subject to change without notice.
+
+Additionally, certain features and functionality of Apache Airflow may not be exposed through the public API,
+either because they are not considered to be stable or because they are not intended for external use.
+In these cases, developers may not be able to access these features through the public API,
+and should instead use other available methods for interacting with the system.
+
+Similarly, :doc:`Database structure <database-erd-ref>`_ is considered to be an internal implementation
+detail of Apache Airflow and you should not assume the structure is going to be maintained in
+backwards-compatible way.
+
+Is Airflow DB part of the Public API of Airflow?
+================================================
+
+The :doc:`Airflow DB <database-erd-ref>`_ is not considered to be part of the public API of Apache Airflow.
+The Airflow DB is the internal database used by Airflow to store information about DAGs, tasks, and
+other aspects of the system's state. This database is not intended to be accessed directly by
+external developers, and the specific details of its schema and structure are not considered
+to be part of the public API.
+
+Instead, developers who want to interact with the Airflow DB should use the :doc:`stable-rest-api-ref` and
+the :doc:`stable-cli-ref`, both providing a number of different interfaces and methods for accessing and
+manipulating data in the database. These APIs are designed to be more stable and user-friendly than
+accessing the Airflow DB directly, and provide a more consistent and predictable way of interacting
+with the system.
+
+Is Stable REST API part of the Public API of Airflow?
+=====================================================
+
+The :doc:`Stable REST API <stable-cli-ref>`_ is considered to be part of the public API of Apache Airflow.
+The REST API is a set of programming interfaces that allow developers to access certain features of
+Airflow over the HTTP protocol, using a standard set of RESTful (representational state transfer) conventions.
+
+The stable REST API is considered to be part of the public API because it is designed to be used by
+external developers to interact with Airflow in a consistent and predictable way. The REST API provides
+access to many of the same features and functionality that are available through the other public APIs
+of Airflow, such as the ability to create and manage DAGs and tasks, and to monitor their status.
+
+Overall, the stable REST API is an important part of the public API of Apache Airflow, and can
+be a useful tool for developers who want to integrate Airflow with other systems or
+build custom tools and applications on top of Airflow.
+
+Which classes and packages are a public API of Apache Airflow?
+==============================================================
+
+The specific classes and packages that are considered to be part of the public API of Apache
+Airflow can vary depending on the version of Airflow that you are using. In general, however, the
+public API of Airflow consists of a number of different classes and packages that provide access
+to the core features and functionality of the system.
+
+Some examples of the classes and packages that may be considered as the public API of Apache Airflow include:
+
+* The :class:`~airflow.models.dag.DAG`, which provides a way to define and manage DAGs in Airflow.
+* The :class:`~airflow.models.baseoperator.BaseOperator`, which provides a way write custom operators.
+* The :class:`~airflow.hooks.base.BaseHook`, which provides a way write custom hooks.
+* The :class:`~airflow.decorators.base.TaskDecorator`, which provides a way write custom hooks.
+* :class:`~airflow.models.connection.Connection`, :class:`~airflow.models.variable.Variable` and
+  :class:`~airflow.models.xcom.XCom` classes that are used to access and manage environment of execution of
+  the tasks and allow to perform an inter-task communication.
+* :class:`~airflow.executors.base_executor.BaseExecutor` - the Executors are the components of
+  Airflow that are responsible for executing tasks. There are a number of different executor implementations
+  available in Airflow, each with its own unique characteristics and capabilities. You can develop your own
+  executors in the way that suits your needs.

Review Comment:
   ```suggestion
     executors in the way that suits your needs by inheriting from the BaseExecutor.
   ```
   
   A relevant ongoing discussion about if or how we should support external executors that don't inherit from BaseExecutor: https://github.com/apache/airflow/issues/28276#issuecomment-1344899475



##########
docs/apache-airflow/public-airflow-api.rst:
##########
@@ -0,0 +1,141 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+"Public API" of Airflow
+=======================
+
+The Public API of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new Executors, Plugins, Operators and Providers. The
+public API of Apache Airflow can be useful for building custom tools and integrations with other systems,
+as well as for automating certain aspects of the Airflow workflow.
+
+In general, the public API is an important part of the Airflow ecosystem and can be a powerful tool for users
+and developers who want to extend the functionality of the system.
+
+What kind of APIs are Public in Apache Airflow?
+===============================================
+
+Apache Airflow has a number of different public APIs that allow developers to interact with various
+aspects of the system. Some examples of the types of public APIs exposed as Python objects
+that are available in Apache Airflow include:
+
+* :doc:`DAG <concepts/dags>`_ (Directed Acyclic Graph) APIs, which allow developers to create, manage,
+  and access DAGs in Airflow.
+* :doc:`Task <concepts/tasks>`_ APIs, which provide access to information about individual tasks within
+  a DAG, such as their dependencies and execution status.
+* :doc:`Operator <concepts/operators>`_ APIs, which allows the developers to write their custom Operators.
+* :doc:`Decorators <howto/create-custom-decorator>`_ APIs, which allows the developers to write their
+  custom decorators to make it easier to write :doc:`TaskFlow <tutorial/taskflow>`_ DAGs.
+* :doc:`Secret Managers <security/secrets>`_ APIs, which allows the developers to write their custom
+  Secret Managers to safely access credentials and other secret configuration of their workflows.
+* :doc:`Connection management <concepts/connections>`_ APIs, which allow developers to manage
+  connections to external systems
+* :doc:`XCom <concepts/xcoms>`_, which allow developers to manage cross-task communication within Airflow.
+* :doc:`Variables <concepts/variables>`_, which allow developers to manage variables within Airflow.
+* :doc:`Executors <executor/index>`_, which allow developers to manage the execution of tasks within Airflow.
+* :doc:`Plugins <plugins>`_, which allow developers to extend internal Airflow capabilities - add new UI
+  pages, custom :doc:`TimeTables <concepts/timetable>`_, :doc:`Extra Links <howto/define_extra_link>`_,
+  :doc:`Listeners <listeners>`_ - all of which are considered public APIs.
+
+Also Airflow has a stable REST API that allows users to interact with Airflow via HTTP requests and a
+CLI that allows users to interact with Airflow via command line commands.
+
+Overall, the public APIs in Apache Airflow are designed to provide developers with a wide
+range of tools and capabilities for interacting with the system and extending its functionality.
+
+What is not part of the Public API of Apache Airflow?
+=====================================================
+
+A public API of Apache Airflow is a set of programming interfaces that are designed to be used
+by developers to access certain features and functionality of the system. As such, not everything in
+Apache Airflow is considered to be part of the public API.
+
+For example, the internal implementation details of Apache Airflow, such as the specific algorithms
+and data structures used to manage DAGs and tasks, are not considered to be part of the public API.
+These implementation details are not intended to be used by external developers, a
+nd are subject to change without notice.
+
+Additionally, certain features and functionality of Apache Airflow may not be exposed through the public API,
+either because they are not considered to be stable or because they are not intended for external use.
+In these cases, developers may not be able to access these features through the public API,
+and should instead use other available methods for interacting with the system.
+
+Similarly, :doc:`Database structure <database-erd-ref>`_ is considered to be an internal implementation
+detail of Apache Airflow and you should not assume the structure is going to be maintained in
+backwards-compatible way.
+
+Is Airflow DB part of the Public API of Airflow?

Review Comment:
   ```suggestion
   Is the Airflow DB part of the Public API of Airflow?
   ```



##########
docs/apache-airflow/public-airflow-api.rst:
##########
@@ -0,0 +1,141 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+"Public API" of Airflow
+=======================
+
+The Public API of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new Executors, Plugins, Operators and Providers. The
+public API of Apache Airflow can be useful for building custom tools and integrations with other systems,
+as well as for automating certain aspects of the Airflow workflow.
+
+In general, the public API is an important part of the Airflow ecosystem and can be a powerful tool for users
+and developers who want to extend the functionality of the system.
+
+What kind of APIs are Public in Apache Airflow?
+===============================================
+
+Apache Airflow has a number of different public APIs that allow developers to interact with various
+aspects of the system. Some examples of the types of public APIs exposed as Python objects
+that are available in Apache Airflow include:
+
+* :doc:`DAG <concepts/dags>`_ (Directed Acyclic Graph) APIs, which allow developers to create, manage,
+  and access DAGs in Airflow.
+* :doc:`Task <concepts/tasks>`_ APIs, which provide access to information about individual tasks within
+  a DAG, such as their dependencies and execution status.
+* :doc:`Operator <concepts/operators>`_ APIs, which allows the developers to write their custom Operators.
+* :doc:`Decorators <howto/create-custom-decorator>`_ APIs, which allows the developers to write their
+  custom decorators to make it easier to write :doc:`TaskFlow <tutorial/taskflow>`_ DAGs.
+* :doc:`Secret Managers <security/secrets>`_ APIs, which allows the developers to write their custom
+  Secret Managers to safely access credentials and other secret configuration of their workflows.
+* :doc:`Connection management <concepts/connections>`_ APIs, which allow developers to manage
+  connections to external systems
+* :doc:`XCom <concepts/xcoms>`_, which allow developers to manage cross-task communication within Airflow.
+* :doc:`Variables <concepts/variables>`_, which allow developers to manage variables within Airflow.
+* :doc:`Executors <executor/index>`_, which allow developers to manage the execution of tasks within Airflow.
+* :doc:`Plugins <plugins>`_, which allow developers to extend internal Airflow capabilities - add new UI
+  pages, custom :doc:`TimeTables <concepts/timetable>`_, :doc:`Extra Links <howto/define_extra_link>`_,
+  :doc:`Listeners <listeners>`_ - all of which are considered public APIs.
+
+Also Airflow has a stable REST API that allows users to interact with Airflow via HTTP requests and a
+CLI that allows users to interact with Airflow via command line commands.
+
+Overall, the public APIs in Apache Airflow are designed to provide developers with a wide
+range of tools and capabilities for interacting with the system and extending its functionality.
+
+What is not part of the Public API of Apache Airflow?
+=====================================================
+
+A public API of Apache Airflow is a set of programming interfaces that are designed to be used
+by developers to access certain features and functionality of the system. As such, not everything in
+Apache Airflow is considered to be part of the public API.
+
+For example, the internal implementation details of Apache Airflow, such as the specific algorithms
+and data structures used to manage DAGs and tasks, are not considered to be part of the public API.
+These implementation details are not intended to be used by external developers, a
+nd are subject to change without notice.
+
+Additionally, certain features and functionality of Apache Airflow may not be exposed through the public API,
+either because they are not considered to be stable or because they are not intended for external use.
+In these cases, developers may not be able to access these features through the public API,
+and should instead use other available methods for interacting with the system.
+
+Similarly, :doc:`Database structure <database-erd-ref>`_ is considered to be an internal implementation
+detail of Apache Airflow and you should not assume the structure is going to be maintained in
+backwards-compatible way.
+
+Is Airflow DB part of the Public API of Airflow?
+================================================
+
+The :doc:`Airflow DB <database-erd-ref>`_ is not considered to be part of the public API of Apache Airflow.
+The Airflow DB is the internal database used by Airflow to store information about DAGs, tasks, and
+other aspects of the system's state. This database is not intended to be accessed directly by
+external developers, and the specific details of its schema and structure are not considered
+to be part of the public API.
+
+Instead, developers who want to interact with the Airflow DB should use the :doc:`stable-rest-api-ref` and
+the :doc:`stable-cli-ref`, both providing a number of different interfaces and methods for accessing and
+manipulating data in the database. These APIs are designed to be more stable and user-friendly than
+accessing the Airflow DB directly, and provide a more consistent and predictable way of interacting
+with the system.
+
+Is Stable REST API part of the Public API of Airflow?
+=====================================================
+
+The :doc:`Stable REST API <stable-cli-ref>`_ is considered to be part of the public API of Apache Airflow.
+The REST API is a set of programming interfaces that allow developers to access certain features of
+Airflow over the HTTP protocol, using a standard set of RESTful (representational state transfer) conventions.
+
+The stable REST API is considered to be part of the public API because it is designed to be used by
+external developers to interact with Airflow in a consistent and predictable way. The REST API provides
+access to many of the same features and functionality that are available through the other public APIs
+of Airflow, such as the ability to create and manage DAGs and tasks, and to monitor their status.
+
+Overall, the stable REST API is an important part of the public API of Apache Airflow, and can
+be a useful tool for developers who want to integrate Airflow with other systems or
+build custom tools and applications on top of Airflow.
+
+Which classes and packages are a public API of Apache Airflow?
+==============================================================
+
+The specific classes and packages that are considered to be part of the public API of Apache
+Airflow can vary depending on the version of Airflow that you are using. In general, however, the
+public API of Airflow consists of a number of different classes and packages that provide access
+to the core features and functionality of the system.
+
+Some examples of the classes and packages that may be considered as the public API of Apache Airflow include:
+
+* The :class:`~airflow.models.dag.DAG`, which provides a way to define and manage DAGs in Airflow.
+* The :class:`~airflow.models.baseoperator.BaseOperator`, which provides a way write custom operators.
+* The :class:`~airflow.hooks.base.BaseHook`, which provides a way write custom hooks.
+* The :class:`~airflow.decorators.base.TaskDecorator`, which provides a way write custom hooks.
+* :class:`~airflow.models.connection.Connection`, :class:`~airflow.models.variable.Variable` and
+  :class:`~airflow.models.xcom.XCom` classes that are used to access and manage environment of execution of
+  the tasks and allow to perform an inter-task communication.
+* :class:`~airflow.executors.base_executor.BaseExecutor` - the Executors are the components of
+  Airflow that are responsible for executing tasks. There are a number of different executor implementations
+  available in Airflow, each with its own unique characteristics and capabilities. You can develop your own
+  executors in the way that suits your needs.
+
+.. note::
+
+    Originally Executors had a number of internal details and coupling with internal Airflow core logic
+    that made it difficult to write your own executors. But as of
+    Airflow 2.6.0 and `AIP-51 <https://cwiki.apache.org/confluence/display/AIRFLOW/AIP-51+Removing+Executor+Coupling+from+Core+Airflow>`_
+    this is no longer the case. The Executors are now fully decoupled and are a part of the public API.

Review Comment:
   😍



##########
docs/apache-airflow/public-airflow-api.rst:
##########
@@ -0,0 +1,141 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+"Public API" of Airflow
+=======================
+
+The Public API of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new Executors, Plugins, Operators and Providers. The
+public API of Apache Airflow can be useful for building custom tools and integrations with other systems,
+as well as for automating certain aspects of the Airflow workflow.
+
+In general, the public API is an important part of the Airflow ecosystem and can be a powerful tool for users
+and developers who want to extend the functionality of the system.
+
+What kind of APIs are Public in Apache Airflow?
+===============================================
+
+Apache Airflow has a number of different public APIs that allow developers to interact with various
+aspects of the system. Some examples of the types of public APIs exposed as Python objects
+that are available in Apache Airflow include:
+
+* :doc:`DAG <concepts/dags>`_ (Directed Acyclic Graph) APIs, which allow developers to create, manage,
+  and access DAGs in Airflow.
+* :doc:`Task <concepts/tasks>`_ APIs, which provide access to information about individual tasks within
+  a DAG, such as their dependencies and execution status.
+* :doc:`Operator <concepts/operators>`_ APIs, which allows the developers to write their custom Operators.
+* :doc:`Decorators <howto/create-custom-decorator>`_ APIs, which allows the developers to write their
+  custom decorators to make it easier to write :doc:`TaskFlow <tutorial/taskflow>`_ DAGs.
+* :doc:`Secret Managers <security/secrets>`_ APIs, which allows the developers to write their custom
+  Secret Managers to safely access credentials and other secret configuration of their workflows.
+* :doc:`Connection management <concepts/connections>`_ APIs, which allow developers to manage
+  connections to external systems
+* :doc:`XCom <concepts/xcoms>`_, which allow developers to manage cross-task communication within Airflow.
+* :doc:`Variables <concepts/variables>`_, which allow developers to manage variables within Airflow.
+* :doc:`Executors <executor/index>`_, which allow developers to manage the execution of tasks within Airflow.
+* :doc:`Plugins <plugins>`_, which allow developers to extend internal Airflow capabilities - add new UI
+  pages, custom :doc:`TimeTables <concepts/timetable>`_, :doc:`Extra Links <howto/define_extra_link>`_,
+  :doc:`Listeners <listeners>`_ - all of which are considered public APIs.
+
+Also Airflow has a stable REST API that allows users to interact with Airflow via HTTP requests and a
+CLI that allows users to interact with Airflow via command line commands.
+
+Overall, the public APIs in Apache Airflow are designed to provide developers with a wide
+range of tools and capabilities for interacting with the system and extending its functionality.
+
+What is not part of the Public API of Apache Airflow?
+=====================================================
+
+A public API of Apache Airflow is a set of programming interfaces that are designed to be used
+by developers to access certain features and functionality of the system. As such, not everything in
+Apache Airflow is considered to be part of the public API.
+
+For example, the internal implementation details of Apache Airflow, such as the specific algorithms
+and data structures used to manage DAGs and tasks, are not considered to be part of the public API.

Review Comment:
   This is very vague, maybe too vague to be functionally useful?
   
   One concrete example that I think would really help in functional decision making is:
   Any "private" class/method/function ([underscore prefixed](https://docs.python.org/2/tutorial/classes.html#private-variables-and-class-local-references)) in the Airflow code base is not part of the public api. If you're a third party and you're importing such methods and using them in your code or plugins we **do not** make any guarantees about those methods (how they function or even if they will exist from release to release). Then we should start marking code as private more regularly, there is tonnes of code in Airflow which really is private but not marked as such.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on a diff in pull request #28300: Add Public API description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
potiuk commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1046506516


##########
docs/apache-airflow/public-airflow-api.rst:
##########
@@ -0,0 +1,141 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+"Public API" of Airflow

Review Comment:
   Yep. And coupled with Public Interface it makes sense to make it consistent.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on a diff in pull request #28300: Add Public Interface description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
potiuk commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1050144300


##########
docs/apache-airflow/public-airflow-interface.rst:
##########
@@ -0,0 +1,115 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+===========================
+
+The Public Interface of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+as well as for automating certain aspects of the Airflow workflow.
+
+In general, the Public Interface is an important part of the Airflow ecosystem and can be a powerful
+tool for users and developers who want to extend the functionality of the system.
+
+You can extend Airflow in three ways:
+
+* By writing new custom Python code (via Operators, Plugins, Provider)
+* By using the `Stable REST API <stable-rest-api-ref>`_ (based on the OpenAPI specification)
+* By using the `Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_
+
+How can you extend Apache Airflow with custom Python Code?
+==========================================================
+
+Apache Airflow has a number of different Public Interfaces that allow developers to interact with various
+aspects of the system. Some examples of the types of Public Interfaces exposed as Python objects
+that are available in Apache Airflow include:
+
+* `DAG <concepts/dags>`_ (Directed Acyclic Graph) APIs, which allow developers to create, manage,
+  and access DAGs in Airflow.
+* `Task <concepts/tasks>`_ APIs, which provide access to information about individual tasks within
+  a DAG, such as their dependencies and execution status.
+* `Operator <concepts/operators>`_ APIs, which allows the developers to write their custom Operators.
+* `Decorators <howto/create-custom-decorator>`_ APIs, which allows the developers to write their
+  custom decorators to make it easier to write `TaskFlow <tutorial/taskflow>`_ DAGs.
+* `Secret Managers <security/secrets>`_ APIs, which allows the developers to write their custom
+  Secret Managers to safely access credentials and other secret configuration of their workflows.
+* `Connection management <concepts/connections>`_ APIs, which allow developers to manage
+  connections to external systems
+* `XCom <concepts/xcoms>`_, which allow developers to manage cross-task communication within Airflow.
+* `Variables <concepts/variables>`_, which allow developers to manage variables within Airflow.
+* `Executors <executor/index>`_, which allow developers to manage the execution of tasks within Airflow.
+* `Listeners <listeners>`_, which allow developers to react to DAG/Task lifecycle events.
+* `Plugins <plugins>`_, which allow developers to extend internal Airflow capabilities - add new UI
+  pages, custom `TimeTables <concepts/timetable>`_, `Extra Links <howto/define_extra_link>`_,
+  `Triggers <concept/deferring>`_. `Listeners <listeners>`_.
+
+What is not part of the Public Interface of Apache Airflow?
+===========================================================
+
+Everything not mentioned in this document should be considered as non-Public Interface.
+
+The users however, might have some assumptions about different parts of Airflow, thinking that
+they can rely on them, so here we try to clarify and want to be explicit that they are not part of the
+Public Interface:
+
+* `Database structure <database-erd-ref>`_ is considered to be an internal implementation
+  detail and you should not assume the structure is going to be maintained in
+  backwards-compatible way.
+
+* `Web UI <ui>`_ is considered to be continuously adapting and evolving for Apache Airflow and
+  you should not assume that the UI will remain as it is in backwards-compatible way. Views and screens
+  might be updated and removed in major releases without impacting backwards-compatibility.
+
+* `Python API <python-api-ref>`_ (except the explicitly mentioned classes below), are considered an
+  internal implementation details and you should not assume they will be maintained
+  in backwards-compatible way.
+
+Which classes and packages are part of the Public Interface of Apache Airflow?
+==============================================================================
+
+The Public Interface of Airflow consists of a number of different classes and packages that provide access
+to the core features and functionality of the system.
+
+The classes and packages that may be considered as the Public Interface include:
+
+* The :class:`~airflow.DAG`, which provides a way to define and manage DAGs in Airflow.
+* The :class:`~airflow.models.baseoperator.BaseOperator`, which provides a way write custom operators.
+* The :class:`~airflow.hooks.base.BaseHook`, which provides a way write custom hooks.
+* The :class:`~airflow.models.connection.Connection`, which provides access to external service credentials and configuration.
+* The :class:`~airflow.models.variable.Variable`, which provides access to Airflow configuration variables.
+* The :class:`~airflow.models.xcom.XCom` which are used to access to inter-task communication data.
+* The :class:`~airflow.secrets.BaseSecretsBackend` which are used to define custom secret managers.
+* The :class:`~airflow.plugins_manager.AirflowPlugin` which are used to define custom plugins.
+* The :class:`~airflow.triggers.base.BaseTrigger`, which are used to implement custom Custom Deferrable Operators (based on ``async.io``).
+* The :class:`~airflow.decorators.base.TaskDecorator`, which provides a way write custom decorators.
+* The :class:`~airflow.listeners.listener.ListenerManager` class which provides hooks that can be implemented to respond to DAG/Task lifecycle events
+
+.. versionadded:: 2.5
+
+   Listener public interface has been added in version 2.5.
+
+* The :class:`~airflow.executors.base_executor.BaseExecutor` - the Executors are the components of Airflow
+  that are responsible for executing tasks.
+
+.. versionadded:: 2.6
+
+   There are a number of different executor implementations built-in Airflow, each with its own unique
+   characteristics and capabilities. Executor interface was available in earlier version of Airflow but
+   only as of version 2.6 executors are decoupled and Airflow does not rely on built-in set of executors.

Review Comment:
   > This is simple not true though -- I know of multiple people (Astro and others) who have successfully written custom executors against 2.1 and above.
   
   It's very strong statement to claim something is not true. But I think we are mixing two things here. I know there are people who wrote some implementations of custom executors that worked in some cases. This is true. I agree with that statement. And I am not arguing with that statement at all. In fact I wholeheartedly agree with your statement @ash. And I believe it is not invalidating what I wrote (or at least what my intention was). What I wrote that the earlier attempts to write executor could have failed in a number of non-obvious cases because there were a number of hard-coded behaviours that made executior not a "well defined API".
   
   I do not want to make an attempt to describe what things you should avoid and what problems you would hit with tryibng to implement Executors before. I think it's a topic that requires some intrinsic analysis of where the "hard-coded" behaviours could cause the problems. And if someone would like to attempt it - I am all ears.
   
   But - let me rephrase it in the way that will make it clear that there were successful attempts before, but you could have hit difficult to diagnose and solve problems because built-in executors were hard-coded (and I will mention that a there were cases successes in writing limited implementations of executors). I hope this will be acceptable.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on a diff in pull request #28300: Add Public Interface description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
potiuk commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1050151517


##########
docs/apache-airflow/public-airflow-interface.rst:
##########
@@ -0,0 +1,115 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+===========================
+
+The Public Interface of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+as well as for automating certain aspects of the Airflow workflow.
+
+In general, the Public Interface is an important part of the Airflow ecosystem and can be a powerful
+tool for users and developers who want to extend the functionality of the system.
+
+You can extend Airflow in three ways:
+
+* By writing new custom Python code (via Operators, Plugins, Provider)
+* By using the `Stable REST API <stable-rest-api-ref>`_ (based on the OpenAPI specification)
+* By using the `Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_
+
+How can you extend Apache Airflow with custom Python Code?
+==========================================================
+
+Apache Airflow has a number of different Public Interfaces that allow developers to interact with various
+aspects of the system. Some examples of the types of Public Interfaces exposed as Python objects
+that are available in Apache Airflow include:
+
+* `DAG <concepts/dags>`_ (Directed Acyclic Graph) APIs, which allow developers to create, manage,
+  and access DAGs in Airflow.
+* `Task <concepts/tasks>`_ APIs, which provide access to information about individual tasks within
+  a DAG, such as their dependencies and execution status.
+* `Operator <concepts/operators>`_ APIs, which allows the developers to write their custom Operators.
+* `Decorators <howto/create-custom-decorator>`_ APIs, which allows the developers to write their
+  custom decorators to make it easier to write `TaskFlow <tutorial/taskflow>`_ DAGs.
+* `Secret Managers <security/secrets>`_ APIs, which allows the developers to write their custom
+  Secret Managers to safely access credentials and other secret configuration of their workflows.
+* `Connection management <concepts/connections>`_ APIs, which allow developers to manage
+  connections to external systems
+* `XCom <concepts/xcoms>`_, which allow developers to manage cross-task communication within Airflow.
+* `Variables <concepts/variables>`_, which allow developers to manage variables within Airflow.
+* `Executors <executor/index>`_, which allow developers to manage the execution of tasks within Airflow.
+* `Listeners <listeners>`_, which allow developers to react to DAG/Task lifecycle events.
+* `Plugins <plugins>`_, which allow developers to extend internal Airflow capabilities - add new UI
+  pages, custom `TimeTables <concepts/timetable>`_, `Extra Links <howto/define_extra_link>`_,
+  `Triggers <concept/deferring>`_. `Listeners <listeners>`_.
+
+What is not part of the Public Interface of Apache Airflow?
+===========================================================
+
+Everything not mentioned in this document should be considered as non-Public Interface.
+
+The users however, might have some assumptions about different parts of Airflow, thinking that
+they can rely on them, so here we try to clarify and want to be explicit that they are not part of the
+Public Interface:
+
+* `Database structure <database-erd-ref>`_ is considered to be an internal implementation
+  detail and you should not assume the structure is going to be maintained in
+  backwards-compatible way.
+
+* `Web UI <ui>`_ is considered to be continuously adapting and evolving for Apache Airflow and
+  you should not assume that the UI will remain as it is in backwards-compatible way. Views and screens
+  might be updated and removed in major releases without impacting backwards-compatibility.
+
+* `Python API <python-api-ref>`_ (except the explicitly mentioned classes below), are considered an
+  internal implementation details and you should not assume they will be maintained
+  in backwards-compatible way.
+
+Which classes and packages are part of the Public Interface of Apache Airflow?
+==============================================================================
+
+The Public Interface of Airflow consists of a number of different classes and packages that provide access
+to the core features and functionality of the system.
+
+The classes and packages that may be considered as the Public Interface include:
+
+* The :class:`~airflow.DAG`, which provides a way to define and manage DAGs in Airflow.
+* The :class:`~airflow.models.baseoperator.BaseOperator`, which provides a way write custom operators.
+* The :class:`~airflow.hooks.base.BaseHook`, which provides a way write custom hooks.
+* The :class:`~airflow.models.connection.Connection`, which provides access to external service credentials and configuration.
+* The :class:`~airflow.models.variable.Variable`, which provides access to Airflow configuration variables.
+* The :class:`~airflow.models.xcom.XCom` which are used to access to inter-task communication data.
+* The :class:`~airflow.secrets.BaseSecretsBackend` which are used to define custom secret managers.
+* The :class:`~airflow.plugins_manager.AirflowPlugin` which are used to define custom plugins.
+* The :class:`~airflow.triggers.base.BaseTrigger`, which are used to implement custom Custom Deferrable Operators (based on ``async.io``).
+* The :class:`~airflow.decorators.base.TaskDecorator`, which provides a way write custom decorators.
+* The :class:`~airflow.listeners.listener.ListenerManager` class which provides hooks that can be implemented to respond to DAG/Task lifecycle events
+
+.. versionadded:: 2.5
+
+   Listener public interface has been added in version 2.5.
+
+* The :class:`~airflow.executors.base_executor.BaseExecutor` - the Executors are the components of Airflow
+  that are responsible for executing tasks.
+
+.. versionadded:: 2.6
+
+   There are a number of different executor implementations built-in Airflow, each with its own unique
+   characteristics and capabilities. Executor interface was available in earlier version of Airflow but
+   only as of version 2.6 executors are decoupled and Airflow does not rely on built-in set of executors.

Review Comment:
   I hope there are no more non-truths in this statement  @ashb 



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] dstandish commented on a diff in pull request #28300: Add Public Interface description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
dstandish commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1058642484


##########
docs/apache-airflow/administration-and-deployment/public-airflow-interface.rst:
##########
@@ -0,0 +1,87 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+===========================
+
+The Public Interface of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This includes operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+and for automating certain aspects of the Airflow workflow.
+
+You can extend Airflow in three ways:
+
+* By writing new custom Python code (via Operators, Plugins, Provider)
+* By using the `Stable REST API <stable-rest-api-ref>`_ (based on the OpenAPI specification)
+* By using the `Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_
+
+How can you extend Apache Airflow with custom Python Code?
+==========================================================
+
+The Public Interface of Airflow consists of a number of different classes and packages that provide access
+to the core features and functionality of the system.
+
+The classes and packages that may be considered as the Public Interface include:
+
+* The :class:`~airflow.DAG`, which provides a way to define and manage DAGs in Airflow.
+* The :class:`~airflow.models.baseoperator.BaseOperator`, which provides a way write custom operators.
+* The :class:`~airflow.hooks.base.BaseHook`, which provides a way write custom hooks.
+* The :class:`~airflow.models.connection.Connection`, which provides access to external service credentials and configuration.
+* The :class:`~airflow.models.variable.Variable`, which provides access to Airflow configuration variables.
+* The :class:`~airflow.models.xcom.XCom` which are used to access to inter-task communication data.
+* The :class:`~airflow.secrets.BaseSecretsBackend` which are used to define custom secret managers.
+* The :class:`~airflow.plugins_manager.AirflowPlugin` which are used to define custom plugins.
+* The :class:`~airflow.triggers.base.BaseTrigger`, which are used to implement custom Custom Deferrable Operators (based on ``asyncio``).
+* The :class:`~airflow.decorators.base.TaskDecorator`, which provides a way write custom decorators.
+* The :class:`~airflow.listeners.listener.ListenerManager` class which provides hooks that can be implemented to respond to DAG/Task lifecycle events.
+
+.. versionadded:: 2.5
+
+   Listener public interface has been added in version 2.5.
+
+* The :class:`~airflow.executors.base_executor.BaseExecutor` - the Executors are the components of Airflow
+  that are responsible for executing tasks.
+
+.. versionadded:: 2.6
+
+   There are a number of different executor implementations built-in Airflow, each with its own unique
+   characteristics and capabilities. Executor interface was available in earlier version of Airflow but
+   only as of version 2.6 executors are fully decoupled and Airflow does not rely on built-in set of executors.
+   You could have implemented (and succeeded) with implementing Executors before Airflow 2.6 and a number
+   of people succeeded in doing so, but there were some hard-coded behaviours that preferred in-built
+   executors, and custom executors could not provide full functionality that built-in executors had.
+
+
+What is not part of the Public Interface of Apache Airflow?
+===========================================================
+
+Everything not mentioned in this document should be considered as non-Public Interface.

Review Comment:
   It's also important to clarify that some methods are intentionally "private". E.g. in the operator, it might have helper methods.  But these methods are more implementation detail right?  If users subclass, though, they may use and depend on these.  So what do we do? Ideally, from maintainer perspective, we want the overall behavior of the operator to be subject to backcompat but not all of its methods.  Currently the convention is, i believe, all methods are assumed public unless prefix with underscore / marked meta private.  WDYT?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on pull request #28300: Add Public Interface description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
potiuk commented on PR #28300:
URL: https://github.com/apache/airflow/pull/28300#issuecomment-1358275835

   Any more comments to my explanation ? 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on pull request #28300: Add Public Interface description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
potiuk commented on PR #28300:
URL: https://github.com/apache/airflow/pull/28300#issuecomment-1355754952

   Hey everyone - I spoke to @TohnJhomas and after we merged the big restructuring of his in #27235 we decided together to move all the changes for "customizing airflow" and "public-airlfow-interface" to "administration-and-deployment". 
   
   This is mostly what it is all about - people writing DAGs are rarely concerned with changes that would involve breaking the API - those are more people who are installing and administering airlfow that are concerned about it, but I think we can still discuss what is the best place if others think otherwise.
   
   Looking forward to merging that one afte the big restructure of docs is merged.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on a diff in pull request #28300: Add Public Interface description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
potiuk commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1051380916


##########
docs/apache-airflow/administration-and-deployment/public-airflow-interface.rst:
##########
@@ -0,0 +1,118 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+===========================
+
+The Public Interface of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+as well as for automating certain aspects of the Airflow workflow.
+
+In general, the Public Interface is an important part of the Airflow ecosystem and can be a powerful
+tool for users and developers who want to extend the functionality of the system.
+
+You can extend Airflow in three ways:
+
+* By writing new custom Python code (via Operators, Plugins, Provider)
+* By using the `Stable REST API <stable-rest-api-ref>`_ (based on the OpenAPI specification)
+* By using the `Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_
+
+How can you extend Apache Airflow with custom Python Code?
+==========================================================
+
+Apache Airflow has a number of different Public Interfaces that allow developers to interact with various
+aspects of the system. Some examples of the types of Public Interfaces exposed as Python objects
+that are available in Apache Airflow include:
+
+* `DAG <concepts/dags>`_ (Directed Acyclic Graph) APIs, which allow developers to create, manage,
+  and access DAGs in Airflow.
+* `Task <concepts/tasks>`_ APIs, which provide access to information about individual tasks within
+  a DAG, such as their dependencies and execution status.
+* `Operator <concepts/operators>`_ APIs, which allows the developers to write their custom Operators.
+* `Decorators <howto/create-custom-decorator>`_ APIs, which allows the developers to write their
+  custom decorators to make it easier to write `TaskFlow <tutorial/taskflow>`_ DAGs.
+* `Secret Managers <security/secrets>`_ APIs, which allows the developers to write their custom
+  Secret Managers to safely access credentials and other secret configuration of their workflows.
+* `Connection management <concepts/connections>`_ APIs, which allow developers to manage
+  connections to external systems.
+* `XCom <concepts/xcoms>`_, which allow developers to manage cross-task communication within Airflow.
+* `Variables <concepts/variables>`_, which allow developers to manage variables within Airflow.
+* `Executors <executor/index>`_, which allow developers to manage the execution of tasks within Airflow.
+* `Listeners <listeners>`_, which allow developers to react to DAG/Task lifecycle events.
+* `Plugins <plugins>`_, which allow developers to extend internal Airflow capabilities - add new UI
+  pages, custom `TimeTables <concepts/timetable>`_, `Extra Links <howto/define_extra_link>`_,
+  `Triggers <concept/deferring>`_, and `Listeners <listeners>`_.

Review Comment:
   Yes. It was orginally higher level but it evolved  into nearly the same over time. I will consolidate.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on a diff in pull request #28300: Add Public Interface description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
potiuk commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1051383543


##########
docs/apache-airflow/administration-and-deployment/public-airflow-interface.rst:
##########
@@ -0,0 +1,118 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+===========================
+
+The Public Interface of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+as well as for automating certain aspects of the Airflow workflow.
+
+In general, the Public Interface is an important part of the Airflow ecosystem and can be a powerful
+tool for users and developers who want to extend the functionality of the system.
+
+You can extend Airflow in three ways:
+
+* By writing new custom Python code (via Operators, Plugins, Provider)
+* By using the `Stable REST API <stable-rest-api-ref>`_ (based on the OpenAPI specification)
+* By using the `Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_
+
+How can you extend Apache Airflow with custom Python Code?
+==========================================================
+
+Apache Airflow has a number of different Public Interfaces that allow developers to interact with various
+aspects of the system. Some examples of the types of Public Interfaces exposed as Python objects
+that are available in Apache Airflow include:
+
+* `DAG <concepts/dags>`_ (Directed Acyclic Graph) APIs, which allow developers to create, manage,
+  and access DAGs in Airflow.
+* `Task <concepts/tasks>`_ APIs, which provide access to information about individual tasks within
+  a DAG, such as their dependencies and execution status.
+* `Operator <concepts/operators>`_ APIs, which allows the developers to write their custom Operators.
+* `Decorators <howto/create-custom-decorator>`_ APIs, which allows the developers to write their
+  custom decorators to make it easier to write `TaskFlow <tutorial/taskflow>`_ DAGs.
+* `Secret Managers <security/secrets>`_ APIs, which allows the developers to write their custom
+  Secret Managers to safely access credentials and other secret configuration of their workflows.
+* `Connection management <concepts/connections>`_ APIs, which allow developers to manage
+  connections to external systems.
+* `XCom <concepts/xcoms>`_, which allow developers to manage cross-task communication within Airflow.
+* `Variables <concepts/variables>`_, which allow developers to manage variables within Airflow.
+* `Executors <executor/index>`_, which allow developers to manage the execution of tasks within Airflow.
+* `Listeners <listeners>`_, which allow developers to react to DAG/Task lifecycle events.
+* `Plugins <plugins>`_, which allow developers to extend internal Airflow capabilities - add new UI
+  pages, custom `TimeTables <concepts/timetable>`_, `Extra Links <howto/define_extra_link>`_,
+  `Triggers <concept/deferring>`_, and `Listeners <listeners>`_.
+
+What is not part of the Public Interface of Apache Airflow?
+===========================================================
+
+Everything not mentioned in this document should be considered as non-Public Interface.
+
+The users however, might have some assumptions about different parts of Airflow, thinking that
+they can rely on them, so here we try to clarify and want to be explicit that they are not part of the
+Public Interface:
+
+* `Database structure <database-erd-ref>`_ is considered to be an internal implementation
+  detail and you should not assume the structure is going to be maintained in
+  backwards-compatible way.
+
+* `Web UI <ui>`_ is considered to be continuously adapting and evolving for Apache Airflow and
+  you should not assume that the UI will remain as it is in backwards-compatible way. Views and screens
+  might be updated and removed in major releases without impacting backwards-compatibility.
+
+* `Python API <python-api-ref>`_ (except the explicitly mentioned classes below), are considered an
+  internal implementation details and you should not assume they will be maintained
+  in backwards-compatible way.
+
+Which classes and packages are part of the Public Interface of Apache Airflow?
+==============================================================================
+
+The Public Interface of Airflow consists of a number of different classes and packages that provide access
+to the core features and functionality of the system.
+
+The classes and packages that may be considered as the Public Interface include:
+
+* The :class:`~airflow.DAG`, which provides a way to define and manage DAGs in Airflow.
+* The :class:`~airflow.models.baseoperator.BaseOperator`, which provides a way write custom operators.
+* The :class:`~airflow.hooks.base.BaseHook`, which provides a way write custom hooks.
+* The :class:`~airflow.models.connection.Connection`, which provides access to external service credentials and configuration.
+* The :class:`~airflow.models.variable.Variable`, which provides access to Airflow configuration variables.
+* The :class:`~airflow.models.xcom.XCom` which are used to access to inter-task communication data.
+* The :class:`~airflow.secrets.BaseSecretsBackend` which are used to define custom secret managers.
+* The :class:`~airflow.plugins_manager.AirflowPlugin` which are used to define custom plugins.
+* The :class:`~airflow.triggers.base.BaseTrigger`, which are used to implement custom Custom Deferrable Operators (based on ``asyncio``).
+* The :class:`~airflow.decorators.base.TaskDecorator`, which provides a way write custom decorators.
+* The :class:`~airflow.listeners.listener.ListenerManager` class which provides hooks that can be implemented to respond to DAG/Task lifecycle events.
+
+.. versionadded:: 2.5
+
+   Listener public interface has been added in version 2.5.
+
+* The :class:`~airflow.executors.base_executor.BaseExecutor` - the Executors are the components of Airflow
+  that are responsible for executing tasks.
+
+.. versionadded:: 2.6
+
+   There are a number of different executor implementations built-in Airflow, each with its own unique
+   characteristics and capabilities. Executor interface was available in earlier version of Airflow but
+   only as of version 2.6 executors are fully decoupled and Airflow does not rely on built-in set of executors.
+   You could have implemented (and succeeded) with implementing Executors before Airflow 2.6 and a number
+   of people succeeded in doing so, but there were some hard-coded behaviours that preferred in-built
+   executors, and custom executors could not provide full functionality that built-in executors had.

Review Comment:
   I already explained it in pretty long message here: https://github.com/apache/airflow/pull/28300#discussion_r1048325485  and more context https://github.com/apache/airflow/pull/28300#issuecomment-1351030386



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on a diff in pull request #28300: Add Public Interface description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
potiuk commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1051383543


##########
docs/apache-airflow/administration-and-deployment/public-airflow-interface.rst:
##########
@@ -0,0 +1,118 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+===========================
+
+The Public Interface of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+as well as for automating certain aspects of the Airflow workflow.
+
+In general, the Public Interface is an important part of the Airflow ecosystem and can be a powerful
+tool for users and developers who want to extend the functionality of the system.
+
+You can extend Airflow in three ways:
+
+* By writing new custom Python code (via Operators, Plugins, Provider)
+* By using the `Stable REST API <stable-rest-api-ref>`_ (based on the OpenAPI specification)
+* By using the `Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_
+
+How can you extend Apache Airflow with custom Python Code?
+==========================================================
+
+Apache Airflow has a number of different Public Interfaces that allow developers to interact with various
+aspects of the system. Some examples of the types of Public Interfaces exposed as Python objects
+that are available in Apache Airflow include:
+
+* `DAG <concepts/dags>`_ (Directed Acyclic Graph) APIs, which allow developers to create, manage,
+  and access DAGs in Airflow.
+* `Task <concepts/tasks>`_ APIs, which provide access to information about individual tasks within
+  a DAG, such as their dependencies and execution status.
+* `Operator <concepts/operators>`_ APIs, which allows the developers to write their custom Operators.
+* `Decorators <howto/create-custom-decorator>`_ APIs, which allows the developers to write their
+  custom decorators to make it easier to write `TaskFlow <tutorial/taskflow>`_ DAGs.
+* `Secret Managers <security/secrets>`_ APIs, which allows the developers to write their custom
+  Secret Managers to safely access credentials and other secret configuration of their workflows.
+* `Connection management <concepts/connections>`_ APIs, which allow developers to manage
+  connections to external systems.
+* `XCom <concepts/xcoms>`_, which allow developers to manage cross-task communication within Airflow.
+* `Variables <concepts/variables>`_, which allow developers to manage variables within Airflow.
+* `Executors <executor/index>`_, which allow developers to manage the execution of tasks within Airflow.
+* `Listeners <listeners>`_, which allow developers to react to DAG/Task lifecycle events.
+* `Plugins <plugins>`_, which allow developers to extend internal Airflow capabilities - add new UI
+  pages, custom `TimeTables <concepts/timetable>`_, `Extra Links <howto/define_extra_link>`_,
+  `Triggers <concept/deferring>`_, and `Listeners <listeners>`_.
+
+What is not part of the Public Interface of Apache Airflow?
+===========================================================
+
+Everything not mentioned in this document should be considered as non-Public Interface.
+
+The users however, might have some assumptions about different parts of Airflow, thinking that
+they can rely on them, so here we try to clarify and want to be explicit that they are not part of the
+Public Interface:
+
+* `Database structure <database-erd-ref>`_ is considered to be an internal implementation
+  detail and you should not assume the structure is going to be maintained in
+  backwards-compatible way.
+
+* `Web UI <ui>`_ is considered to be continuously adapting and evolving for Apache Airflow and
+  you should not assume that the UI will remain as it is in backwards-compatible way. Views and screens
+  might be updated and removed in major releases without impacting backwards-compatibility.
+
+* `Python API <python-api-ref>`_ (except the explicitly mentioned classes below), are considered an
+  internal implementation details and you should not assume they will be maintained
+  in backwards-compatible way.
+
+Which classes and packages are part of the Public Interface of Apache Airflow?
+==============================================================================
+
+The Public Interface of Airflow consists of a number of different classes and packages that provide access
+to the core features and functionality of the system.
+
+The classes and packages that may be considered as the Public Interface include:
+
+* The :class:`~airflow.DAG`, which provides a way to define and manage DAGs in Airflow.
+* The :class:`~airflow.models.baseoperator.BaseOperator`, which provides a way write custom operators.
+* The :class:`~airflow.hooks.base.BaseHook`, which provides a way write custom hooks.
+* The :class:`~airflow.models.connection.Connection`, which provides access to external service credentials and configuration.
+* The :class:`~airflow.models.variable.Variable`, which provides access to Airflow configuration variables.
+* The :class:`~airflow.models.xcom.XCom` which are used to access to inter-task communication data.
+* The :class:`~airflow.secrets.BaseSecretsBackend` which are used to define custom secret managers.
+* The :class:`~airflow.plugins_manager.AirflowPlugin` which are used to define custom plugins.
+* The :class:`~airflow.triggers.base.BaseTrigger`, which are used to implement custom Custom Deferrable Operators (based on ``asyncio``).
+* The :class:`~airflow.decorators.base.TaskDecorator`, which provides a way write custom decorators.
+* The :class:`~airflow.listeners.listener.ListenerManager` class which provides hooks that can be implemented to respond to DAG/Task lifecycle events.
+
+.. versionadded:: 2.5
+
+   Listener public interface has been added in version 2.5.
+
+* The :class:`~airflow.executors.base_executor.BaseExecutor` - the Executors are the components of Airflow
+  that are responsible for executing tasks.
+
+.. versionadded:: 2.6
+
+   There are a number of different executor implementations built-in Airflow, each with its own unique
+   characteristics and capabilities. Executor interface was available in earlier version of Airflow but
+   only as of version 2.6 executors are fully decoupled and Airflow does not rely on built-in set of executors.
+   You could have implemented (and succeeded) with implementing Executors before Airflow 2.6 and a number
+   of people succeeded in doing so, but there were some hard-coded behaviours that preferred in-built
+   executors, and custom executors could not provide full functionality that built-in executors had.

Review Comment:
   I already explained it in pretty long message here: https://github.com/apache/airflow/pull/28300#discussion_r1048325485 
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on a diff in pull request #28300: Add Public API description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
potiuk commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1046554283


##########
docs/apache-airflow/public-airflow-api.rst:
##########
@@ -0,0 +1,141 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+"Public API" of Airflow
+=======================
+
+The Public API of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new Executors, Plugins, Operators and Providers. The
+public API of Apache Airflow can be useful for building custom tools and integrations with other systems,
+as well as for automating certain aspects of the Airflow workflow.
+
+In general, the public API is an important part of the Airflow ecosystem and can be a powerful tool for users
+and developers who want to extend the functionality of the system.
+
+What kind of APIs are Public in Apache Airflow?
+===============================================
+
+Apache Airflow has a number of different public APIs that allow developers to interact with various
+aspects of the system. Some examples of the types of public APIs exposed as Python objects
+that are available in Apache Airflow include:
+
+* :doc:`DAG <concepts/dags>`_ (Directed Acyclic Graph) APIs, which allow developers to create, manage,
+  and access DAGs in Airflow.
+* :doc:`Task <concepts/tasks>`_ APIs, which provide access to information about individual tasks within
+  a DAG, such as their dependencies and execution status.
+* :doc:`Operator <concepts/operators>`_ APIs, which allows the developers to write their custom Operators.
+* :doc:`Decorators <howto/create-custom-decorator>`_ APIs, which allows the developers to write their
+  custom decorators to make it easier to write :doc:`TaskFlow <tutorial/taskflow>`_ DAGs.
+* :doc:`Secret Managers <security/secrets>`_ APIs, which allows the developers to write their custom
+  Secret Managers to safely access credentials and other secret configuration of their workflows.
+* :doc:`Connection management <concepts/connections>`_ APIs, which allow developers to manage
+  connections to external systems
+* :doc:`XCom <concepts/xcoms>`_, which allow developers to manage cross-task communication within Airflow.
+* :doc:`Variables <concepts/variables>`_, which allow developers to manage variables within Airflow.
+* :doc:`Executors <executor/index>`_, which allow developers to manage the execution of tasks within Airflow.
+* :doc:`Plugins <plugins>`_, which allow developers to extend internal Airflow capabilities - add new UI
+  pages, custom :doc:`TimeTables <concepts/timetable>`_, :doc:`Extra Links <howto/define_extra_link>`_,
+  :doc:`Listeners <listeners>`_ - all of which are considered public APIs.
+
+Also Airflow has a stable REST API that allows users to interact with Airflow via HTTP requests and a
+CLI that allows users to interact with Airflow via command line commands.
+
+Overall, the public APIs in Apache Airflow are designed to provide developers with a wide
+range of tools and capabilities for interacting with the system and extending its functionality.

Review Comment:
   I think it's nice actually. We should not assume the user that comes here has the same assumptions what the public interface is about. having a short (it's short) and explanatory (it is) preamble like that actually sets the tone and provide the context. It explains to the reader, why we explain this all.
   
   We already have a random dump of links in various places and they have no explanation why they are there. Yes it's shorter, but I think some form of context that explains what we really want to explain is important here.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on a diff in pull request #28300: Add Public API description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
potiuk commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1046546975


##########
docs/apache-airflow/index.rst:
##########
@@ -79,6 +79,49 @@ seen running over time:
 Each column represents one DAG run. These are two of the most used views in Airflow, but there are several
 other views which allow you to deep dive into the state of your workflows.
 
+Customizing Airflow
+===================
+
+There are a number of ways to extend the capabilities of Apache Airflow. Some common methods
+for extending Airflow include:
+
+* Writing :doc:`Custom Operators <howto/custom-operator>`_ : Operators are the building blocks of DAGs in
+  Airflow, and can be used to perform specific tasks within a DAG. By writing custom operators,
+  you can add new functionality to Airflow, such as the ability to connect to and interact
+  with external services, or to perform complex data transformations.
+
+* Writing :doc:`Custom Decorators <howto/create-custom-decorator>`_ : Custom decorators are a way to
+  extend the functionality of the Airflow ``@task`` decorator. By writing custom decorators,
+  you can add new functionality to Airflow, such as the ability to connect to and interact
+  with external services, or to perform complex data transformations.
+
+* Creating custom :doc:`plugins`: Airflow plugins are extensions to the core system that provide additional
+  features and functionality. By creating custom plugins, you can add new capabilities to Airflow,
+  such as adding custom UI components, creating custom macros, creating your own TimeTables,
+  adding Operator Extra Links, Listeners.
+
+* Writing custom :doc:`provider packages <apache-airflow-providers:index>`: Providers are the components
+  of Airflow that are responsible for connecting to and interacting with external services. By writing
+  custom providers, you can add support for new types of external services, such as databases or
+  cloud services, to Airflow.
+
+With all the above methods, you can use the :doc:`public-airflow-api` to write your
+custom Python code to interact with Airflow.
+
+However, you can also extend Airflow without directly integrating your Python code - via the
+:doc:`stable-rest-api-ref` and the :doc:`stable-cli-ref`. These methods allow you to interact with
+Airflow as an external system:
+
+Particularly, the doc:`stable-rest-api-ref` of Apache Airflow provides a number of methods for programmatic
+accessing and manipulating various aspects of the system. By using the public API, you can build
+custom tools and integrations with other systems, and automate certain aspects of the Airflow workflow.
+
+Overall, there are many different ways to extend the capabilities of Apache Airflow, and the specific
+approach that you choose will depend on your specific needs and requirements.
+By combining these different methods, you can create a powerful and flexible workflow management
+system that can help you automate and optimize your data pipelines.

Review Comment:
   Yep. That was GPT Chat repeating itself.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] ashb commented on a diff in pull request #28300: Add Public Interface description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
ashb commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1046943619


##########
docs/apache-airflow/public-airflow-interface.rst:
##########
@@ -0,0 +1,110 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+===========================
+
+The Public Interface of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+as well as for automating certain aspects of the Airflow workflow.
+
+In general, the Public Interface is an important part of the Airflow ecosystem and can be a powerful
+tool for users and developers who want to extend the functionality of the system.
+
+You can extend Airflow in three ways:
+
+* By writing new custom Python code (via Operators, Plugins, Provider)
+* By using the `Stable REST API <stable-rest-api-ref>`_ (based on the OpenAPI specification)
+* By using the `Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_
+
+How can you extend Apache Airflow with custom Python Code?
+==========================================================
+
+Apache Airflow has a number of different Public Interfaces that allow developers to interact with various
+aspects of the system. Some examples of the types of Public Interfaces exposed as Python objects
+that are available in Apache Airflow include:
+
+* `DAG <concepts/dags>`_ (Directed Acyclic Graph) APIs, which allow developers to create, manage,
+  and access DAGs in Airflow.
+* `Task <concepts/tasks>`_ APIs, which provide access to information about individual tasks within
+  a DAG, such as their dependencies and execution status.
+* `Operator <concepts/operators>`_ APIs, which allows the developers to write their custom Operators.
+* `Decorators <howto/create-custom-decorator>`_ APIs, which allows the developers to write their
+  custom decorators to make it easier to write `TaskFlow <tutorial/taskflow>`_ DAGs.
+* `Secret Managers <security/secrets>`_ APIs, which allows the developers to write their custom
+  Secret Managers to safely access credentials and other secret configuration of their workflows.
+* `Connection management <concepts/connections>`_ APIs, which allow developers to manage
+  connections to external systems
+* `XCom <concepts/xcoms>`_, which allow developers to manage cross-task communication within Airflow.
+* `Variables <concepts/variables>`_, which allow developers to manage variables within Airflow.
+* `Executors <executor/index>`_, which allow developers to manage the execution of tasks within Airflow.
+* `Listeners <listeners>`_ - all of which are considered Public Interfaces.
+* `Plugins <plugins>`_, which allow developers to extend internal Airflow capabilities - add new UI
+  pages, custom `TimeTables <concepts/timetable>`_, `Extra Links <howto/define_extra_link>`_,
+
+What is not part of the Public Interface of Apache Airflow?
+===========================================================
+
+Everything not mentioned in this document should be considered as non-Public Interface.
+
+The users however, might have some assumptions about different parts of Airflow, thinking that
+they can rely on them, so here we try to clarify and want to be explicit that they are not part of the
+Public Interface:
+
+* `Database structure <database-erd-ref>`_ is considered to be an internal implementation
+  detail and you should not assume the structure is going to be maintained in
+  backwards-compatible way.
+
+* `Web UI <ui>`_ is considered to be continuously adapting and evolving for Apache Airflow and
+  you should not assume that the UI will remain as it is in backwards-compatible way. Views and screens
+  might be updated and removed in major releases without impacting backwards-compatibility.
+
+* `Python API <python-api-ref>`_ (except the explicitly mentioned classes below), are considered an
+  internal implementation details and you should not assume they will be maintained
+  in backwards-compatible way.
+
+Which classes and packages are part of the Public Interface of Apache Airflow?
+==============================================================================
+
+The Public Interface of Airflow consists of a number of different classes and packages that provide access
+to the core features and functionality of the system.
+
+The classes and packages that may be considered as the Public Interface include:
+
+* The :class:`~airflow.models.dag.DAG`, which provides a way to define and manage DAGs in Airflow.

Review Comment:
   I think I'd like the "public" interface to not include module where possible, so 
   ```suggestion
   * The :class:`~airflow.DAG`, which provides a way to define and manage DAGs in Airflow.
   ```
   
   (Reasoning:I think  we should move towards `airflow.models` should be internal by default!)



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on a diff in pull request #28300: Add Public Interface description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
potiuk commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1048325485


##########
docs/apache-airflow/public-airflow-interface.rst:
##########
@@ -0,0 +1,115 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+===========================
+
+The Public Interface of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+as well as for automating certain aspects of the Airflow workflow.
+
+In general, the Public Interface is an important part of the Airflow ecosystem and can be a powerful
+tool for users and developers who want to extend the functionality of the system.
+
+You can extend Airflow in three ways:
+
+* By writing new custom Python code (via Operators, Plugins, Provider)
+* By using the `Stable REST API <stable-rest-api-ref>`_ (based on the OpenAPI specification)
+* By using the `Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_
+
+How can you extend Apache Airflow with custom Python Code?
+==========================================================
+
+Apache Airflow has a number of different Public Interfaces that allow developers to interact with various
+aspects of the system. Some examples of the types of Public Interfaces exposed as Python objects
+that are available in Apache Airflow include:
+
+* `DAG <concepts/dags>`_ (Directed Acyclic Graph) APIs, which allow developers to create, manage,
+  and access DAGs in Airflow.
+* `Task <concepts/tasks>`_ APIs, which provide access to information about individual tasks within
+  a DAG, such as their dependencies and execution status.
+* `Operator <concepts/operators>`_ APIs, which allows the developers to write their custom Operators.
+* `Decorators <howto/create-custom-decorator>`_ APIs, which allows the developers to write their
+  custom decorators to make it easier to write `TaskFlow <tutorial/taskflow>`_ DAGs.
+* `Secret Managers <security/secrets>`_ APIs, which allows the developers to write their custom
+  Secret Managers to safely access credentials and other secret configuration of their workflows.
+* `Connection management <concepts/connections>`_ APIs, which allow developers to manage
+  connections to external systems
+* `XCom <concepts/xcoms>`_, which allow developers to manage cross-task communication within Airflow.
+* `Variables <concepts/variables>`_, which allow developers to manage variables within Airflow.
+* `Executors <executor/index>`_, which allow developers to manage the execution of tasks within Airflow.
+* `Listeners <listeners>`_, which allow developers to react to DAG/Task lifecycle events.
+* `Plugins <plugins>`_, which allow developers to extend internal Airflow capabilities - add new UI
+  pages, custom `TimeTables <concepts/timetable>`_, `Extra Links <howto/define_extra_link>`_,
+  `Triggers <concept/deferring>`_. `Listeners <listeners>`_.
+
+What is not part of the Public Interface of Apache Airflow?
+===========================================================
+
+Everything not mentioned in this document should be considered as non-Public Interface.
+
+The users however, might have some assumptions about different parts of Airflow, thinking that
+they can rely on them, so here we try to clarify and want to be explicit that they are not part of the
+Public Interface:
+
+* `Database structure <database-erd-ref>`_ is considered to be an internal implementation
+  detail and you should not assume the structure is going to be maintained in
+  backwards-compatible way.
+
+* `Web UI <ui>`_ is considered to be continuously adapting and evolving for Apache Airflow and
+  you should not assume that the UI will remain as it is in backwards-compatible way. Views and screens
+  might be updated and removed in major releases without impacting backwards-compatibility.
+
+* `Python API <python-api-ref>`_ (except the explicitly mentioned classes below), are considered an
+  internal implementation details and you should not assume they will be maintained
+  in backwards-compatible way.
+
+Which classes and packages are part of the Public Interface of Apache Airflow?
+==============================================================================
+
+The Public Interface of Airflow consists of a number of different classes and packages that provide access
+to the core features and functionality of the system.
+
+The classes and packages that may be considered as the Public Interface include:
+
+* The :class:`~airflow.DAG`, which provides a way to define and manage DAGs in Airflow.
+* The :class:`~airflow.models.baseoperator.BaseOperator`, which provides a way write custom operators.
+* The :class:`~airflow.hooks.base.BaseHook`, which provides a way write custom hooks.
+* The :class:`~airflow.models.connection.Connection`, which provides access to external service credentials and configuration.
+* The :class:`~airflow.models.variable.Variable`, which provides access to Airflow configuration variables.
+* The :class:`~airflow.models.xcom.XCom` which are used to access to inter-task communication data.
+* The :class:`~airflow.secrets.BaseSecretsBackend` which are used to define custom secret managers.
+* The :class:`~airflow.plugins_manager.AirflowPlugin` which are used to define custom plugins.
+* The :class:`~airflow.triggers.base.BaseTrigger`, which are used to implement custom Custom Deferrable Operators (based on ``async.io``).
+* The :class:`~airflow.decorators.base.TaskDecorator`, which provides a way write custom decorators.
+* The :class:`~airflow.listeners.listener.ListenerManager` class which provides hooks that can be implemented to respond to DAG/Task lifecycle events
+
+.. versionadded:: 2.5
+
+   Listener public interface has been added in version 2.5.
+
+* The :class:`~airflow.executors.base_executor.BaseExecutor` - the Executors are the components of Airflow
+  that are responsible for executing tasks.
+
+.. versionadded:: 2.6
+
+   There are a number of different executor implementations built-in Airflow, each with its own unique
+   characteristics and capabilities. Executor interface was available in earlier version of Airflow but
+   only as of version 2.6 executors are decoupled and Airflow does not rely on built-in set of executors.

Review Comment:
   The message is that only as 2.6 executors are truly extendable. Because they weren't before. There are certain modifications and changes in the interface (as explained and being implemented in https://cwiki.apache.org/confluence/display/AIRFLOW/AIP-51+Removing+Executor+Coupling+from+Core+Airlfow ) that allow you to implement custom executor without hittting some roadblocks.
   
   We just failed in delivering it before even if it was somewhat promised. And this is to warn users they should only target 2.6+. I want to avoid the situation that people mistakenly think they can do it in earlier versions. Of course there is the "version" selector at the top of the documentation, but many people do not realise that the feature they see there in the "stable" version are not available in their version and loose time and effort to try to make them work. And then they come up with bad feelings about Airflow, because we failed to warn them clearly enough.
   
   Just one example from last week: 
   
    User thinking that breadth-first mapping is implemented in 2.3
   
   https://apache-airflow.slack.com/archives/CCQ7EGB1P/p1670968167747709?thread_ts=1670937467.554809&cid=CCQ7EGB1P
   
   User is at 2.3.4 in Composer and cannot upgrade to 2.5.0:
   
   > User: any possibility to implement an inheritance on TaskGroup and add the "jinja interpretation mechanism" somehow?? even simplistically, just to get a {{ds_nodash}}  interpretation (edited) 
   > Jarek: No
   > User: RIP 10 days of work lmao
   
   I want to malke sure our users don't even start working on 2.3, 2.4, 2.5 with custom executors basing on the new documentation that's coming on 2.6 where we explicitly admit Executors are extendable. Before we hinted at it but never admitted to it. Now when we explicitly state it that users "can" do it, we should tell them they should only do it in 2.5 and explain why. Otherwise they will start working on it with earlier versions and when they fail, they will complain. We can prevent it by being explicit about it and if they complain we can link to that doc and tell them "you had a chance to read it, it's your fault". Otherwise they can tell "you have not explained this explicitly that it's only 2.6" and they will be right.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on a diff in pull request #28300: Add Public Interface description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
potiuk commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1058648489


##########
docs/apache-airflow/administration-and-deployment/public-airflow-interface.rst:
##########
@@ -0,0 +1,87 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+===========================
+
+The Public Interface of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This includes operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+and for automating certain aspects of the Airflow workflow.
+
+You can extend Airflow in three ways:
+
+* By writing new custom Python code (via Operators, Plugins, Provider)
+* By using the `Stable REST API <stable-rest-api-ref>`_ (based on the OpenAPI specification)
+* By using the `Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_
+
+How can you extend Apache Airflow with custom Python Code?
+==========================================================
+
+The Public Interface of Airflow consists of a number of different classes and packages that provide access
+to the core features and functionality of the system.
+
+The classes and packages that may be considered as the Public Interface include:
+
+* The :class:`~airflow.DAG`, which provides a way to define and manage DAGs in Airflow.
+* The :class:`~airflow.models.baseoperator.BaseOperator`, which provides a way write custom operators.
+* The :class:`~airflow.hooks.base.BaseHook`, which provides a way write custom hooks.
+* The :class:`~airflow.models.connection.Connection`, which provides access to external service credentials and configuration.
+* The :class:`~airflow.models.variable.Variable`, which provides access to Airflow configuration variables.
+* The :class:`~airflow.models.xcom.XCom` which are used to access to inter-task communication data.
+* The :class:`~airflow.secrets.BaseSecretsBackend` which are used to define custom secret managers.
+* The :class:`~airflow.plugins_manager.AirflowPlugin` which are used to define custom plugins.
+* The :class:`~airflow.triggers.base.BaseTrigger`, which are used to implement custom Custom Deferrable Operators (based on ``asyncio``).
+* The :class:`~airflow.decorators.base.TaskDecorator`, which provides a way write custom decorators.
+* The :class:`~airflow.listeners.listener.ListenerManager` class which provides hooks that can be implemented to respond to DAG/Task lifecycle events.
+
+.. versionadded:: 2.5
+
+   Listener public interface has been added in version 2.5.
+
+* The :class:`~airflow.executors.base_executor.BaseExecutor` - the Executors are the components of Airflow
+  that are responsible for executing tasks.
+
+.. versionadded:: 2.6
+
+   There are a number of different executor implementations built-in Airflow, each with its own unique
+   characteristics and capabilities. Executor interface was available in earlier version of Airflow but
+   only as of version 2.6 executors are fully decoupled and Airflow does not rely on built-in set of executors.
+   You could have implemented (and succeeded) with implementing Executors before Airflow 2.6 and a number
+   of people succeeded in doing so, but there were some hard-coded behaviours that preferred in-built
+   executors, and custom executors could not provide full functionality that built-in executors had.
+
+
+What is not part of the Public Interface of Apache Airflow?
+===========================================================
+
+Everything not mentioned in this document should be considered as non-Public Interface.

Review Comment:
   Maybe this is misunderstanding - what I wrote above was not about what provider packages "provides" - but what they "consume". This  document indeed does not specify anything about what providers provide - it's only what Airflow "produces" and providers "consume" as users of that. 
   
   The Community Providers have their own already pretty very well defined interface (with defined json schema) for public API they provide (introduced since the beginning in Airflow 2.0): provider.yaml kept internally and corresponding public `get_provider_info` entrypoint which explains all packages/classes/functionalities that are exposed and also `airflow providers` CLI shows that information. 
   
   I think (maybe I am wrong) that tverything specified there is public - including automatically generated list of "core extensions" the provider deliver - and the documentation about that is automatically updated based on this meta-data (and things like naming, completeness etc. is automatically verified at PR time with unit tests following AIP-21).
   
   And yes. In Providers, whenever something is ment to be internal there is indeed defined with `_` and our sphinx docs building excludes it automatically.  Actually all those are things we cross-check against when we decide on making a breaking release of the providers or adding new features. And I think it's very well controlled by the pre-commits/cross-verification and general very limited "code structure" we have there - hooks, contain hooks, operators contain operators etc. 
   
   But I think this  is very much different for Airlow "core" - there are multiple classes that have no `_` and they are not really specified if they could or should be used by the users to rely on and not marked with :private:  nor _. And I think keeping that in check could be difficult. 
   
   But maybe that is a good idea? Do you think @dstandish that we should synchronize and make similar approach for airflow - instead of allowlisting and explicitly specifying what are the public classes we should use,  to make an effort to review all those Airlfow classes in airflow and rename them to _* for example to follow what has been done and followed in providers?
   
    I would also be very happy if we would do it (and we could even add pre-commit to add some friction to add a new class that is public classes).  But I think this might be pretty invasive for multiple reasons. 
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on pull request #28300: Add Public Interface description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
potiuk commented on PR #28300:
URL: https://github.com/apache/airflow/pull/28300#issuecomment-1372044348

   @basph ?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on a diff in pull request #28300: Add Public API description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
potiuk commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1046552210


##########
docs/apache-airflow/index.rst:
##########
@@ -79,6 +79,49 @@ seen running over time:
 Each column represents one DAG run. These are two of the most used views in Airflow, but there are several
 other views which allow you to deep dive into the state of your workflows.
 
+Customizing Airflow
+===================
+
+There are a number of ways to extend the capabilities of Apache Airflow. Some common methods
+for extending Airflow include:
+
+* Writing :doc:`Custom Operators <howto/custom-operator>`_ : Operators are the building blocks of DAGs in
+  Airflow, and can be used to perform specific tasks within a DAG. By writing custom operators,
+  you can add new functionality to Airflow, such as the ability to connect to and interact
+  with external services, or to perform complex data transformations.
+
+* Writing :doc:`Custom Decorators <howto/create-custom-decorator>`_ : Custom decorators are a way to
+  extend the functionality of the Airflow ``@task`` decorator. By writing custom decorators,
+  you can add new functionality to Airflow, such as the ability to connect to and interact
+  with external services, or to perform complex data transformations.
+
+* Creating custom :doc:`plugins`: Airflow plugins are extensions to the core system that provide additional
+  features and functionality. By creating custom plugins, you can add new capabilities to Airflow,
+  such as adding custom UI components, creating custom macros, creating your own TimeTables,
+  adding Operator Extra Links, Listeners.
+
+* Writing custom :doc:`provider packages <apache-airflow-providers:index>`: Providers are the components
+  of Airflow that are responsible for connecting to and interacting with external services. By writing
+  custom providers, you can add support for new types of external services, such as databases or
+  cloud services, to Airflow.
+
+With all the above methods, you can use the :doc:`public-airflow-api` to write your
+custom Python code to interact with Airflow.
+
+However, you can also extend Airflow without directly integrating your Python code - via the
+:doc:`stable-rest-api-ref` and the :doc:`stable-cli-ref`. These methods allow you to interact with
+Airflow as an external system:
+
+Particularly, the doc:`stable-rest-api-ref` of Apache Airflow provides a number of methods for programmatic

Review Comment:
   Removed altogether.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] uranusjr commented on a diff in pull request #28300: Add Public Interface description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
uranusjr commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1046681128


##########
docs/apache-airflow/index.rst:
##########
@@ -79,6 +79,48 @@ seen running over time:
 Each column represents one DAG run. These are two of the most used views in Airflow, but there are several
 other views which allow you to deep dive into the state of your workflows.
 
+Customizing Airflow
+===================
+
+There are several ways to extend the capabilities of Apache Airflow:
+
+* Writing `Custom Operators <howto/custom-operator>`_ : Operators are the building blocks of DAGs in
+  Airflow, and can be used to perform specific tasks within a DAG. By writing custom operators,
+  you can add new functionality to Airflow, such as the ability to connect to and interact
+  with external services, or to perform complex data transformations.
+
+* Writing `Custom Decorators <howto/create-custom-decorator>`_ : Custom decorators are a way to
+  extend the functionality of the Airflow ``@task`` decorator. By writing custom decorators,
+  you can add new functionality to Airflow, such as the ability to connect to and interact
+  with external services, or to perform complex data transformations.
+
+* Creating custom :doc:`plugins`: Airflow plugins are extensions to the core system that provide additional
+  features and functionality. By creating custom plugins, you can add new capabilities to Airflow,
+  such as adding custom UI components, creating custom macros, creating your own TimeTables,
+  adding Operator Extra Links, Listeners.
+
+* Writing custom `provider packages <apache-airflow-providers:index>`_: Providers are the components
+  of Airflow that are responsible for connecting to and interacting with external services. By writing
+  custom providers, you can add support for new types of external services, such as databases or
+  cloud services, to Airflow.

Review Comment:
   We should probably mention these are not mutually exclusive; a custom decorator can be (usually is) coupled with a custom operator; a provider can also contain plugins, decorators, and operators.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] dstandish commented on a diff in pull request #28300: Add Public Interface description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
dstandish commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1058724161


##########
docs/apache-airflow/administration-and-deployment/public-airflow-interface.rst:
##########
@@ -0,0 +1,87 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+===========================
+
+The Public Interface of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This includes operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+and for automating certain aspects of the Airflow workflow.
+
+You can extend Airflow in three ways:
+
+* By writing new custom Python code (via Operators, Plugins, Provider)
+* By using the `Stable REST API <stable-rest-api-ref>`_ (based on the OpenAPI specification)
+* By using the `Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_
+
+How can you extend Apache Airflow with custom Python Code?
+==========================================================
+
+The Public Interface of Airflow consists of a number of different classes and packages that provide access
+to the core features and functionality of the system.
+
+The classes and packages that may be considered as the Public Interface include:
+
+* The :class:`~airflow.DAG`, which provides a way to define and manage DAGs in Airflow.
+* The :class:`~airflow.models.baseoperator.BaseOperator`, which provides a way write custom operators.
+* The :class:`~airflow.hooks.base.BaseHook`, which provides a way write custom hooks.
+* The :class:`~airflow.models.connection.Connection`, which provides access to external service credentials and configuration.
+* The :class:`~airflow.models.variable.Variable`, which provides access to Airflow configuration variables.
+* The :class:`~airflow.models.xcom.XCom` which are used to access to inter-task communication data.
+* The :class:`~airflow.secrets.BaseSecretsBackend` which are used to define custom secret managers.
+* The :class:`~airflow.plugins_manager.AirflowPlugin` which are used to define custom plugins.
+* The :class:`~airflow.triggers.base.BaseTrigger`, which are used to implement custom Custom Deferrable Operators (based on ``asyncio``).
+* The :class:`~airflow.decorators.base.TaskDecorator`, which provides a way write custom decorators.
+* The :class:`~airflow.listeners.listener.ListenerManager` class which provides hooks that can be implemented to respond to DAG/Task lifecycle events.
+
+.. versionadded:: 2.5
+
+   Listener public interface has been added in version 2.5.
+
+* The :class:`~airflow.executors.base_executor.BaseExecutor` - the Executors are the components of Airflow
+  that are responsible for executing tasks.
+
+.. versionadded:: 2.6
+
+   There are a number of different executor implementations built-in Airflow, each with its own unique
+   characteristics and capabilities. Executor interface was available in earlier version of Airflow but
+   only as of version 2.6 executors are fully decoupled and Airflow does not rely on built-in set of executors.
+   You could have implemented (and succeeded) with implementing Executors before Airflow 2.6 and a number
+   of people succeeded in doing so, but there were some hard-coded behaviours that preferred in-built
+   executors, and custom executors could not provide full functionality that built-in executors had.
+
+
+What is not part of the Public Interface of Apache Airflow?
+===========================================================
+
+Everything not mentioned in this document should be considered as non-Public Interface.

Review Comment:
   > For providers it's far easier because there is no "internal" sharing between multiple modules
   
   so i mean even e.g. for KPO we have a `pod_manager` module.  This should not be considered (ideally) user-public.  It exists only for KPO, used only in KPO.  It is internal helper code for KPO, just because, otherwise, KPO is 1000 line class.  (maybe exaggerating, maybe not). But it should be internal-public.  So there is sharing, and ideally, from maintainer perspective this module isn't user-public.  And moreover, again from this perspective, ideally operator methods should not be user-public.  Makes it harder to refactor and make changes.  The result here would be, "subclass operators at your own risk".  Would this be acceptable?  I feel like it could be OK.  So then "base" operator / sensor etc could be the public API (at least, whatever's not explicitly private there via meta tag or _), and then any _particular_ operator's methods are not public?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on a diff in pull request #28300: Add Public Interface description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
potiuk commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1058843112


##########
docs/apache-airflow/administration-and-deployment/public-airflow-interface.rst:
##########
@@ -0,0 +1,87 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+===========================
+
+The Public Interface of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This includes operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+and for automating certain aspects of the Airflow workflow.
+
+You can extend Airflow in three ways:
+
+* By writing new custom Python code (via Operators, Plugins, Provider)
+* By using the `Stable REST API <stable-rest-api-ref>`_ (based on the OpenAPI specification)
+* By using the `Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_
+
+How can you extend Apache Airflow with custom Python Code?
+==========================================================
+
+The Public Interface of Airflow consists of a number of different classes and packages that provide access
+to the core features and functionality of the system.
+
+The classes and packages that may be considered as the Public Interface include:
+
+* The :class:`~airflow.DAG`, which provides a way to define and manage DAGs in Airflow.
+* The :class:`~airflow.models.baseoperator.BaseOperator`, which provides a way write custom operators.
+* The :class:`~airflow.hooks.base.BaseHook`, which provides a way write custom hooks.
+* The :class:`~airflow.models.connection.Connection`, which provides access to external service credentials and configuration.
+* The :class:`~airflow.models.variable.Variable`, which provides access to Airflow configuration variables.
+* The :class:`~airflow.models.xcom.XCom` which are used to access to inter-task communication data.
+* The :class:`~airflow.secrets.BaseSecretsBackend` which are used to define custom secret managers.
+* The :class:`~airflow.plugins_manager.AirflowPlugin` which are used to define custom plugins.
+* The :class:`~airflow.triggers.base.BaseTrigger`, which are used to implement custom Custom Deferrable Operators (based on ``asyncio``).
+* The :class:`~airflow.decorators.base.TaskDecorator`, which provides a way write custom decorators.
+* The :class:`~airflow.listeners.listener.ListenerManager` class which provides hooks that can be implemented to respond to DAG/Task lifecycle events.
+
+.. versionadded:: 2.5
+
+   Listener public interface has been added in version 2.5.
+
+* The :class:`~airflow.executors.base_executor.BaseExecutor` - the Executors are the components of Airflow
+  that are responsible for executing tasks.
+
+.. versionadded:: 2.6
+
+   There are a number of different executor implementations built-in Airflow, each with its own unique
+   characteristics and capabilities. Executor interface was available in earlier version of Airflow but
+   only as of version 2.6 executors are fully decoupled and Airflow does not rely on built-in set of executors.
+   You could have implemented (and succeeded) with implementing Executors before Airflow 2.6 and a number
+   of people succeeded in doing so, but there were some hard-coded behaviours that preferred in-built
+   executors, and custom executors could not provide full functionality that built-in executors had.
+
+
+What is not part of the Public Interface of Apache Airflow?
+===========================================================
+
+Everything not mentioned in this document should be considered as non-Public Interface.

Review Comment:
   That's a good point. I think K8S is pretty special case because it is both used internally by K8S executor as well as it is exposed externally. 
   
   And I think there we **could** treat it in a special way and just there make sure that we explicitly decide on which parts of the pod manager or KPO is bound to change. We could indeed decide that the _ (or :private:) is actually  something that you SHOULD NOT use because it MIGHT change.  And I think in most cases we follow that convention actually  - I see tht in KPO we have a number of classes that are `_` and yes - that's ok for those methods to change their signatures in whatever way. User got enough signal (and we should be explicit about it too) that this method is more volatile than the others (by sheer _ in front).
   
   And to be honest - this is already supported by our documentation: https://airflow.apache.org/docs/apache-airflow-providers-cncf-kubernetes/stable/_api/airflow/providers/cncf/kubernetes/index.html  - does not contain any _* methods because they are excluded by our Sphinx autoapi settings.
   
   I will add some description about that to cover general approach for that (also covering providers). 
   
   Also I have found one thing. We actually have **almost** what I created here available (but no-one commenting here realized that so I consider that as something that we can easily merge.
   
   We have this page in our docs: https://airflow.apache.org/docs/apache-airflow/stable/python-api-ref.html -  and besides being out-dated and having some buggy entries (empty "utils" ?) it was an attempt to do very similar as part of my doc - expose some of the classes that we think are "public". And besides being unmaintained (no triggers for example) It had no explicit mentioning that this is "public interface" and all the rest is not and it was only about "Python API", but Airflow public interface is a little more than that (DB/REST Api and explicit exclusion of the UI and possibly other components are more general than just Python API). I will remove that page and add redirection and merge these two.
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on a diff in pull request #28300: Add Public Interface description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
potiuk commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1058655785


##########
docs/apache-airflow/administration-and-deployment/public-airflow-interface.rst:
##########
@@ -0,0 +1,87 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+===========================
+
+The Public Interface of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This includes operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+and for automating certain aspects of the Airflow workflow.
+
+You can extend Airflow in three ways:
+
+* By writing new custom Python code (via Operators, Plugins, Provider)
+* By using the `Stable REST API <stable-rest-api-ref>`_ (based on the OpenAPI specification)
+* By using the `Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_
+
+How can you extend Apache Airflow with custom Python Code?
+==========================================================
+
+The Public Interface of Airflow consists of a number of different classes and packages that provide access
+to the core features and functionality of the system.
+
+The classes and packages that may be considered as the Public Interface include:
+
+* The :class:`~airflow.DAG`, which provides a way to define and manage DAGs in Airflow.
+* The :class:`~airflow.models.baseoperator.BaseOperator`, which provides a way write custom operators.
+* The :class:`~airflow.hooks.base.BaseHook`, which provides a way write custom hooks.
+* The :class:`~airflow.models.connection.Connection`, which provides access to external service credentials and configuration.
+* The :class:`~airflow.models.variable.Variable`, which provides access to Airflow configuration variables.
+* The :class:`~airflow.models.xcom.XCom` which are used to access to inter-task communication data.
+* The :class:`~airflow.secrets.BaseSecretsBackend` which are used to define custom secret managers.
+* The :class:`~airflow.plugins_manager.AirflowPlugin` which are used to define custom plugins.
+* The :class:`~airflow.triggers.base.BaseTrigger`, which are used to implement custom Custom Deferrable Operators (based on ``asyncio``).
+* The :class:`~airflow.decorators.base.TaskDecorator`, which provides a way write custom decorators.
+* The :class:`~airflow.listeners.listener.ListenerManager` class which provides hooks that can be implemented to respond to DAG/Task lifecycle events.
+
+.. versionadded:: 2.5
+
+   Listener public interface has been added in version 2.5.
+
+* The :class:`~airflow.executors.base_executor.BaseExecutor` - the Executors are the components of Airflow
+  that are responsible for executing tasks.
+
+.. versionadded:: 2.6
+
+   There are a number of different executor implementations built-in Airflow, each with its own unique
+   characteristics and capabilities. Executor interface was available in earlier version of Airflow but
+   only as of version 2.6 executors are fully decoupled and Airflow does not rely on built-in set of executors.
+   You could have implemented (and succeeded) with implementing Executors before Airflow 2.6 and a number
+   of people succeeded in doing so, but there were some hard-coded behaviours that preferred in-built
+   executors, and custom executors could not provide full functionality that built-in executors had.
+
+
+What is not part of the Public Interface of Apache Airflow?
+===========================================================
+
+Everything not mentioned in this document should be considered as non-Public Interface.

Review Comment:
   > It's also important to clarify that some methods are intentionally "private". E.g. in the operator, it might have helper methods. But these methods are more implementation detail right? If users subclass, though, they may use and depend on these. So what do we do? Ideally, from maintainer perspective, we want the overall behavior of the operator to be subject to backcompat but not any of its methods. Currently the convention is, i believe, all methods are assumed public unless prefix with underscore / marked meta private. WDYT?
   
   For providers it's far easier because there is no "internal" sharing between multiple modules. Each of them is really either "standalone" or uses "public" classes from other providers. And with common.sql we even introduced stubgen to generate the actual "public interface" that MyPy can surface as errors. And we even have a friction added (necessiety of regenerating the stubs) when someone changes the API by accident.
   
   > One problem with the public / private distinction is... from tool perspective ... even using _my_func across modules is sorta forbidden. But there is a difference between the internal "private" and the user-facing "private". We need to be able to write helper code that can be used across modules (internal public), but not have it be user-facing public.
   
   This is very right assesment.
   
   That's why I proposed to allowlist and be explicit about everything that is public. And from the tool perspective, one thing is  important (what we did for common.sql) - to keep the list of all public methods, and whenever something there is *removed* or *updated*, fail the build and make it an explicit action (friction) for the one who modifies it accidentally. We can also add a tool for whoever uses airflow as library to check if they are not using something not "intentionally" public (we've done that for common.sql - via MyPy stubs).
   
   But we cannot  (yet) do any automation in Airflow, because first we need to agree and document what is and what is not public. We need a good list of things that should be "public" - so that we can (eventually) add friction to removing or updating those and possibly eventually add a tool for others to verify if they are not using something they should not. 
   
   This doc here is mostly to start the discussion and work-out a good set of classes/packages that we should consider "really public". For now in a document, eventually "guarded automatically".
   
   BTW. As I mentioned in many places, we are not able to get it 100% correct. Never. [Hyrum's law](https://www.hyrumslaw.com/) is very clear and IMHO very true about this. 
   
   But we can state our intentions, and make an effort to keep those "explicitly intended" APIs controlled. 
   
   And first ... we need to agree on what our intentions are. If we don't document them and don't agree what is and what is not our intention for public API - everyone will have their own understanding of it.  And often those understandings will be different one (for example many users still use the DB of Airflow and relies on its structure because they assume they can rely on it). You wrote yourself that you prefer to `minimize backcompat surface` and "everything that's not an operator / sensor / hook should be private". But this is only your understanding - others might say "everything in utils can be used by DAG authors and provider writers".  Both statement are possible - both orthogonally different. Both might be deliberate decision we should make as a community. But we need to make that decision and be very explicit about it. This is a starting point that is missing now IMHO.
   
   I just want to have a page where we all agree what we intentionally make "public", and have clear rules that all the rest is not.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] dstandish commented on a diff in pull request #28300: Add Public Interface description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
dstandish commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1058624408


##########
docs/apache-airflow/administration-and-deployment/public-airflow-interface.rst:
##########
@@ -0,0 +1,87 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+===========================
+
+The Public Interface of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This includes operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+and for automating certain aspects of the Airflow workflow.
+
+You can extend Airflow in three ways:
+
+* By writing new custom Python code (via Operators, Plugins, Provider)
+* By using the `Stable REST API <stable-rest-api-ref>`_ (based on the OpenAPI specification)
+* By using the `Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_
+
+How can you extend Apache Airflow with custom Python Code?
+==========================================================
+
+The Public Interface of Airflow consists of a number of different classes and packages that provide access
+to the core features and functionality of the system.
+
+The classes and packages that may be considered as the Public Interface include:
+
+* The :class:`~airflow.DAG`, which provides a way to define and manage DAGs in Airflow.
+* The :class:`~airflow.models.baseoperator.BaseOperator`, which provides a way write custom operators.
+* The :class:`~airflow.hooks.base.BaseHook`, which provides a way write custom hooks.
+* The :class:`~airflow.models.connection.Connection`, which provides access to external service credentials and configuration.
+* The :class:`~airflow.models.variable.Variable`, which provides access to Airflow configuration variables.
+* The :class:`~airflow.models.xcom.XCom` which are used to access to inter-task communication data.
+* The :class:`~airflow.secrets.BaseSecretsBackend` which are used to define custom secret managers.
+* The :class:`~airflow.plugins_manager.AirflowPlugin` which are used to define custom plugins.
+* The :class:`~airflow.triggers.base.BaseTrigger`, which are used to implement custom Custom Deferrable Operators (based on ``asyncio``).
+* The :class:`~airflow.decorators.base.TaskDecorator`, which provides a way write custom decorators.
+* The :class:`~airflow.listeners.listener.ListenerManager` class which provides hooks that can be implemented to respond to DAG/Task lifecycle events.
+
+.. versionadded:: 2.5
+
+   Listener public interface has been added in version 2.5.
+
+* The :class:`~airflow.executors.base_executor.BaseExecutor` - the Executors are the components of Airflow
+  that are responsible for executing tasks.
+
+.. versionadded:: 2.6
+
+   There are a number of different executor implementations built-in Airflow, each with its own unique
+   characteristics and capabilities. Executor interface was available in earlier version of Airflow but
+   only as of version 2.6 executors are fully decoupled and Airflow does not rely on built-in set of executors.
+   You could have implemented (and succeeded) with implementing Executors before Airflow 2.6 and a number
+   of people succeeded in doing so, but there were some hard-coded behaviours that preferred in-built
+   executors, and custom executors could not provide full functionality that built-in executors had.
+
+
+What is not part of the Public Interface of Apache Airflow?
+===========================================================
+
+Everything not mentioned in this document should be considered as non-Public Interface.

Review Comment:
   This doesn't address providers, or how we can include "private" e.g. helper code in otherwise "public" modules / classes.
   
   Is it sufficient to include `:meta private:`?  Or must we prefix with underscore?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on a diff in pull request #28300: Add Public Interface description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
potiuk commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1047364387


##########
docs/apache-airflow/public-airflow-interface.rst:
##########
@@ -0,0 +1,110 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+===========================
+
+The Public Interface of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+as well as for automating certain aspects of the Airflow workflow.
+
+In general, the Public Interface is an important part of the Airflow ecosystem and can be a powerful
+tool for users and developers who want to extend the functionality of the system.
+
+You can extend Airflow in three ways:
+
+* By writing new custom Python code (via Operators, Plugins, Provider)
+* By using the `Stable REST API <stable-rest-api-ref>`_ (based on the OpenAPI specification)
+* By using the `Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_
+
+How can you extend Apache Airflow with custom Python Code?
+==========================================================
+
+Apache Airflow has a number of different Public Interfaces that allow developers to interact with various
+aspects of the system. Some examples of the types of Public Interfaces exposed as Python objects
+that are available in Apache Airflow include:
+
+* `DAG <concepts/dags>`_ (Directed Acyclic Graph) APIs, which allow developers to create, manage,
+  and access DAGs in Airflow.
+* `Task <concepts/tasks>`_ APIs, which provide access to information about individual tasks within
+  a DAG, such as their dependencies and execution status.
+* `Operator <concepts/operators>`_ APIs, which allows the developers to write their custom Operators.
+* `Decorators <howto/create-custom-decorator>`_ APIs, which allows the developers to write their
+  custom decorators to make it easier to write `TaskFlow <tutorial/taskflow>`_ DAGs.
+* `Secret Managers <security/secrets>`_ APIs, which allows the developers to write their custom
+  Secret Managers to safely access credentials and other secret configuration of their workflows.
+* `Connection management <concepts/connections>`_ APIs, which allow developers to manage
+  connections to external systems
+* `XCom <concepts/xcoms>`_, which allow developers to manage cross-task communication within Airflow.
+* `Variables <concepts/variables>`_, which allow developers to manage variables within Airflow.
+* `Executors <executor/index>`_, which allow developers to manage the execution of tasks within Airflow.
+* `Listeners <listeners>`_ - all of which are considered Public Interfaces.
+* `Plugins <plugins>`_, which allow developers to extend internal Airflow capabilities - add new UI
+  pages, custom `TimeTables <concepts/timetable>`_, `Extra Links <howto/define_extra_link>`_,
+
+What is not part of the Public Interface of Apache Airflow?
+===========================================================
+
+Everything not mentioned in this document should be considered as non-Public Interface.
+
+The users however, might have some assumptions about different parts of Airflow, thinking that
+they can rely on them, so here we try to clarify and want to be explicit that they are not part of the
+Public Interface:
+
+* `Database structure <database-erd-ref>`_ is considered to be an internal implementation
+  detail and you should not assume the structure is going to be maintained in
+  backwards-compatible way.
+
+* `Web UI <ui>`_ is considered to be continuously adapting and evolving for Apache Airflow and
+  you should not assume that the UI will remain as it is in backwards-compatible way. Views and screens
+  might be updated and removed in major releases without impacting backwards-compatibility.
+
+* `Python API <python-api-ref>`_ (except the explicitly mentioned classes below), are considered an
+  internal implementation details and you should not assume they will be maintained
+  in backwards-compatible way.
+
+Which classes and packages are part of the Public Interface of Apache Airflow?
+==============================================================================
+
+The Public Interface of Airflow consists of a number of different classes and packages that provide access
+to the core features and functionality of the system.
+
+The classes and packages that may be considered as the Public Interface include:
+
+* The :class:`~airflow.models.dag.DAG`, which provides a way to define and manage DAGs in Airflow.

Review Comment:
   For DAG yes. For others we would have to move it. I think - if you think it makes sense, you can add them to "airflow"  - no problem for me since we have the STAICA_HACK to avoid circular references
   
   For now I just want to document what we have rather than  change anything.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on a diff in pull request #28300: Add Public Interface description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
potiuk commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1050166158


##########
docs/apache-airflow/public-airflow-interface.rst:
##########
@@ -0,0 +1,115 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+===========================
+
+The Public Interface of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+as well as for automating certain aspects of the Airflow workflow.
+
+In general, the Public Interface is an important part of the Airflow ecosystem and can be a powerful
+tool for users and developers who want to extend the functionality of the system.
+
+You can extend Airflow in three ways:
+
+* By writing new custom Python code (via Operators, Plugins, Provider)
+* By using the `Stable REST API <stable-rest-api-ref>`_ (based on the OpenAPI specification)
+* By using the `Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_
+
+How can you extend Apache Airflow with custom Python Code?
+==========================================================
+
+Apache Airflow has a number of different Public Interfaces that allow developers to interact with various
+aspects of the system. Some examples of the types of Public Interfaces exposed as Python objects
+that are available in Apache Airflow include:
+
+* `DAG <concepts/dags>`_ (Directed Acyclic Graph) APIs, which allow developers to create, manage,
+  and access DAGs in Airflow.
+* `Task <concepts/tasks>`_ APIs, which provide access to information about individual tasks within
+  a DAG, such as their dependencies and execution status.
+* `Operator <concepts/operators>`_ APIs, which allows the developers to write their custom Operators.
+* `Decorators <howto/create-custom-decorator>`_ APIs, which allows the developers to write their
+  custom decorators to make it easier to write `TaskFlow <tutorial/taskflow>`_ DAGs.
+* `Secret Managers <security/secrets>`_ APIs, which allows the developers to write their custom
+  Secret Managers to safely access credentials and other secret configuration of their workflows.
+* `Connection management <concepts/connections>`_ APIs, which allow developers to manage
+  connections to external systems
+* `XCom <concepts/xcoms>`_, which allow developers to manage cross-task communication within Airflow.
+* `Variables <concepts/variables>`_, which allow developers to manage variables within Airflow.
+* `Executors <executor/index>`_, which allow developers to manage the execution of tasks within Airflow.
+* `Listeners <listeners>`_, which allow developers to react to DAG/Task lifecycle events.
+* `Plugins <plugins>`_, which allow developers to extend internal Airflow capabilities - add new UI
+  pages, custom `TimeTables <concepts/timetable>`_, `Extra Links <howto/define_extra_link>`_,
+  `Triggers <concept/deferring>`_. `Listeners <listeners>`_.
+
+What is not part of the Public Interface of Apache Airflow?
+===========================================================
+
+Everything not mentioned in this document should be considered as non-Public Interface.
+
+The users however, might have some assumptions about different parts of Airflow, thinking that
+they can rely on them, so here we try to clarify and want to be explicit that they are not part of the
+Public Interface:
+
+* `Database structure <database-erd-ref>`_ is considered to be an internal implementation
+  detail and you should not assume the structure is going to be maintained in
+  backwards-compatible way.
+
+* `Web UI <ui>`_ is considered to be continuously adapting and evolving for Apache Airflow and
+  you should not assume that the UI will remain as it is in backwards-compatible way. Views and screens
+  might be updated and removed in major releases without impacting backwards-compatibility.
+
+* `Python API <python-api-ref>`_ (except the explicitly mentioned classes below), are considered an
+  internal implementation details and you should not assume they will be maintained
+  in backwards-compatible way.
+
+Which classes and packages are part of the Public Interface of Apache Airflow?
+==============================================================================
+
+The Public Interface of Airflow consists of a number of different classes and packages that provide access
+to the core features and functionality of the system.
+
+The classes and packages that may be considered as the Public Interface include:
+
+* The :class:`~airflow.DAG`, which provides a way to define and manage DAGs in Airflow.
+* The :class:`~airflow.models.baseoperator.BaseOperator`, which provides a way write custom operators.
+* The :class:`~airflow.hooks.base.BaseHook`, which provides a way write custom hooks.
+* The :class:`~airflow.models.connection.Connection`, which provides access to external service credentials and configuration.
+* The :class:`~airflow.models.variable.Variable`, which provides access to Airflow configuration variables.
+* The :class:`~airflow.models.xcom.XCom` which are used to access to inter-task communication data.
+* The :class:`~airflow.secrets.BaseSecretsBackend` which are used to define custom secret managers.
+* The :class:`~airflow.plugins_manager.AirflowPlugin` which are used to define custom plugins.
+* The :class:`~airflow.triggers.base.BaseTrigger`, which are used to implement custom Custom Deferrable Operators (based on ``async.io``).
+* The :class:`~airflow.decorators.base.TaskDecorator`, which provides a way write custom decorators.
+* The :class:`~airflow.listeners.listener.ListenerManager` class which provides hooks that can be implemented to respond to DAG/Task lifecycle events
+
+.. versionadded:: 2.5
+
+   Listener public interface has been added in version 2.5.
+
+* The :class:`~airflow.executors.base_executor.BaseExecutor` - the Executors are the components of Airflow
+  that are responsible for executing tasks.
+
+.. versionadded:: 2.6
+
+   There are a number of different executor implementations built-in Airflow, each with its own unique
+   characteristics and capabilities. Executor interface was available in earlier version of Airflow but
+   only as of version 2.6 executors are decoupled and Airflow does not rely on built-in set of executors.

Review Comment:
   BTW. I can also link to AIP-51 for details of what "could not be done" before. I think it describes it pretty well (and completely). But I think we rarely link to AIPs in our docs (and it was criricised before when I added here), so I think just leaving a generic note is better. 



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] BasPH commented on a diff in pull request #28300: Add Public Interface description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
BasPH commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1051376618


##########
docs/apache-airflow/administration-and-deployment/public-airflow-interface.rst:
##########
@@ -0,0 +1,118 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+===========================
+
+The Public Interface of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+as well as for automating certain aspects of the Airflow workflow.
+
+In general, the Public Interface is an important part of the Airflow ecosystem and can be a powerful
+tool for users and developers who want to extend the functionality of the system.
+
+You can extend Airflow in three ways:
+
+* By writing new custom Python code (via Operators, Plugins, Provider)
+* By using the `Stable REST API <stable-rest-api-ref>`_ (based on the OpenAPI specification)
+* By using the `Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_
+
+How can you extend Apache Airflow with custom Python Code?
+==========================================================
+
+Apache Airflow has a number of different Public Interfaces that allow developers to interact with various
+aspects of the system. Some examples of the types of Public Interfaces exposed as Python objects
+that are available in Apache Airflow include:
+
+* `DAG <concepts/dags>`_ (Directed Acyclic Graph) APIs, which allow developers to create, manage,
+  and access DAGs in Airflow.
+* `Task <concepts/tasks>`_ APIs, which provide access to information about individual tasks within
+  a DAG, such as their dependencies and execution status.
+* `Operator <concepts/operators>`_ APIs, which allows the developers to write their custom Operators.
+* `Decorators <howto/create-custom-decorator>`_ APIs, which allows the developers to write their
+  custom decorators to make it easier to write `TaskFlow <tutorial/taskflow>`_ DAGs.
+* `Secret Managers <security/secrets>`_ APIs, which allows the developers to write their custom
+  Secret Managers to safely access credentials and other secret configuration of their workflows.
+* `Connection management <concepts/connections>`_ APIs, which allow developers to manage
+  connections to external systems.
+* `XCom <concepts/xcoms>`_, which allow developers to manage cross-task communication within Airflow.
+* `Variables <concepts/variables>`_, which allow developers to manage variables within Airflow.
+* `Executors <executor/index>`_, which allow developers to manage the execution of tasks within Airflow.
+* `Listeners <listeners>`_, which allow developers to react to DAG/Task lifecycle events.
+* `Plugins <plugins>`_, which allow developers to extend internal Airflow capabilities - add new UI
+  pages, custom `TimeTables <concepts/timetable>`_, `Extra Links <howto/define_extra_link>`_,
+  `Triggers <concept/deferring>`_, and `Listeners <listeners>`_.

Review Comment:
   It's a bit vague to me why we have a near duplicate of this list a few lines below. Let's consolidate it into one single list for clarity and ease of maintenance.



##########
docs/apache-airflow/administration-and-deployment/public-airflow-interface.rst:
##########
@@ -0,0 +1,118 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+===========================
+
+The Public Interface of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+as well as for automating certain aspects of the Airflow workflow.

Review Comment:
   ```suggestion
   and for automating certain aspects of the Airflow workflow.
   ```



##########
docs/apache-airflow/administration-and-deployment/public-airflow-interface.rst:
##########
@@ -0,0 +1,118 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+===========================
+
+The Public Interface of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+as well as for automating certain aspects of the Airflow workflow.
+
+In general, the Public Interface is an important part of the Airflow ecosystem and can be a powerful
+tool for users and developers who want to extend the functionality of the system.
+
+You can extend Airflow in three ways:
+
+* By writing new custom Python code (via Operators, Plugins, Provider)
+* By using the `Stable REST API <stable-rest-api-ref>`_ (based on the OpenAPI specification)
+* By using the `Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_
+
+How can you extend Apache Airflow with custom Python Code?
+==========================================================
+
+Apache Airflow has a number of different Public Interfaces that allow developers to interact with various
+aspects of the system. Some examples of the types of Public Interfaces exposed as Python objects
+that are available in Apache Airflow include:

Review Comment:
   IMO "some examples" is vague language, which can lead to discussion. Better to be explicit:
   
   ```suggestion
   aspects of the system. The following Python objects are part of the Public Interface:
   ```



##########
docs/apache-airflow/administration-and-deployment/public-airflow-interface.rst:
##########
@@ -0,0 +1,118 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+===========================
+
+The Public Interface of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+as well as for automating certain aspects of the Airflow workflow.
+
+In general, the Public Interface is an important part of the Airflow ecosystem and can be a powerful
+tool for users and developers who want to extend the functionality of the system.
+
+You can extend Airflow in three ways:
+
+* By writing new custom Python code (via Operators, Plugins, Provider)
+* By using the `Stable REST API <stable-rest-api-ref>`_ (based on the OpenAPI specification)
+* By using the `Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_
+
+How can you extend Apache Airflow with custom Python Code?
+==========================================================
+
+Apache Airflow has a number of different Public Interfaces that allow developers to interact with various
+aspects of the system. Some examples of the types of Public Interfaces exposed as Python objects
+that are available in Apache Airflow include:
+
+* `DAG <concepts/dags>`_ (Directed Acyclic Graph) APIs, which allow developers to create, manage,
+  and access DAGs in Airflow.
+* `Task <concepts/tasks>`_ APIs, which provide access to information about individual tasks within
+  a DAG, such as their dependencies and execution status.
+* `Operator <concepts/operators>`_ APIs, which allows the developers to write their custom Operators.
+* `Decorators <howto/create-custom-decorator>`_ APIs, which allows the developers to write their
+  custom decorators to make it easier to write `TaskFlow <tutorial/taskflow>`_ DAGs.
+* `Secret Managers <security/secrets>`_ APIs, which allows the developers to write their custom
+  Secret Managers to safely access credentials and other secret configuration of their workflows.
+* `Connection management <concepts/connections>`_ APIs, which allow developers to manage
+  connections to external systems.
+* `XCom <concepts/xcoms>`_, which allow developers to manage cross-task communication within Airflow.
+* `Variables <concepts/variables>`_, which allow developers to manage variables within Airflow.
+* `Executors <executor/index>`_, which allow developers to manage the execution of tasks within Airflow.
+* `Listeners <listeners>`_, which allow developers to react to DAG/Task lifecycle events.
+* `Plugins <plugins>`_, which allow developers to extend internal Airflow capabilities - add new UI
+  pages, custom `TimeTables <concepts/timetable>`_, `Extra Links <howto/define_extra_link>`_,
+  `Triggers <concept/deferring>`_, and `Listeners <listeners>`_.
+
+What is not part of the Public Interface of Apache Airflow?
+===========================================================
+
+Everything not mentioned in this document should be considered as non-Public Interface.
+
+The users however, might have some assumptions about different parts of Airflow, thinking that
+they can rely on them, so here we try to clarify and want to be explicit that they are not part of the
+Public Interface:

Review Comment:
   This reads as an internal note and not something aimed at a user. Let's reword for clarity:
   
   ```suggestion
   Everything not mentioned in this document should be considered as non-Public Interface. This includes:
   ```



##########
docs/apache-airflow/administration-and-deployment/public-airflow-interface.rst:
##########
@@ -0,0 +1,118 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+===========================
+
+The Public Interface of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+as well as for automating certain aspects of the Airflow workflow.
+
+In general, the Public Interface is an important part of the Airflow ecosystem and can be a powerful
+tool for users and developers who want to extend the functionality of the system.
+
+You can extend Airflow in three ways:
+
+* By writing new custom Python code (via Operators, Plugins, Provider)
+* By using the `Stable REST API <stable-rest-api-ref>`_ (based on the OpenAPI specification)
+* By using the `Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_
+
+How can you extend Apache Airflow with custom Python Code?
+==========================================================
+
+Apache Airflow has a number of different Public Interfaces that allow developers to interact with various
+aspects of the system. Some examples of the types of Public Interfaces exposed as Python objects
+that are available in Apache Airflow include:
+
+* `DAG <concepts/dags>`_ (Directed Acyclic Graph) APIs, which allow developers to create, manage,
+  and access DAGs in Airflow.
+* `Task <concepts/tasks>`_ APIs, which provide access to information about individual tasks within
+  a DAG, such as their dependencies and execution status.
+* `Operator <concepts/operators>`_ APIs, which allows the developers to write their custom Operators.
+* `Decorators <howto/create-custom-decorator>`_ APIs, which allows the developers to write their
+  custom decorators to make it easier to write `TaskFlow <tutorial/taskflow>`_ DAGs.
+* `Secret Managers <security/secrets>`_ APIs, which allows the developers to write their custom
+  Secret Managers to safely access credentials and other secret configuration of their workflows.
+* `Connection management <concepts/connections>`_ APIs, which allow developers to manage
+  connections to external systems.
+* `XCom <concepts/xcoms>`_, which allow developers to manage cross-task communication within Airflow.
+* `Variables <concepts/variables>`_, which allow developers to manage variables within Airflow.
+* `Executors <executor/index>`_, which allow developers to manage the execution of tasks within Airflow.
+* `Listeners <listeners>`_, which allow developers to react to DAG/Task lifecycle events.
+* `Plugins <plugins>`_, which allow developers to extend internal Airflow capabilities - add new UI
+  pages, custom `TimeTables <concepts/timetable>`_, `Extra Links <howto/define_extra_link>`_,
+  `Triggers <concept/deferring>`_, and `Listeners <listeners>`_.
+
+What is not part of the Public Interface of Apache Airflow?
+===========================================================
+
+Everything not mentioned in this document should be considered as non-Public Interface.
+
+The users however, might have some assumptions about different parts of Airflow, thinking that
+they can rely on them, so here we try to clarify and want to be explicit that they are not part of the
+Public Interface:
+
+* `Database structure <database-erd-ref>`_ is considered to be an internal implementation
+  detail and you should not assume the structure is going to be maintained in
+  backwards-compatible way.

Review Comment:
   ```suggestion
     detail and you should not assume the structure is going to be maintained in a
     backwards-compatible way.
   ```



##########
docs/apache-airflow/administration-and-deployment/public-airflow-interface.rst:
##########
@@ -0,0 +1,118 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+===========================
+
+The Public Interface of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as

Review Comment:
   ```suggestion
   with and access certain features of the Apache Airflow system. This includes operations such as
   ```



##########
docs/apache-airflow/administration-and-deployment/public-airflow-interface.rst:
##########
@@ -0,0 +1,118 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+===========================
+
+The Public Interface of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+as well as for automating certain aspects of the Airflow workflow.
+
+In general, the Public Interface is an important part of the Airflow ecosystem and can be a powerful
+tool for users and developers who want to extend the functionality of the system.

Review Comment:
   Would remove this sentence, I don't see any added value.



##########
docs/apache-airflow/administration-and-deployment/public-airflow-interface.rst:
##########
@@ -0,0 +1,118 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+===========================
+
+The Public Interface of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+as well as for automating certain aspects of the Airflow workflow.
+
+In general, the Public Interface is an important part of the Airflow ecosystem and can be a powerful
+tool for users and developers who want to extend the functionality of the system.
+
+You can extend Airflow in three ways:
+
+* By writing new custom Python code (via Operators, Plugins, Provider)
+* By using the `Stable REST API <stable-rest-api-ref>`_ (based on the OpenAPI specification)
+* By using the `Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_
+
+How can you extend Apache Airflow with custom Python Code?
+==========================================================
+
+Apache Airflow has a number of different Public Interfaces that allow developers to interact with various
+aspects of the system. Some examples of the types of Public Interfaces exposed as Python objects
+that are available in Apache Airflow include:
+
+* `DAG <concepts/dags>`_ (Directed Acyclic Graph) APIs, which allow developers to create, manage,
+  and access DAGs in Airflow.
+* `Task <concepts/tasks>`_ APIs, which provide access to information about individual tasks within
+  a DAG, such as their dependencies and execution status.
+* `Operator <concepts/operators>`_ APIs, which allows the developers to write their custom Operators.
+* `Decorators <howto/create-custom-decorator>`_ APIs, which allows the developers to write their
+  custom decorators to make it easier to write `TaskFlow <tutorial/taskflow>`_ DAGs.
+* `Secret Managers <security/secrets>`_ APIs, which allows the developers to write their custom
+  Secret Managers to safely access credentials and other secret configuration of their workflows.
+* `Connection management <concepts/connections>`_ APIs, which allow developers to manage
+  connections to external systems.
+* `XCom <concepts/xcoms>`_, which allow developers to manage cross-task communication within Airflow.
+* `Variables <concepts/variables>`_, which allow developers to manage variables within Airflow.
+* `Executors <executor/index>`_, which allow developers to manage the execution of tasks within Airflow.
+* `Listeners <listeners>`_, which allow developers to react to DAG/Task lifecycle events.
+* `Plugins <plugins>`_, which allow developers to extend internal Airflow capabilities - add new UI
+  pages, custom `TimeTables <concepts/timetable>`_, `Extra Links <howto/define_extra_link>`_,
+  `Triggers <concept/deferring>`_, and `Listeners <listeners>`_.
+
+What is not part of the Public Interface of Apache Airflow?
+===========================================================
+
+Everything not mentioned in this document should be considered as non-Public Interface.
+
+The users however, might have some assumptions about different parts of Airflow, thinking that
+they can rely on them, so here we try to clarify and want to be explicit that they are not part of the
+Public Interface:
+
+* `Database structure <database-erd-ref>`_ is considered to be an internal implementation
+  detail and you should not assume the structure is going to be maintained in
+  backwards-compatible way.
+
+* `Web UI <ui>`_ is considered to be continuously adapting and evolving for Apache Airflow and
+  you should not assume that the UI will remain as it is in backwards-compatible way. Views and screens
+  might be updated and removed in major releases without impacting backwards-compatibility.
+
+* `Python API <python-api-ref>`_ (except the explicitly mentioned classes below), are considered an
+  internal implementation details and you should not assume they will be maintained
+  in backwards-compatible way.

Review Comment:
   ```suggestion
   * `Python API <python-api-ref>`_ (except the explicitly mentioned classes below), are considered an
     internal implementation detail and you should not assume they will be maintained
     in a backwards-compatible way.
   ```



##########
docs/apache-airflow/administration-and-deployment/public-airflow-interface.rst:
##########
@@ -0,0 +1,118 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+===========================
+
+The Public Interface of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+as well as for automating certain aspects of the Airflow workflow.
+
+In general, the Public Interface is an important part of the Airflow ecosystem and can be a powerful
+tool for users and developers who want to extend the functionality of the system.
+
+You can extend Airflow in three ways:
+
+* By writing new custom Python code (via Operators, Plugins, Provider)
+* By using the `Stable REST API <stable-rest-api-ref>`_ (based on the OpenAPI specification)
+* By using the `Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_
+
+How can you extend Apache Airflow with custom Python Code?
+==========================================================
+
+Apache Airflow has a number of different Public Interfaces that allow developers to interact with various
+aspects of the system. Some examples of the types of Public Interfaces exposed as Python objects
+that are available in Apache Airflow include:
+
+* `DAG <concepts/dags>`_ (Directed Acyclic Graph) APIs, which allow developers to create, manage,
+  and access DAGs in Airflow.
+* `Task <concepts/tasks>`_ APIs, which provide access to information about individual tasks within
+  a DAG, such as their dependencies and execution status.
+* `Operator <concepts/operators>`_ APIs, which allows the developers to write their custom Operators.
+* `Decorators <howto/create-custom-decorator>`_ APIs, which allows the developers to write their
+  custom decorators to make it easier to write `TaskFlow <tutorial/taskflow>`_ DAGs.
+* `Secret Managers <security/secrets>`_ APIs, which allows the developers to write their custom
+  Secret Managers to safely access credentials and other secret configuration of their workflows.
+* `Connection management <concepts/connections>`_ APIs, which allow developers to manage
+  connections to external systems.
+* `XCom <concepts/xcoms>`_, which allow developers to manage cross-task communication within Airflow.
+* `Variables <concepts/variables>`_, which allow developers to manage variables within Airflow.
+* `Executors <executor/index>`_, which allow developers to manage the execution of tasks within Airflow.
+* `Listeners <listeners>`_, which allow developers to react to DAG/Task lifecycle events.
+* `Plugins <plugins>`_, which allow developers to extend internal Airflow capabilities - add new UI
+  pages, custom `TimeTables <concepts/timetable>`_, `Extra Links <howto/define_extra_link>`_,
+  `Triggers <concept/deferring>`_, and `Listeners <listeners>`_.
+
+What is not part of the Public Interface of Apache Airflow?
+===========================================================
+
+Everything not mentioned in this document should be considered as non-Public Interface.
+
+The users however, might have some assumptions about different parts of Airflow, thinking that
+they can rely on them, so here we try to clarify and want to be explicit that they are not part of the
+Public Interface:
+
+* `Database structure <database-erd-ref>`_ is considered to be an internal implementation
+  detail and you should not assume the structure is going to be maintained in
+  backwards-compatible way.
+
+* `Web UI <ui>`_ is considered to be continuously adapting and evolving for Apache Airflow and
+  you should not assume that the UI will remain as it is in backwards-compatible way. Views and screens
+  might be updated and removed in major releases without impacting backwards-compatibility.

Review Comment:
   ```suggestion
   * `Web UI <ui>`_ is continuously evolving and there are no backwards compatibility guarantees on HTML elements.
   ```



##########
docs/apache-airflow/administration-and-deployment/public-airflow-interface.rst:
##########
@@ -0,0 +1,118 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+===========================
+
+The Public Interface of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+as well as for automating certain aspects of the Airflow workflow.
+
+In general, the Public Interface is an important part of the Airflow ecosystem and can be a powerful
+tool for users and developers who want to extend the functionality of the system.
+
+You can extend Airflow in three ways:
+
+* By writing new custom Python code (via Operators, Plugins, Provider)
+* By using the `Stable REST API <stable-rest-api-ref>`_ (based on the OpenAPI specification)
+* By using the `Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_
+
+How can you extend Apache Airflow with custom Python Code?
+==========================================================
+
+Apache Airflow has a number of different Public Interfaces that allow developers to interact with various
+aspects of the system. Some examples of the types of Public Interfaces exposed as Python objects
+that are available in Apache Airflow include:
+
+* `DAG <concepts/dags>`_ (Directed Acyclic Graph) APIs, which allow developers to create, manage,
+  and access DAGs in Airflow.
+* `Task <concepts/tasks>`_ APIs, which provide access to information about individual tasks within
+  a DAG, such as their dependencies and execution status.
+* `Operator <concepts/operators>`_ APIs, which allows the developers to write their custom Operators.
+* `Decorators <howto/create-custom-decorator>`_ APIs, which allows the developers to write their
+  custom decorators to make it easier to write `TaskFlow <tutorial/taskflow>`_ DAGs.
+* `Secret Managers <security/secrets>`_ APIs, which allows the developers to write their custom
+  Secret Managers to safely access credentials and other secret configuration of their workflows.
+* `Connection management <concepts/connections>`_ APIs, which allow developers to manage
+  connections to external systems.
+* `XCom <concepts/xcoms>`_, which allow developers to manage cross-task communication within Airflow.
+* `Variables <concepts/variables>`_, which allow developers to manage variables within Airflow.
+* `Executors <executor/index>`_, which allow developers to manage the execution of tasks within Airflow.
+* `Listeners <listeners>`_, which allow developers to react to DAG/Task lifecycle events.
+* `Plugins <plugins>`_, which allow developers to extend internal Airflow capabilities - add new UI
+  pages, custom `TimeTables <concepts/timetable>`_, `Extra Links <howto/define_extra_link>`_,
+  `Triggers <concept/deferring>`_, and `Listeners <listeners>`_.
+
+What is not part of the Public Interface of Apache Airflow?
+===========================================================
+
+Everything not mentioned in this document should be considered as non-Public Interface.
+
+The users however, might have some assumptions about different parts of Airflow, thinking that
+they can rely on them, so here we try to clarify and want to be explicit that they are not part of the
+Public Interface:
+
+* `Database structure <database-erd-ref>`_ is considered to be an internal implementation
+  detail and you should not assume the structure is going to be maintained in
+  backwards-compatible way.
+
+* `Web UI <ui>`_ is considered to be continuously adapting and evolving for Apache Airflow and
+  you should not assume that the UI will remain as it is in backwards-compatible way. Views and screens
+  might be updated and removed in major releases without impacting backwards-compatibility.
+
+* `Python API <python-api-ref>`_ (except the explicitly mentioned classes below), are considered an
+  internal implementation details and you should not assume they will be maintained
+  in backwards-compatible way.
+
+Which classes and packages are part of the Public Interface of Apache Airflow?
+==============================================================================
+
+The Public Interface of Airflow consists of a number of different classes and packages that provide access
+to the core features and functionality of the system.
+
+The classes and packages that may be considered as the Public Interface include:
+
+* The :class:`~airflow.DAG`, which provides a way to define and manage DAGs in Airflow.
+* The :class:`~airflow.models.baseoperator.BaseOperator`, which provides a way write custom operators.
+* The :class:`~airflow.hooks.base.BaseHook`, which provides a way write custom hooks.
+* The :class:`~airflow.models.connection.Connection`, which provides access to external service credentials and configuration.
+* The :class:`~airflow.models.variable.Variable`, which provides access to Airflow configuration variables.
+* The :class:`~airflow.models.xcom.XCom` which are used to access to inter-task communication data.
+* The :class:`~airflow.secrets.BaseSecretsBackend` which are used to define custom secret managers.
+* The :class:`~airflow.plugins_manager.AirflowPlugin` which are used to define custom plugins.
+* The :class:`~airflow.triggers.base.BaseTrigger`, which are used to implement custom Custom Deferrable Operators (based on ``asyncio``).
+* The :class:`~airflow.decorators.base.TaskDecorator`, which provides a way write custom decorators.
+* The :class:`~airflow.listeners.listener.ListenerManager` class which provides hooks that can be implemented to respond to DAG/Task lifecycle events.
+
+.. versionadded:: 2.5
+
+   Listener public interface has been added in version 2.5.
+
+* The :class:`~airflow.executors.base_executor.BaseExecutor` - the Executors are the components of Airflow
+  that are responsible for executing tasks.
+
+.. versionadded:: 2.6
+
+   There are a number of different executor implementations built-in Airflow, each with its own unique
+   characteristics and capabilities. Executor interface was available in earlier version of Airflow but
+   only as of version 2.6 executors are fully decoupled and Airflow does not rely on built-in set of executors.
+   You could have implemented (and succeeded) with implementing Executors before Airflow 2.6 and a number
+   of people succeeded in doing so, but there were some hard-coded behaviours that preferred in-built
+   executors, and custom executors could not provide full functionality that built-in executors had.

Review Comment:
   What is the key message here?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on pull request #28300: Add Public Interface description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
potiuk commented on PR #28300:
URL: https://github.com/apache/airflow/pull/28300#issuecomment-1356180966

   Rebased and applied the latest comments.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on pull request #28300: Add Public Interface description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
potiuk commented on PR #28300:
URL: https://github.com/apache/airflow/pull/28300#issuecomment-1351030386

   Hey @BasPH  - explained reasoning why I addded those a bit longer explanation and context. 
   
   I hope this is good for you as well.
   
   One of the cases which I want to address is to be able to explain people that if they do rely on some of those things being explained as "public interface" , even if they do, whenever they complain or raise questions about that I want to simply send them there - "Please read the docs, it's all explained there and you likely missed it". 
   
   That saves enormous mental effort and time in disputes when you engage in such discussions (which are often) "but you did not explain it, I had the right to think it works this way".
   
   Also just to stress a bit that my "thinking" is not theorethical. It is very much based on practice of interacting with our users A LOT. I would encourage everyone to engage more in discussions we have in our community - or at least review them and follow. So far not many of us, maintainers are engaged, I think I understand quite well - and likely bettter than others - what our users are struggling with as well what is helpful for people. 
   
   I am not sure if you realize, but over the last 6 months or so the status of Github Discussions of ours looks like that (I have - very consistently 11/15 "most helpful" answers last 30 days while the next person never had more than 1-2:
   
   <img width="378" alt="Screenshot 2022-12-14 at 12 12 02" src="https://user-images.githubusercontent.com/595491/207580648-449f0380-d37b-46d6-aeac-3996da820d5d.png">
   
   And the raise in a number of discussions looks like that - so this is an extremely important medium where our users look for help:
   
   <img width="469" alt="Screenshot 2022-12-14 at 12 16 02" src="https://user-images.githubusercontent.com/595491/207581299-04ced62f-b4ae-47ed-aa9b-c11d8ee727ea.png">
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on pull request #28300: Add Public Interface description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
potiuk commented on PR #28300:
URL: https://github.com/apache/airflow/pull/28300#issuecomment-1367257038

   Following the comments of @dstandish  I just updated and restructured the proposal (again) - it will have some docs failures but I wanted to push it now if someone wants to look.
   
   I essentially merged all the different pages that were talking about the same with different focus - into one "Public Airflow Interface" where I described the intention of the page and described all the ways the public interfaces of Airflow can be used to extend Airflow functionality. It will be literally "all things Airflow considers public" with explaination why we create this single page and index to all the detailed pages and references where details can be found. 
   
   I also added the general statement discussed with Daniel - that whatever is "_" or marked as meta:private in those classes, is not considered as "public". And left the general exclusion about Database, UI and "all other Python classes and methods that are not mentioned in this document.
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on a diff in pull request #28300: Add Public Interface description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
potiuk commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1066812851


##########
docs/apache-airflow/public-airflow-interface.rst:
##########
@@ -0,0 +1,374 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+...........................
+
+The Public Interface of Apache Airflow is a set of interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This includes operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+and for automating certain aspects of the Airflow workflow.
+
+Using Airflow Public Interfaces
+===============================
+
+Using Airflow Public Interfaces is needed when you want to interact with Airflow programmatically:
+
+* When you are writing new (or extending existing) custom Python classes (Operators, Hooks) - basic building
+  blocks of DAGs and can be done DAG Authors to add missing functionality in their DAGs or by those who write
+  reusable custom operators for other DAG authors.
+* When writing new :doc:`Plugins <authoring-and-scheduling/plugins>` that extend Airflow's functionality beyond
+  DAG building blocks. Secrets, Timetables, Triggers, Listeners are all examples of such functionality. This
+  is usually done by users who manage Airflow instances.
+* Bundling custom Operators, Hooks, Plugins and releasing them together via
+  :doc:`provider packages <apache-airflow-providers:index>` - this is usually done by those who intend to
+  provide reusable set of functionality for external service or application Airflow integrates with.
+
+All the ways above involve extending or using Airflow Python classes and functions. The classes
+and functions mentioned below can be relied to keep backwards-compatible signatures and behaviours within
+MAJOR version of Airflow. On the other hand, classes and method started with ``_`` (also known
+as protected Python methods) and ``__`` (also known as private Python methods) are not part of the Public
+Airflow Interface and might change any time.
+
+You can also use Airflow's Public Interface via the `Stable REST API <stable-rest-api-ref>`_ (based on the
+OpenAPI specification). For specific needs you can also use the
+`Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_ though it's behaviour might change
+in details (such as output format and available flags) so if you want to rely on those in programmatic
+way, the Stable REST API is recommended.
+
+
+Using Public Interface by DAG Authors
+=====================================
+
+DAGS
+----
+
+The DAG is Airflow's core entity that represents a recurring workflow. You can create a DAG by
+instantiating the :class:`~airflow.models.dag.DAG` class in your DAG file.
+
+Airflow has a set of Example DAGs that you can use to learn how to write DAGs
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/example_dags/index
+
+You can read more about DAGs in :doc:`DAGs <core-concepts/dags>`.
+
+.. _pythonapi:operators:
+
+Operators
+---------
+
+Operators allow for generation of certain types of tasks that become nodes in
+the DAG when instantiated.
+
+There are 3 main types of operators:
+
+- Operators that performs an **action**, or tell another system to
+  perform an action
+- **Transfer** operators move data from one system to another
+- **Sensors** are a certain type of operator that will keep running until a
+  certain criterion is met. Examples include a specific file landing in HDFS or
+  S3, a partition appearing in Hive, or a specific time of the day. Sensors
+  are derived from :class:`~airflow.sensors.base.BaseSensorOperator` and run a poke
+  method at a specified :attr:`~airflow.sensors.base.BaseSensorOperator.poke_interval` until it
+  returns ``True``.
+
+All operators are derived from :class:`~airflow.models.baseoperator.BaseOperator` and acquire much
+functionality through inheritance. Since this is the core of the engine,
+it's worth taking the time to understand the parameters of :class:`~airflow.models.baseoperator.BaseOperator`
+to understand the primitive features that can be leveraged in your DAGs.
+
+Airflow has a set of Operators that are considered public. You are also free to extend their functionality
+by extending them:
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/operators/index
+
+  _api/airflow/sensors/index
+
+
+You can read more about the operators in :doc:`core-concepts/operators`, :doc:`core-concepts/sensors`.
+Also you can learn how to write a custom operator in :doc:`howto/custom-operator`.
+
+.. _pythonapi:hooks:
+
+Hooks
+-----
+
+Hooks are interfaces to external platforms and databases, implementing a common
+interface when possible and acting as building blocks for operators. All hooks
+are derived from :class:`~airflow.hooks.base.BaseHook`.
+
+Airflow has a set of Hooks that are considered public. You are free to extend their functionality
+by extending them:
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/hooks/index
+
+Public Airflow utilities
+------------------------
+
+When writing or extending Hooks and Operators, DAG authors and developers of those Operators and Hooks can
+use the following classes:
+
+* The :class:`~airflow.models.connection.Connection`, which provides access to external service credentials and configuration.
+* The :class:`~airflow.models.variable.Variable`, which provides access to Airflow configuration variables.
+* The :class:`~airflow.models.xcom.XCom` which are used to access to inter-task communication data.
+
+You can read more about the public Airflow utilities in :doc:`howto/connection`,
+:doc:`core-concepts/variables`, :doc:`core-concepts/xcoms`
+
+Public Exceptions
+-----------------
+
+When writing the custom Operators and Hooks, you can handle and raise public Exceptions that Airflow
+exposes:
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/exceptions/index
+
+
+Using Public Interface to extend Airflow capabilities
+=====================================================
+
+Airflow uses Plugin mechanism as a way to extend Airflow platform capabilities. They allow to extend
+Airflow UI but also they are the way to expose the below customizations (Triggers, Timetables, Listeners).
+Providers can also implement plugin endpoints and customize Airflow UI and the customizations.
+
+You can read more about plugins in :doc:`authoring-and-scheduling/plugins`. You can read how to extend
+Airflow UI in :doc:`howto/custom-view-plugin`. Note that there are some simple customizations of the UI
+that do not require plugins - you can read more about them in :doc:`howto/customize-ui`.
+
+Here are the ways how Plugins can be used to extend Airflow:
+
+Triggers
+--------
+
+Airflow might be configure to use Triggers in order to implement ``asyncio`` compatible Deferrable Operators.
+All Triggers derive from :class:`~airflow.triggers.base.BaseTrigger`.
+
+Airflow has a set of Triggers that are considered public. You are free to extend their functionality
+by extending them:
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/triggers/index
+
+You can read more about Triggers in :doc:`authoring-and-scheduling/deferring`.
+
+Timetables
+----------
+
+Custom timetable implementations provide Airflow's scheduler additional logic to
+schedule DAG runs in ways not possible with built-in schedule expressions.
+All Timetables derive from :class:`~airflow.timetables.base.Timetable`.
+
+Airflow has a set of Timetables that are considered public. You are free to extend their functionality
+by extending them:
+
+.. toctree::
+  :includehidden:
+  :maxdepth: 1
+
+  _api/airflow/timetables/index
+
+You can read more about Timetables in :doc:`howto/timetable`.
+
+Listeners
+---------
+
+Listeners are the way that airflow platform as a whole can be extended to respond to DAG/Task lifecycle events.
+
+This is implemented via :class:`~airflow.listeners.listener.ListenerManager` class that provides hooks that
+can be implemented to respond to DAG/Task lifecycle events.
+
+.. versionadded:: 2.5
+
+   Listener public interface has been added in version 2.5.
+
+You can read more about Listeners in :doc:`administration-and-deployment/listeners`.
+
+Extra Links
+-----------
+
+Extra links are dynamic links that could be added to Airflow independently from custom Operators. Normally
+they can be defined by the Operators, but plugins allow to override the links on aa global level.
+
+You can read more about the Extra Links in :doc:`/howto/define-extra-link`.
+
+Using Public Interface to integrate with external services and applications
+===========================================================================
+
+In order to integrate Airflow with external services and applications you can develop Hooks and Operators,
+similarly to DAG Authors, but also you can also extend the core functionality of Airflow by integrating
+them with those external services. This can be done by just adding the extension classes to PYTHONPATH
+on your system,and configuring Airflow to use them (there is no need to use plugin mechanism). However,
+you can also package and release the Hooks, Operators and core extensions in the form of
+:doc:`provider packages <apache-airflow-providers:index>`. You can read more about core extensions
+delivered by providers via :doc:`provider packages <apache-airflow-providers:core-extensions/index>`.
+
+Executors
+---------
+
+Executors are the mechanism by which task instances get run. All executors are
+derived from :class:`~airflow.executors.base_executor.BaseExecutor`. There are a number of different
+executor implementations built-in Airflow, each with its own unique characteristics and capabilities.
+
+Airflow has a set of Executors that are considered public. You are free to extend their functionality
+by extending them:
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/executors/index
+
+You can read more about executors in :doc:`core-concepts/executor/index`.
+
+.. versionadded:: 2.6
+
+  Executor interface was available in earlier version of Airflow but only as of version 2.6 executors are
+  fully decoupled and Airflow does not rely on built-in set of executors.
+  You could have implemented (and succeeded) with implementing Executors before Airflow 2.6 and a number
+  of people succeeded in doing so, but there were some hard-coded behaviours that preferred in-built
+  executors, and custom executors could not provide full functionality that built-in executors had.
+
+Secrets Backends
+----------------
+
+Airflow might be configured to rely on secrets backends to retrieve
+:class:`~airflow.models.connection.Connection` and :class:`~airflow.models.Variables`.
+All secrets backends derive from :class:`~airflow.secrets.BaseSecretsBackend`.
+
+Airflow has a set of Secret Backends that are considered public. You are free to extend their functionality
+by extending them:
+
+.. toctree::
+  :includehidden:
+  :glob:
+  :maxdepth: 1
+
+  _api/airflow/secrets/index
+
+You can read more about Secret Backends in :doc:`administration-and-deployment/security/secrets/secrets-backend/index`.
+You can also find all the available Secrets Backends implemented in community providers
+in :doc:`apache-airflow-providers:core-extensions/secrets-backends`.
+
+Authentication Backends
+-----------------------
+
+Authentication backends can extend the way how Airflow authentication mechanism works. You can find out more
+about authentication in :doc:`apache-airflow-providers:core-extensions/auth-backends` that also shows available
+Authentication backends implemented in the community providers.
+
+Connections
+-----------
+
+When crating Hooks, you can add custom Connections. You can read more
+about connections in :doc:`apache-airflow-providers:core-extensions/connections` for available
+Connections implemented in the community providers.
+
+Extra Links
+-----------
+
+When creating Hooks, you can add custom Extra Links that are displayed when the tasks are run.
+You can find out more about extra links in :doc:`apache-airflow-providers:core-extensions/extra-links`
+that also shows available Authentication backends implemented in the community providers.

Review Comment:
   Right.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on a diff in pull request #28300: Add Public Interface description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
potiuk commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1051383399


##########
docs/apache-airflow/administration-and-deployment/public-airflow-interface.rst:
##########
@@ -0,0 +1,118 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+===========================
+
+The Public Interface of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+as well as for automating certain aspects of the Airflow workflow.
+
+In general, the Public Interface is an important part of the Airflow ecosystem and can be a powerful
+tool for users and developers who want to extend the functionality of the system.
+
+You can extend Airflow in three ways:
+
+* By writing new custom Python code (via Operators, Plugins, Provider)
+* By using the `Stable REST API <stable-rest-api-ref>`_ (based on the OpenAPI specification)
+* By using the `Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_
+
+How can you extend Apache Airflow with custom Python Code?
+==========================================================
+
+Apache Airflow has a number of different Public Interfaces that allow developers to interact with various
+aspects of the system. Some examples of the types of Public Interfaces exposed as Python objects
+that are available in Apache Airflow include:
+
+* `DAG <concepts/dags>`_ (Directed Acyclic Graph) APIs, which allow developers to create, manage,
+  and access DAGs in Airflow.
+* `Task <concepts/tasks>`_ APIs, which provide access to information about individual tasks within
+  a DAG, such as their dependencies and execution status.
+* `Operator <concepts/operators>`_ APIs, which allows the developers to write their custom Operators.
+* `Decorators <howto/create-custom-decorator>`_ APIs, which allows the developers to write their
+  custom decorators to make it easier to write `TaskFlow <tutorial/taskflow>`_ DAGs.
+* `Secret Managers <security/secrets>`_ APIs, which allows the developers to write their custom
+  Secret Managers to safely access credentials and other secret configuration of their workflows.
+* `Connection management <concepts/connections>`_ APIs, which allow developers to manage
+  connections to external systems.
+* `XCom <concepts/xcoms>`_, which allow developers to manage cross-task communication within Airflow.
+* `Variables <concepts/variables>`_, which allow developers to manage variables within Airflow.
+* `Executors <executor/index>`_, which allow developers to manage the execution of tasks within Airflow.
+* `Listeners <listeners>`_, which allow developers to react to DAG/Task lifecycle events.
+* `Plugins <plugins>`_, which allow developers to extend internal Airflow capabilities - add new UI
+  pages, custom `TimeTables <concepts/timetable>`_, `Extra Links <howto/define_extra_link>`_,
+  `Triggers <concept/deferring>`_, and `Listeners <listeners>`_.
+
+What is not part of the Public Interface of Apache Airflow?
+===========================================================
+
+Everything not mentioned in this document should be considered as non-Public Interface.
+
+The users however, might have some assumptions about different parts of Airflow, thinking that
+they can rely on them, so here we try to clarify and want to be explicit that they are not part of the
+Public Interface:

Review Comment:
   I think it could be answered as response to my previous explanation which I gave above.
   
   So let me repeat again - that if we list those explicitly without context and telling why we are listing those (because we've learned from the past that our users tend to think they can treat those compoents as "public interface". And listing them here explicitly is the way to avoid any discussions.
   
   I rephrased and shortened it to better reflect what I mean.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on pull request #28300: Add Public Interface description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
potiuk commented on PR #28300:
URL: https://github.com/apache/airflow/pull/28300#issuecomment-1356830770

   Rebased to apply static check fixes with isort.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] ashb commented on a diff in pull request #28300: Add Public Interface description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
ashb commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1049526177


##########
docs/apache-airflow/public-airflow-interface.rst:
##########
@@ -0,0 +1,115 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+===========================
+
+The Public Interface of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+as well as for automating certain aspects of the Airflow workflow.
+
+In general, the Public Interface is an important part of the Airflow ecosystem and can be a powerful
+tool for users and developers who want to extend the functionality of the system.
+
+You can extend Airflow in three ways:
+
+* By writing new custom Python code (via Operators, Plugins, Provider)
+* By using the `Stable REST API <stable-rest-api-ref>`_ (based on the OpenAPI specification)
+* By using the `Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_
+
+How can you extend Apache Airflow with custom Python Code?
+==========================================================
+
+Apache Airflow has a number of different Public Interfaces that allow developers to interact with various
+aspects of the system. Some examples of the types of Public Interfaces exposed as Python objects
+that are available in Apache Airflow include:
+
+* `DAG <concepts/dags>`_ (Directed Acyclic Graph) APIs, which allow developers to create, manage,
+  and access DAGs in Airflow.
+* `Task <concepts/tasks>`_ APIs, which provide access to information about individual tasks within
+  a DAG, such as their dependencies and execution status.
+* `Operator <concepts/operators>`_ APIs, which allows the developers to write their custom Operators.
+* `Decorators <howto/create-custom-decorator>`_ APIs, which allows the developers to write their
+  custom decorators to make it easier to write `TaskFlow <tutorial/taskflow>`_ DAGs.
+* `Secret Managers <security/secrets>`_ APIs, which allows the developers to write their custom
+  Secret Managers to safely access credentials and other secret configuration of their workflows.
+* `Connection management <concepts/connections>`_ APIs, which allow developers to manage
+  connections to external systems
+* `XCom <concepts/xcoms>`_, which allow developers to manage cross-task communication within Airflow.
+* `Variables <concepts/variables>`_, which allow developers to manage variables within Airflow.
+* `Executors <executor/index>`_, which allow developers to manage the execution of tasks within Airflow.
+* `Listeners <listeners>`_, which allow developers to react to DAG/Task lifecycle events.
+* `Plugins <plugins>`_, which allow developers to extend internal Airflow capabilities - add new UI
+  pages, custom `TimeTables <concepts/timetable>`_, `Extra Links <howto/define_extra_link>`_,
+  `Triggers <concept/deferring>`_. `Listeners <listeners>`_.
+
+What is not part of the Public Interface of Apache Airflow?
+===========================================================
+
+Everything not mentioned in this document should be considered as non-Public Interface.
+
+The users however, might have some assumptions about different parts of Airflow, thinking that
+they can rely on them, so here we try to clarify and want to be explicit that they are not part of the
+Public Interface:
+
+* `Database structure <database-erd-ref>`_ is considered to be an internal implementation
+  detail and you should not assume the structure is going to be maintained in
+  backwards-compatible way.
+
+* `Web UI <ui>`_ is considered to be continuously adapting and evolving for Apache Airflow and
+  you should not assume that the UI will remain as it is in backwards-compatible way. Views and screens
+  might be updated and removed in major releases without impacting backwards-compatibility.
+
+* `Python API <python-api-ref>`_ (except the explicitly mentioned classes below), are considered an
+  internal implementation details and you should not assume they will be maintained
+  in backwards-compatible way.
+
+Which classes and packages are part of the Public Interface of Apache Airflow?
+==============================================================================
+
+The Public Interface of Airflow consists of a number of different classes and packages that provide access
+to the core features and functionality of the system.
+
+The classes and packages that may be considered as the Public Interface include:
+
+* The :class:`~airflow.DAG`, which provides a way to define and manage DAGs in Airflow.
+* The :class:`~airflow.models.baseoperator.BaseOperator`, which provides a way write custom operators.
+* The :class:`~airflow.hooks.base.BaseHook`, which provides a way write custom hooks.
+* The :class:`~airflow.models.connection.Connection`, which provides access to external service credentials and configuration.
+* The :class:`~airflow.models.variable.Variable`, which provides access to Airflow configuration variables.
+* The :class:`~airflow.models.xcom.XCom` which are used to access to inter-task communication data.
+* The :class:`~airflow.secrets.BaseSecretsBackend` which are used to define custom secret managers.
+* The :class:`~airflow.plugins_manager.AirflowPlugin` which are used to define custom plugins.
+* The :class:`~airflow.triggers.base.BaseTrigger`, which are used to implement custom Custom Deferrable Operators (based on ``async.io``).
+* The :class:`~airflow.decorators.base.TaskDecorator`, which provides a way write custom decorators.
+* The :class:`~airflow.listeners.listener.ListenerManager` class which provides hooks that can be implemented to respond to DAG/Task lifecycle events
+
+.. versionadded:: 2.5
+
+   Listener public interface has been added in version 2.5.
+
+* The :class:`~airflow.executors.base_executor.BaseExecutor` - the Executors are the components of Airflow
+  that are responsible for executing tasks.
+
+.. versionadded:: 2.6
+
+   There are a number of different executor implementations built-in Airflow, each with its own unique
+   characteristics and capabilities. Executor interface was available in earlier version of Airflow but
+   only as of version 2.6 executors are decoupled and Airflow does not rely on built-in set of executors.

Review Comment:
   This is simple not true though -- I know of multiple people (Astro _and_ others) who have successfully written custom executors against 2.1 and above.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on a diff in pull request #28300: Add Public API description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
potiuk commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1046534132


##########
docs/apache-airflow/public-airflow-api.rst:
##########
@@ -0,0 +1,141 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+"Public API" of Airflow
+=======================
+
+The Public API of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new Executors, Plugins, Operators and Providers. The
+public API of Apache Airflow can be useful for building custom tools and integrations with other systems,
+as well as for automating certain aspects of the Airflow workflow.
+
+In general, the public API is an important part of the Airflow ecosystem and can be a powerful tool for users
+and developers who want to extend the functionality of the system.
+
+What kind of APIs are Public in Apache Airflow?
+===============================================
+
+Apache Airflow has a number of different public APIs that allow developers to interact with various
+aspects of the system. Some examples of the types of public APIs exposed as Python objects
+that are available in Apache Airflow include:
+
+* :doc:`DAG <concepts/dags>`_ (Directed Acyclic Graph) APIs, which allow developers to create, manage,
+  and access DAGs in Airflow.
+* :doc:`Task <concepts/tasks>`_ APIs, which provide access to information about individual tasks within
+  a DAG, such as their dependencies and execution status.
+* :doc:`Operator <concepts/operators>`_ APIs, which allows the developers to write their custom Operators.
+* :doc:`Decorators <howto/create-custom-decorator>`_ APIs, which allows the developers to write their
+  custom decorators to make it easier to write :doc:`TaskFlow <tutorial/taskflow>`_ DAGs.
+* :doc:`Secret Managers <security/secrets>`_ APIs, which allows the developers to write their custom
+  Secret Managers to safely access credentials and other secret configuration of their workflows.
+* :doc:`Connection management <concepts/connections>`_ APIs, which allow developers to manage
+  connections to external systems
+* :doc:`XCom <concepts/xcoms>`_, which allow developers to manage cross-task communication within Airflow.
+* :doc:`Variables <concepts/variables>`_, which allow developers to manage variables within Airflow.
+* :doc:`Executors <executor/index>`_, which allow developers to manage the execution of tasks within Airflow.
+* :doc:`Plugins <plugins>`_, which allow developers to extend internal Airflow capabilities - add new UI
+  pages, custom :doc:`TimeTables <concepts/timetable>`_, :doc:`Extra Links <howto/define_extra_link>`_,
+  :doc:`Listeners <listeners>`_ - all of which are considered public APIs.
+
+Also Airflow has a stable REST API that allows users to interact with Airflow via HTTP requests and a
+CLI that allows users to interact with Airflow via command line commands.
+
+Overall, the public APIs in Apache Airflow are designed to provide developers with a wide
+range of tools and capabilities for interacting with the system and extending its functionality.
+
+What is not part of the Public API of Apache Airflow?
+=====================================================
+
+A public API of Apache Airflow is a set of programming interfaces that are designed to be used
+by developers to access certain features and functionality of the system. As such, not everything in
+Apache Airflow is considered to be part of the public API.
+
+For example, the internal implementation details of Apache Airflow, such as the specific algorithms
+and data structures used to manage DAGs and tasks, are not considered to be part of the public API.
+These implementation details are not intended to be used by external developers, a
+nd are subject to change without notice.
+
+Additionally, certain features and functionality of Apache Airflow may not be exposed through the public API,
+either because they are not considered to be stable or because they are not intended for external use.
+In these cases, developers may not be able to access these features through the public API,
+and should instead use other available methods for interacting with the system.
+
+Similarly, :doc:`Database structure <database-erd-ref>`_ is considered to be an internal implementation
+detail of Apache Airflow and you should not assume the structure is going to be maintained in
+backwards-compatible way.
+
+Is Airflow DB part of the Public API of Airflow?
+================================================
+
+The :doc:`Airflow DB <database-erd-ref>`_ is not considered to be part of the public API of Apache Airflow.
+The Airflow DB is the internal database used by Airflow to store information about DAGs, tasks, and
+other aspects of the system's state. This database is not intended to be accessed directly by
+external developers, and the specific details of its schema and structure are not considered
+to be part of the public API.
+
+Instead, developers who want to interact with the Airflow DB should use the :doc:`stable-rest-api-ref` and
+the :doc:`stable-cli-ref`, both providing a number of different interfaces and methods for accessing and
+manipulating data in the database. These APIs are designed to be more stable and user-friendly than
+accessing the Airflow DB directly, and provide a more consistent and predictable way of interacting
+with the system.
+
+Is Stable REST API part of the Public API of Airflow?
+=====================================================
+
+The :doc:`Stable REST API <stable-cli-ref>`_ is considered to be part of the public API of Apache Airflow.
+The REST API is a set of programming interfaces that allow developers to access certain features of
+Airflow over the HTTP protocol, using a standard set of RESTful (representational state transfer) conventions.
+
+The stable REST API is considered to be part of the public API because it is designed to be used by
+external developers to interact with Airflow in a consistent and predictable way. The REST API provides
+access to many of the same features and functionality that are available through the other public APIs
+of Airflow, such as the ability to create and manage DAGs and tasks, and to monitor their status.
+
+Overall, the stable REST API is an important part of the public API of Apache Airflow, and can
+be a useful tool for developers who want to integrate Airflow with other systems or
+build custom tools and applications on top of Airflow.

Review Comment:
   I do not want to change the current naming in this PR. This is something we can change later when we decide on different naming. This should be a separate PR.
   
   For now we have this:
   
   <img width="239" alt="Screenshot 2022-12-13 at 01 01 36" src="https://user-images.githubusercontent.com/595491/207186876-9d7a8dac-06cf-40c6-b69a-98feac48823c.png">
   
   And this:
   
   <img width="385" alt="Screenshot 2022-12-13 at 01 03 21" src="https://user-images.githubusercontent.com/595491/207187225-625f21c8-b444-47fe-9689-cb777c726529.png">
   
   And yes - I agree we should likely change the naming - but separatley.
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on a diff in pull request #28300: Add Public API description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
potiuk commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1046554635


##########
docs/apache-airflow/public-airflow-api.rst:
##########
@@ -0,0 +1,141 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+"Public API" of Airflow
+=======================
+
+The Public API of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new Executors, Plugins, Operators and Providers. The
+public API of Apache Airflow can be useful for building custom tools and integrations with other systems,
+as well as for automating certain aspects of the Airflow workflow.
+
+In general, the public API is an important part of the Airflow ecosystem and can be a powerful tool for users
+and developers who want to extend the functionality of the system.
+
+What kind of APIs are Public in Apache Airflow?
+===============================================
+
+Apache Airflow has a number of different public APIs that allow developers to interact with various
+aspects of the system. Some examples of the types of public APIs exposed as Python objects
+that are available in Apache Airflow include:
+
+* :doc:`DAG <concepts/dags>`_ (Directed Acyclic Graph) APIs, which allow developers to create, manage,
+  and access DAGs in Airflow.
+* :doc:`Task <concepts/tasks>`_ APIs, which provide access to information about individual tasks within
+  a DAG, such as their dependencies and execution status.
+* :doc:`Operator <concepts/operators>`_ APIs, which allows the developers to write their custom Operators.
+* :doc:`Decorators <howto/create-custom-decorator>`_ APIs, which allows the developers to write their
+  custom decorators to make it easier to write :doc:`TaskFlow <tutorial/taskflow>`_ DAGs.
+* :doc:`Secret Managers <security/secrets>`_ APIs, which allows the developers to write their custom
+  Secret Managers to safely access credentials and other secret configuration of their workflows.
+* :doc:`Connection management <concepts/connections>`_ APIs, which allow developers to manage
+  connections to external systems
+* :doc:`XCom <concepts/xcoms>`_, which allow developers to manage cross-task communication within Airflow.
+* :doc:`Variables <concepts/variables>`_, which allow developers to manage variables within Airflow.
+* :doc:`Executors <executor/index>`_, which allow developers to manage the execution of tasks within Airflow.
+* :doc:`Plugins <plugins>`_, which allow developers to extend internal Airflow capabilities - add new UI
+  pages, custom :doc:`TimeTables <concepts/timetable>`_, :doc:`Extra Links <howto/define_extra_link>`_,
+  :doc:`Listeners <listeners>`_ - all of which are considered public APIs.
+
+Also Airflow has a stable REST API that allows users to interact with Airflow via HTTP requests and a
+CLI that allows users to interact with Airflow via command line commands.
+
+Overall, the public APIs in Apache Airflow are designed to provide developers with a wide
+range of tools and capabilities for interacting with the system and extending its functionality.
+
+What is not part of the Public API of Apache Airflow?
+=====================================================
+
+A public API of Apache Airflow is a set of programming interfaces that are designed to be used
+by developers to access certain features and functionality of the system. As such, not everything in
+Apache Airflow is considered to be part of the public API.
+
+For example, the internal implementation details of Apache Airflow, such as the specific algorithms
+and data structures used to manage DAGs and tasks, are not considered to be part of the public API.
+These implementation details are not intended to be used by external developers, a
+nd are subject to change without notice.
+
+Additionally, certain features and functionality of Apache Airflow may not be exposed through the public API,
+either because they are not considered to be stable or because they are not intended for external use.
+In these cases, developers may not be able to access these features through the public API,
+and should instead use other available methods for interacting with the system.
+
+Similarly, :doc:`Database structure <database-erd-ref>`_ is considered to be an internal implementation
+detail of Apache Airflow and you should not assume the structure is going to be maintained in
+backwards-compatible way.
+
+Is Airflow DB part of the Public API of Airflow?
+================================================
+
+The :doc:`Airflow DB <database-erd-ref>`_ is not considered to be part of the public API of Apache Airflow.
+The Airflow DB is the internal database used by Airflow to store information about DAGs, tasks, and
+other aspects of the system's state. This database is not intended to be accessed directly by
+external developers, and the specific details of its schema and structure are not considered
+to be part of the public API.
+
+Instead, developers who want to interact with the Airflow DB should use the :doc:`stable-rest-api-ref` and
+the :doc:`stable-cli-ref`, both providing a number of different interfaces and methods for accessing and
+manipulating data in the database. These APIs are designed to be more stable and user-friendly than
+accessing the Airflow DB directly, and provide a more consistent and predictable way of interacting
+with the system.
+
+Is Stable REST API part of the Public API of Airflow?
+=====================================================
+
+The :doc:`Stable REST API <stable-cli-ref>`_ is considered to be part of the public API of Apache Airflow.
+The REST API is a set of programming interfaces that allow developers to access certain features of
+Airflow over the HTTP protocol, using a standard set of RESTful (representational state transfer) conventions.
+
+The stable REST API is considered to be part of the public API because it is designed to be used by
+external developers to interact with Airflow in a consistent and predictable way. The REST API provides
+access to many of the same features and functionality that are available through the other public APIs
+of Airflow, such as the ability to create and manage DAGs and tasks, and to monitor their status.
+
+Overall, the stable REST API is an important part of the public API of Apache Airflow, and can
+be a useful tool for developers who want to integrate Airflow with other systems or
+build custom tools and applications on top of Airflow.

Review Comment:
   Removed it altogether.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on a diff in pull request #28300: Add Public API description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
potiuk commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1046561319


##########
docs/apache-airflow/public-airflow-api.rst:
##########
@@ -0,0 +1,141 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+"Public API" of Airflow
+=======================
+
+The Public API of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new Executors, Plugins, Operators and Providers. The
+public API of Apache Airflow can be useful for building custom tools and integrations with other systems,
+as well as for automating certain aspects of the Airflow workflow.
+
+In general, the public API is an important part of the Airflow ecosystem and can be a powerful tool for users
+and developers who want to extend the functionality of the system.
+
+What kind of APIs are Public in Apache Airflow?
+===============================================
+
+Apache Airflow has a number of different public APIs that allow developers to interact with various
+aspects of the system. Some examples of the types of public APIs exposed as Python objects
+that are available in Apache Airflow include:
+
+* :doc:`DAG <concepts/dags>`_ (Directed Acyclic Graph) APIs, which allow developers to create, manage,
+  and access DAGs in Airflow.
+* :doc:`Task <concepts/tasks>`_ APIs, which provide access to information about individual tasks within
+  a DAG, such as their dependencies and execution status.
+* :doc:`Operator <concepts/operators>`_ APIs, which allows the developers to write their custom Operators.
+* :doc:`Decorators <howto/create-custom-decorator>`_ APIs, which allows the developers to write their
+  custom decorators to make it easier to write :doc:`TaskFlow <tutorial/taskflow>`_ DAGs.
+* :doc:`Secret Managers <security/secrets>`_ APIs, which allows the developers to write their custom
+  Secret Managers to safely access credentials and other secret configuration of their workflows.
+* :doc:`Connection management <concepts/connections>`_ APIs, which allow developers to manage
+  connections to external systems
+* :doc:`XCom <concepts/xcoms>`_, which allow developers to manage cross-task communication within Airflow.
+* :doc:`Variables <concepts/variables>`_, which allow developers to manage variables within Airflow.
+* :doc:`Executors <executor/index>`_, which allow developers to manage the execution of tasks within Airflow.
+* :doc:`Plugins <plugins>`_, which allow developers to extend internal Airflow capabilities - add new UI
+  pages, custom :doc:`TimeTables <concepts/timetable>`_, :doc:`Extra Links <howto/define_extra_link>`_,
+  :doc:`Listeners <listeners>`_ - all of which are considered public APIs.
+
+Also Airflow has a stable REST API that allows users to interact with Airflow via HTTP requests and a
+CLI that allows users to interact with Airflow via command line commands.
+
+Overall, the public APIs in Apache Airflow are designed to provide developers with a wide
+range of tools and capabilities for interacting with the system and extending its functionality.
+
+What is not part of the Public API of Apache Airflow?
+=====================================================
+
+A public API of Apache Airflow is a set of programming interfaces that are designed to be used
+by developers to access certain features and functionality of the system. As such, not everything in
+Apache Airflow is considered to be part of the public API.
+
+For example, the internal implementation details of Apache Airflow, such as the specific algorithms
+and data structures used to manage DAGs and tasks, are not considered to be part of the public API.

Review Comment:
   I think we want to be MUCH stronger than that. Anything that is NOT mentioned in this docs should be considered Private. This is what I am adding in the new version of the docs. 
   
   Currently we have almost no` _` classes in our codebase and we even have no convention agreed that we should do it. I'd say maybe at some point in the future we should have it, but even that I am not sure if we will do it, because this is pretty uncommon in the Python world and it simply looks ugly. So while maybe it is an easy win, I am not sure if we will ever go this direction and adding it at this stage assumes we will.
   
   Simply speaking - let's not add a new convention in this doc that we neither have nor agreed to yet. My goal is to document current state at this stage and any future changes to conventions and agreements should be left for, well, future ....



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on a diff in pull request #28300: Add Public API description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
potiuk commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1046554517


##########
docs/apache-airflow/public-airflow-api.rst:
##########
@@ -0,0 +1,141 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+"Public API" of Airflow
+=======================
+
+The Public API of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new Executors, Plugins, Operators and Providers. The
+public API of Apache Airflow can be useful for building custom tools and integrations with other systems,
+as well as for automating certain aspects of the Airflow workflow.
+
+In general, the public API is an important part of the Airflow ecosystem and can be a powerful tool for users
+and developers who want to extend the functionality of the system.
+
+What kind of APIs are Public in Apache Airflow?
+===============================================
+
+Apache Airflow has a number of different public APIs that allow developers to interact with various
+aspects of the system. Some examples of the types of public APIs exposed as Python objects
+that are available in Apache Airflow include:
+
+* :doc:`DAG <concepts/dags>`_ (Directed Acyclic Graph) APIs, which allow developers to create, manage,
+  and access DAGs in Airflow.
+* :doc:`Task <concepts/tasks>`_ APIs, which provide access to information about individual tasks within
+  a DAG, such as their dependencies and execution status.
+* :doc:`Operator <concepts/operators>`_ APIs, which allows the developers to write their custom Operators.
+* :doc:`Decorators <howto/create-custom-decorator>`_ APIs, which allows the developers to write their
+  custom decorators to make it easier to write :doc:`TaskFlow <tutorial/taskflow>`_ DAGs.
+* :doc:`Secret Managers <security/secrets>`_ APIs, which allows the developers to write their custom
+  Secret Managers to safely access credentials and other secret configuration of their workflows.
+* :doc:`Connection management <concepts/connections>`_ APIs, which allow developers to manage
+  connections to external systems
+* :doc:`XCom <concepts/xcoms>`_, which allow developers to manage cross-task communication within Airflow.
+* :doc:`Variables <concepts/variables>`_, which allow developers to manage variables within Airflow.
+* :doc:`Executors <executor/index>`_, which allow developers to manage the execution of tasks within Airflow.
+* :doc:`Plugins <plugins>`_, which allow developers to extend internal Airflow capabilities - add new UI
+  pages, custom :doc:`TimeTables <concepts/timetable>`_, :doc:`Extra Links <howto/define_extra_link>`_,
+  :doc:`Listeners <listeners>`_ - all of which are considered public APIs.
+
+Also Airflow has a stable REST API that allows users to interact with Airflow via HTTP requests and a
+CLI that allows users to interact with Airflow via command line commands.

Review Comment:
   This has been removed so that it is not repeated.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] o-nikolas commented on a diff in pull request #28300: Add Public Interface description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
o-nikolas commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1046645649


##########
docs/apache-airflow/public-airflow-api.rst:
##########
@@ -0,0 +1,141 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+"Public API" of Airflow
+=======================
+
+The Public API of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new Executors, Plugins, Operators and Providers. The
+public API of Apache Airflow can be useful for building custom tools and integrations with other systems,
+as well as for automating certain aspects of the Airflow workflow.
+
+In general, the public API is an important part of the Airflow ecosystem and can be a powerful tool for users
+and developers who want to extend the functionality of the system.
+
+What kind of APIs are Public in Apache Airflow?
+===============================================
+
+Apache Airflow has a number of different public APIs that allow developers to interact with various
+aspects of the system. Some examples of the types of public APIs exposed as Python objects
+that are available in Apache Airflow include:
+
+* :doc:`DAG <concepts/dags>`_ (Directed Acyclic Graph) APIs, which allow developers to create, manage,
+  and access DAGs in Airflow.
+* :doc:`Task <concepts/tasks>`_ APIs, which provide access to information about individual tasks within
+  a DAG, such as their dependencies and execution status.
+* :doc:`Operator <concepts/operators>`_ APIs, which allows the developers to write their custom Operators.
+* :doc:`Decorators <howto/create-custom-decorator>`_ APIs, which allows the developers to write their
+  custom decorators to make it easier to write :doc:`TaskFlow <tutorial/taskflow>`_ DAGs.
+* :doc:`Secret Managers <security/secrets>`_ APIs, which allows the developers to write their custom
+  Secret Managers to safely access credentials and other secret configuration of their workflows.
+* :doc:`Connection management <concepts/connections>`_ APIs, which allow developers to manage
+  connections to external systems
+* :doc:`XCom <concepts/xcoms>`_, which allow developers to manage cross-task communication within Airflow.
+* :doc:`Variables <concepts/variables>`_, which allow developers to manage variables within Airflow.
+* :doc:`Executors <executor/index>`_, which allow developers to manage the execution of tasks within Airflow.
+* :doc:`Plugins <plugins>`_, which allow developers to extend internal Airflow capabilities - add new UI
+  pages, custom :doc:`TimeTables <concepts/timetable>`_, :doc:`Extra Links <howto/define_extra_link>`_,
+  :doc:`Listeners <listeners>`_ - all of which are considered public APIs.
+
+Also Airflow has a stable REST API that allows users to interact with Airflow via HTTP requests and a
+CLI that allows users to interact with Airflow via command line commands.
+
+Overall, the public APIs in Apache Airflow are designed to provide developers with a wide
+range of tools and capabilities for interacting with the system and extending its functionality.
+
+What is not part of the Public API of Apache Airflow?
+=====================================================
+
+A public API of Apache Airflow is a set of programming interfaces that are designed to be used
+by developers to access certain features and functionality of the system. As such, not everything in
+Apache Airflow is considered to be part of the public API.
+
+For example, the internal implementation details of Apache Airflow, such as the specific algorithms
+and data structures used to manage DAGs and tasks, are not considered to be part of the public API.

Review Comment:
   I just don't see the downside of including it. There will never be a time where we _want_ private methods/classes to be in the public API, so we can leverage it straight away as an actually useful mechanism without any harm. 
   
   Most other things documented here are very debatable/vague (other than the DB section). I'm just looking for tools that can actually be applied in practice and I think private code is an easy win.
   
   But, as always, I'm happy to disagree and approve if I'm out voted 👍



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on pull request #28300: Add Public API description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
potiuk commented on PR #28300:
URL: https://github.com/apache/airflow/pull/28300#issuecomment-1346308535

   About 80% of it was written by the GPT Chat :D


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on a diff in pull request #28300: Add Public API description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
potiuk commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1046546870


##########
docs/apache-airflow/public-airflow-api.rst:
##########
@@ -0,0 +1,141 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+"Public API" of Airflow
+=======================
+
+The Public API of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This can include operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new Executors, Plugins, Operators and Providers. The
+public API of Apache Airflow can be useful for building custom tools and integrations with other systems,
+as well as for automating certain aspects of the Airflow workflow.
+
+In general, the public API is an important part of the Airflow ecosystem and can be a powerful tool for users
+and developers who want to extend the functionality of the system.

Review Comment:
   Removed.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] dstandish commented on a diff in pull request #28300: Add Public Interface description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
dstandish commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1058643196


##########
docs/apache-airflow/administration-and-deployment/public-airflow-interface.rst:
##########
@@ -0,0 +1,87 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+===========================
+
+The Public Interface of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This includes operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+and for automating certain aspects of the Airflow workflow.
+
+You can extend Airflow in three ways:
+
+* By writing new custom Python code (via Operators, Plugins, Provider)
+* By using the `Stable REST API <stable-rest-api-ref>`_ (based on the OpenAPI specification)
+* By using the `Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_
+
+How can you extend Apache Airflow with custom Python Code?
+==========================================================
+
+The Public Interface of Airflow consists of a number of different classes and packages that provide access
+to the core features and functionality of the system.
+
+The classes and packages that may be considered as the Public Interface include:
+
+* The :class:`~airflow.DAG`, which provides a way to define and manage DAGs in Airflow.
+* The :class:`~airflow.models.baseoperator.BaseOperator`, which provides a way write custom operators.
+* The :class:`~airflow.hooks.base.BaseHook`, which provides a way write custom hooks.
+* The :class:`~airflow.models.connection.Connection`, which provides access to external service credentials and configuration.
+* The :class:`~airflow.models.variable.Variable`, which provides access to Airflow configuration variables.
+* The :class:`~airflow.models.xcom.XCom` which are used to access to inter-task communication data.
+* The :class:`~airflow.secrets.BaseSecretsBackend` which are used to define custom secret managers.
+* The :class:`~airflow.plugins_manager.AirflowPlugin` which are used to define custom plugins.
+* The :class:`~airflow.triggers.base.BaseTrigger`, which are used to implement custom Custom Deferrable Operators (based on ``asyncio``).
+* The :class:`~airflow.decorators.base.TaskDecorator`, which provides a way write custom decorators.
+* The :class:`~airflow.listeners.listener.ListenerManager` class which provides hooks that can be implemented to respond to DAG/Task lifecycle events.
+
+.. versionadded:: 2.5
+
+   Listener public interface has been added in version 2.5.
+
+* The :class:`~airflow.executors.base_executor.BaseExecutor` - the Executors are the components of Airflow
+  that are responsible for executing tasks.
+
+.. versionadded:: 2.6
+
+   There are a number of different executor implementations built-in Airflow, each with its own unique
+   characteristics and capabilities. Executor interface was available in earlier version of Airflow but
+   only as of version 2.6 executors are fully decoupled and Airflow does not rely on built-in set of executors.
+   You could have implemented (and succeeded) with implementing Executors before Airflow 2.6 and a number
+   of people succeeded in doing so, but there were some hard-coded behaviours that preferred in-built
+   executors, and custom executors could not provide full functionality that built-in executors had.
+
+
+What is not part of the Public Interface of Apache Airflow?
+===========================================================
+
+Everything not mentioned in this document should be considered as non-Public Interface.

Review Comment:
   One problem with the public / private distinction is... from tool perspective ... even using _my_func across modules is sorta forbidden.   But there is a difference between the internal "private" and the user-facing "private".  We need to be able to write helper code that can be used across modules (internal public), but not have it be user-facing public. 



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] dstandish commented on a diff in pull request #28300: Add Public Interface description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
dstandish commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1058847397


##########
docs/apache-airflow/administration-and-deployment/public-airflow-interface.rst:
##########
@@ -0,0 +1,87 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+===========================
+
+The Public Interface of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This includes operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+and for automating certain aspects of the Airflow workflow.
+
+You can extend Airflow in three ways:
+
+* By writing new custom Python code (via Operators, Plugins, Provider)
+* By using the `Stable REST API <stable-rest-api-ref>`_ (based on the OpenAPI specification)
+* By using the `Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_
+
+How can you extend Apache Airflow with custom Python Code?
+==========================================================
+
+The Public Interface of Airflow consists of a number of different classes and packages that provide access
+to the core features and functionality of the system.
+
+The classes and packages that may be considered as the Public Interface include:
+
+* The :class:`~airflow.DAG`, which provides a way to define and manage DAGs in Airflow.
+* The :class:`~airflow.models.baseoperator.BaseOperator`, which provides a way write custom operators.
+* The :class:`~airflow.hooks.base.BaseHook`, which provides a way write custom hooks.
+* The :class:`~airflow.models.connection.Connection`, which provides access to external service credentials and configuration.
+* The :class:`~airflow.models.variable.Variable`, which provides access to Airflow configuration variables.
+* The :class:`~airflow.models.xcom.XCom` which are used to access to inter-task communication data.
+* The :class:`~airflow.secrets.BaseSecretsBackend` which are used to define custom secret managers.
+* The :class:`~airflow.plugins_manager.AirflowPlugin` which are used to define custom plugins.
+* The :class:`~airflow.triggers.base.BaseTrigger`, which are used to implement custom Custom Deferrable Operators (based on ``asyncio``).
+* The :class:`~airflow.decorators.base.TaskDecorator`, which provides a way write custom decorators.
+* The :class:`~airflow.listeners.listener.ListenerManager` class which provides hooks that can be implemented to respond to DAG/Task lifecycle events.
+
+.. versionadded:: 2.5
+
+   Listener public interface has been added in version 2.5.
+
+* The :class:`~airflow.executors.base_executor.BaseExecutor` - the Executors are the components of Airflow
+  that are responsible for executing tasks.
+
+.. versionadded:: 2.6
+
+   There are a number of different executor implementations built-in Airflow, each with its own unique
+   characteristics and capabilities. Executor interface was available in earlier version of Airflow but
+   only as of version 2.6 executors are fully decoupled and Airflow does not rely on built-in set of executors.
+   You could have implemented (and succeeded) with implementing Executors before Airflow 2.6 and a number
+   of people succeeded in doing so, but there were some hard-coded behaviours that preferred in-built
+   executors, and custom executors could not provide full functionality that built-in executors had.
+
+
+What is not part of the Public Interface of Apache Airflow?
+===========================================================
+
+Everything not mentioned in this document should be considered as non-Public Interface.

Review Comment:
   Yeah, just not only kpo, I think there are other providers with e.g. utils.py



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] dstandish commented on a diff in pull request #28300: Add Public Interface description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
dstandish commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1058643196


##########
docs/apache-airflow/administration-and-deployment/public-airflow-interface.rst:
##########
@@ -0,0 +1,87 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+===========================
+
+The Public Interface of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This includes operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+and for automating certain aspects of the Airflow workflow.
+
+You can extend Airflow in three ways:
+
+* By writing new custom Python code (via Operators, Plugins, Provider)
+* By using the `Stable REST API <stable-rest-api-ref>`_ (based on the OpenAPI specification)
+* By using the `Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_
+
+How can you extend Apache Airflow with custom Python Code?
+==========================================================
+
+The Public Interface of Airflow consists of a number of different classes and packages that provide access
+to the core features and functionality of the system.
+
+The classes and packages that may be considered as the Public Interface include:
+
+* The :class:`~airflow.DAG`, which provides a way to define and manage DAGs in Airflow.
+* The :class:`~airflow.models.baseoperator.BaseOperator`, which provides a way write custom operators.
+* The :class:`~airflow.hooks.base.BaseHook`, which provides a way write custom hooks.
+* The :class:`~airflow.models.connection.Connection`, which provides access to external service credentials and configuration.
+* The :class:`~airflow.models.variable.Variable`, which provides access to Airflow configuration variables.
+* The :class:`~airflow.models.xcom.XCom` which are used to access to inter-task communication data.
+* The :class:`~airflow.secrets.BaseSecretsBackend` which are used to define custom secret managers.
+* The :class:`~airflow.plugins_manager.AirflowPlugin` which are used to define custom plugins.
+* The :class:`~airflow.triggers.base.BaseTrigger`, which are used to implement custom Custom Deferrable Operators (based on ``asyncio``).
+* The :class:`~airflow.decorators.base.TaskDecorator`, which provides a way write custom decorators.
+* The :class:`~airflow.listeners.listener.ListenerManager` class which provides hooks that can be implemented to respond to DAG/Task lifecycle events.
+
+.. versionadded:: 2.5
+
+   Listener public interface has been added in version 2.5.
+
+* The :class:`~airflow.executors.base_executor.BaseExecutor` - the Executors are the components of Airflow
+  that are responsible for executing tasks.
+
+.. versionadded:: 2.6
+
+   There are a number of different executor implementations built-in Airflow, each with its own unique
+   characteristics and capabilities. Executor interface was available in earlier version of Airflow but
+   only as of version 2.6 executors are fully decoupled and Airflow does not rely on built-in set of executors.
+   You could have implemented (and succeeded) with implementing Executors before Airflow 2.6 and a number
+   of people succeeded in doing so, but there were some hard-coded behaviours that preferred in-built
+   executors, and custom executors could not provide full functionality that built-in executors had.
+
+
+What is not part of the Public Interface of Apache Airflow?
+===========================================================
+
+Everything not mentioned in this document should be considered as non-Public Interface.

Review Comment:
   One problem with public / private is... from tool perspective ... even using _my_func across modules is forbidden.   But there is a difference between the internal "private" and the user-facing "private".  We need to be able to write helper code that can be used across modules (internal public), but not have it be user-facing public. 



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on a diff in pull request #28300: Add Public Interface description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
potiuk commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1058627308


##########
docs/apache-airflow/administration-and-deployment/public-airflow-interface.rst:
##########
@@ -0,0 +1,87 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+===========================
+
+The Public Interface of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This includes operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+and for automating certain aspects of the Airflow workflow.
+
+You can extend Airflow in three ways:
+
+* By writing new custom Python code (via Operators, Plugins, Provider)
+* By using the `Stable REST API <stable-rest-api-ref>`_ (based on the OpenAPI specification)
+* By using the `Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_
+
+How can you extend Apache Airflow with custom Python Code?
+==========================================================
+
+The Public Interface of Airflow consists of a number of different classes and packages that provide access
+to the core features and functionality of the system.
+
+The classes and packages that may be considered as the Public Interface include:
+
+* The :class:`~airflow.DAG`, which provides a way to define and manage DAGs in Airflow.
+* The :class:`~airflow.models.baseoperator.BaseOperator`, which provides a way write custom operators.
+* The :class:`~airflow.hooks.base.BaseHook`, which provides a way write custom hooks.
+* The :class:`~airflow.models.connection.Connection`, which provides access to external service credentials and configuration.
+* The :class:`~airflow.models.variable.Variable`, which provides access to Airflow configuration variables.
+* The :class:`~airflow.models.xcom.XCom` which are used to access to inter-task communication data.
+* The :class:`~airflow.secrets.BaseSecretsBackend` which are used to define custom secret managers.
+* The :class:`~airflow.plugins_manager.AirflowPlugin` which are used to define custom plugins.
+* The :class:`~airflow.triggers.base.BaseTrigger`, which are used to implement custom Custom Deferrable Operators (based on ``asyncio``).
+* The :class:`~airflow.decorators.base.TaskDecorator`, which provides a way write custom decorators.
+* The :class:`~airflow.listeners.listener.ListenerManager` class which provides hooks that can be implemented to respond to DAG/Task lifecycle events.
+
+.. versionadded:: 2.5
+
+   Listener public interface has been added in version 2.5.
+
+* The :class:`~airflow.executors.base_executor.BaseExecutor` - the Executors are the components of Airflow
+  that are responsible for executing tasks.
+
+.. versionadded:: 2.6
+
+   There are a number of different executor implementations built-in Airflow, each with its own unique
+   characteristics and capabilities. Executor interface was available in earlier version of Airflow but
+   only as of version 2.6 executors are fully decoupled and Airflow does not rely on built-in set of executors.
+   You could have implemented (and succeeded) with implementing Executors before Airflow 2.6 and a number
+   of people succeeded in doing so, but there were some hard-coded behaviours that preferred in-built
+   executors, and custom executors could not provide full functionality that built-in executors had.
+
+
+What is not part of the Public Interface of Apache Airflow?
+===========================================================
+
+Everything not mentioned in this document should be considered as non-Public Interface.

Review Comment:
   IMHO - it should also address providers (and especially them). 
   
   Rather than adding private tags, I think we should allowlist everything that should be public here. If we forgot something we should add.  All the rest should be private by definition IMHO.  This is the whole purpose of this list. And we should explicitly add here everything, all helper methods that providers are supposed to use (and add some automation to be able to enable verifying it - for example using the same approach we do in common.sql now using stubgen).
   
   Otherwise we are not really defining public interface. 



##########
docs/apache-airflow/administration-and-deployment/public-airflow-interface.rst:
##########
@@ -0,0 +1,87 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+===========================
+
+The Public Interface of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This includes operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+and for automating certain aspects of the Airflow workflow.
+
+You can extend Airflow in three ways:
+
+* By writing new custom Python code (via Operators, Plugins, Provider)
+* By using the `Stable REST API <stable-rest-api-ref>`_ (based on the OpenAPI specification)
+* By using the `Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_
+
+How can you extend Apache Airflow with custom Python Code?
+==========================================================
+
+The Public Interface of Airflow consists of a number of different classes and packages that provide access
+to the core features and functionality of the system.
+
+The classes and packages that may be considered as the Public Interface include:
+
+* The :class:`~airflow.DAG`, which provides a way to define and manage DAGs in Airflow.
+* The :class:`~airflow.models.baseoperator.BaseOperator`, which provides a way write custom operators.
+* The :class:`~airflow.hooks.base.BaseHook`, which provides a way write custom hooks.
+* The :class:`~airflow.models.connection.Connection`, which provides access to external service credentials and configuration.
+* The :class:`~airflow.models.variable.Variable`, which provides access to Airflow configuration variables.
+* The :class:`~airflow.models.xcom.XCom` which are used to access to inter-task communication data.
+* The :class:`~airflow.secrets.BaseSecretsBackend` which are used to define custom secret managers.
+* The :class:`~airflow.plugins_manager.AirflowPlugin` which are used to define custom plugins.
+* The :class:`~airflow.triggers.base.BaseTrigger`, which are used to implement custom Custom Deferrable Operators (based on ``asyncio``).
+* The :class:`~airflow.decorators.base.TaskDecorator`, which provides a way write custom decorators.
+* The :class:`~airflow.listeners.listener.ListenerManager` class which provides hooks that can be implemented to respond to DAG/Task lifecycle events.
+
+.. versionadded:: 2.5
+
+   Listener public interface has been added in version 2.5.
+
+* The :class:`~airflow.executors.base_executor.BaseExecutor` - the Executors are the components of Airflow
+  that are responsible for executing tasks.
+
+.. versionadded:: 2.6
+
+   There are a number of different executor implementations built-in Airflow, each with its own unique
+   characteristics and capabilities. Executor interface was available in earlier version of Airflow but
+   only as of version 2.6 executors are fully decoupled and Airflow does not rely on built-in set of executors.
+   You could have implemented (and succeeded) with implementing Executors before Airflow 2.6 and a number
+   of people succeeded in doing so, but there were some hard-coded behaviours that preferred in-built
+   executors, and custom executors could not provide full functionality that built-in executors had.
+
+
+What is not part of the Public Interface of Apache Airflow?
+===========================================================
+
+Everything not mentioned in this document should be considered as non-Public Interface.

Review Comment:
   IMHO - it should also address providers (and especially them). 
   
   Rather than adding private tags, I think we should allowlist everything that should be public here. If we forgot something we should add it.  All the rest should be private by definition IMHO.  This is the whole purpose of this list. And we should explicitly add here everything, all helper methods that providers are supposed to use (and add some automation to be able to enable verifying it - for example using the same approach we do in common.sql now using stubgen).
   
   Otherwise we are not really defining public interface. 



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on a diff in pull request #28300: Add Public Interface description to Airflow documentation

Posted by GitBox <gi...@apache.org>.
potiuk commented on code in PR #28300:
URL: https://github.com/apache/airflow/pull/28300#discussion_r1058627698


##########
docs/apache-airflow/administration-and-deployment/public-airflow-interface.rst:
##########
@@ -0,0 +1,87 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Public Interface of Airflow
+===========================
+
+The Public Interface of Apache Airflow is a set of programmatic interfaces that allow developers to interact
+with and access certain features of the Apache Airflow system. This includes operations such as
+creating and managing DAGs (directed acyclic graphs), managing tasks and their dependencies,
+and extending Airflow capabilities by writing new executors, plugins, operators and providers. The
+Public Interface can be useful for building custom tools and integrations with other systems,
+and for automating certain aspects of the Airflow workflow.
+
+You can extend Airflow in three ways:
+
+* By writing new custom Python code (via Operators, Plugins, Provider)
+* By using the `Stable REST API <stable-rest-api-ref>`_ (based on the OpenAPI specification)
+* By using the `Airflow Command Line Interface (CLI) <cli-and-env-variables-ref.rst>`_
+
+How can you extend Apache Airflow with custom Python Code?
+==========================================================
+
+The Public Interface of Airflow consists of a number of different classes and packages that provide access
+to the core features and functionality of the system.
+
+The classes and packages that may be considered as the Public Interface include:
+
+* The :class:`~airflow.DAG`, which provides a way to define and manage DAGs in Airflow.
+* The :class:`~airflow.models.baseoperator.BaseOperator`, which provides a way write custom operators.
+* The :class:`~airflow.hooks.base.BaseHook`, which provides a way write custom hooks.
+* The :class:`~airflow.models.connection.Connection`, which provides access to external service credentials and configuration.
+* The :class:`~airflow.models.variable.Variable`, which provides access to Airflow configuration variables.
+* The :class:`~airflow.models.xcom.XCom` which are used to access to inter-task communication data.
+* The :class:`~airflow.secrets.BaseSecretsBackend` which are used to define custom secret managers.
+* The :class:`~airflow.plugins_manager.AirflowPlugin` which are used to define custom plugins.
+* The :class:`~airflow.triggers.base.BaseTrigger`, which are used to implement custom Custom Deferrable Operators (based on ``asyncio``).
+* The :class:`~airflow.decorators.base.TaskDecorator`, which provides a way write custom decorators.
+* The :class:`~airflow.listeners.listener.ListenerManager` class which provides hooks that can be implemented to respond to DAG/Task lifecycle events.
+
+.. versionadded:: 2.5
+
+   Listener public interface has been added in version 2.5.
+
+* The :class:`~airflow.executors.base_executor.BaseExecutor` - the Executors are the components of Airflow
+  that are responsible for executing tasks.
+
+.. versionadded:: 2.6
+
+   There are a number of different executor implementations built-in Airflow, each with its own unique
+   characteristics and capabilities. Executor interface was available in earlier version of Airflow but
+   only as of version 2.6 executors are fully decoupled and Airflow does not rely on built-in set of executors.
+   You could have implemented (and succeeded) with implementing Executors before Airflow 2.6 and a number
+   of people succeeded in doing so, but there were some hard-coded behaviours that preferred in-built
+   executors, and custom executors could not provide full functionality that built-in executors had.
+
+
+What is not part of the Public Interface of Apache Airflow?
+===========================================================
+
+Everything not mentioned in this document should be considered as non-Public Interface.

Review Comment:
   And the list does not **have** to be complete now. If we forget something before merging it - we can always add it later - there is completely no harm in that. It will be "eventually complete" if we follow it for quite some time and decide that some things should be "publicised".



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org