You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by po...@apache.org on 2022/06/11 12:55:01 UTC

[airflow-site] branch fix-historically-wrong-source-links created (now c99c537bb9)

This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a change to branch fix-historically-wrong-source-links
in repository https://gitbox.apache.org/repos/asf/airflow-site.git


      at c99c537bb9 Fix example_dags and system tests "source" links in documentation

This branch includes the following new commits:

     new c99c537bb9 Fix example_dags and system tests "source" links in documentation

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.



[airflow-site] 01/01: Fix example_dags and system tests "source" links in documentation

Posted by po...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch fix-historically-wrong-source-links
in repository https://gitbox.apache.org/repos/asf/airflow-site.git

commit c99c537bb953becd157f2f5f3519e97093013536
Author: Jarek Potiuk <ja...@potiuk.com>
AuthorDate: Sat Jun 11 14:48:33 2022 +0200

    Fix example_dags and system tests "source" links in documentation
    
    This is the result of running the scripts from
    https://github.com/apache/airflow/pull/24389
---
 .../2.1.1/operators/airbyte.html                   |   4 +-
 .../2.1.2/operators/airbyte.html                   |   4 +-
 .../2.1.3/operators/airbyte.html                   |   4 +-
 .../2.1.4/operators/airbyte.html                   |   4 +-
 .../1.0.0/operators/oss.html                       |   2 +-
 .../1.0.1/operators/oss.html                       |   2 +-
 .../1.1.0/operators/oss.html                       |   2 +-
 .../1.1.1/operators/oss.html                       |   2 +-
 .../2.6.0/operators/athena.html                    |   2 +-
 .../2.6.0/operators/datasync.html                  |  12 +-
 .../2.6.0/operators/dms.html                       |   2 +-
 .../2.6.0/operators/ecs.html                       |   2 +-
 .../2.6.0/operators/eks.html                       |  20 ++--
 .../2.6.0/operators/emr.html                       |   6 +-
 .../2.6.0/operators/emr_eks.html                   |   4 +-
 .../2.6.0/operators/glacier.html                   |   4 +-
 .../2.6.0/operators/google_api_to_s3_transfer.html |  16 +--
 .../2.6.0/operators/imap_attachment_to_s3.html     |   4 +-
 .../2.6.0/operators/redshift_sql.html              |  10 +-
 .../2.6.0/operators/s3.html                        |   4 +-
 .../2.6.0/operators/s3_to_redshift.html            |   4 +-
 .../2.6.0/operators/salesforce_to_s3.html          |   2 +-
 .../2.6.0/operators/sqs_publish.html               |   2 +-
 .../2.6.0/operators/transfer/glacier_to_gcs.html   |   2 +-
 .../2.6.0/operators/transfer/s3_to_sftp.html       |   2 +-
 .../2.6.0/operators/transfer/sftp_to_s3.html       |   2 +-
 .../3.0.0/operators/athena.html                    |   2 +-
 .../3.0.0/operators/datasync.html                  |  12 +-
 .../3.0.0/operators/dms.html                       |   2 +-
 .../3.0.0/operators/ecs.html                       |   2 +-
 .../3.0.0/operators/eks.html                       |  20 ++--
 .../3.0.0/operators/emr.html                       |   6 +-
 .../3.0.0/operators/emr_eks.html                   |   4 +-
 .../3.0.0/operators/glacier.html                   |   4 +-
 .../3.0.0/operators/google_api_to_s3_transfer.html |  16 +--
 .../3.0.0/operators/imap_attachment_to_s3.html     |   4 +-
 .../3.0.0/operators/redshift_sql.html              |  10 +-
 .../3.0.0/operators/s3.html                        |   4 +-
 .../3.0.0/operators/s3_to_redshift.html            |   4 +-
 .../3.0.0/operators/salesforce_to_s3.html          |   2 +-
 .../3.0.0/operators/sqs_publish.html               |   2 +-
 .../3.0.0/operators/transfer/glacier_to_gcs.html   |   2 +-
 .../3.0.0/operators/transfer/s3_to_sftp.html       |   2 +-
 .../3.0.0/operators/transfer/sftp_to_s3.html       |   2 +-
 .../3.1.1/operators/athena.html                    |   2 +-
 .../3.1.1/operators/batch.html                     |   4 +-
 .../3.1.1/operators/datasync.html                  |  12 +-
 .../3.1.1/operators/dms.html                       |   2 +-
 .../3.1.1/operators/ecs.html                       |   6 +-
 .../3.1.1/operators/eks.html                       |  26 ++---
 .../3.1.1/operators/emr.html                       |  12 +-
 .../3.1.1/operators/emr_eks.html                   |   4 +-
 .../3.1.1/operators/glacier.html                   |   4 +-
 .../3.1.1/operators/google_api_to_s3_transfer.html |  16 +--
 .../3.1.1/operators/imap_attachment_to_s3.html     |   4 +-
 .../3.1.1/operators/lambda.html                    |   2 +-
 .../3.1.1/operators/rds.html                       |  18 +--
 .../3.1.1/operators/redshift_cluster.html          |   6 +-
 .../3.1.1/operators/redshift_data.html             |   2 +-
 .../3.1.1/operators/redshift_sql.html              |   4 +-
 .../3.1.1/operators/s3.html                        |   4 +-
 .../3.1.1/operators/sagemaker.html                 |   2 +-
 .../3.1.1/operators/salesforce_to_s3.html          |   2 +-
 .../3.1.1/operators/sns.html                       |   2 +-
 .../3.1.1/operators/sqs_publish.html               |   2 +-
 .../3.1.1/operators/transfer/dynamodb_to_s3.html   |   4 +-
 .../3.1.1/operators/transfer/glacier_to_gcs.html   |   2 +-
 .../3.1.1/operators/transfer/redshift_to_s3.html   |   2 +-
 .../3.1.1/operators/transfer/s3_to_redshift.html   |   2 +-
 .../3.1.1/operators/transfer/s3_to_sftp.html       |   2 +-
 .../3.1.1/operators/transfer/sftp_to_s3.html       |   2 +-
 .../3.2.0/operators/athena.html                    |   2 +-
 .../3.2.0/operators/batch.html                     |   4 +-
 .../3.2.0/operators/datasync.html                  |  12 +-
 .../3.2.0/operators/dms.html                       |   2 +-
 .../3.2.0/operators/ecs.html                       |   6 +-
 .../3.2.0/operators/eks.html                       |  26 ++---
 .../3.2.0/operators/emr.html                       |  12 +-
 .../3.2.0/operators/emr_eks.html                   |   4 +-
 .../3.2.0/operators/glacier.html                   |   4 +-
 .../3.2.0/operators/glue.html                      |   8 +-
 .../3.2.0/operators/google_api_to_s3_transfer.html |  16 +--
 .../3.2.0/operators/lambda.html                    |   2 +-
 .../3.2.0/operators/rds.html                       |  18 +--
 .../3.2.0/operators/redshift_cluster.html          |   6 +-
 .../3.2.0/operators/redshift_data.html             |   2 +-
 .../3.2.0/operators/redshift_sql.html              |   4 +-
 .../3.2.0/operators/s3.html                        |  10 +-
 .../3.2.0/operators/sagemaker.html                 |   2 +-
 .../3.2.0/operators/salesforce_to_s3.html          |   2 +-
 .../3.2.0/operators/sns.html                       |   2 +-
 .../3.2.0/operators/sqs_publish.html               |   2 +-
 .../3.2.0/operators/transfer/dynamodb_to_s3.html   |   4 +-
 .../3.2.0/operators/transfer/glacier_to_gcs.html   |   2 +-
 .../operators/transfer/imap_attachment_to_s3.html  |   2 +-
 .../3.2.0/operators/transfer/redshift_to_s3.html   |   2 +-
 .../3.2.0/operators/transfer/s3_to_redshift.html   |   2 +-
 .../3.2.0/operators/transfer/s3_to_sftp.html       |   2 +-
 .../3.2.0/operators/transfer/sftp_to_s3.html       |   2 +-
 .../3.3.0/operators/athena.html                    |   2 +-
 .../3.3.0/operators/batch.html                     |   4 +-
 .../3.3.0/operators/cloudformation.html            |   8 +-
 .../3.3.0/operators/datasync.html                  |  12 +-
 .../3.3.0/operators/dms.html                       |   2 +-
 .../3.3.0/operators/ecs.html                       |   6 +-
 .../3.3.0/operators/eks.html                       |  26 ++---
 .../3.3.0/operators/emr.html                       |  12 +-
 .../3.3.0/operators/emr_eks.html                   |   4 +-
 .../3.3.0/operators/glacier.html                   |   4 +-
 .../3.3.0/operators/glue.html                      |   8 +-
 .../3.3.0/operators/lambda.html                    |   2 +-
 .../3.3.0/operators/rds.html                       |  18 +--
 .../3.3.0/operators/redshift_cluster.html          |   6 +-
 .../3.3.0/operators/redshift_data.html             |   2 +-
 .../3.3.0/operators/redshift_sql.html              |   4 +-
 .../3.3.0/operators/s3.html                        |  10 +-
 .../3.3.0/operators/sagemaker.html                 |   2 +-
 .../3.3.0/operators/sns.html                       |   2 +-
 .../3.3.0/operators/sqs_publish.html               |   2 +-
 .../3.3.0/operators/transfer/dynamodb_to_s3.html   |   4 +-
 .../3.3.0/operators/transfer/ftp_to_s3.html        |   2 +-
 .../3.3.0/operators/transfer/glacier_to_gcs.html   |   2 +-
 .../3.3.0/operators/transfer/google_api_to_s3.html |   6 +-
 .../3.3.0/operators/transfer/hive_to_dynamodb.html |   2 +-
 .../operators/transfer/imap_attachment_to_s3.html  |   2 +-
 .../3.3.0/operators/transfer/local_to_s3.html      |   2 +-
 .../3.3.0/operators/transfer/mongo_to_s3.html      |   2 +-
 .../3.3.0/operators/transfer/redshift_to_s3.html   |   2 +-
 .../3.3.0/operators/transfer/s3_to_ftp.html        |   2 +-
 .../3.3.0/operators/transfer/s3_to_redshift.html   |   2 +-
 .../3.3.0/operators/transfer/s3_to_sftp.html       |   2 +-
 .../3.3.0/operators/transfer/salesforce_to_s3.html |   2 +-
 .../3.3.0/operators/transfer/sftp_to_s3.html       |   2 +-
 .../3.3.0/operators/transfer/sql_to_s3.html        |   2 +-
 .../3.4.0/operators/athena.html                    |   4 +-
 .../3.4.0/operators/batch.html                     |   4 +-
 .../3.4.0/operators/cloudformation.html            |   8 +-
 .../3.4.0/operators/datasync.html                  |   6 +-
 .../3.4.0/operators/dms.html                       |   2 +-
 .../3.4.0/operators/ec2.html                       |   6 +-
 .../3.4.0/operators/ecs.html                       |   6 +-
 .../3.4.0/operators/eks.html                       |  26 ++---
 .../3.4.0/operators/emr.html                       |  12 +-
 .../3.4.0/operators/emr_eks.html                   |   4 +-
 .../3.4.0/operators/glacier.html                   |   4 +-
 .../3.4.0/operators/glue.html                      |   8 +-
 .../3.4.0/operators/lambda.html                    |   2 +-
 .../3.4.0/operators/quicksight.html                |   4 +-
 .../3.4.0/operators/rds.html                       |  18 +--
 .../3.4.0/operators/redshift_cluster.html          |  10 +-
 .../3.4.0/operators/redshift_data.html             |   2 +-
 .../3.4.0/operators/redshift_sql.html              |   4 +-
 .../3.4.0/operators/s3.html                        |  32 +++---
 .../3.4.0/operators/sagemaker.html                 |  24 ++--
 .../3.4.0/operators/sns.html                       |   2 +-
 .../3.4.0/operators/sqs.html                       |   4 +-
 .../3.4.0/operators/step_functions.html            |   6 +-
 .../3.4.0/operators/transfer/dynamodb_to_s3.html   |   4 +-
 .../3.4.0/operators/transfer/ftp_to_s3.html        |   2 +-
 .../3.4.0/operators/transfer/glacier_to_gcs.html   |   2 +-
 .../3.4.0/operators/transfer/google_api_to_s3.html |   6 +-
 .../3.4.0/operators/transfer/hive_to_dynamodb.html |   2 +-
 .../operators/transfer/imap_attachment_to_s3.html  |   2 +-
 .../3.4.0/operators/transfer/local_to_s3.html      |   2 +-
 .../3.4.0/operators/transfer/mongo_to_s3.html      |   2 +-
 .../3.4.0/operators/transfer/redshift_to_s3.html   |   2 +-
 .../3.4.0/operators/transfer/s3_to_ftp.html        |   2 +-
 .../3.4.0/operators/transfer/s3_to_redshift.html   |   2 +-
 .../3.4.0/operators/transfer/s3_to_sftp.html       |   2 +-
 .../3.4.0/operators/transfer/salesforce_to_s3.html |   2 +-
 .../3.4.0/operators/transfer/sftp_to_s3.html       |   2 +-
 .../3.4.0/operators/transfer/sql_to_s3.html        |   2 +-
 .../1.0.0/operators.html                           |  12 +-
 .../1.0.1/operators.html                           |  12 +-
 .../2.0.0/operators.html                           |  12 +-
 .../3.0.0/operators.html                           |  12 +-
 .../3.0.1/operators.html                           |  12 +-
 .../3.1.0/operators.html                           |  12 +-
 .../3.2.0/operators.html                           |  20 ++--
 .../3.2.1/operators.html                           |  20 ++--
 .../3.3.0/operators.html                           |  20 ++--
 .../3.4.0/operators.html                           |  20 ++--
 .../1.0.0/operators.html                           |   4 +-
 .../1.0.1/operators.html                           |   4 +-
 .../2.0.0/operators.html                           |   4 +-
 .../2.0.1/operators.html                           |   4 +-
 .../2.1.0/operators.html                           |   2 +-
 .../2.1.1/operators.html                           |   2 +-
 .../2.1.2/operators.html                           |   2 +-
 .../2.1.3/operators.html                           |   2 +-
 .../1.0.0/operators.html                           |   2 +-
 .../1.0.1/operators.html                           |   2 +-
 .../1.0.2/operators.html                           |   2 +-
 .../1.0.3/operators.html                           |   2 +-
 .../1.0.4/operators.html                           |   2 +-
 .../2.1.0/operators.html                           |   2 +-
 .../2.2.0/operators.html                           |   2 +-
 .../2.3.0/operators.html                           |   2 +-
 .../2.3.1/operators.html                           |   2 +-
 .../2.3.2/operators.html                           |   2 +-
 .../2.3.3/operators.html                           |   2 +-
 .../2.3.0/operators.html                           |   2 +-
 .../2.3.1/operators.html                           |   2 +-
 .../2.3.2/operators.html                           |   2 +-
 .../2.3.3/operators.html                           |   2 +-
 .../2.2.0/operators.html                           |   2 +-
 .../2.2.1/operators.html                           |   2 +-
 .../2.2.2/operators.html                           |   2 +-
 .../2.2.3/operators.html                           |   2 +-
 .../2.0.2/operators.html                           |   2 +-
 .../2.0.3/operators.html                           |   2 +-
 .../2.0.4/operators.html                           |   2 +-
 .../1.0.0/operators.html                           |   6 +-
 .../1.0.1/operators.html                           |   6 +-
 .../1.0.2/operators.html                           |   6 +-
 .../1.0.3/operators.html                           |   6 +-
 .../2.0.0/operators.html                           |   6 +-
 .../2.0.1/operators.html                           |   6 +-
 .../2.0.2/operators.html                           |   6 +-
 .../2.0.3/operators.html                           |   6 +-
 .../2.1.0/operators.html                           |   6 +-
 .../2.1.1/operators.html                           |   6 +-
 .../2.1.2/operators.html                           |   6 +-
 .../2.1.3/operators.html                           |   6 +-
 .../1.0.0/operators/index.html                     |   8 +-
 .../1.0.0/operators.html                           |   6 +-
 .../1.0.1/operators.html                           |   6 +-
 .../1.0.2/operators.html                           |   6 +-
 .../1.1.0/operators.html                           |   6 +-
 .../1.2.0/operators.html                           |   6 +-
 .../2.0.0/operators.html                           |   6 +-
 .../2.0.1/operators.html                           |   6 +-
 .../2.0.2/operators.html                           |   6 +-
 .../2.0.3/operators.html                           |   6 +-
 .../2.1.0/operators.html                           |   6 +-
 .../2.2.0/operators.html                           |   6 +-
 .../3.0.0/operators.html                           |   6 +-
 .../3.0.1/operators.html                           |   6 +-
 .../3.0.2/operators.html                           |   6 +-
 .../3.1.0/operators.html                           |   6 +-
 .../3.1.1/operators.html                           |   6 +-
 .../3.1.2/operators.html                           |   6 +-
 .../4.0.0/operators.html                           |   6 +-
 .../4.0.1/operators.html                           |   6 +-
 .../4.0.2/operators.html                           |   6 +-
 .../1.0.0/operators.html                           |   4 +-
 .../1.0.1/operators.html                           |   4 +-
 .../2.0.0/operators.html                           |   4 +-
 .../2.0.1/operators.html                           |   4 +-
 .../2.0.2/operators.html                           |   4 +-
 .../2.1.0/operators.html                           |   4 +-
 .../2.2.0/operators.html                           |   4 +-
 .../2.3.0/operators/copy_into.html                 |   2 +-
 .../2.3.0/operators/sql.html                       |   8 +-
 .../2.3.0/operators/submit_run.html                |   4 +-
 .../2.4.0/operators/copy_into.html                 |   2 +-
 .../2.4.0/operators/sql.html                       |   8 +-
 .../2.4.0/operators/submit_run.html                |   4 +-
 .../2.5.0/operators/copy_into.html                 |   2 +-
 .../2.5.0/operators/repos_update.html              |   2 +-
 .../2.5.0/operators/sql.html                       |   8 +-
 .../2.5.0/operators/submit_run.html                |   4 +-
 .../2.6.0/operators/copy_into.html                 |   2 +-
 .../2.6.0/operators/repos_create.html              |   2 +-
 .../2.6.0/operators/repos_delete.html              |   2 +-
 .../2.6.0/operators/repos_update.html              |   2 +-
 .../2.6.0/operators/sql.html                       |   8 +-
 .../2.6.0/operators/submit_run.html                |   4 +-
 .../2.7.0/operators/copy_into.html                 |   2 +-
 .../2.7.0/operators/repos_create.html              |   2 +-
 .../2.7.0/operators/repos_delete.html              |   2 +-
 .../2.7.0/operators/repos_update.html              |   2 +-
 .../2.7.0/operators/sql.html                       |   8 +-
 .../2.7.0/operators/submit_run.html                |   4 +-
 .../1.0.1/operators.html                           |   8 +-
 .../1.0.2/operators.html                           |   8 +-
 .../1.0.0/operators.html                           |   8 +-
 .../1.0.1/operators.html                           |   8 +-
 .../1.0.2/operators.html                           |   8 +-
 .../2.0.0/operators.html                           |   8 +-
 .../2.0.1/operators.html                           |   8 +-
 .../2.0.2/operators.html                           |   8 +-
 .../2.0.3/operators.html                           |   8 +-
 .../2.0.4/operators.html                           |   8 +-
 .../1.0.0/operators/index.html                     |   8 +-
 .../1.0.1/operators/index.html                     |   8 +-
 .../1.0.2/operators/index.html                     |   8 +-
 .../1.0.3/operators/index.html                     |   8 +-
 .../6.3.0/operators/ads.html                       |   4 +-
 .../6.3.0/operators/cloud/automl.html              |  26 ++---
 .../6.3.0/operators/cloud/bigquery.html            |  48 ++++----
 .../6.3.0/operators/cloud/bigtable.html            |  14 +--
 .../6.3.0/operators/cloud/cloud_build.html         |  30 ++---
 .../6.3.0/operators/cloud/cloud_memorystore.html   |  26 ++---
 .../cloud/cloud_memorystore_memcached.html         |  14 +--
 .../6.3.0/operators/cloud/cloud_sql.html           |  40 +++----
 .../cloud/cloud_storage_transfer_service.html      |  24 ++--
 .../6.3.0/operators/cloud/compute.html             |  24 ++--
 .../6.3.0/operators/cloud/compute_ssh.html         |   4 +-
 .../operators/cloud/data_loss_prevention.html      |  20 ++--
 .../6.3.0/operators/cloud/datacatalog.html         |  74 ++++++------
 .../6.3.0/operators/cloud/dataflow.html            |  22 ++--
 .../6.3.0/operators/cloud/datafusion.html          |  24 ++--
 .../6.3.0/operators/cloud/dataprep.html            |   6 +-
 .../6.3.0/operators/cloud/dataproc.html            |  40 +++----
 .../6.3.0/operators/cloud/dataproc_metastore.html  |  28 ++---
 .../6.3.0/operators/cloud/datastore.html           |  26 ++---
 .../6.3.0/operators/cloud/functions.html           |  12 +-
 .../6.3.0/operators/cloud/gcs.html                 |  14 +--
 .../6.3.0/operators/cloud/kubernetes_engine.html   |  10 +-
 .../6.3.0/operators/cloud/life_sciences.html       |   6 +-
 .../6.3.0/operators/cloud/mlengine.html            |  30 ++---
 .../6.3.0/operators/cloud/natural_language.html    |  20 ++--
 .../6.3.0/operators/cloud/pubsub.html              |  18 +--
 .../6.3.0/operators/cloud/spanner.html             |  14 +--
 .../6.3.0/operators/cloud/speech_to_text.html      |   6 +-
 .../6.3.0/operators/cloud/stackdriver.html         |  20 ++--
 .../6.3.0/operators/cloud/tasks.html               |  26 ++---
 .../6.3.0/operators/cloud/text_to_speech.html      |   6 +-
 .../6.3.0/operators/cloud/translate.html           |   4 +-
 .../6.3.0/operators/cloud/translate_speech.html    |   4 +-
 .../6.3.0/operators/cloud/video_intelligence.html  |  18 +--
 .../6.3.0/operators/cloud/vision.html              | 124 ++++++++++-----------
 .../6.3.0/operators/cloud/workflows.html           |  22 ++--
 .../6.3.0/operators/firebase/firestore.html        |   2 +-
 .../6.3.0/operators/leveldb/leveldb.html           |   2 +-
 .../operators/marketing_platform/analytics.html    |   6 +-
 .../marketing_platform/campaign_manager.html       |  14 +--
 .../marketing_platform/display_video.html          |  20 ++--
 .../operators/marketing_platform/search_ads.html   |   8 +-
 .../6.3.0/operators/suite/sheets.html              |   4 +-
 .../operators/transfer/azure_fileshare_to_gcs.html |   2 +-
 .../operators/transfer/facebook_ads_to_gcs.html    |   2 +-
 .../6.3.0/operators/transfer/gcs_to_gcs.html       |  22 ++--
 .../6.3.0/operators/transfer/gcs_to_gdrive.html    |   6 +-
 .../6.3.0/operators/transfer/gcs_to_local.html     |   2 +-
 .../6.3.0/operators/transfer/gcs_to_sftp.html      |   8 +-
 .../6.3.0/operators/transfer/gcs_to_sheets.html    |   2 +-
 .../6.3.0/operators/transfer/gdrive_to_gcs.html    |   2 +-
 .../6.3.0/operators/transfer/gdrive_to_local.html  |   2 +-
 .../6.3.0/operators/transfer/local_to_gcs.html     |   2 +-
 .../6.3.0/operators/transfer/mssql_to_gcs.html     |   2 +-
 .../6.3.0/operators/transfer/oracle_to_gcs.html    |   2 +-
 .../6.3.0/operators/transfer/presto_to_gcs.html    |  10 +-
 .../operators/transfer/salesforce_to_gcs.html      |   2 +-
 .../6.3.0/operators/transfer/sftp_to_gcs.html      |   8 +-
 .../6.3.0/operators/transfer/sheets_to_gcs.html    |   2 +-
 .../6.3.0/operators/transfer/sql_to_sheets.html    |   2 +-
 .../6.3.0/operators/transfer/trino_to_gcs.html     |  10 +-
 .../6.4.0/operators/ads.html                       |   4 +-
 .../6.4.0/operators/cloud/automl.html              |  26 ++---
 .../6.4.0/operators/cloud/bigquery.html            |  48 ++++----
 .../6.4.0/operators/cloud/bigtable.html            |  14 +--
 .../6.4.0/operators/cloud/cloud_build.html         |  30 ++---
 .../6.4.0/operators/cloud/cloud_composer.html      |  22 ++--
 .../6.4.0/operators/cloud/cloud_memorystore.html   |  26 ++---
 .../cloud/cloud_memorystore_memcached.html         |  14 +--
 .../6.4.0/operators/cloud/cloud_sql.html           |  40 +++----
 .../cloud/cloud_storage_transfer_service.html      |  24 ++--
 .../6.4.0/operators/cloud/compute.html             |  24 ++--
 .../6.4.0/operators/cloud/compute_ssh.html         |   4 +-
 .../operators/cloud/data_loss_prevention.html      |  20 ++--
 .../6.4.0/operators/cloud/datacatalog.html         |  74 ++++++------
 .../6.4.0/operators/cloud/dataflow.html            |  22 ++--
 .../6.4.0/operators/cloud/datafusion.html          |  24 ++--
 .../6.4.0/operators/cloud/dataprep.html            |   6 +-
 .../6.4.0/operators/cloud/dataproc.html            |  44 ++++----
 .../6.4.0/operators/cloud/dataproc_metastore.html  |  28 ++---
 .../6.4.0/operators/cloud/datastore.html           |  26 ++---
 .../6.4.0/operators/cloud/functions.html           |  12 +-
 .../6.4.0/operators/cloud/gcs.html                 |  14 +--
 .../6.4.0/operators/cloud/kubernetes_engine.html   |  10 +-
 .../6.4.0/operators/cloud/life_sciences.html       |   6 +-
 .../6.4.0/operators/cloud/mlengine.html            |  30 ++---
 .../6.4.0/operators/cloud/natural_language.html    |  20 ++--
 .../6.4.0/operators/cloud/pubsub.html              |  18 +--
 .../6.4.0/operators/cloud/spanner.html             |  14 +--
 .../6.4.0/operators/cloud/speech_to_text.html      |   6 +-
 .../6.4.0/operators/cloud/stackdriver.html         |  20 ++--
 .../6.4.0/operators/cloud/tasks.html               |  26 ++---
 .../6.4.0/operators/cloud/text_to_speech.html      |   6 +-
 .../6.4.0/operators/cloud/translate.html           |   4 +-
 .../6.4.0/operators/cloud/translate_speech.html    |   4 +-
 .../6.4.0/operators/cloud/vertex_ai.html           |  24 ++--
 .../6.4.0/operators/cloud/video_intelligence.html  |  18 +--
 .../6.4.0/operators/cloud/vision.html              | 124 ++++++++++-----------
 .../6.4.0/operators/cloud/workflows.html           |  22 ++--
 .../6.4.0/operators/firebase/firestore.html        |   2 +-
 .../6.4.0/operators/leveldb/leveldb.html           |   2 +-
 .../operators/marketing_platform/analytics.html    |   6 +-
 .../marketing_platform/campaign_manager.html       |  14 +--
 .../marketing_platform/display_video.html          |  20 ++--
 .../operators/marketing_platform/search_ads.html   |   8 +-
 .../6.4.0/operators/suite/sheets.html              |   4 +-
 .../operators/transfer/azure_fileshare_to_gcs.html |   2 +-
 .../operators/transfer/facebook_ads_to_gcs.html    |   2 +-
 .../6.4.0/operators/transfer/gcs_to_gcs.html       |  22 ++--
 .../6.4.0/operators/transfer/gcs_to_gdrive.html    |   6 +-
 .../6.4.0/operators/transfer/gcs_to_local.html     |   2 +-
 .../6.4.0/operators/transfer/gcs_to_sftp.html      |   8 +-
 .../6.4.0/operators/transfer/gcs_to_sheets.html    |   2 +-
 .../6.4.0/operators/transfer/gdrive_to_gcs.html    |   2 +-
 .../6.4.0/operators/transfer/gdrive_to_local.html  |   2 +-
 .../6.4.0/operators/transfer/local_to_gcs.html     |   2 +-
 .../6.4.0/operators/transfer/mssql_to_gcs.html     |   2 +-
 .../6.4.0/operators/transfer/oracle_to_gcs.html    |   2 +-
 .../6.4.0/operators/transfer/presto_to_gcs.html    |  10 +-
 .../operators/transfer/salesforce_to_gcs.html      |   2 +-
 .../6.4.0/operators/transfer/sftp_to_gcs.html      |   8 +-
 .../6.4.0/operators/transfer/sheets_to_gcs.html    |   2 +-
 .../6.4.0/operators/transfer/sql_to_sheets.html    |   2 +-
 .../6.4.0/operators/transfer/trino_to_gcs.html     |  10 +-
 .../6.5.0/operators/ads.html                       |   4 +-
 .../6.5.0/operators/cloud/automl.html              |  26 ++---
 .../6.5.0/operators/cloud/bigquery.html            |  48 ++++----
 .../6.5.0/operators/cloud/bigtable.html            |  14 +--
 .../6.5.0/operators/cloud/cloud_build.html         |  30 ++---
 .../6.5.0/operators/cloud/cloud_composer.html      |  22 ++--
 .../6.5.0/operators/cloud/cloud_memorystore.html   |  26 ++---
 .../cloud/cloud_memorystore_memcached.html         |  14 +--
 .../6.5.0/operators/cloud/cloud_sql.html           |  40 +++----
 .../cloud/cloud_storage_transfer_service.html      |  24 ++--
 .../6.5.0/operators/cloud/compute.html             |  24 ++--
 .../6.5.0/operators/cloud/compute_ssh.html         |   4 +-
 .../operators/cloud/data_loss_prevention.html      |  20 ++--
 .../6.5.0/operators/cloud/datacatalog.html         |  74 ++++++------
 .../6.5.0/operators/cloud/dataflow.html            |  22 ++--
 .../6.5.0/operators/cloud/datafusion.html          |  24 ++--
 .../6.5.0/operators/cloud/dataprep.html            |   6 +-
 .../6.5.0/operators/cloud/dataproc.html            |  44 ++++----
 .../6.5.0/operators/cloud/dataproc_metastore.html  |  28 ++---
 .../6.5.0/operators/cloud/datastore.html           |  26 ++---
 .../6.5.0/operators/cloud/functions.html           |  12 +-
 .../6.5.0/operators/cloud/gcs.html                 |  14 +--
 .../6.5.0/operators/cloud/kubernetes_engine.html   |  10 +-
 .../6.5.0/operators/cloud/life_sciences.html       |   6 +-
 .../6.5.0/operators/cloud/looker.html              |   4 +-
 .../6.5.0/operators/cloud/mlengine.html            |  30 ++---
 .../6.5.0/operators/cloud/natural_language.html    |  20 ++--
 .../6.5.0/operators/cloud/pubsub.html              |  18 +--
 .../6.5.0/operators/cloud/spanner.html             |  14 +--
 .../6.5.0/operators/cloud/speech_to_text.html      |   6 +-
 .../6.5.0/operators/cloud/stackdriver.html         |  20 ++--
 .../6.5.0/operators/cloud/tasks.html               |  26 ++---
 .../6.5.0/operators/cloud/text_to_speech.html      |   6 +-
 .../6.5.0/operators/cloud/translate.html           |   4 +-
 .../6.5.0/operators/cloud/translate_speech.html    |   4 +-
 .../6.5.0/operators/cloud/vertex_ai.html           |  38 +++----
 .../6.5.0/operators/cloud/video_intelligence.html  |  18 +--
 .../6.5.0/operators/cloud/vision.html              | 124 ++++++++++-----------
 .../6.5.0/operators/cloud/workflows.html           |  22 ++--
 .../6.5.0/operators/firebase/firestore.html        |   2 +-
 .../6.5.0/operators/leveldb/leveldb.html           |   2 +-
 .../operators/marketing_platform/analytics.html    |   6 +-
 .../marketing_platform/campaign_manager.html       |  14 +--
 .../marketing_platform/display_video.html          |  20 ++--
 .../operators/marketing_platform/search_ads.html   |   8 +-
 .../6.5.0/operators/suite/sheets.html              |   4 +-
 .../operators/transfer/azure_fileshare_to_gcs.html |   2 +-
 .../6.5.0/operators/transfer/calendar_to_gcs.html  |   2 +-
 .../operators/transfer/facebook_ads_to_gcs.html    |   2 +-
 .../6.5.0/operators/transfer/gcs_to_gcs.html       |  22 ++--
 .../6.5.0/operators/transfer/gcs_to_gdrive.html    |   6 +-
 .../6.5.0/operators/transfer/gcs_to_local.html     |   2 +-
 .../6.5.0/operators/transfer/gcs_to_sftp.html      |   8 +-
 .../6.5.0/operators/transfer/gcs_to_sheets.html    |   2 +-
 .../6.5.0/operators/transfer/gdrive_to_gcs.html    |   2 +-
 .../6.5.0/operators/transfer/gdrive_to_local.html  |   2 +-
 .../6.5.0/operators/transfer/local_to_gcs.html     |   2 +-
 .../6.5.0/operators/transfer/mssql_to_gcs.html     |   2 +-
 .../6.5.0/operators/transfer/oracle_to_gcs.html    |   2 +-
 .../6.5.0/operators/transfer/presto_to_gcs.html    |  10 +-
 .../operators/transfer/salesforce_to_gcs.html      |   2 +-
 .../6.5.0/operators/transfer/sftp_to_gcs.html      |   8 +-
 .../6.5.0/operators/transfer/sheets_to_gcs.html    |   2 +-
 .../6.5.0/operators/transfer/sql_to_sheets.html    |   2 +-
 .../6.5.0/operators/transfer/trino_to_gcs.html     |  10 +-
 .../6.6.0/operators/ads.html                       |   4 +-
 .../6.6.0/operators/cloud/automl.html              |  26 ++---
 .../6.6.0/operators/cloud/bigquery.html            |  48 ++++----
 .../6.6.0/operators/cloud/bigtable.html            |  14 +--
 .../6.6.0/operators/cloud/cloud_build.html         |  30 ++---
 .../6.6.0/operators/cloud/cloud_composer.html      |  22 ++--
 .../6.6.0/operators/cloud/cloud_memorystore.html   |  26 ++---
 .../cloud/cloud_memorystore_memcached.html         |  14 +--
 .../6.6.0/operators/cloud/cloud_sql.html           |  40 +++----
 .../cloud/cloud_storage_transfer_service.html      |  24 ++--
 .../6.6.0/operators/cloud/compute.html             |  24 ++--
 .../6.6.0/operators/cloud/compute_ssh.html         |   4 +-
 .../operators/cloud/data_loss_prevention.html      |  20 ++--
 .../6.6.0/operators/cloud/datacatalog.html         |  74 ++++++------
 .../6.6.0/operators/cloud/dataflow.html            |  22 ++--
 .../6.6.0/operators/cloud/datafusion.html          |  24 ++--
 .../6.6.0/operators/cloud/dataplex.html            |  14 +--
 .../6.6.0/operators/cloud/dataprep.html            |   6 +-
 .../6.6.0/operators/cloud/dataproc.html            |  46 ++++----
 .../6.6.0/operators/cloud/dataproc_metastore.html  |  28 ++---
 .../6.6.0/operators/cloud/datastore.html           |  26 ++---
 .../6.6.0/operators/cloud/functions.html           |  12 +-
 .../6.6.0/operators/cloud/gcs.html                 |  14 +--
 .../6.6.0/operators/cloud/kubernetes_engine.html   |  10 +-
 .../6.6.0/operators/cloud/life_sciences.html       |   6 +-
 .../6.6.0/operators/cloud/looker.html              |   4 +-
 .../6.6.0/operators/cloud/mlengine.html            |  30 ++---
 .../6.6.0/operators/cloud/natural_language.html    |  20 ++--
 .../6.6.0/operators/cloud/pubsub.html              |  18 +--
 .../6.6.0/operators/cloud/spanner.html             |  14 +--
 .../6.6.0/operators/cloud/speech_to_text.html      |   6 +-
 .../6.6.0/operators/cloud/stackdriver.html         |  20 ++--
 .../6.6.0/operators/cloud/tasks.html               |  26 ++---
 .../6.6.0/operators/cloud/text_to_speech.html      |   6 +-
 .../6.6.0/operators/cloud/translate.html           |   4 +-
 .../6.6.0/operators/cloud/translate_speech.html    |   4 +-
 .../6.6.0/operators/cloud/vertex_ai.html           |  38 +++----
 .../6.6.0/operators/cloud/video_intelligence.html  |  18 +--
 .../6.6.0/operators/cloud/vision.html              | 124 ++++++++++-----------
 .../6.6.0/operators/cloud/workflows.html           |  22 ++--
 .../6.6.0/operators/firebase/firestore.html        |   2 +-
 .../6.6.0/operators/leveldb/leveldb.html           |   2 +-
 .../operators/marketing_platform/analytics.html    |   6 +-
 .../marketing_platform/campaign_manager.html       |  14 +--
 .../marketing_platform/display_video.html          |  20 ++--
 .../operators/marketing_platform/search_ads.html   |   8 +-
 .../6.6.0/operators/suite/sheets.html              |   4 +-
 .../operators/transfer/azure_fileshare_to_gcs.html |   2 +-
 .../6.6.0/operators/transfer/calendar_to_gcs.html  |   2 +-
 .../operators/transfer/facebook_ads_to_gcs.html    |   2 +-
 .../6.6.0/operators/transfer/gcs_to_gcs.html       |  22 ++--
 .../6.6.0/operators/transfer/gcs_to_gdrive.html    |   6 +-
 .../6.6.0/operators/transfer/gcs_to_local.html     |   2 +-
 .../6.6.0/operators/transfer/gcs_to_sftp.html      |   8 +-
 .../6.6.0/operators/transfer/gcs_to_sheets.html    |   2 +-
 .../6.6.0/operators/transfer/gdrive_to_gcs.html    |   2 +-
 .../6.6.0/operators/transfer/gdrive_to_local.html  |   2 +-
 .../6.6.0/operators/transfer/local_to_gcs.html     |   2 +-
 .../6.6.0/operators/transfer/mssql_to_gcs.html     |   2 +-
 .../6.6.0/operators/transfer/oracle_to_gcs.html    |   2 +-
 .../6.6.0/operators/transfer/presto_to_gcs.html    |  10 +-
 .../operators/transfer/salesforce_to_gcs.html      |   2 +-
 .../6.6.0/operators/transfer/sftp_to_gcs.html      |   8 +-
 .../6.6.0/operators/transfer/sheets_to_gcs.html    |   2 +-
 .../6.6.0/operators/transfer/sql_to_sheets.html    |   2 +-
 .../6.6.0/operators/transfer/trino_to_gcs.html     |  10 +-
 .../6.7.0/operators/ads.html                       |   4 +-
 .../6.7.0/operators/cloud/automl.html              |  26 ++---
 .../6.7.0/operators/cloud/bigquery.html            |  48 ++++----
 .../6.7.0/operators/cloud/bigtable.html            |  14 +--
 .../6.7.0/operators/cloud/cloud_build.html         |  30 ++---
 .../6.7.0/operators/cloud/cloud_composer.html      |  22 ++--
 .../6.7.0/operators/cloud/cloud_memorystore.html   |  26 ++---
 .../cloud/cloud_memorystore_memcached.html         |  14 +--
 .../6.7.0/operators/cloud/cloud_sql.html           |  40 +++----
 .../cloud/cloud_storage_transfer_service.html      |  24 ++--
 .../6.7.0/operators/cloud/compute.html             |  24 ++--
 .../6.7.0/operators/cloud/compute_ssh.html         |   4 +-
 .../operators/cloud/data_loss_prevention.html      |  20 ++--
 .../6.7.0/operators/cloud/datacatalog.html         |  74 ++++++------
 .../6.7.0/operators/cloud/dataflow.html            |  22 ++--
 .../6.7.0/operators/cloud/datafusion.html          |  24 ++--
 .../6.7.0/operators/cloud/dataplex.html            |  14 +--
 .../6.7.0/operators/cloud/dataprep.html            |   6 +-
 .../6.7.0/operators/cloud/dataproc.html            |  46 ++++----
 .../6.7.0/operators/cloud/dataproc_metastore.html  |  28 ++---
 .../6.7.0/operators/cloud/datastore.html           |  26 ++---
 .../6.7.0/operators/cloud/functions.html           |  12 +-
 .../6.7.0/operators/cloud/gcs.html                 |  14 +--
 .../6.7.0/operators/cloud/kubernetes_engine.html   |  10 +-
 .../6.7.0/operators/cloud/life_sciences.html       |   6 +-
 .../6.7.0/operators/cloud/looker.html              |   4 +-
 .../6.7.0/operators/cloud/mlengine.html            |  30 ++---
 .../6.7.0/operators/cloud/natural_language.html    |  20 ++--
 .../6.7.0/operators/cloud/pubsub.html              |  18 +--
 .../6.7.0/operators/cloud/spanner.html             |  14 +--
 .../6.7.0/operators/cloud/speech_to_text.html      |   6 +-
 .../6.7.0/operators/cloud/stackdriver.html         |  20 ++--
 .../6.7.0/operators/cloud/tasks.html               |  26 ++---
 .../6.7.0/operators/cloud/text_to_speech.html      |   6 +-
 .../6.7.0/operators/cloud/translate.html           |   4 +-
 .../6.7.0/operators/cloud/translate_speech.html    |   4 +-
 .../6.7.0/operators/cloud/vertex_ai.html           |  38 +++----
 .../6.7.0/operators/cloud/video_intelligence.html  |  18 +--
 .../6.7.0/operators/cloud/vision.html              | 124 ++++++++++-----------
 .../6.7.0/operators/cloud/workflows.html           |  22 ++--
 .../6.7.0/operators/firebase/firestore.html        |   2 +-
 .../6.7.0/operators/leveldb/leveldb.html           |   2 +-
 .../operators/marketing_platform/analytics.html    |   6 +-
 .../marketing_platform/campaign_manager.html       |  14 +--
 .../marketing_platform/display_video.html          |  20 ++--
 .../operators/marketing_platform/search_ads.html   |   8 +-
 .../6.7.0/operators/suite/sheets.html              |   4 +-
 .../operators/transfer/azure_fileshare_to_gcs.html |   2 +-
 .../6.7.0/operators/transfer/calendar_to_gcs.html  |   2 +-
 .../operators/transfer/facebook_ads_to_gcs.html    |   2 +-
 .../6.7.0/operators/transfer/gcs_to_gcs.html       |  22 ++--
 .../6.7.0/operators/transfer/gcs_to_gdrive.html    |   6 +-
 .../6.7.0/operators/transfer/gcs_to_local.html     |   2 +-
 .../6.7.0/operators/transfer/gcs_to_sftp.html      |   8 +-
 .../6.7.0/operators/transfer/gcs_to_sheets.html    |   2 +-
 .../6.7.0/operators/transfer/gdrive_to_gcs.html    |   2 +-
 .../6.7.0/operators/transfer/gdrive_to_local.html  |   2 +-
 .../6.7.0/operators/transfer/local_to_gcs.html     |   2 +-
 .../6.7.0/operators/transfer/mssql_to_gcs.html     |   2 +-
 .../6.7.0/operators/transfer/oracle_to_gcs.html    |   2 +-
 .../6.7.0/operators/transfer/presto_to_gcs.html    |  10 +-
 .../operators/transfer/salesforce_to_gcs.html      |   2 +-
 .../6.7.0/operators/transfer/sftp_to_gcs.html      |   8 +-
 .../6.7.0/operators/transfer/sheets_to_gcs.html    |   2 +-
 .../6.7.0/operators/transfer/sql_to_sheets.html    |   2 +-
 .../6.7.0/operators/transfer/trino_to_gcs.html     |  10 +-
 .../6.8.0/operators/ads.html                       |   4 +-
 .../6.8.0/operators/cloud/automl.html              |  26 ++---
 .../6.8.0/operators/cloud/bigtable.html            |  14 +--
 .../6.8.0/operators/cloud/cloud_build.html         |  30 ++---
 .../6.8.0/operators/cloud/cloud_composer.html      |  22 ++--
 .../6.8.0/operators/cloud/cloud_memorystore.html   |  26 ++---
 .../cloud/cloud_memorystore_memcached.html         |  14 +--
 .../6.8.0/operators/cloud/cloud_sql.html           |  40 +++----
 .../cloud/cloud_storage_transfer_service.html      |  24 ++--
 .../6.8.0/operators/cloud/compute.html             |  24 ++--
 .../6.8.0/operators/cloud/compute_ssh.html         |   4 +-
 .../operators/cloud/data_loss_prevention.html      |  22 ++--
 .../6.8.0/operators/cloud/datacatalog.html         |  74 ++++++------
 .../6.8.0/operators/cloud/dataflow.html            |  22 ++--
 .../6.8.0/operators/cloud/datafusion.html          |  24 ++--
 .../6.8.0/operators/cloud/dataplex.html            |  14 +--
 .../6.8.0/operators/cloud/dataprep.html            |   6 +-
 .../6.8.0/operators/cloud/dataproc.html            |  46 ++++----
 .../6.8.0/operators/cloud/dataproc_metastore.html  |  28 ++---
 .../6.8.0/operators/cloud/datastore.html           |  26 ++---
 .../6.8.0/operators/cloud/functions.html           |  12 +-
 .../6.8.0/operators/cloud/gcs.html                 |  14 +--
 .../6.8.0/operators/cloud/kubernetes_engine.html   |  10 +-
 .../6.8.0/operators/cloud/life_sciences.html       |   6 +-
 .../6.8.0/operators/cloud/looker.html              |   4 +-
 .../6.8.0/operators/cloud/mlengine.html            |  30 ++---
 .../6.8.0/operators/cloud/natural_language.html    |  20 ++--
 .../6.8.0/operators/cloud/pubsub.html              |  18 +--
 .../6.8.0/operators/cloud/spanner.html             |  14 +--
 .../6.8.0/operators/cloud/speech_to_text.html      |   6 +-
 .../6.8.0/operators/cloud/stackdriver.html         |  20 ++--
 .../6.8.0/operators/cloud/tasks.html               |  26 ++---
 .../6.8.0/operators/cloud/text_to_speech.html      |   6 +-
 .../6.8.0/operators/cloud/translate.html           |   4 +-
 .../6.8.0/operators/cloud/translate_speech.html    |   4 +-
 .../6.8.0/operators/cloud/vertex_ai.html           |  70 ++++++------
 .../6.8.0/operators/cloud/video_intelligence.html  |  18 +--
 .../6.8.0/operators/cloud/vision.html              | 124 ++++++++++-----------
 .../6.8.0/operators/cloud/workflows.html           |  22 ++--
 .../6.8.0/operators/firebase/firestore.html        |   2 +-
 .../6.8.0/operators/leveldb/leveldb.html           |   2 +-
 .../operators/marketing_platform/analytics.html    |   6 +-
 .../marketing_platform/campaign_manager.html       |  14 +--
 .../marketing_platform/display_video.html          |  20 ++--
 .../operators/marketing_platform/search_ads.html   |   8 +-
 .../6.8.0/operators/suite/sheets.html              |   4 +-
 .../operators/transfer/azure_fileshare_to_gcs.html |   2 +-
 .../6.8.0/operators/transfer/calendar_to_gcs.html  |   2 +-
 .../operators/transfer/facebook_ads_to_gcs.html    |   2 +-
 .../6.8.0/operators/transfer/gcs_to_gcs.html       |  22 ++--
 .../6.8.0/operators/transfer/gcs_to_gdrive.html    |   6 +-
 .../6.8.0/operators/transfer/gcs_to_local.html     |   2 +-
 .../6.8.0/operators/transfer/gcs_to_sftp.html      |   8 +-
 .../6.8.0/operators/transfer/gcs_to_sheets.html    |   2 +-
 .../6.8.0/operators/transfer/gdrive_to_gcs.html    |   2 +-
 .../6.8.0/operators/transfer/gdrive_to_local.html  |   2 +-
 .../6.8.0/operators/transfer/local_to_gcs.html     |   2 +-
 .../6.8.0/operators/transfer/mssql_to_gcs.html     |   2 +-
 .../6.8.0/operators/transfer/oracle_to_gcs.html    |   2 +-
 .../6.8.0/operators/transfer/presto_to_gcs.html    |  10 +-
 .../operators/transfer/salesforce_to_gcs.html      |   2 +-
 .../6.8.0/operators/transfer/sftp_to_gcs.html      |   8 +-
 .../6.8.0/operators/transfer/sheets_to_gcs.html    |   2 +-
 .../6.8.0/operators/transfer/sql_to_sheets.html    |   2 +-
 .../6.8.0/operators/transfer/trino_to_gcs.html     |  10 +-
 .../7.0.0/operators/ads.html                       |   4 +-
 .../7.0.0/operators/cloud/automl.html              |  26 ++---
 .../7.0.0/operators/cloud/bigtable.html            |  14 +--
 .../7.0.0/operators/cloud/cloud_build.html         |  30 ++---
 .../7.0.0/operators/cloud/cloud_composer.html      |  22 ++--
 .../7.0.0/operators/cloud/cloud_memorystore.html   |  26 ++---
 .../cloud/cloud_memorystore_memcached.html         |  14 +--
 .../7.0.0/operators/cloud/cloud_sql.html           |  40 +++----
 .../cloud/cloud_storage_transfer_service.html      |  24 ++--
 .../7.0.0/operators/cloud/compute.html             |  24 ++--
 .../7.0.0/operators/cloud/compute_ssh.html         |   4 +-
 .../operators/cloud/data_loss_prevention.html      |  22 ++--
 .../7.0.0/operators/cloud/datacatalog.html         |  74 ++++++------
 .../7.0.0/operators/cloud/dataflow.html            |  22 ++--
 .../7.0.0/operators/cloud/datafusion.html          |  24 ++--
 .../7.0.0/operators/cloud/dataplex.html            |  14 +--
 .../7.0.0/operators/cloud/dataprep.html            |   6 +-
 .../7.0.0/operators/cloud/dataproc.html            |  46 ++++----
 .../7.0.0/operators/cloud/dataproc_metastore.html  |  28 ++---
 .../7.0.0/operators/cloud/functions.html           |  12 +-
 .../7.0.0/operators/cloud/life_sciences.html       |   6 +-
 .../7.0.0/operators/cloud/looker.html              |   4 +-
 .../7.0.0/operators/cloud/mlengine.html            |  30 ++---
 .../7.0.0/operators/cloud/natural_language.html    |  20 ++--
 .../7.0.0/operators/cloud/pubsub.html              |  18 +--
 .../7.0.0/operators/cloud/spanner.html             |  14 +--
 .../7.0.0/operators/cloud/speech_to_text.html      |   6 +-
 .../7.0.0/operators/cloud/translate.html           |   4 +-
 .../7.0.0/operators/cloud/translate_speech.html    |   4 +-
 .../7.0.0/operators/cloud/vertex_ai.html           |  70 ++++++------
 .../7.0.0/operators/cloud/video_intelligence.html  |  18 +--
 .../7.0.0/operators/cloud/vision.html              | 124 ++++++++++-----------
 .../7.0.0/operators/cloud/workflows.html           |  22 ++--
 .../7.0.0/operators/firebase/firestore.html        |   2 +-
 .../operators/marketing_platform/analytics.html    |   6 +-
 .../marketing_platform/campaign_manager.html       |  14 +--
 .../marketing_platform/display_video.html          |  20 ++--
 .../operators/marketing_platform/search_ads.html   |   8 +-
 .../7.0.0/operators/suite/sheets.html              |   4 +-
 .../operators/transfer/azure_fileshare_to_gcs.html |   2 +-
 .../7.0.0/operators/transfer/calendar_to_gcs.html  |   2 +-
 .../operators/transfer/facebook_ads_to_gcs.html    |   2 +-
 .../7.0.0/operators/transfer/gcs_to_gdrive.html    |   6 +-
 .../7.0.0/operators/transfer/gcs_to_sftp.html      |   8 +-
 .../7.0.0/operators/transfer/gcs_to_sheets.html    |   2 +-
 .../7.0.0/operators/transfer/gdrive_to_gcs.html    |   2 +-
 .../7.0.0/operators/transfer/gdrive_to_local.html  |   2 +-
 .../7.0.0/operators/transfer/oracle_to_gcs.html    |   2 +-
 .../7.0.0/operators/transfer/presto_to_gcs.html    |  10 +-
 .../operators/transfer/salesforce_to_gcs.html      |   2 +-
 .../7.0.0/operators/transfer/sftp_to_gcs.html      |   8 +-
 .../7.0.0/operators/transfer/sheets_to_gcs.html    |   2 +-
 .../7.0.0/operators/transfer/sql_to_sheets.html    |   2 +-
 .../7.0.0/operators/transfer/trino_to_gcs.html     |  10 +-
 .../1.0.0/operators.html                           |  14 +--
 .../1.1.0/operators.html                           |  14 +--
 .../1.1.1/operators.html                           |  14 +--
 .../2.0.0/operators.html                           |  14 +--
 .../2.0.1/operators.html                           |  14 +--
 .../2.0.2/operators.html                           |  14 +--
 .../2.0.3/operators.html                           |  14 +--
 .../2.1.0/operators.html                           |  14 +--
 .../2.1.1/operators.html                           |  14 +--
 .../2.1.2/operators.html                           |  14 +--
 .../1.1.0/operators/index.html                     |   2 +-
 .../1.1.1/operators/index.html                     |   2 +-
 .../1.1.2/operators/index.html                     |   2 +-
 .../1.1.3/operators/index.html                     |   2 +-
 .../1.0.0/operators.html                           |   4 +-
 .../1.0.1/operators.html                           |   4 +-
 .../2.0.0/operators.html                           |   4 +-
 .../2.0.1/operators.html                           |   4 +-
 .../2.1.0/operators.html                           |   4 +-
 .../2.1.1/operators.html                           |   4 +-
 .../2.1.2/operators.html                           |   4 +-
 .../2.1.3/operators.html                           |   4 +-
 .../3.5.0/operators/adf_run_pipeline.html          |   4 +-
 .../3.5.0/operators/adls.html                      |   2 +-
 .../3.5.0/operators/azure_blob_to_gcs.html         |   2 +-
 .../3.5.0/operators/local_to_adls.html             |   2 +-
 .../3.5.0/operators/sftp_to_wasb.html              |   2 +-
 .../3.6.0/operators/adf_run_pipeline.html          |   4 +-
 .../3.6.0/operators/adls.html                      |   2 +-
 .../3.6.0/operators/azure_blob_to_gcs.html         |   2 +-
 .../3.6.0/operators/local_to_adls.html             |   2 +-
 .../3.6.0/operators/sftp_to_wasb.html              |   2 +-
 .../3.7.0/operators/adf_run_pipeline.html          |   4 +-
 .../3.7.0/operators/adls.html                      |   2 +-
 .../3.7.0/operators/azure_blob_to_gcs.html         |   2 +-
 .../3.7.0/operators/local_to_adls.html             |   2 +-
 .../3.7.0/operators/sftp_to_wasb.html              |   2 +-
 .../3.7.1/operators/adf_run_pipeline.html          |   4 +-
 .../3.7.1/operators/adls.html                      |   2 +-
 .../3.7.1/operators/azure_blob_to_gcs.html         |   2 +-
 .../3.7.1/operators/local_to_adls.html             |   2 +-
 .../3.7.1/operators/sftp_to_wasb.html              |   2 +-
 .../3.7.2/operators/adf_run_pipeline.html          |   4 +-
 .../3.7.2/operators/adls.html                      |   2 +-
 .../3.7.2/operators/azure_blob_to_gcs.html         |   2 +-
 .../3.7.2/operators/local_to_adls.html             |   2 +-
 .../3.7.2/operators/sftp_to_wasb.html              |   2 +-
 .../3.8.0/operators/adf_run_pipeline.html          |   4 +-
 .../3.8.0/operators/adls.html                      |   2 +-
 .../3.8.0/operators/azure_blob_to_gcs.html         |   2 +-
 .../3.8.0/operators/local_to_adls.html             |   2 +-
 .../3.8.0/operators/sftp_to_wasb.html              |   2 +-
 .../3.9.0/operators/adf_run_pipeline.html          |   4 +-
 .../3.9.0/operators/adls.html                      |   2 +-
 .../3.9.0/operators/azure_blob_to_gcs.html         |   2 +-
 .../3.9.0/operators/local_to_adls.html             |   2 +-
 .../3.9.0/operators/sftp_to_wasb.html              |   2 +-
 .../2.1.0/operators.html                           |  12 +-
 .../2.1.1/operators.html                           |  12 +-
 .../2.1.2/operators.html                           |  12 +-
 .../2.0.2/operators.html                           |   4 +-
 .../2.0.3/operators.html                           |   4 +-
 .../2.0.4/operators.html                           |   4 +-
 .../2.0.5/operators.html                           |   4 +-
 .../1.0.0/operators.html                           |   4 +-
 .../1.0.1/operators.html                           |   4 +-
 .../1.0.2/operators.html                           |   4 +-
 .../1.1.0/operators.html                           |   4 +-
 .../2.0.0/operators.html                           |   4 +-
 .../2.1.0/operators.html                           |   4 +-
 .../2.1.1/operators.html                           |   4 +-
 .../2.2.0/operators.html                           |   4 +-
 .../2.2.1/operators.html                           |   4 +-
 .../2.2.2/operators.html                           |   4 +-
 .../2.2.3/operators.html                           |   4 +-
 .../3.0.0/operators/opsgenie_alert.html            |   4 +-
 .../3.0.1/operators/opsgenie_alert.html            |   4 +-
 .../3.0.2/operators/opsgenie_alert.html            |   4 +-
 .../3.0.3/operators/opsgenie_alert.html            |   4 +-
 .../3.1.0/operators/opsgenie_alert.html            |   6 +-
 .../1.0.0/operators.html                           |   2 +-
 .../1.0.1/operators.html                           |   2 +-
 .../1.0.2/operators.html                           |   2 +-
 .../2.0.0/operators.html                           |   2 +-
 .../2.0.1/operators.html                           |   2 +-
 .../2.1.0/operators.html                           |   2 +-
 .../2.2.0/operators.html                           |   2 +-
 .../2.2.1/operators.html                           |   2 +-
 .../2.2.2/operators.html                           |   2 +-
 .../2.2.3/operators.html                           |   2 +-
 .../operators/postgres_operator_howto_guide.html   |   4 +-
 .../operators/postgres_operator_howto_guide.html   |   4 +-
 .../operators/postgres_operator_howto_guide.html   |   4 +-
 .../operators/postgres_operator_howto_guide.html   |   4 +-
 .../operators/postgres_operator_howto_guide.html   |   6 +-
 .../2.1.0/operators/transfer/gcs_to_presto.html    |   2 +-
 .../2.1.1/operators/transfer/gcs_to_presto.html    |   2 +-
 .../2.1.2/operators/transfer/gcs_to_presto.html    |   2 +-
 .../2.2.0/operators/transfer/gcs_to_presto.html    |   2 +-
 .../2.2.1/operators/transfer/gcs_to_presto.html    |   2 +-
 .../2.0.1/operators/qubole.html                    |  24 ++--
 .../2.1.0/operators/qubole.html                    |  24 ++--
 .../2.1.1/operators/qubole.html                    |  24 ++--
 .../2.1.2/operators/qubole.html                    |  24 ++--
 .../2.1.3/operators/qubole.html                    |  24 ++--
 .../3.4.0/operators/salesforce_apex_rest.html      |   2 +-
 .../3.4.1/operators/salesforce_apex_rest.html      |   2 +-
 .../3.4.2/operators/salesforce_apex_rest.html      |   2 +-
 .../3.4.3/operators/salesforce_apex_rest.html      |   2 +-
 .../3.4.4/operators/salesforce_apex_rest.html      |   2 +-
 .../operators/slack_operator_howto_guide.html      |   2 +-
 .../operators/slack_operator_howto_guide.html      |   2 +-
 .../operators/slack_operator_howto_guide.html      |   2 +-
 .../operators/slack_operator_howto_guide.html      |   2 +-
 .../operators/slack_operator_howto_guide.html      |   2 +-
 .../2.4.0/operators/s3_to_snowflake.html           |   2 +-
 .../2.4.0/operators/snowflake.html                 |   2 +-
 .../2.4.0/operators/snowflake_to_slack.html        |   2 +-
 .../2.5.0/operators/s3_to_snowflake.html           |   2 +-
 .../2.5.0/operators/snowflake.html                 |   2 +-
 .../2.5.0/operators/snowflake_to_slack.html        |   2 +-
 .../2.5.1/operators/s3_to_snowflake.html           |   2 +-
 .../2.5.1/operators/snowflake.html                 |   2 +-
 .../2.5.1/operators/snowflake_to_slack.html        |   2 +-
 .../2.5.2/operators/s3_to_snowflake.html           |   2 +-
 .../2.5.2/operators/snowflake.html                 |   2 +-
 .../2.5.2/operators/snowflake_to_slack.html        |   2 +-
 .../2.6.0/operators/s3_to_snowflake.html           |   2 +-
 .../2.6.0/operators/snowflake.html                 |   2 +-
 .../2.6.0/operators/snowflake_to_slack.html        |   2 +-
 .../2.7.0/operators/s3_to_snowflake.html           |   2 +-
 .../2.7.0/operators/snowflake.html                 |   2 +-
 .../2.7.0/operators/snowflake_to_slack.html        |   2 +-
 .../1.0.1/operators.html                           |   4 +-
 .../1.0.2/operators.html                           |   4 +-
 .../2.0.0/operators.html                           |   4 +-
 .../2.0.1/operators.html                           |   4 +-
 .../2.1.0/operators.html                           |   4 +-
 .../2.1.1/operators.html                           |   4 +-
 .../2.1.2/operators.html                           |   4 +-
 .../2.1.3/operators.html                           |   4 +-
 .../2.1.1/operators.html                           |   2 +-
 .../2.1.2/operators.html                           |   2 +-
 .../2.1.3/operators.html                           |   2 +-
 .../2.1.4/operators.html                           |   2 +-
 .../2.1.5/operators.html                           |   2 +-
 .../2.1.6/operators.html                           |   2 +-
 .../2.1.7/operators.html                           |   2 +-
 .../2.1.8/operators.html                           |   2 +-
 .../1.0.0/operators.html                           |   2 +-
 .../1.0.1/operators.html                           |   2 +-
 .../1.0.2/operators.html                           |   2 +-
 .../2.0.0/operators.html                           |   2 +-
 .../2.0.1/operators.html                           |   2 +-
 .../2.0.2/operators.html                           |   2 +-
 .../2.0.3/operators.html                           |   2 +-
 .../2.0.4/operators.html                           |   2 +-
 .../2.1.0/operators/transfer/gcs_to_trino.html     |   2 +-
 .../2.1.1/operators/transfer/gcs_to_trino.html     |   2 +-
 .../2.1.2/operators/transfer/gcs_to_trino.html     |   2 +-
 .../2.2.0/operators/transfer/gcs_to_trino.html     |   2 +-
 .../2.3.0/operators/transfer/gcs_to_trino.html     |   2 +-
 docs-archive/apache-airflow/1.10.10/tutorial.html  |  16 +--
 docs-archive/apache-airflow/1.10.11/concepts.html  |   6 +-
 docs-archive/apache-airflow/1.10.11/tutorial.html  |  16 +--
 docs-archive/apache-airflow/1.10.12/concepts.html  |   6 +-
 docs-archive/apache-airflow/1.10.12/tutorial.html  |  16 +--
 docs-archive/apache-airflow/1.10.13/concepts.html  |   6 +-
 docs-archive/apache-airflow/1.10.13/tutorial.html  |  16 +--
 docs-archive/apache-airflow/1.10.14/concepts.html  |   6 +-
 docs-archive/apache-airflow/1.10.14/tutorial.html  |  16 +--
 docs-archive/apache-airflow/1.10.15/concepts.html  |   6 +-
 docs-archive/apache-airflow/1.10.15/tutorial.html  |  16 +--
 docs-archive/apache-airflow/1.10.8/tutorial.html   |  18 +--
 docs-archive/apache-airflow/1.10.9/tutorial.html   |  18 +--
 .../2.0.0/apache-airflow/stable/concepts.html      |   8 +-
 .../2.0.0/apache-airflow/stable/tutorial.html      |  16 +--
 .../stable/tutorial_taskflow_api.html              |  20 ++--
 docs-archive/apache-airflow/2.0.0/concepts.html    |   8 +-
 docs-archive/apache-airflow/2.0.0/tutorial.html    |  16 +--
 .../2.0.0/tutorial_taskflow_api.html               |  20 ++--
 docs-archive/apache-airflow/2.0.1/concepts.html    |   8 +-
 docs-archive/apache-airflow/2.0.1/tutorial.html    |  16 +--
 .../2.0.1/tutorial_taskflow_api.html               |  20 ++--
 docs-archive/apache-airflow/2.0.2/concepts.html    |   8 +-
 docs-archive/apache-airflow/2.0.2/tutorial.html    |  16 +--
 .../2.0.2/tutorial_taskflow_api.html               |  20 ++--
 docs-archive/apache-airflow/2.1.0/tutorial.html    |  16 +--
 .../2.1.0/tutorial_taskflow_api.html               |  22 ++--
 docs-archive/apache-airflow/2.1.1/tutorial.html    |  16 +--
 .../2.1.1/tutorial_taskflow_api.html               |  22 ++--
 docs-archive/apache-airflow/2.1.2/tutorial.html    |  16 +--
 .../2.1.2/tutorial_taskflow_api.html               |  22 ++--
 docs-archive/apache-airflow/2.1.3/tutorial.html    |  16 +--
 .../2.1.3/tutorial_taskflow_api.html               |  22 ++--
 docs-archive/apache-airflow/2.1.4/tutorial.html    |  16 +--
 .../2.1.4/tutorial_taskflow_api.html               |  22 ++--
 docs-archive/apache-airflow/2.2.0/tutorial.html    |  16 +--
 .../2.2.0/tutorial_taskflow_api.html               |  24 ++--
 docs-archive/apache-airflow/2.2.1/tutorial.html    |  16 +--
 .../2.2.1/tutorial_taskflow_api.html               |  24 ++--
 docs-archive/apache-airflow/2.2.2/tutorial.html    |  16 +--
 .../2.2.2/tutorial_taskflow_api.html               |  24 ++--
 docs-archive/apache-airflow/2.2.3/tutorial.html    |  16 +--
 .../2.2.3/tutorial_taskflow_api.html               |  24 ++--
 .../apache-airflow/2.2.4/concepts/dags.html        |   8 +-
 .../apache-airflow/2.2.4/concepts/tasks.html       |   2 +-
 .../apache-airflow/2.2.4/executor/kubernetes.html  |   4 +-
 .../2.2.4/howto/create-custom-decorator.html       |   4 +-
 .../apache-airflow/2.2.4/howto/operator/bash.html  |   6 +-
 .../2.2.4/howto/operator/datetime.html             |   4 +-
 .../2.2.4/howto/operator/external_task_sensor.html |   4 +-
 .../2.2.4/howto/operator/python.html               |   6 +-
 .../2.2.4/howto/operator/weekday.html              |   2 +-
 .../apache-airflow/2.2.4/howto/timetable.html      |   6 +-
 .../2.2.4/security/access-control.html             |   6 +-
 docs-archive/apache-airflow/2.2.4/tutorial.html    |  16 +--
 .../2.2.4/tutorial_taskflow_api.html               |  24 ++--
 .../apache-airflow/2.2.5/concepts/dags.html        |   8 +-
 .../apache-airflow/2.2.5/concepts/tasks.html       |   2 +-
 .../apache-airflow/2.2.5/executor/kubernetes.html  |   4 +-
 .../2.2.5/howto/create-custom-decorator.html       |   4 +-
 .../apache-airflow/2.2.5/howto/operator/bash.html  |   6 +-
 .../2.2.5/howto/operator/datetime.html             |   4 +-
 .../2.2.5/howto/operator/external_task_sensor.html |   4 +-
 .../2.2.5/howto/operator/python.html               |   6 +-
 .../2.2.5/howto/operator/weekday.html              |   2 +-
 .../apache-airflow/2.2.5/howto/timetable.html      |   6 +-
 .../2.2.5/security/access-control.html             |   6 +-
 docs-archive/apache-airflow/2.2.5/tutorial.html    |  16 +--
 .../2.2.5/tutorial_taskflow_api.html               |  24 ++--
 .../apache-airflow/2.3.0/concepts/dags.html        |   8 +-
 .../apache-airflow/2.3.0/concepts/tasks.html       |   2 +-
 .../apache-airflow/2.3.0/executor/kubernetes.html  |   4 +-
 .../apache-airflow/2.3.0/howto/operator/bash.html  |   6 +-
 .../2.3.0/howto/operator/datetime.html             |   4 +-
 .../2.3.0/howto/operator/external_task_sensor.html |   4 +-
 .../2.3.0/howto/operator/python.html               |  10 +-
 .../2.3.0/howto/operator/weekday.html              |   2 +-
 .../apache-airflow/2.3.0/howto/timetable.html      |   6 +-
 .../2.3.0/security/access-control.html             |   6 +-
 docs-archive/apache-airflow/2.3.0/tutorial.html    |  16 +--
 .../2.3.0/tutorial_taskflow_api.html               |  24 ++--
 .../apache-airflow/2.3.1/concepts/dags.html        |   8 +-
 .../apache-airflow/2.3.1/concepts/tasks.html       |   2 +-
 .../apache-airflow/2.3.1/executor/kubernetes.html  |   4 +-
 .../apache-airflow/2.3.1/howto/operator/bash.html  |   6 +-
 .../2.3.1/howto/operator/datetime.html             |   4 +-
 .../2.3.1/howto/operator/external_task_sensor.html |   4 +-
 .../2.3.1/howto/operator/python.html               |  10 +-
 .../2.3.1/howto/operator/weekday.html              |   2 +-
 .../apache-airflow/2.3.1/howto/timetable.html      |   6 +-
 .../2.3.1/security/access-control.html             |   6 +-
 docs-archive/apache-airflow/2.3.1/tutorial.html    |  16 +--
 .../2.3.1/tutorial_taskflow_api.html               |  24 ++--
 .../apache-airflow/2.3.2/concepts/dags.html        |   8 +-
 .../apache-airflow/2.3.2/concepts/tasks.html       |   2 +-
 .../apache-airflow/2.3.2/executor/kubernetes.html  |   4 +-
 .../apache-airflow/2.3.2/howto/operator/bash.html  |   6 +-
 .../2.3.2/howto/operator/datetime.html             |   4 +-
 .../2.3.2/howto/operator/external_task_sensor.html |   4 +-
 .../2.3.2/howto/operator/python.html               |  10 +-
 .../2.3.2/howto/operator/weekday.html              |   2 +-
 .../apache-airflow/2.3.2/howto/timetable.html      |   6 +-
 .../2.3.2/security/access-control.html             |   6 +-
 docs-archive/apache-airflow/2.3.2/tutorial.html    |  16 +--
 .../2.3.2/tutorial_taskflow_api.html               |  24 ++--
 995 files changed, 5333 insertions(+), 5333 deletions(-)

diff --git a/docs-archive/apache-airflow-providers-airbyte/2.1.1/operators/airbyte.html b/docs-archive/apache-airflow-providers-airbyte/2.1.1/operators/airbyte.html
index 22d58bccbd..4b6101e8d6 100644
--- a/docs-archive/apache-airflow-providers-airbyte/2.1.1/operators/airbyte.html
+++ b/docs-archive/apache-airflow-providers-airbyte/2.1.1/operators/airbyte.html
@@ -609,7 +609,7 @@ of the job. Another way is use the flag <code class="docutils literal notranslat
 return the <code class="docutils literal notranslate"><span class="pre">job_id</span></code> that should be pass to the AirbyteSensor.</p>
 <p>An example using the synchronous way:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/airbyte/example_dags/example_airbyte_trigger_job.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/airbyte/example_dags/example_airbyte_trigger_job.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/airbyte/example_dags/example_airbyte_trigger_job.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-airbyte/2.1.1/airflow/providers/airbyte/example_dags/example_airbyte_trigger_job.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">sync_source_destination</span> <span class="o">=</span> <span class="n">AirbyteTriggerSyncOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;airbyte_sync_source_dest_example&#39;</span><span class="p">,</span>
         <span class="n">connection_id</span><span class="o">=</span><span class="s1">&#39;15bc3800-82e4-48c3-a32d-620661273f28&#39;</span><span class="p">,</span>
@@ -619,7 +619,7 @@ return the <code class="docutils literal notranslate"><span class="pre">job_id</
 </div>
 <p>An example using the async way:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/airbyte/example_dags/example_airbyte_trigger_job.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/airbyte/example_dags/example_airbyte_trigger_job.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/airbyte/example_dags/example_airbyte_trigger_job.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-airbyte/2.1.1/airflow/providers/airbyte/example_dags/example_airbyte_trigger_job.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">async_source_destination</span> <span class="o">=</span> <span class="n">AirbyteTriggerSyncOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;airbyte_async_source_dest_example&#39;</span><span class="p">,</span>
         <span class="n">connection_id</span><span class="o">=</span><span class="s1">&#39;15bc3800-82e4-48c3-a32d-620661273f28&#39;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-airbyte/2.1.2/operators/airbyte.html b/docs-archive/apache-airflow-providers-airbyte/2.1.2/operators/airbyte.html
index 43522ac318..7e58c798fc 100644
--- a/docs-archive/apache-airflow-providers-airbyte/2.1.2/operators/airbyte.html
+++ b/docs-archive/apache-airflow-providers-airbyte/2.1.2/operators/airbyte.html
@@ -609,7 +609,7 @@ of the job. Another way is use the flag <code class="docutils literal notranslat
 return the <code class="docutils literal notranslate"><span class="pre">job_id</span></code> that should be pass to the AirbyteSensor.</p>
 <p>An example using the synchronous way:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/airbyte/example_dags/example_airbyte_trigger_job.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/airbyte/example_dags/example_airbyte_trigger_job.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/airbyte/example_dags/example_airbyte_trigger_job.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-airbyte/2.1.2/airflow/providers/airbyte/example_dags/example_airbyte_trigger_job.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">sync_source_destination</span> <span class="o">=</span> <span class="n">AirbyteTriggerSyncOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;airbyte_sync_source_dest_example&#39;</span><span class="p">,</span>
         <span class="n">connection_id</span><span class="o">=</span><span class="s1">&#39;15bc3800-82e4-48c3-a32d-620661273f28&#39;</span><span class="p">,</span>
@@ -619,7 +619,7 @@ return the <code class="docutils literal notranslate"><span class="pre">job_id</
 </div>
 <p>An example using the async way:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/airbyte/example_dags/example_airbyte_trigger_job.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/airbyte/example_dags/example_airbyte_trigger_job.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/airbyte/example_dags/example_airbyte_trigger_job.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-airbyte/2.1.2/airflow/providers/airbyte/example_dags/example_airbyte_trigger_job.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">async_source_destination</span> <span class="o">=</span> <span class="n">AirbyteTriggerSyncOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;airbyte_async_source_dest_example&#39;</span><span class="p">,</span>
         <span class="n">connection_id</span><span class="o">=</span><span class="s1">&#39;15bc3800-82e4-48c3-a32d-620661273f28&#39;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-airbyte/2.1.3/operators/airbyte.html b/docs-archive/apache-airflow-providers-airbyte/2.1.3/operators/airbyte.html
index f244ff1f86..8e05f7da12 100644
--- a/docs-archive/apache-airflow-providers-airbyte/2.1.3/operators/airbyte.html
+++ b/docs-archive/apache-airflow-providers-airbyte/2.1.3/operators/airbyte.html
@@ -609,7 +609,7 @@ of the job. Another way is use the flag <code class="docutils literal notranslat
 return the <code class="docutils literal notranslate"><span class="pre">job_id</span></code> that should be pass to the AirbyteSensor.</p>
 <p>An example using the synchronous way:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/airbyte/example_dags/example_airbyte_trigger_job.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/airbyte/example_dags/example_airbyte_trigger_job.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/airbyte/example_dags/example_airbyte_trigger_job.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-airbyte/2.1.3/airflow/providers/airbyte/example_dags/example_airbyte_trigger_job.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">sync_source_destination</span> <span class="o">=</span> <span class="n">AirbyteTriggerSyncOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;airbyte_sync_source_dest_example&#39;</span><span class="p">,</span>
         <span class="n">connection_id</span><span class="o">=</span><span class="s1">&#39;15bc3800-82e4-48c3-a32d-620661273f28&#39;</span><span class="p">,</span>
@@ -619,7 +619,7 @@ return the <code class="docutils literal notranslate"><span class="pre">job_id</
 </div>
 <p>An example using the async way:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/airbyte/example_dags/example_airbyte_trigger_job.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/airbyte/example_dags/example_airbyte_trigger_job.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/airbyte/example_dags/example_airbyte_trigger_job.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-airbyte/2.1.3/airflow/providers/airbyte/example_dags/example_airbyte_trigger_job.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">async_source_destination</span> <span class="o">=</span> <span class="n">AirbyteTriggerSyncOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;airbyte_async_source_dest_example&#39;</span><span class="p">,</span>
         <span class="n">connection_id</span><span class="o">=</span><span class="s1">&#39;15bc3800-82e4-48c3-a32d-620661273f28&#39;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-airbyte/2.1.4/operators/airbyte.html b/docs-archive/apache-airflow-providers-airbyte/2.1.4/operators/airbyte.html
index 144c67c634..c281542339 100644
--- a/docs-archive/apache-airflow-providers-airbyte/2.1.4/operators/airbyte.html
+++ b/docs-archive/apache-airflow-providers-airbyte/2.1.4/operators/airbyte.html
@@ -611,7 +611,7 @@ of the job. Another way is use the flag <code class="docutils literal notranslat
 return the <code class="docutils literal notranslate"><span class="pre">job_id</span></code> that should be pass to the AirbyteSensor.</p>
 <p>An example using the synchronous way:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/airbyte/example_dags/example_airbyte_trigger_job.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/airbyte/example_dags/example_airbyte_trigger_job.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/airbyte/example_dags/example_airbyte_trigger_job.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-airbyte/2.1.4/airflow/providers/airbyte/example_dags/example_airbyte_trigger_job.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">sync_source_destination</span> <span class="o">=</span> <span class="n">AirbyteTriggerSyncOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;airbyte_sync_source_dest_example&#39;</span><span class="p">,</span>
         <span class="n">connection_id</span><span class="o">=</span><span class="s1">&#39;15bc3800-82e4-48c3-a32d-620661273f28&#39;</span><span class="p">,</span>
@@ -621,7 +621,7 @@ return the <code class="docutils literal notranslate"><span class="pre">job_id</
 </div>
 <p>An example using the async way:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/airbyte/example_dags/example_airbyte_trigger_job.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/airbyte/example_dags/example_airbyte_trigger_job.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/airbyte/example_dags/example_airbyte_trigger_job.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-airbyte/2.1.4/airflow/providers/airbyte/example_dags/example_airbyte_trigger_job.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">async_source_destination</span> <span class="o">=</span> <span class="n">AirbyteTriggerSyncOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;airbyte_async_source_dest_example&#39;</span><span class="p">,</span>
         <span class="n">connection_id</span><span class="o">=</span><span class="s1">&#39;15bc3800-82e4-48c3-a32d-620661273f28&#39;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-alibaba/1.0.0/operators/oss.html b/docs-archive/apache-airflow-providers-alibaba/1.0.0/operators/oss.html
index 9264bb3ca7..94aaf9e6a3 100644
--- a/docs-archive/apache-airflow-providers-alibaba/1.0.0/operators/oss.html
+++ b/docs-archive/apache-airflow-providers-alibaba/1.0.0/operators/oss.html
@@ -616,7 +616,7 @@ new OSS bucket with a given bucket name then delete it.</p>
 <h3>Defining tasks<a class="headerlink" href="#defining-tasks" title="Permalink to this headline">¶</a></h3>
 <p>In the following code we create a new bucket and then delete the bucket.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/alibaba/cloud/example_dags/example_oss_bucket.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/alibaba/cloud/example_dags/example_oss_bucket.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/alibaba/cloud/example_dags/example_oss_bucket.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-alibaba/1.0.0/airflow/providers/alibaba/cloud/example_dags/example_oss_bucket.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="k">with</span> <span class="n">DAG</span><span class="p">(</span>
     <span class="n">dag_id</span><span class="o">=</span><span class="s1">&#39;oss_bucket_dag&#39;</span><span class="p">,</span>
     <span class="n">start_date</span><span class="o">=</span><span class="n">datetime</span><span class="p">(</span><span class="mi">2021</span><span class="p">,</span> <span class="mi">1</span><span class="p">,</span> <span class="mi">1</span><span class="p">),</span>
diff --git a/docs-archive/apache-airflow-providers-alibaba/1.0.1/operators/oss.html b/docs-archive/apache-airflow-providers-alibaba/1.0.1/operators/oss.html
index 31fd200cfa..0101d57cd9 100644
--- a/docs-archive/apache-airflow-providers-alibaba/1.0.1/operators/oss.html
+++ b/docs-archive/apache-airflow-providers-alibaba/1.0.1/operators/oss.html
@@ -616,7 +616,7 @@ new OSS bucket with a given bucket name then delete it.</p>
 <h3>Defining tasks<a class="headerlink" href="#defining-tasks" title="Permalink to this headline">¶</a></h3>
 <p>In the following code we create a new bucket and then delete the bucket.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/alibaba/cloud/example_dags/example_oss_bucket.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/alibaba/cloud/example_dags/example_oss_bucket.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/alibaba/cloud/example_dags/example_oss_bucket.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-alibaba/1.0.1/airflow/providers/alibaba/cloud/example_dags/example_oss_bucket.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="k">with</span> <span class="n">DAG</span><span class="p">(</span>
     <span class="n">dag_id</span><span class="o">=</span><span class="s1">&#39;oss_bucket_dag&#39;</span><span class="p">,</span>
     <span class="n">start_date</span><span class="o">=</span><span class="n">datetime</span><span class="p">(</span><span class="mi">2021</span><span class="p">,</span> <span class="mi">1</span><span class="p">,</span> <span class="mi">1</span><span class="p">),</span>
diff --git a/docs-archive/apache-airflow-providers-alibaba/1.1.0/operators/oss.html b/docs-archive/apache-airflow-providers-alibaba/1.1.0/operators/oss.html
index 7f8dc843aa..53c4c8c42a 100644
--- a/docs-archive/apache-airflow-providers-alibaba/1.1.0/operators/oss.html
+++ b/docs-archive/apache-airflow-providers-alibaba/1.1.0/operators/oss.html
@@ -618,7 +618,7 @@ new OSS bucket with a given bucket name then delete it.</p>
 <h3>Defining tasks<a class="headerlink" href="#defining-tasks" title="Permalink to this headline">¶</a></h3>
 <p>In the following code we create a new bucket and then delete the bucket.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/alibaba/cloud/example_dags/example_oss_bucket.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/alibaba/cloud/example_dags/example_oss_bucket.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/alibaba/cloud/example_dags/example_oss_bucket.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-alibaba/1.1.0/airflow/providers/alibaba/cloud/example_dags/example_oss_bucket.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="k">with</span> <span class="n">DAG</span><span class="p">(</span>
     <span class="n">dag_id</span><span class="o">=</span><span class="s1">&#39;oss_bucket_dag&#39;</span><span class="p">,</span>
     <span class="n">start_date</span><span class="o">=</span><span class="n">datetime</span><span class="p">(</span><span class="mi">2021</span><span class="p">,</span> <span class="mi">1</span><span class="p">,</span> <span class="mi">1</span><span class="p">),</span>
diff --git a/docs-archive/apache-airflow-providers-alibaba/1.1.1/operators/oss.html b/docs-archive/apache-airflow-providers-alibaba/1.1.1/operators/oss.html
index 03c57e007f..6be5b17efd 100644
--- a/docs-archive/apache-airflow-providers-alibaba/1.1.1/operators/oss.html
+++ b/docs-archive/apache-airflow-providers-alibaba/1.1.1/operators/oss.html
@@ -620,7 +620,7 @@ new OSS bucket with a given bucket name then delete it.</p>
 <h3>Defining tasks<a class="headerlink" href="#defining-tasks" title="Permalink to this headline">¶</a></h3>
 <p>In the following code we create a new bucket and then delete the bucket.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/alibaba/cloud/example_dags/example_oss_bucket.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/alibaba/cloud/example_dags/example_oss_bucket.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/alibaba/cloud/example_dags/example_oss_bucket.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-alibaba/1.1.1/airflow/providers/alibaba/cloud/example_dags/example_oss_bucket.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="k">with</span> <span class="n">DAG</span><span class="p">(</span>
     <span class="n">dag_id</span><span class="o">=</span><span class="s1">&#39;oss_bucket_dag&#39;</span><span class="p">,</span>
     <span class="n">start_date</span><span class="o">=</span><span class="n">datetime</span><span class="p">(</span><span class="mi">2021</span><span class="p">,</span> <span class="mi">1</span><span class="p">,</span> <span class="mi">1</span><span class="p">),</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/athena.html b/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/athena.html
index a274bc1d8e..06215e88d7 100644
--- a/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/athena.html
+++ b/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/athena.html
@@ -630,7 +630,7 @@ created in an S3 bucket and populated with SAMPLE_DATA.  The example waits for t
 to complete and then drops the created table and deletes the sample CSV file in the S3
 bucket.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_athena.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_athena.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_athena.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/2.6.0/airflow/providers/amazon/aws/example_dags/example_athena.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>
     <span class="c1"># Using a task-decorated function to create a CSV file in S3</span>
     <span class="n">add_sample_data_to_s3</span> <span class="o">=</span> <span class="n">add_sample_data_to_s3</span><span class="p">()</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/datasync.html b/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/datasync.html
index b327cf21a9..dafe587f74 100644
--- a/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/datasync.html
+++ b/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/datasync.html
@@ -622,13 +622,13 @@ and an <em>AWS DataSync Task</em> (identified by a TaskArn on AWS).</p>
 <h3>Environment variables<a class="headerlink" href="#environment-variables" title="Permalink to this headline">¶</a></h3>
 <p>These examples rely on the following variables, which can be passed via OS environment variables.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_datasync_1.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_datasync_1.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_datasync_1.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/2.6.0/airflow/providers/amazon/aws/example_dags/example_datasync_1.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">TASK_ARN</span> <span class="o">=</span> <span class="n">getenv</span><span class="p">(</span><span class="s2">&quot;TASK_ARN&quot;</span><span class="p">,</span> <span class="s2">&quot;my_aws_datasync_task_arn&quot;</span><span class="p">)</span>
 </pre></div>
 </div>
 </div>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_datasync_1.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_datasync_1.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_datasync_1.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/2.6.0/airflow/providers/amazon/aws/example_dags/example_datasync_1.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">SOURCE_LOCATION_URI</span> <span class="o">=</span> <span class="n">getenv</span><span class="p">(</span><span class="s2">&quot;SOURCE_LOCATION_URI&quot;</span><span class="p">,</span> <span class="s2">&quot;smb://hostname/directory/&quot;</span><span class="p">)</span>
 
 <span class="n">DESTINATION_LOCATION_URI</span> <span class="o">=</span> <span class="n">getenv</span><span class="p">(</span><span class="s2">&quot;DESTINATION_LOCATION_URI&quot;</span><span class="p">,</span> <span class="s2">&quot;s3://mybucket/prefix&quot;</span><span class="p">)</span>
@@ -641,7 +641,7 @@ and an <em>AWS DataSync Task</em> (identified by a TaskArn on AWS).</p>
 <p>The <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/datasync/index.html#airflow.providers.amazon.aws.operators.datasync.DataSyncOperator" title="airflow.providers.amazon.aws.operators.datasync.DataSyncOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">DataSyncOperator</span></code></a> can execute a specific
 TaskArn by specifying the <code class="docutils literal notranslate"><span class="pre">task_arn</span></code> parameter. This is useful when you know the TaskArn you want to execute.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_datasync_1.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_datasync_1.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_datasync_1.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/2.6.0/airflow/providers/amazon/aws/example_dags/example_datasync_1.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">datasync_task_1</span> <span class="o">=</span> <span class="n">DataSyncOperator</span><span class="p">(</span><span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;datasync_task_1&quot;</span><span class="p">,</span> <span class="n">task_arn</span><span class="o">=</span><span class="n">TASK_ARN</span><span class="p">)</span>
 </pre></div>
 </div>
@@ -657,7 +657,7 @@ can iterate all DataSync Tasks for their source and destination LocationArns. Th
 each LocationArn to see if its the URIs match the desired source / destination URI.</p>
 <p>To perform a search based on the Location URIs, define the task as follows</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_datasync_1.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_datasync_1.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_datasync_1.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/2.6.0/airflow/providers/amazon/aws/example_dags/example_datasync_1.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">datasync_task_2</span> <span class="o">=</span> <span class="n">DataSyncOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;datasync_task_2&quot;</span><span class="p">,</span>
         <span class="n">source_location_uri</span><span class="o">=</span><span class="n">SOURCE_LOCATION_URI</span><span class="p">,</span>
@@ -685,7 +685,7 @@ Finally, delete it.</p>
 <h3>Environment variables<a class="headerlink" href="#id2" title="Permalink to this headline">¶</a></h3>
 <p>This example relies on the following variables, which can be passed via OS environment variables.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_datasync_2.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_datasync_2.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_datasync_2.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/2.6.0/airflow/providers/amazon/aws/example_dags/example_datasync_2.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">SOURCE_LOCATION_URI</span> <span class="o">=</span> <span class="n">getenv</span><span class="p">(</span><span class="s2">&quot;SOURCE_LOCATION_URI&quot;</span><span class="p">,</span> <span class="s2">&quot;smb://hostname/directory/&quot;</span><span class="p">)</span>
 
 <span class="n">DESTINATION_LOCATION_URI</span> <span class="o">=</span> <span class="n">getenv</span><span class="p">(</span><span class="s2">&quot;DESTINATION_LOCATION_URI&quot;</span><span class="p">,</span> <span class="s2">&quot;s3://mybucket/prefix&quot;</span><span class="p">)</span>
@@ -723,7 +723,7 @@ as before but with some extra arguments.</p>
 and/or Locations if no suitable existing Task was found. If these are left to their default value (None)
 then no create will be attempted.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_datasync_2.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_datasync_2.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_datasync_2.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/2.6.0/airflow/providers/amazon/aws/example_dags/example_datasync_2.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">datasync_task</span> <span class="o">=</span> <span class="n">DataSyncOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;datasync_task&quot;</span><span class="p">,</span>
         <span class="n">source_location_uri</span><span class="o">=</span><span class="n">SOURCE_LOCATION_URI</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/dms.html b/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/dms.html
index cf8513af4b..e784c01711 100644
--- a/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/dms.html
+++ b/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/dms.html
@@ -646,7 +646,7 @@ to be completed, and then delete it.</p>
 <h3>Defining tasks<a class="headerlink" href="#defining-tasks" title="Permalink to this headline">¶</a></h3>
 <p>In the following code we create a new replication task, start it, wait for it to be completed and then delete it.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_dms_full_load_task.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_dms_full_load_task.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_dms_full_load_task.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/2.6.0/airflow/providers/amazon/aws/example_dags/example_dms_full_load_task.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">create_task</span> <span class="o">=</span> <span class="n">DmsCreateTaskOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;create_task&#39;</span><span class="p">,</span>
         <span class="n">replication_task_id</span><span class="o">=</span><span class="n">REPLICATION_TASK_ID</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/ecs.html b/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/ecs.html
index 4bfd960ed7..54b2f9ad0f 100644
--- a/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/ecs.html
+++ b/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/ecs.html
@@ -622,7 +622,7 @@ the task &quot;hello_world&quot; runs <code class="docutils literal notranslate"
 It overrides the command in the <code class="docutils literal notranslate"><span class="pre">hello-world-container</span></code> container.</p>
 <p>Before using ECSOperator, <em>cluster</em> and <em>task definition</em> need to be created.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_ecs_fargate.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_ecs_fargate.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_ecs_fargate.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/2.6.0/airflow/providers/amazon/aws/example_dags/example_ecs_fargate.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">hello_world</span> <span class="o">=</span> <span class="n">ECSOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;hello_world&quot;</span><span class="p">,</span>
     <span class="n">dag</span><span class="o">=</span><span class="n">dag</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/eks.html b/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/eks.html
index edad134156..8bc51df3bc 100644
--- a/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/eks.html
+++ b/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/eks.html
@@ -614,7 +614,7 @@ and management of containerized applications.</p>
 </dd>
 </dl>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/2.6.0/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="c1"># Create an Amazon EKS Cluster control plane without attaching a compute service.</span>
     <span class="n">create_cluster</span> <span class="o">=</span> <span class="n">EksCreateClusterOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;create_eks_cluster&#39;</span><span class="p">,</span>
@@ -631,7 +631,7 @@ and management of containerized applications.</p>
 <p>To delete an existing Amazon EKS Cluster you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/eks/index.html#airflow.providers.amazon.aws.operators.eks.EksDeleteClusterOperator" title="airflow.providers.amazon.aws.operators.eks.EksDeleteClusterOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">EksDeleteClusterOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/2.6.0/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">delete_cluster</span> <span class="o">=</span> <span class="n">EksDeleteClusterOperator</span><span class="p">(</span><span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;delete_eks_cluster&#39;</span><span class="p">)</span>
 </pre></div>
 </div>
@@ -642,7 +642,7 @@ attempt to delete any attached resources first.</p>
 </dd>
 </dl>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroup_in_one_step.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroup_in_one_step.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroup_in_one_step.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/2.6.0/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroup_in_one_step.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="c1"># An Amazon EKS cluster can not be deleted with attached resources such as nodegroups or Fargate profiles.</span>
     <span class="c1"># Setting the `force` to `True` will delete any attached resources before deleting the cluster.</span>
     <span class="n">delete_all</span> <span class="o">=</span> <span class="n">EksDeleteClusterOperator</span><span class="p">(</span><span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;delete_nodegroup_and_cluster&#39;</span><span class="p">,</span> <span class="n">force_delete_compute</span><span class="o">=</span><span class="kc">True</span><span class="p">)</span>
@@ -664,7 +664,7 @@ attempt to delete any attached resources first.</p>
 </dd>
 </dl>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/2.6.0/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">create_nodegroup</span> <span class="o">=</span> <span class="n">EksCreateNodegroupOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;create_eks_nodegroup&#39;</span><span class="p">,</span>
         <span class="n">nodegroup_name</span><span class="o">=</span><span class="n">NODEGROUP_NAME</span><span class="p">,</span>
@@ -680,7 +680,7 @@ attempt to delete any attached resources first.</p>
 <p>To delete an existing Amazon EKS Managed Nodegroup you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/eks/index.html#airflow.providers.amazon.aws.operators.eks.EksDeleteNodegroupOperator" title="airflow.providers.amazon.aws.operators.eks.EksDeleteNodegroupOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">EksDeleteNodegroupOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/2.6.0/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">delete_nodegroup</span> <span class="o">=</span> <span class="n">EksDeleteNodegroupOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;delete_eks_nodegroup&#39;</span><span class="p">,</span> <span class="n">nodegroup_name</span><span class="o">=</span><span class="n">NODEGROUP_NAME</span>
     <span class="p">)</span>
@@ -702,7 +702,7 @@ attempt to delete any attached resources first.</p>
 </dd>
 </dl>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroup_in_one_step.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroup_in_one_step.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroup_in_one_step.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/2.6.0/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroup_in_one_step.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="c1"># Create an Amazon EKS cluster control plane and an EKS nodegroup compute platform in one step.</span>
     <span class="n">create_cluster_and_nodegroup</span> <span class="o">=</span> <span class="n">EksCreateClusterOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;create_eks_cluster_and_nodegroup&#39;</span><span class="p">,</span>
@@ -732,7 +732,7 @@ attempt to delete any attached resources first.</p>
 </dd>
 </dl>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_in_one_step.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_in_one_step.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_in_one_step.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/2.6.0/airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_in_one_step.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="c1"># Create an Amazon EKS cluster control plane and an AWS Fargate compute platform in one step.</span>
     <span class="n">create_cluster_and_fargate_profile</span> <span class="o">=</span> <span class="n">EksCreateClusterOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;create_eks_cluster_and_fargate_profile&#39;</span><span class="p">,</span>
@@ -761,7 +761,7 @@ attempt to delete any attached resources first.</p>
 </dd>
 </dl>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_profile.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_profile.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_profile.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/2.6.0/airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_profile.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">create_fargate_profile</span> <span class="o">=</span> <span class="n">EksCreateFargateProfileOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;create_eks_fargate_profile&#39;</span><span class="p">,</span>
         <span class="n">pod_execution_role_arn</span><span class="o">=</span><span class="n">ROLE_ARN</span><span class="p">,</span>
@@ -777,7 +777,7 @@ attempt to delete any attached resources first.</p>
 <p>To delete an existing AWS Fargate Profile you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/eks/index.html#airflow.providers.amazon.aws.operators.eks.EksDeleteFargateProfileOperator" title="airflow.providers.amazon.aws.operators.eks.EksDeleteFargateProfileOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">EksDeleteFargateProfileOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_profile.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_profile.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_profile.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/2.6.0/airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_profile.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">delete_fargate_profile</span> <span class="o">=</span> <span class="n">EksDeleteFargateProfileOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;delete_eks_fargate_profile&#39;</span><span class="p">,</span>
         <span class="n">fargate_profile_name</span><span class="o">=</span><span class="n">FARGATE_PROFILE_NAME</span><span class="p">,</span>
@@ -793,7 +793,7 @@ attempt to delete any attached resources first.</p>
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/eks/index.html#airflow.providers.amazon.aws.operators.eks.EksPodOperator" title="airflow.providers.amazon.aws.operators.eks.EksPodOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">EksPodOperator</span></code></a>.</p>
 <p>Note: An Amazon EKS Cluster with underlying compute infrastructure is required.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/2.6.0/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">start_pod</span> <span class="o">=</span> <span class="n">EksPodOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;run_pod&quot;</span><span class="p">,</span>
         <span class="n">pod_name</span><span class="o">=</span><span class="s2">&quot;run_pod&quot;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/emr.html b/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/emr.html
index 67749cf7ea..96e02a090d 100644
--- a/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/emr.html
+++ b/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/emr.html
@@ -651,7 +651,7 @@ with <code class="docutils literal notranslate"><span class="pre">EmrJobFlowSens
 <h3>JobFlow configuration<a class="headerlink" href="#jobflow-configuration" title="Permalink to this headline">¶</a></h3>
 <p>To create a job flow at EMR, you need to specify the configuration for the EMR cluster:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_job_flow_automatic_steps.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_emr_job_flow_automatic_steps.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_job_flow_automatic_steps.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/2.6.0/airflow/providers/amazon/aws/example_dags/example_emr_job_flow_automatic_steps.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">SPARK_STEPS</span> <span class="o">=</span> <span class="p">[</span>
     <span class="p">{</span>
         <span class="s1">&#39;Name&#39;</span><span class="p">:</span> <span class="s1">&#39;calculate_pi&#39;</span><span class="p">,</span>
@@ -699,7 +699,7 @@ The config <code class="docutils literal notranslate"><span class="pre">'KeepJob
 <h3>Defining tasks<a class="headerlink" href="#defining-tasks" title="Permalink to this headline">¶</a></h3>
 <p>In the following code we are creating a new job flow, add a step, monitor the step, and then terminate the cluster.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_job_flow_automatic_steps.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_emr_job_flow_automatic_steps.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_job_flow_automatic_steps.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/2.6.0/airflow/providers/amazon/aws/example_dags/example_emr_job_flow_automatic_steps.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">job_flow_creator</span> <span class="o">=</span> <span class="n">EmrCreateJobFlowOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;create_job_flow&#39;</span><span class="p">,</span>
         <span class="n">job_flow_overrides</span><span class="o">=</span><span class="n">JOB_FLOW_OVERRIDES</span><span class="p">,</span>
@@ -728,7 +728,7 @@ Also, we would not specify <code class="docutils literal notranslate"><span clas
 <h3>Defining tasks<a class="headerlink" href="#id3" title="Permalink to this headline">¶</a></h3>
 <p>Here is the task definitions for our DAG.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_job_flow_manual_steps.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_emr_job_flow_manual_steps.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_job_flow_manual_steps.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/2.6.0/airflow/providers/amazon/aws/example_dags/example_emr_job_flow_manual_steps.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">cluster_creator</span> <span class="o">=</span> <span class="n">EmrCreateJobFlowOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;create_job_flow&#39;</span><span class="p">,</span>
         <span class="n">job_flow_overrides</span><span class="o">=</span><span class="n">JOB_FLOW_OVERRIDES</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/emr_eks.html b/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/emr_eks.html
index a5523ffc80..9ae5857f27 100644
--- a/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/emr_eks.html
+++ b/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/emr_eks.html
@@ -627,7 +627,7 @@
 <p>You can also optionally provide configuration overrides such as Spark, Hive, or Log4j properties as well as monitoring configuration that sends Spark logs to S3 or Cloudwatch.</p>
 <p>In the example, we show how to add an <code class="docutils literal notranslate"><span class="pre">applicationConfiguration</span></code> to use the AWS Glue data catalog and <code class="docutils literal notranslate"><span class="pre">monitoringConfiguration</span></code> to send logs to the <code class="docutils literal notranslate"><span class="pre">/aws/emr-eks-spark</span></code> log group in CloudWatch. Refer to the <a class="reference external" href="https://docs.aws.amazon.com [...]
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_eks_job.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_emr_eks_job.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_eks_job.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/2.6.0/airflow/providers/amazon/aws/example_dags/example_emr_eks_job.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">JOB_DRIVER_ARG</span> <span class="o">=</span> <span class="p">{</span>
     <span class="s2">&quot;sparkSubmitJobDriver&quot;</span><span class="p">:</span> <span class="p">{</span>
         <span class="s2">&quot;entryPoint&quot;</span><span class="p">:</span> <span class="s2">&quot;local:///usr/lib/spark/examples/src/main/python/pi.py&quot;</span><span class="p">,</span>
@@ -656,7 +656,7 @@
 </div>
 <p>We pass the <code class="docutils literal notranslate"><span class="pre">virtual_cluster_id</span></code> and <code class="docutils literal notranslate"><span class="pre">execution_role_arn</span></code> values as operator parameters, but you can store them in a connection or provide them in the DAG. Your AWS region should be defined either in the <code class="docutils literal notranslate"><span class="pre">aws_default</span></code> connection as <code class="docutils literal notransl [...]
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_eks_job.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_emr_eks_job.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_eks_job.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/2.6.0/airflow/providers/amazon/aws/example_dags/example_emr_eks_job.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">job_starter</span> <span class="o">=</span> <span class="n">EmrContainerOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;start_job&quot;</span><span class="p">,</span>
     <span class="n">virtual_cluster_id</span><span class="o">=</span><span class="n">VIRTUAL_CLUSTER_ID</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/glacier.html b/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/glacier.html
index c909410d40..430dd307e3 100644
--- a/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/glacier.html
+++ b/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/glacier.html
@@ -604,7 +604,7 @@ The operation returns dictionary of information related to the initiated job lik
 <code class="xref py py-class docutils literal notranslate"><span class="pre">GlacierCreateJobOperator</span></code></p>
 <p>Example usage:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/2.6.0/airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">create_glacier_job</span> <span class="o">=</span> <span class="n">GlacierCreateJobOperator</span><span class="p">(</span><span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;create_glacier_job&quot;</span><span class="p">,</span> <span class="n">vault_name</span><span class="o">=</span><span class="n">VAULT_NAME</span><span class="p">)</span>
 <span class="n">JOB_ID</span> <span class="o">=</span> <span class="s1">&#39;{{ task_instance.xcom_pull(&quot;create_glacier_job&quot;)[&quot;jobId&quot;] }}&#39;</span>
 </pre></div>
@@ -629,7 +629,7 @@ Which means that every next request will be sent every 20 minutes.</p>
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/sensors/glacier/index.html#airflow.providers.amazon.aws.sensors.glacier.GlacierJobOperationSensor" title="airflow.providers.amazon.aws.sensors.glacier.GlacierJobOperationSensor"><code class="xref py py-class docutils literal notranslate"><span class="pre">GlacierJobOperationSensor</span></code></a></p>
 <p>Example usage:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/2.6.0/airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">transfer_archive_to_gcs</span> <span class="o">=</span> <span class="n">GlacierToGCSOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;transfer_archive_to_gcs&quot;</span><span class="p">,</span>
     <span class="n">vault_name</span><span class="o">=</span><span class="n">VAULT_NAME</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/google_api_to_s3_transfer.html b/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/google_api_to_s3_transfer.html
index 42725e1ff8..8e3a67579c 100644
--- a/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/google_api_to_s3_transfer.html
+++ b/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/google_api_to_s3_transfer.html
@@ -617,7 +617,7 @@ in action.</p>
 <h3>Environment variables<a class="headerlink" href="#environment-variables" title="Permalink to this headline">¶</a></h3>
 <p>These examples rely on the following variables, which can be passed via OS environment variables.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_basic.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_basic.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_basic.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/2.6.0/airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_basic.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">GOOGLE_SHEET_ID</span> <span class="o">=</span> <span class="n">getenv</span><span class="p">(</span><span class="s2">&quot;GOOGLE_SHEET_ID&quot;</span><span class="p">)</span>
 <span class="n">GOOGLE_SHEET_RANGE</span> <span class="o">=</span> <span class="n">getenv</span><span class="p">(</span><span class="s2">&quot;GOOGLE_SHEET_RANGE&quot;</span><span class="p">)</span>
 <span class="n">S3_DESTINATION_KEY</span> <span class="o">=</span> <span class="n">getenv</span><span class="p">(</span><span class="s2">&quot;S3_DESTINATION_KEY&quot;</span><span class="p">,</span> <span class="s2">&quot;s3://bucket/key.json&quot;</span><span class="p">)</span>
@@ -630,7 +630,7 @@ in action.</p>
 <h3>Get Google Sheets Sheet Values<a class="headerlink" href="#get-google-sheets-sheet-values" title="Permalink to this headline">¶</a></h3>
 <p>In the following code we are requesting a Google Sheet via the <code class="docutils literal notranslate"><span class="pre">sheets.spreadsheets.values.get</span></code> endpoint.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_basic.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_basic.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_basic.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/2.6.0/airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_basic.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">task_google_sheets_values_to_s3</span> <span class="o">=</span> <span class="n">GoogleApiToS3Operator</span><span class="p">(</span>
         <span class="n">google_api_service_name</span><span class="o">=</span><span class="s1">&#39;sheets&#39;</span><span class="p">,</span>
         <span class="n">google_api_service_version</span><span class="o">=</span><span class="s1">&#39;v4&#39;</span><span class="p">,</span>
@@ -657,7 +657,7 @@ tasks to retrieve specific information about YouTube videos.</p>
 <h3>Environment variables<a class="headerlink" href="#id2" title="Permalink to this headline">¶</a></h3>
 <p>This example relies on the following variables, which can be passed via OS environment variables.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/2.6.0/airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">YOUTUBE_CONN_ID</span> <span class="o">=</span> <span class="n">getenv</span><span class="p">(</span><span class="s2">&quot;YOUTUBE_CONN_ID&quot;</span><span class="p">,</span> <span class="s2">&quot;google_cloud_default&quot;</span><span class="p">)</span>
 <span class="n">YOUTUBE_CHANNEL_ID</span> <span class="o">=</span> <span class="n">getenv</span><span class="p">(</span><span class="s2">&quot;YOUTUBE_CHANNEL_ID&quot;</span><span class="p">,</span> <span class="s2">&quot;UCSXwxpWZQ7XZ1WL3wqevChA&quot;</span><span class="p">)</span>  <span class="c1"># &quot;Apache Airflow&quot;</span>
 <span class="n">YOUTUBE_VIDEO_PUBLISHED_AFTER</span> <span class="o">=</span> <span class="n">getenv</span><span class="p">(</span><span class="s2">&quot;YOUTUBE_VIDEO_PUBLISHED_AFTER&quot;</span><span class="p">,</span> <span class="s2">&quot;2019-09-25T00:00:00Z&quot;</span><span class="p">)</span>
@@ -676,7 +676,7 @@ tasks to retrieve specific information about YouTube videos.</p>
 (<code class="docutils literal notranslate"><span class="pre">YOUTUBE_VIDEO_PUBLISHED_AFTER</span></code>, <code class="docutils literal notranslate"><span class="pre">YOUTUBE_VIDEO_PUBLISHED_BEFORE</span></code>) on a YouTube channel (<code class="docutils literal notranslate"><span class="pre">YOUTUBE_CHANNEL_ID</span></code>)
 saves the response in S3 and also pushes the data to xcom.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/2.6.0/airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">task_video_ids_to_s3</span> <span class="o">=</span> <span class="n">GoogleApiToS3Operator</span><span class="p">(</span>
         <span class="n">gcp_conn_id</span><span class="o">=</span><span class="n">YOUTUBE_CONN_ID</span><span class="p">,</span>
         <span class="n">google_api_service_name</span><span class="o">=</span><span class="s1">&#39;youtube&#39;</span><span class="p">,</span>
@@ -701,7 +701,7 @@ saves the response in S3 and also pushes the data to xcom.</p>
 <p>From there a <code class="docutils literal notranslate"><span class="pre">BranchPythonOperator</span></code> will extract the xcom data and bring the IDs in a format the next
 request needs it + it also decides whether we need to request any videos or not.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/2.6.0/airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="k">def</span> <span class="nf">_check_and_transform_video_ids</span><span class="p">(</span><span class="n">task_output</span><span class="p">,</span> <span class="n">task_instance</span><span class="p">):</span>
     <span class="n">video_ids_response</span> <span class="o">=</span> <span class="n">task_output</span>
     <span class="n">video_ids</span> <span class="o">=</span> <span class="p">[</span><span class="n">item</span><span class="p">[</span><span class="s1">&#39;id&#39;</span><span class="p">][</span><span class="s1">&#39;videoId&#39;</span><span class="p">]</span> <span class="k">for</span> <span class="n">item</span> <span class="ow">in</span> <span class="n">video_ids_response</span><span class="p">[</span><span class="s1">&#39;items&#39;</span><span class="p">]]</span>
@@ -716,7 +716,7 @@ request needs it + it also decides whether we need to request any videos or not.
 </div>
 </div>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/2.6.0/airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">task_check_and_transform_video_ids</span> <span class="o">=</span> <span class="n">BranchPythonOperator</span><span class="p">(</span>
         <span class="n">python_callable</span><span class="o">=</span><span class="n">_check_and_transform_video_ids</span><span class="p">,</span>
         <span class="n">op_args</span><span class="o">=</span><span class="p">[</span><span class="n">task_video_ids_to_s3</span><span class="o">.</span><span class="n">output</span><span class="p">[</span><span class="n">task_video_ids_to_s3</span><span class="o">.</span><span class="n">google_api_response_via_xcom</span><span class="p">]],</span>
@@ -728,7 +728,7 @@ request needs it + it also decides whether we need to request any videos or not.
 <p>If there are YouTube Video IDs available, it passes over the YouTube IDs to the next request which then gets the
 information (<code class="docutils literal notranslate"><span class="pre">YOUTUBE_VIDEO_FIELDS</span></code>) for the requested videos and saves them in S3 (<code class="docutils literal notranslate"><span class="pre">S3_DESTINATION_KEY</span></code>).</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/2.6.0/airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">task_video_data_to_s3</span> <span class="o">=</span> <span class="n">GoogleApiToS3Operator</span><span class="p">(</span>
         <span class="n">gcp_conn_id</span><span class="o">=</span><span class="n">YOUTUBE_CONN_ID</span><span class="p">,</span>
         <span class="n">google_api_service_name</span><span class="o">=</span><span class="s1">&#39;youtube&#39;</span><span class="p">,</span>
@@ -748,7 +748,7 @@ information (<code class="docutils literal notranslate"><span class="pre">YOUTUB
 </div>
 <p>If not do nothing - and track it.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/2.6.0/airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">task_no_video_ids</span> <span class="o">=</span> <span class="n">DummyOperator</span><span class="p">(</span><span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;no_video_ids&#39;</span><span class="p">)</span>
 </pre></div>
 </div>
diff --git a/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/imap_attachment_to_s3.html b/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/imap_attachment_to_s3.html
index 36003e6293..040a536c65 100644
--- a/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/imap_attachment_to_s3.html
+++ b/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/imap_attachment_to_s3.html
@@ -613,7 +613,7 @@ protocol from a mail server to S3 Bucket.</p>
 <h3>Environment variables<a class="headerlink" href="#environment-variables" title="Permalink to this headline">¶</a></h3>
 <p>These examples rely on the following variables, which can be passed via OS environment variables.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_imap_attachment_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_imap_attachment_to_s3.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_imap_attachment_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/2.6.0/airflow/providers/amazon/aws/example_dags/example_imap_attachment_to_s3.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">IMAP_ATTACHMENT_NAME</span> <span class="o">=</span> <span class="n">getenv</span><span class="p">(</span><span class="s2">&quot;IMAP_ATTACHMENT_NAME&quot;</span><span class="p">,</span> <span class="s2">&quot;test.txt&quot;</span><span class="p">)</span>
 <span class="n">IMAP_MAIL_FOLDER</span> <span class="o">=</span> <span class="n">getenv</span><span class="p">(</span><span class="s2">&quot;IMAP_MAIL_FOLDER&quot;</span><span class="p">,</span> <span class="s2">&quot;INBOX&quot;</span><span class="p">)</span>
 <span class="n">IMAP_MAIL_FILTER</span> <span class="o">=</span> <span class="n">getenv</span><span class="p">(</span><span class="s2">&quot;IMAP_MAIL_FILTER&quot;</span><span class="p">,</span> <span class="s2">&quot;All&quot;</span><span class="p">)</span>
@@ -625,7 +625,7 @@ protocol from a mail server to S3 Bucket.</p>
 <div class="section" id="transfer-mail-attachments-via-imap-to-s3">
 <h3>Transfer Mail Attachments via IMAP to S3<a class="headerlink" href="#transfer-mail-attachments-via-imap-to-s3" title="Permalink to this headline">¶</a></h3>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_imap_attachment_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_imap_attachment_to_s3.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_imap_attachment_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/2.6.0/airflow/providers/amazon/aws/example_dags/example_imap_attachment_to_s3.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">task_transfer_imap_attachment_to_s3</span> <span class="o">=</span> <span class="n">ImapAttachmentToS3Operator</span><span class="p">(</span>
         <span class="n">imap_attachment_name</span><span class="o">=</span><span class="n">IMAP_ATTACHMENT_NAME</span><span class="p">,</span>
         <span class="n">s3_key</span><span class="o">=</span><span class="n">S3_DESTINATION_KEY</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/redshift_sql.html b/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/redshift_sql.html
index aab9a0191c..e4da0e60dc 100644
--- a/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/redshift_sql.html
+++ b/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/redshift_sql.html
@@ -620,7 +620,7 @@ to execute statements against an Amazon Redshift cluster.</p>
 <h3>Create a table<a class="headerlink" href="#create-a-table" title="Permalink to this headline">¶</a></h3>
 <p>In the following code we are creating a table called &quot;fruit&quot;.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_redshift.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/2.6.0/airflow/providers/amazon/aws/example_dags/example_redshift.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">setup__task_create_table</span> <span class="o">=</span> <span class="n">RedshiftSQLOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;setup__create_table&#39;</span><span class="p">,</span>
         <span class="n">sql</span><span class="o">=</span><span class="s2">&quot;&quot;&quot;</span>
@@ -639,7 +639,7 @@ to execute statements against an Amazon Redshift cluster.</p>
 <h3>Insert data into a table<a class="headerlink" href="#insert-data-into-a-table" title="Permalink to this headline">¶</a></h3>
 <p>In the following code we insert a few sample rows into the &quot;fruit&quot; table.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_redshift.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/2.6.0/airflow/providers/amazon/aws/example_dags/example_redshift.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">task_insert_data</span> <span class="o">=</span> <span class="n">RedshiftSQLOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;task_insert_data&#39;</span><span class="p">,</span>
         <span class="n">sql</span><span class="o">=</span><span class="p">[</span>
@@ -659,7 +659,7 @@ to execute statements against an Amazon Redshift cluster.</p>
 <h3>Fetching records from a table<a class="headerlink" href="#fetching-records-from-a-table" title="Permalink to this headline">¶</a></h3>
 <p>Creating a new table, &quot;more_fruit&quot; from the &quot;fruit&quot; table.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_redshift.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/2.6.0/airflow/providers/amazon/aws/example_dags/example_redshift.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">task_get_all_table_data</span> <span class="o">=</span> <span class="n">RedshiftSQLOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;task_get_all_table_data&#39;</span><span class="p">,</span> <span class="n">sql</span><span class="o">=</span><span class="s2">&quot;CREATE TABLE more_fruit AS SELECT * FROM fruit;&quot;</span>
     <span class="p">)</span>
@@ -672,7 +672,7 @@ to execute statements against an Amazon Redshift cluster.</p>
 <p>RedshiftSQLOperator supports the <code class="docutils literal notranslate"><span class="pre">parameters</span></code> attribute which allows us to dynamically pass
 parameters into SQL statements.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_redshift.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/2.6.0/airflow/providers/amazon/aws/example_dags/example_redshift.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">task_get_with_filter</span> <span class="o">=</span> <span class="n">RedshiftSQLOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;task_get_with_filter&#39;</span><span class="p">,</span>
         <span class="n">sql</span><span class="o">=</span><span class="s2">&quot;CREATE TABLE filtered_fruit AS SELECT * FROM fruit WHERE color = &#39;{{ params.color }}&#39;;&quot;</span><span class="p">,</span>
@@ -687,7 +687,7 @@ parameters into SQL statements.</p>
 <h2><a class="toc-backref" href="#id3">The complete RedshiftSQLOperator DAG</a><a class="headerlink" href="#the-complete-redshiftsqloperator-dag" title="Permalink to this headline">¶</a></h2>
 <p>All together, here is our DAG:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_redshift.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/2.6.0/airflow/providers/amazon/aws/example_dags/example_redshift.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="kn">from</span> <span class="nn">airflow</span> <span class="kn">import</span> <span class="n">DAG</span>
 <span class="kn">from</span> <span class="nn">airflow.providers.amazon.aws.operators.redshift_sql</span> <span class="kn">import</span> <span class="n">RedshiftSQLOperator</span>
 
diff --git a/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/s3.html b/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/s3.html
index 9b9ac8aa5b..8f6624df58 100644
--- a/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/s3.html
+++ b/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/s3.html
@@ -651,7 +651,7 @@ new S3 bucket with a given bucket name then delete it.</p>
 <h3>Defining tasks<a class="headerlink" href="#defining-tasks" title="Permalink to this headline">¶</a></h3>
 <p>In the following code we create a new bucket, add keys, and then delete the bucket.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3_bucket.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_s3_bucket.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3_bucket.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/2.6.0/airflow/providers/amazon/aws/example_dags/example_s3_bucket.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="k">with</span> <span class="n">DAG</span><span class="p">(</span>
     <span class="n">dag_id</span><span class="o">=</span><span class="s1">&#39;s3_bucket_dag&#39;</span><span class="p">,</span>
     <span class="n">schedule_interval</span><span class="o">=</span><span class="kc">None</span><span class="p">,</span>
@@ -686,7 +686,7 @@ and <code class="docutils literal notranslate"><span class="pre">S3PutBucketTagg
 <h3>Defining tasks<a class="headerlink" href="#id2" title="Permalink to this headline">¶</a></h3>
 <p>In the following code we create a new S3 bucket, apply tagging, get tagging, delete tagging, then delete the bucket.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3_bucket_tagging.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_s3_bucket_tagging.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3_bucket_tagging.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/2.6.0/airflow/providers/amazon/aws/example_dags/example_s3_bucket_tagging.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="k">with</span> <span class="n">DAG</span><span class="p">(</span>
     <span class="n">dag_id</span><span class="o">=</span><span class="s1">&#39;s3_bucket_tagging_dag&#39;</span><span class="p">,</span>
     <span class="n">schedule_interval</span><span class="o">=</span><span class="kc">None</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/s3_to_redshift.html b/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/s3_to_redshift.html
index 80ad5749db..c7eae80b78 100644
--- a/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/s3_to_redshift.html
+++ b/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/s3_to_redshift.html
@@ -616,7 +616,7 @@ in action.</p>
 <h3>Environment variables<a class="headerlink" href="#environment-variables" title="Permalink to this headline">¶</a></h3>
 <p>This example relies on the following variables, which can be passed via OS environment variables.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3_to_redshift.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_s3_to_redshift.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3_to_redshift.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/2.6.0/airflow/providers/amazon/aws/example_dags/example_s3_to_redshift.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">S3_BUCKET</span> <span class="o">=</span> <span class="n">getenv</span><span class="p">(</span><span class="s2">&quot;S3_BUCKET&quot;</span><span class="p">,</span> <span class="s2">&quot;test-bucket&quot;</span><span class="p">)</span>
 <span class="n">S3_KEY</span> <span class="o">=</span> <span class="n">getenv</span><span class="p">(</span><span class="s2">&quot;S3_KEY&quot;</span><span class="p">,</span> <span class="s2">&quot;key&quot;</span><span class="p">)</span>
 <span class="n">REDSHIFT_TABLE</span> <span class="o">=</span> <span class="n">getenv</span><span class="p">(</span><span class="s2">&quot;REDSHIFT_TABLE&quot;</span><span class="p">,</span> <span class="s2">&quot;test_table&quot;</span><span class="p">)</span>
@@ -630,7 +630,7 @@ in action.</p>
 <p>In the following code we are copying the S3 key <code class="docutils literal notranslate"><span class="pre">s3://{S3_BUCKET}/{S3_KEY}/{REDSHIFT_TABLE}</span></code> into the Redshift table
 <code class="docutils literal notranslate"><span class="pre">PUBLIC.{REDSHIFT_TABLE}</span></code>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3_to_redshift.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_s3_to_redshift.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3_to_redshift.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/2.6.0/airflow/providers/amazon/aws/example_dags/example_s3_to_redshift.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">task_transfer_s3_to_redshift</span> <span class="o">=</span> <span class="n">S3ToRedshiftOperator</span><span class="p">(</span>
         <span class="n">s3_bucket</span><span class="o">=</span><span class="n">S3_BUCKET</span><span class="p">,</span>
         <span class="n">s3_key</span><span class="o">=</span><span class="n">S3_KEY</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/salesforce_to_s3.html b/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/salesforce_to_s3.html
index b67a89b55e..5d53f52b1f 100644
--- a/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/salesforce_to_s3.html
+++ b/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/salesforce_to_s3.html
@@ -605,7 +605,7 @@ are initially written to a local, temporary directory and then uploaded to an S3
 <p>The following example demonstrates a use case of extracting customer data from a Salesforce
 instance and upload to a &quot;landing&quot; bucket in S3.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_salesforce_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_salesforce_to_s3.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_salesforce_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/2.6.0/airflow/providers/amazon/aws/example_dags/example_salesforce_to_s3.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">upload_salesforce_data_to_s3_landing</span> <span class="o">=</span> <span class="n">SalesforceToS3Operator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;upload_salesforce_data_to_s3&quot;</span><span class="p">,</span>
         <span class="n">salesforce_query</span><span class="o">=</span><span class="s2">&quot;SELECT Id, Name, Company, Phone, Email, LastModifiedDate, IsActive FROM Customers&quot;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/sqs_publish.html b/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/sqs_publish.html
index 39a525e1cf..a57e123a6b 100644
--- a/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/sqs_publish.html
+++ b/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/sqs_publish.html
@@ -627,7 +627,7 @@ to publish a message to Amazon Simple Queue Service (SQS).</p>
 <p>In the following example, the task &quot;publish_to_queue&quot; publishes a message containing
 the task instance and the execution date to a queue named <code class="docutils literal notranslate"><span class="pre">Airflow-Example-Queue</span></code>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_sqs.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_sqs.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_sqs.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/2.6.0/airflow/providers/amazon/aws/example_dags/example_sqs.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>
     <span class="c1"># Using a task-decorated function to create an SQS queue</span>
     <span class="n">create_queue</span> <span class="o">=</span> <span class="n">create_queue_fn</span><span class="p">()</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/transfer/glacier_to_gcs.html b/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/transfer/glacier_to_gcs.html
index a3ed053bb9..6a52e52895 100644
--- a/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/transfer/glacier_to_gcs.html
+++ b/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/transfer/glacier_to_gcs.html
@@ -610,7 +610,7 @@ Transferring big files may not work well.</p>
 <a class="reference internal" href="../../_api/airflow/providers/amazon/aws/transfers/glacier_to_gcs/index.html#airflow.providers.amazon.aws.transfers.glacier_to_gcs.GlacierToGCSOperator" title="airflow.providers.amazon.aws.transfers.glacier_to_gcs.GlacierToGCSOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">GlacierToGCSOperator</span></code></a></p>
 <p>Example usage:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/2.6.0/airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">transfer_archive_to_gcs</span> <span class="o">=</span> <span class="n">GlacierToGCSOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;transfer_archive_to_gcs&quot;</span><span class="p">,</span>
     <span class="n">vault_name</span><span class="o">=</span><span class="n">VAULT_NAME</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/transfer/s3_to_sftp.html b/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/transfer/s3_to_sftp.html
index 103c174840..4615692921 100644
--- a/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/transfer/s3_to_sftp.html
+++ b/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/transfer/s3_to_sftp.html
@@ -605,7 +605,7 @@ For more information about the service visits <a class="reference external" href
 <a class="reference internal" href="../../_api/airflow/providers/amazon/aws/transfers/s3_to_sftp/index.html#airflow.providers.amazon.aws.transfers.s3_to_sftp.S3ToSFTPOperator" title="airflow.providers.amazon.aws.transfers.s3_to_sftp.S3ToSFTPOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">S3ToSFTPOperator</span></code></a></p>
 <p>Example usage:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3_to_sftp.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/amazon/aws/example_dags/example_s3_to_sftp.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3_to_sftp.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/2.6.0/airflow/providers/amazon/aws/example_dags/example_s3_to_sftp.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">create_s3_to_sftp_job</span> <span class="o">=</span> <span class="n">S3ToSFTPOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;create_to_s3_sftp_job&quot;</span><span class="p">,</span>
     <span class="n">sftp_conn_id</span><span class="o">=</span><span class="s2">&quot;sftp_conn_id&quot;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/transfer/sftp_to_s3.html b/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/transfer/sftp_to_s3.html
index 4ac95a657b..ae8d43c54c 100644
--- a/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/transfer/sftp_to_s3.html
+++ b/docs-archive/apache-airflow-providers-amazon/2.6.0/operators/transfer/sftp_to_s3.html
@@ -604,7 +604,7 @@
 <a class="reference internal" href="../../_api/airflow/providers/amazon/aws/transfers/sftp_to_s3/index.html#airflow.providers.amazon.aws.transfers.sftp_to_s3.SFTPToS3Operator" title="airflow.providers.amazon.aws.transfers.sftp_to_s3.SFTPToS3Operator"><code class="xref py py-class docutils literal notranslate"><span class="pre">SFTPToS3Operator</span></code></a></p>
 <p>Example usage:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_sftp_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/amazon/aws/example_dags/example_sftp_to_s3.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_sftp_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/2.6.0/airflow/providers/amazon/aws/example_dags/example_sftp_to_s3.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">create_sftp_to_s3_job</span> <span class="o">=</span> <span class="n">SFTPToS3Operator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;create_sftp_to_s3_job&quot;</span><span class="p">,</span>
     <span class="n">sftp_conn_id</span><span class="o">=</span><span class="s2">&quot;sftp_conn_id&quot;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/athena.html b/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/athena.html
index 4a06ee9c16..c8dd41653c 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/athena.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/athena.html
@@ -630,7 +630,7 @@ created in an S3 bucket and populated with SAMPLE_DATA.  The example waits for t
 to complete and then drops the created table and deletes the sample CSV file in the S3
 bucket.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_athena.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_athena.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_athena.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.0.0/airflow/providers/amazon/aws/example_dags/example_athena.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>
     <span class="c1"># Using a task-decorated function to create a CSV file in S3</span>
     <span class="n">add_sample_data_to_s3</span> <span class="o">=</span> <span class="n">add_sample_data_to_s3</span><span class="p">()</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/datasync.html b/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/datasync.html
index b07847192b..d6627756a1 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/datasync.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/datasync.html
@@ -622,13 +622,13 @@ and an <em>AWS DataSync Task</em> (identified by a TaskArn on AWS).</p>
 <h3>Environment variables<a class="headerlink" href="#environment-variables" title="Permalink to this headline">¶</a></h3>
 <p>These examples rely on the following variables, which can be passed via OS environment variables.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_datasync_1.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_datasync_1.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_datasync_1.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.0.0/airflow/providers/amazon/aws/example_dags/example_datasync_1.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">TASK_ARN</span> <span class="o">=</span> <span class="n">getenv</span><span class="p">(</span><span class="s2">&quot;TASK_ARN&quot;</span><span class="p">,</span> <span class="s2">&quot;my_aws_datasync_task_arn&quot;</span><span class="p">)</span>
 </pre></div>
 </div>
 </div>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_datasync_1.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_datasync_1.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_datasync_1.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.0.0/airflow/providers/amazon/aws/example_dags/example_datasync_1.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">SOURCE_LOCATION_URI</span> <span class="o">=</span> <span class="n">getenv</span><span class="p">(</span><span class="s2">&quot;SOURCE_LOCATION_URI&quot;</span><span class="p">,</span> <span class="s2">&quot;smb://hostname/directory/&quot;</span><span class="p">)</span>
 
 <span class="n">DESTINATION_LOCATION_URI</span> <span class="o">=</span> <span class="n">getenv</span><span class="p">(</span><span class="s2">&quot;DESTINATION_LOCATION_URI&quot;</span><span class="p">,</span> <span class="s2">&quot;s3://mybucket/prefix&quot;</span><span class="p">)</span>
@@ -641,7 +641,7 @@ and an <em>AWS DataSync Task</em> (identified by a TaskArn on AWS).</p>
 <p>The <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/datasync/index.html#airflow.providers.amazon.aws.operators.datasync.DataSyncOperator" title="airflow.providers.amazon.aws.operators.datasync.DataSyncOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">DataSyncOperator</span></code></a> can execute a specific
 TaskArn by specifying the <code class="docutils literal notranslate"><span class="pre">task_arn</span></code> parameter. This is useful when you know the TaskArn you want to execute.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_datasync_1.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_datasync_1.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_datasync_1.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.0.0/airflow/providers/amazon/aws/example_dags/example_datasync_1.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">datasync_task_1</span> <span class="o">=</span> <span class="n">DataSyncOperator</span><span class="p">(</span><span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;datasync_task_1&quot;</span><span class="p">,</span> <span class="n">task_arn</span><span class="o">=</span><span class="n">TASK_ARN</span><span class="p">)</span>
 </pre></div>
 </div>
@@ -657,7 +657,7 @@ can iterate all DataSync Tasks for their source and destination LocationArns. Th
 each LocationArn to see if its the URIs match the desired source / destination URI.</p>
 <p>To perform a search based on the Location URIs, define the task as follows</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_datasync_1.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_datasync_1.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_datasync_1.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.0.0/airflow/providers/amazon/aws/example_dags/example_datasync_1.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">datasync_task_2</span> <span class="o">=</span> <span class="n">DataSyncOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;datasync_task_2&quot;</span><span class="p">,</span>
         <span class="n">source_location_uri</span><span class="o">=</span><span class="n">SOURCE_LOCATION_URI</span><span class="p">,</span>
@@ -685,7 +685,7 @@ Finally, delete it.</p>
 <h3>Environment variables<a class="headerlink" href="#id2" title="Permalink to this headline">¶</a></h3>
 <p>This example relies on the following variables, which can be passed via OS environment variables.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_datasync_2.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_datasync_2.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_datasync_2.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.0.0/airflow/providers/amazon/aws/example_dags/example_datasync_2.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">SOURCE_LOCATION_URI</span> <span class="o">=</span> <span class="n">getenv</span><span class="p">(</span><span class="s2">&quot;SOURCE_LOCATION_URI&quot;</span><span class="p">,</span> <span class="s2">&quot;smb://hostname/directory/&quot;</span><span class="p">)</span>
 
 <span class="n">DESTINATION_LOCATION_URI</span> <span class="o">=</span> <span class="n">getenv</span><span class="p">(</span><span class="s2">&quot;DESTINATION_LOCATION_URI&quot;</span><span class="p">,</span> <span class="s2">&quot;s3://mybucket/prefix&quot;</span><span class="p">)</span>
@@ -723,7 +723,7 @@ as before but with some extra arguments.</p>
 and/or Locations if no suitable existing Task was found. If these are left to their default value (None)
 then no create will be attempted.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_datasync_2.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_datasync_2.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_datasync_2.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.0.0/airflow/providers/amazon/aws/example_dags/example_datasync_2.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">datasync_task</span> <span class="o">=</span> <span class="n">DataSyncOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;datasync_task&quot;</span><span class="p">,</span>
         <span class="n">source_location_uri</span><span class="o">=</span><span class="n">SOURCE_LOCATION_URI</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/dms.html b/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/dms.html
index 5b0923976b..6024385da7 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/dms.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/dms.html
@@ -646,7 +646,7 @@ to be completed, and then delete it.</p>
 <h3>Defining tasks<a class="headerlink" href="#defining-tasks" title="Permalink to this headline">¶</a></h3>
 <p>In the following code we create a new replication task, start it, wait for it to be completed and then delete it.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_dms_full_load_task.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_dms_full_load_task.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_dms_full_load_task.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.0.0/airflow/providers/amazon/aws/example_dags/example_dms_full_load_task.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">create_task</span> <span class="o">=</span> <span class="n">DmsCreateTaskOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;create_task&#39;</span><span class="p">,</span>
         <span class="n">replication_task_id</span><span class="o">=</span><span class="n">REPLICATION_TASK_ID</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/ecs.html b/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/ecs.html
index 168b0857b8..3a3bd523f9 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/ecs.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/ecs.html
@@ -622,7 +622,7 @@ the task &quot;hello_world&quot; runs <code class="docutils literal notranslate"
 It overrides the command in the <code class="docutils literal notranslate"><span class="pre">hello-world-container</span></code> container.</p>
 <p>Before using EcsOperator, <em>cluster</em> and <em>task definition</em> need to be created.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_ecs_fargate.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_ecs_fargate.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_ecs_fargate.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.0.0/airflow/providers/amazon/aws/example_dags/example_ecs_fargate.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">hello_world</span> <span class="o">=</span> <span class="n">EcsOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;hello_world&quot;</span><span class="p">,</span>
     <span class="n">dag</span><span class="o">=</span><span class="n">dag</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/eks.html b/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/eks.html
index 23507c8327..1431fc242c 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/eks.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/eks.html
@@ -614,7 +614,7 @@ and management of containerized applications.</p>
 </dd>
 </dl>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.0.0/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="c1"># Create an Amazon EKS Cluster control plane without attaching compute service.</span>
     <span class="n">create_cluster</span> <span class="o">=</span> <span class="n">EksCreateClusterOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;create_eks_cluster&#39;</span><span class="p">,</span>
@@ -631,7 +631,7 @@ and management of containerized applications.</p>
 <p>To delete an existing Amazon EKS Cluster you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/eks/index.html#airflow.providers.amazon.aws.operators.eks.EksDeleteClusterOperator" title="airflow.providers.amazon.aws.operators.eks.EksDeleteClusterOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">EksDeleteClusterOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.0.0/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">delete_cluster</span> <span class="o">=</span> <span class="n">EksDeleteClusterOperator</span><span class="p">(</span><span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;delete_eks_cluster&#39;</span><span class="p">)</span>
 </pre></div>
 </div>
@@ -642,7 +642,7 @@ attempt to delete any attached resources first.</p>
 </dd>
 </dl>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroup_in_one_step.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroup_in_one_step.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroup_in_one_step.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.0.0/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroup_in_one_step.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="c1"># An Amazon EKS cluster can not be deleted with attached resources such as nodegroups or Fargate profiles.</span>
     <span class="c1"># Setting the `force` to `True` will delete any attached resources before deleting the cluster.</span>
     <span class="n">delete_all</span> <span class="o">=</span> <span class="n">EksDeleteClusterOperator</span><span class="p">(</span><span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;delete_nodegroup_and_cluster&#39;</span><span class="p">,</span> <span class="n">force_delete_compute</span><span class="o">=</span><span class="kc">True</span><span class="p">)</span>
@@ -664,7 +664,7 @@ attempt to delete any attached resources first.</p>
 </dd>
 </dl>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.0.0/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">create_nodegroup</span> <span class="o">=</span> <span class="n">EksCreateNodegroupOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;create_eks_nodegroup&#39;</span><span class="p">,</span>
         <span class="n">nodegroup_name</span><span class="o">=</span><span class="n">NODEGROUP_NAME</span><span class="p">,</span>
@@ -680,7 +680,7 @@ attempt to delete any attached resources first.</p>
 <p>To delete an existing Amazon EKS Managed Nodegroup you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/eks/index.html#airflow.providers.amazon.aws.operators.eks.EksDeleteNodegroupOperator" title="airflow.providers.amazon.aws.operators.eks.EksDeleteNodegroupOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">EksDeleteNodegroupOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.0.0/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">delete_nodegroup</span> <span class="o">=</span> <span class="n">EksDeleteNodegroupOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;delete_eks_nodegroup&#39;</span><span class="p">,</span> <span class="n">nodegroup_name</span><span class="o">=</span><span class="n">NODEGROUP_NAME</span>
     <span class="p">)</span>
@@ -702,7 +702,7 @@ attempt to delete any attached resources first.</p>
 </dd>
 </dl>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroup_in_one_step.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroup_in_one_step.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroup_in_one_step.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.0.0/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroup_in_one_step.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="c1"># Create an Amazon EKS cluster control plane and an EKS nodegroup compute platform in one step.</span>
     <span class="n">create_cluster_and_nodegroup</span> <span class="o">=</span> <span class="n">EksCreateClusterOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;create_eks_cluster_and_nodegroup&#39;</span><span class="p">,</span>
@@ -732,7 +732,7 @@ attempt to delete any attached resources first.</p>
 </dd>
 </dl>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_in_one_step.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_in_one_step.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_in_one_step.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.0.0/airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_in_one_step.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="c1"># Create an Amazon EKS cluster control plane and an AWS Fargate compute platform in one step.</span>
     <span class="n">create_cluster_and_fargate_profile</span> <span class="o">=</span> <span class="n">EksCreateClusterOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;create_eks_cluster_and_fargate_profile&#39;</span><span class="p">,</span>
@@ -761,7 +761,7 @@ attempt to delete any attached resources first.</p>
 </dd>
 </dl>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_profile.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_profile.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_profile.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.0.0/airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_profile.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">create_fargate_profile</span> <span class="o">=</span> <span class="n">EksCreateFargateProfileOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;create_eks_fargate_profile&#39;</span><span class="p">,</span>
         <span class="n">pod_execution_role_arn</span><span class="o">=</span><span class="n">ROLE_ARN</span><span class="p">,</span>
@@ -777,7 +777,7 @@ attempt to delete any attached resources first.</p>
 <p>To delete an existing AWS Fargate Profile you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/eks/index.html#airflow.providers.amazon.aws.operators.eks.EksDeleteFargateProfileOperator" title="airflow.providers.amazon.aws.operators.eks.EksDeleteFargateProfileOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">EksDeleteFargateProfileOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_profile.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_profile.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_profile.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.0.0/airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_profile.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">delete_fargate_profile</span> <span class="o">=</span> <span class="n">EksDeleteFargateProfileOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;delete_eks_fargate_profile&#39;</span><span class="p">,</span>
         <span class="n">fargate_profile_name</span><span class="o">=</span><span class="n">FARGATE_PROFILE_NAME</span><span class="p">,</span>
@@ -793,7 +793,7 @@ attempt to delete any attached resources first.</p>
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/eks/index.html#airflow.providers.amazon.aws.operators.eks.EksPodOperator" title="airflow.providers.amazon.aws.operators.eks.EksPodOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">EksPodOperator</span></code></a>.</p>
 <p>Note: An Amazon EKS Cluster with underlying compute infrastructure is required.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.0.0/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">start_pod</span> <span class="o">=</span> <span class="n">EksPodOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;run_pod&quot;</span><span class="p">,</span>
         <span class="n">pod_name</span><span class="o">=</span><span class="s2">&quot;run_pod&quot;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/emr.html b/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/emr.html
index eb5fc0717c..b884d6316f 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/emr.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/emr.html
@@ -651,7 +651,7 @@ with <code class="docutils literal notranslate"><span class="pre">EmrJobFlowSens
 <h3>JobFlow configuration<a class="headerlink" href="#jobflow-configuration" title="Permalink to this headline">¶</a></h3>
 <p>To create a job flow at EMR, you need to specify the configuration for the EMR cluster:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_job_flow_automatic_steps.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_emr_job_flow_automatic_steps.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_job_flow_automatic_steps.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.0.0/airflow/providers/amazon/aws/example_dags/example_emr_job_flow_automatic_steps.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">SPARK_STEPS</span> <span class="o">=</span> <span class="p">[</span>
     <span class="p">{</span>
         <span class="s1">&#39;Name&#39;</span><span class="p">:</span> <span class="s1">&#39;calculate_pi&#39;</span><span class="p">,</span>
@@ -699,7 +699,7 @@ The config <code class="docutils literal notranslate"><span class="pre">'KeepJob
 <h3>Defining tasks<a class="headerlink" href="#defining-tasks" title="Permalink to this headline">¶</a></h3>
 <p>In the following code we are creating a new job flow, add a step, monitor the step, and then terminate the cluster.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_job_flow_automatic_steps.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_emr_job_flow_automatic_steps.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_job_flow_automatic_steps.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.0.0/airflow/providers/amazon/aws/example_dags/example_emr_job_flow_automatic_steps.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">job_flow_creator</span> <span class="o">=</span> <span class="n">EmrCreateJobFlowOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;create_job_flow&#39;</span><span class="p">,</span>
         <span class="n">job_flow_overrides</span><span class="o">=</span><span class="n">JOB_FLOW_OVERRIDES</span><span class="p">,</span>
@@ -728,7 +728,7 @@ Also, we would not specify <code class="docutils literal notranslate"><span clas
 <h3>Defining tasks<a class="headerlink" href="#id3" title="Permalink to this headline">¶</a></h3>
 <p>Here is the task definitions for our DAG.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_job_flow_manual_steps.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_emr_job_flow_manual_steps.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_job_flow_manual_steps.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.0.0/airflow/providers/amazon/aws/example_dags/example_emr_job_flow_manual_steps.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">cluster_creator</span> <span class="o">=</span> <span class="n">EmrCreateJobFlowOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;create_job_flow&#39;</span><span class="p">,</span>
         <span class="n">job_flow_overrides</span><span class="o">=</span><span class="n">JOB_FLOW_OVERRIDES</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/emr_eks.html b/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/emr_eks.html
index f0e655b2da..d18cc9b856 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/emr_eks.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/emr_eks.html
@@ -627,7 +627,7 @@
 <p>You can also optionally provide configuration overrides such as Spark, Hive, or Log4j properties as well as monitoring configuration that sends Spark logs to S3 or Cloudwatch.</p>
 <p>In the example, we show how to add an <code class="docutils literal notranslate"><span class="pre">applicationConfiguration</span></code> to use the AWS Glue data catalog and <code class="docutils literal notranslate"><span class="pre">monitoringConfiguration</span></code> to send logs to the <code class="docutils literal notranslate"><span class="pre">/aws/emr-eks-spark</span></code> log group in CloudWatch. Refer to the <a class="reference external" href="https://docs.aws.amazon.com [...]
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_eks_job.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_emr_eks_job.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_eks_job.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.0.0/airflow/providers/amazon/aws/example_dags/example_emr_eks_job.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">JOB_DRIVER_ARG</span> <span class="o">=</span> <span class="p">{</span>
     <span class="s2">&quot;sparkSubmitJobDriver&quot;</span><span class="p">:</span> <span class="p">{</span>
         <span class="s2">&quot;entryPoint&quot;</span><span class="p">:</span> <span class="s2">&quot;local:///usr/lib/spark/examples/src/main/python/pi.py&quot;</span><span class="p">,</span>
@@ -656,7 +656,7 @@
 </div>
 <p>We pass the <code class="docutils literal notranslate"><span class="pre">virtual_cluster_id</span></code> and <code class="docutils literal notranslate"><span class="pre">execution_role_arn</span></code> values as operator parameters, but you can store them in a connection or provide them in the DAG. Your AWS region should be defined either in the <code class="docutils literal notranslate"><span class="pre">aws_default</span></code> connection as <code class="docutils literal notransl [...]
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_eks_job.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_emr_eks_job.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_eks_job.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.0.0/airflow/providers/amazon/aws/example_dags/example_emr_eks_job.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">job_starter</span> <span class="o">=</span> <span class="n">EmrContainerOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;start_job&quot;</span><span class="p">,</span>
     <span class="n">virtual_cluster_id</span><span class="o">=</span><span class="n">VIRTUAL_CLUSTER_ID</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/glacier.html b/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/glacier.html
index 5b52de53f9..f6061ed25e 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/glacier.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/glacier.html
@@ -604,7 +604,7 @@ The operation returns dictionary of information related to the initiated job lik
 <code class="xref py py-class docutils literal notranslate"><span class="pre">GlacierCreateJobOperator</span></code></p>
 <p>Example usage:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.0.0/airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">create_glacier_job</span> <span class="o">=</span> <span class="n">GlacierCreateJobOperator</span><span class="p">(</span><span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;create_glacier_job&quot;</span><span class="p">,</span> <span class="n">vault_name</span><span class="o">=</span><span class="n">VAULT_NAME</span><span class="p">)</span>
 <span class="n">JOB_ID</span> <span class="o">=</span> <span class="s1">&#39;{{ task_instance.xcom_pull(&quot;create_glacier_job&quot;)[&quot;jobId&quot;] }}&#39;</span>
 </pre></div>
@@ -629,7 +629,7 @@ Which means that every next request will be sent every 20 minutes.</p>
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/sensors/glacier/index.html#airflow.providers.amazon.aws.sensors.glacier.GlacierJobOperationSensor" title="airflow.providers.amazon.aws.sensors.glacier.GlacierJobOperationSensor"><code class="xref py py-class docutils literal notranslate"><span class="pre">GlacierJobOperationSensor</span></code></a></p>
 <p>Example usage:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.0.0/airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">transfer_archive_to_gcs</span> <span class="o">=</span> <span class="n">GlacierToGCSOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;transfer_archive_to_gcs&quot;</span><span class="p">,</span>
     <span class="n">vault_name</span><span class="o">=</span><span class="n">VAULT_NAME</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/google_api_to_s3_transfer.html b/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/google_api_to_s3_transfer.html
index 0b8f5b8afe..938db7c9cc 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/google_api_to_s3_transfer.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/google_api_to_s3_transfer.html
@@ -617,7 +617,7 @@ in action.</p>
 <h3>Environment variables<a class="headerlink" href="#environment-variables" title="Permalink to this headline">¶</a></h3>
 <p>These examples rely on the following variables, which can be passed via OS environment variables.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_basic.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_basic.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_basic.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.0.0/airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_basic.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">GOOGLE_SHEET_ID</span> <span class="o">=</span> <span class="n">getenv</span><span class="p">(</span><span class="s2">&quot;GOOGLE_SHEET_ID&quot;</span><span class="p">)</span>
 <span class="n">GOOGLE_SHEET_RANGE</span> <span class="o">=</span> <span class="n">getenv</span><span class="p">(</span><span class="s2">&quot;GOOGLE_SHEET_RANGE&quot;</span><span class="p">)</span>
 <span class="n">S3_DESTINATION_KEY</span> <span class="o">=</span> <span class="n">getenv</span><span class="p">(</span><span class="s2">&quot;S3_DESTINATION_KEY&quot;</span><span class="p">,</span> <span class="s2">&quot;s3://bucket/key.json&quot;</span><span class="p">)</span>
@@ -630,7 +630,7 @@ in action.</p>
 <h3>Get Google Sheets Sheet Values<a class="headerlink" href="#get-google-sheets-sheet-values" title="Permalink to this headline">¶</a></h3>
 <p>In the following code we are requesting a Google Sheet via the <code class="docutils literal notranslate"><span class="pre">sheets.spreadsheets.values.get</span></code> endpoint.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_basic.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_basic.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_basic.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.0.0/airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_basic.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">task_google_sheets_values_to_s3</span> <span class="o">=</span> <span class="n">GoogleApiToS3Operator</span><span class="p">(</span>
         <span class="n">google_api_service_name</span><span class="o">=</span><span class="s1">&#39;sheets&#39;</span><span class="p">,</span>
         <span class="n">google_api_service_version</span><span class="o">=</span><span class="s1">&#39;v4&#39;</span><span class="p">,</span>
@@ -657,7 +657,7 @@ tasks to retrieve specific information about YouTube videos.</p>
 <h3>Environment variables<a class="headerlink" href="#id2" title="Permalink to this headline">¶</a></h3>
 <p>This example relies on the following variables, which can be passed via OS environment variables.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.0.0/airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">YOUTUBE_CONN_ID</span> <span class="o">=</span> <span class="n">getenv</span><span class="p">(</span><span class="s2">&quot;YOUTUBE_CONN_ID&quot;</span><span class="p">,</span> <span class="s2">&quot;google_cloud_default&quot;</span><span class="p">)</span>
 <span class="n">YOUTUBE_CHANNEL_ID</span> <span class="o">=</span> <span class="n">getenv</span><span class="p">(</span><span class="s2">&quot;YOUTUBE_CHANNEL_ID&quot;</span><span class="p">,</span> <span class="s2">&quot;UCSXwxpWZQ7XZ1WL3wqevChA&quot;</span><span class="p">)</span>  <span class="c1"># &quot;Apache Airflow&quot;</span>
 <span class="n">YOUTUBE_VIDEO_PUBLISHED_AFTER</span> <span class="o">=</span> <span class="n">getenv</span><span class="p">(</span><span class="s2">&quot;YOUTUBE_VIDEO_PUBLISHED_AFTER&quot;</span><span class="p">,</span> <span class="s2">&quot;2019-09-25T00:00:00Z&quot;</span><span class="p">)</span>
@@ -676,7 +676,7 @@ tasks to retrieve specific information about YouTube videos.</p>
 (<code class="docutils literal notranslate"><span class="pre">YOUTUBE_VIDEO_PUBLISHED_AFTER</span></code>, <code class="docutils literal notranslate"><span class="pre">YOUTUBE_VIDEO_PUBLISHED_BEFORE</span></code>) on a YouTube channel (<code class="docutils literal notranslate"><span class="pre">YOUTUBE_CHANNEL_ID</span></code>)
 saves the response in S3 and also pushes the data to xcom.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.0.0/airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">task_video_ids_to_s3</span> <span class="o">=</span> <span class="n">GoogleApiToS3Operator</span><span class="p">(</span>
         <span class="n">gcp_conn_id</span><span class="o">=</span><span class="n">YOUTUBE_CONN_ID</span><span class="p">,</span>
         <span class="n">google_api_service_name</span><span class="o">=</span><span class="s1">&#39;youtube&#39;</span><span class="p">,</span>
@@ -701,7 +701,7 @@ saves the response in S3 and also pushes the data to xcom.</p>
 <p>From there a <code class="docutils literal notranslate"><span class="pre">BranchPythonOperator</span></code> will extract the xcom data and bring the IDs in a format the next
 request needs it + it also decides whether we need to request any videos or not.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.0.0/airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="k">def</span> <span class="nf">_check_and_transform_video_ids</span><span class="p">(</span><span class="n">task_output</span><span class="p">,</span> <span class="n">task_instance</span><span class="p">):</span>
     <span class="n">video_ids_response</span> <span class="o">=</span> <span class="n">task_output</span>
     <span class="n">video_ids</span> <span class="o">=</span> <span class="p">[</span><span class="n">item</span><span class="p">[</span><span class="s1">&#39;id&#39;</span><span class="p">][</span><span class="s1">&#39;videoId&#39;</span><span class="p">]</span> <span class="k">for</span> <span class="n">item</span> <span class="ow">in</span> <span class="n">video_ids_response</span><span class="p">[</span><span class="s1">&#39;items&#39;</span><span class="p">]]</span>
@@ -716,7 +716,7 @@ request needs it + it also decides whether we need to request any videos or not.
 </div>
 </div>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.0.0/airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">task_check_and_transform_video_ids</span> <span class="o">=</span> <span class="n">BranchPythonOperator</span><span class="p">(</span>
         <span class="n">python_callable</span><span class="o">=</span><span class="n">_check_and_transform_video_ids</span><span class="p">,</span>
         <span class="n">op_args</span><span class="o">=</span><span class="p">[</span><span class="n">task_video_ids_to_s3</span><span class="o">.</span><span class="n">output</span><span class="p">[</span><span class="n">task_video_ids_to_s3</span><span class="o">.</span><span class="n">google_api_response_via_xcom</span><span class="p">]],</span>
@@ -728,7 +728,7 @@ request needs it + it also decides whether we need to request any videos or not.
 <p>If there are YouTube Video IDs available, it passes over the YouTube IDs to the next request which then gets the
 information (<code class="docutils literal notranslate"><span class="pre">YOUTUBE_VIDEO_FIELDS</span></code>) for the requested videos and saves them in S3 (<code class="docutils literal notranslate"><span class="pre">S3_DESTINATION_KEY</span></code>).</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.0.0/airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">task_video_data_to_s3</span> <span class="o">=</span> <span class="n">GoogleApiToS3Operator</span><span class="p">(</span>
         <span class="n">gcp_conn_id</span><span class="o">=</span><span class="n">YOUTUBE_CONN_ID</span><span class="p">,</span>
         <span class="n">google_api_service_name</span><span class="o">=</span><span class="s1">&#39;youtube&#39;</span><span class="p">,</span>
@@ -748,7 +748,7 @@ information (<code class="docutils literal notranslate"><span class="pre">YOUTUB
 </div>
 <p>If not do nothing - and track it.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.0.0/airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">task_no_video_ids</span> <span class="o">=</span> <span class="n">DummyOperator</span><span class="p">(</span><span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;no_video_ids&#39;</span><span class="p">)</span>
 </pre></div>
 </div>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/imap_attachment_to_s3.html b/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/imap_attachment_to_s3.html
index 31211d067a..467010258f 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/imap_attachment_to_s3.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/imap_attachment_to_s3.html
@@ -613,7 +613,7 @@ protocol from a mail server to S3 Bucket.</p>
 <h3>Environment variables<a class="headerlink" href="#environment-variables" title="Permalink to this headline">¶</a></h3>
 <p>These examples rely on the following variables, which can be passed via OS environment variables.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_imap_attachment_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_imap_attachment_to_s3.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_imap_attachment_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.0.0/airflow/providers/amazon/aws/example_dags/example_imap_attachment_to_s3.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">IMAP_ATTACHMENT_NAME</span> <span class="o">=</span> <span class="n">getenv</span><span class="p">(</span><span class="s2">&quot;IMAP_ATTACHMENT_NAME&quot;</span><span class="p">,</span> <span class="s2">&quot;test.txt&quot;</span><span class="p">)</span>
 <span class="n">IMAP_MAIL_FOLDER</span> <span class="o">=</span> <span class="n">getenv</span><span class="p">(</span><span class="s2">&quot;IMAP_MAIL_FOLDER&quot;</span><span class="p">,</span> <span class="s2">&quot;INBOX&quot;</span><span class="p">)</span>
 <span class="n">IMAP_MAIL_FILTER</span> <span class="o">=</span> <span class="n">getenv</span><span class="p">(</span><span class="s2">&quot;IMAP_MAIL_FILTER&quot;</span><span class="p">,</span> <span class="s2">&quot;All&quot;</span><span class="p">)</span>
@@ -625,7 +625,7 @@ protocol from a mail server to S3 Bucket.</p>
 <div class="section" id="transfer-mail-attachments-via-imap-to-s3">
 <h3>Transfer Mail Attachments via IMAP to S3<a class="headerlink" href="#transfer-mail-attachments-via-imap-to-s3" title="Permalink to this headline">¶</a></h3>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_imap_attachment_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_imap_attachment_to_s3.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_imap_attachment_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.0.0/airflow/providers/amazon/aws/example_dags/example_imap_attachment_to_s3.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">task_transfer_imap_attachment_to_s3</span> <span class="o">=</span> <span class="n">ImapAttachmentToS3Operator</span><span class="p">(</span>
         <span class="n">imap_attachment_name</span><span class="o">=</span><span class="n">IMAP_ATTACHMENT_NAME</span><span class="p">,</span>
         <span class="n">s3_key</span><span class="o">=</span><span class="n">S3_DESTINATION_KEY</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/redshift_sql.html b/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/redshift_sql.html
index 117c803c3c..1b94680f31 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/redshift_sql.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/redshift_sql.html
@@ -620,7 +620,7 @@ to execute statements against an Amazon Redshift cluster.</p>
 <h3>Create a table<a class="headerlink" href="#create-a-table" title="Permalink to this headline">¶</a></h3>
 <p>In the following code we are creating a table called &quot;fruit&quot;.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_redshift.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.0.0/airflow/providers/amazon/aws/example_dags/example_redshift.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">setup__task_create_table</span> <span class="o">=</span> <span class="n">RedshiftSQLOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;setup__create_table&#39;</span><span class="p">,</span>
         <span class="n">sql</span><span class="o">=</span><span class="s2">&quot;&quot;&quot;</span>
@@ -639,7 +639,7 @@ to execute statements against an Amazon Redshift cluster.</p>
 <h3>Insert data into a table<a class="headerlink" href="#insert-data-into-a-table" title="Permalink to this headline">¶</a></h3>
 <p>In the following code we insert a few sample rows into the &quot;fruit&quot; table.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_redshift.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.0.0/airflow/providers/amazon/aws/example_dags/example_redshift.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">task_insert_data</span> <span class="o">=</span> <span class="n">RedshiftSQLOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;task_insert_data&#39;</span><span class="p">,</span>
         <span class="n">sql</span><span class="o">=</span><span class="p">[</span>
@@ -659,7 +659,7 @@ to execute statements against an Amazon Redshift cluster.</p>
 <h3>Fetching records from a table<a class="headerlink" href="#fetching-records-from-a-table" title="Permalink to this headline">¶</a></h3>
 <p>Creating a new table, &quot;more_fruit&quot; from the &quot;fruit&quot; table.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_redshift.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.0.0/airflow/providers/amazon/aws/example_dags/example_redshift.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">task_get_all_table_data</span> <span class="o">=</span> <span class="n">RedshiftSQLOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;task_get_all_table_data&#39;</span><span class="p">,</span> <span class="n">sql</span><span class="o">=</span><span class="s2">&quot;CREATE TABLE more_fruit AS SELECT * FROM fruit;&quot;</span>
     <span class="p">)</span>
@@ -672,7 +672,7 @@ to execute statements against an Amazon Redshift cluster.</p>
 <p>RedshiftSQLOperator supports the <code class="docutils literal notranslate"><span class="pre">parameters</span></code> attribute which allows us to dynamically pass
 parameters into SQL statements.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_redshift.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.0.0/airflow/providers/amazon/aws/example_dags/example_redshift.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">task_get_with_filter</span> <span class="o">=</span> <span class="n">RedshiftSQLOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;task_get_with_filter&#39;</span><span class="p">,</span>
         <span class="n">sql</span><span class="o">=</span><span class="s2">&quot;CREATE TABLE filtered_fruit AS SELECT * FROM fruit WHERE color = &#39;{{ params.color }}&#39;;&quot;</span><span class="p">,</span>
@@ -687,7 +687,7 @@ parameters into SQL statements.</p>
 <h2><a class="toc-backref" href="#id3">The complete RedshiftSQLOperator DAG</a><a class="headerlink" href="#the-complete-redshiftsqloperator-dag" title="Permalink to this headline">¶</a></h2>
 <p>All together, here is our DAG:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_redshift.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.0.0/airflow/providers/amazon/aws/example_dags/example_redshift.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="kn">from</span> <span class="nn">airflow</span> <span class="kn">import</span> <span class="n">DAG</span>
 <span class="kn">from</span> <span class="nn">airflow.providers.amazon.aws.operators.redshift_sql</span> <span class="kn">import</span> <span class="n">RedshiftSQLOperator</span>
 
diff --git a/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/s3.html b/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/s3.html
index fc2a659cac..e0c77d805e 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/s3.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/s3.html
@@ -651,7 +651,7 @@ new S3 bucket with a given bucket name then delete it.</p>
 <h3>Defining tasks<a class="headerlink" href="#defining-tasks" title="Permalink to this headline">¶</a></h3>
 <p>In the following code we create a new bucket, add keys, and then delete the bucket.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3_bucket.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_s3_bucket.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3_bucket.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.0.0/airflow/providers/amazon/aws/example_dags/example_s3_bucket.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="k">with</span> <span class="n">DAG</span><span class="p">(</span>
     <span class="n">dag_id</span><span class="o">=</span><span class="s1">&#39;s3_bucket_dag&#39;</span><span class="p">,</span>
     <span class="n">schedule_interval</span><span class="o">=</span><span class="kc">None</span><span class="p">,</span>
@@ -686,7 +686,7 @@ and <code class="docutils literal notranslate"><span class="pre">S3PutBucketTagg
 <h3>Defining tasks<a class="headerlink" href="#id2" title="Permalink to this headline">¶</a></h3>
 <p>In the following code we create a new S3 bucket, apply tagging, get tagging, delete tagging, then delete the bucket.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3_bucket_tagging.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_s3_bucket_tagging.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3_bucket_tagging.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.0.0/airflow/providers/amazon/aws/example_dags/example_s3_bucket_tagging.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="k">with</span> <span class="n">DAG</span><span class="p">(</span>
     <span class="n">dag_id</span><span class="o">=</span><span class="s1">&#39;s3_bucket_tagging_dag&#39;</span><span class="p">,</span>
     <span class="n">schedule_interval</span><span class="o">=</span><span class="kc">None</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/s3_to_redshift.html b/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/s3_to_redshift.html
index fe2bde12ca..95862bc4da 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/s3_to_redshift.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/s3_to_redshift.html
@@ -616,7 +616,7 @@ in action.</p>
 <h3>Environment variables<a class="headerlink" href="#environment-variables" title="Permalink to this headline">¶</a></h3>
 <p>This example relies on the following variables, which can be passed via OS environment variables.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3_to_redshift.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_s3_to_redshift.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3_to_redshift.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.0.0/airflow/providers/amazon/aws/example_dags/example_s3_to_redshift.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">S3_BUCKET</span> <span class="o">=</span> <span class="n">getenv</span><span class="p">(</span><span class="s2">&quot;S3_BUCKET&quot;</span><span class="p">,</span> <span class="s2">&quot;test-bucket&quot;</span><span class="p">)</span>
 <span class="n">S3_KEY</span> <span class="o">=</span> <span class="n">getenv</span><span class="p">(</span><span class="s2">&quot;S3_KEY&quot;</span><span class="p">,</span> <span class="s2">&quot;key&quot;</span><span class="p">)</span>
 <span class="n">REDSHIFT_TABLE</span> <span class="o">=</span> <span class="n">getenv</span><span class="p">(</span><span class="s2">&quot;REDSHIFT_TABLE&quot;</span><span class="p">,</span> <span class="s2">&quot;test_table&quot;</span><span class="p">)</span>
@@ -630,7 +630,7 @@ in action.</p>
 <p>In the following code we are copying the S3 key <code class="docutils literal notranslate"><span class="pre">s3://{S3_BUCKET}/{S3_KEY}/{REDSHIFT_TABLE}</span></code> into the Redshift table
 <code class="docutils literal notranslate"><span class="pre">PUBLIC.{REDSHIFT_TABLE}</span></code>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3_to_redshift.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_s3_to_redshift.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3_to_redshift.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.0.0/airflow/providers/amazon/aws/example_dags/example_s3_to_redshift.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">task_transfer_s3_to_redshift</span> <span class="o">=</span> <span class="n">S3ToRedshiftOperator</span><span class="p">(</span>
         <span class="n">s3_bucket</span><span class="o">=</span><span class="n">S3_BUCKET</span><span class="p">,</span>
         <span class="n">s3_key</span><span class="o">=</span><span class="n">S3_KEY</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/salesforce_to_s3.html b/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/salesforce_to_s3.html
index 17345498f1..6a8c491eaf 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/salesforce_to_s3.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/salesforce_to_s3.html
@@ -605,7 +605,7 @@ are initially written to a local, temporary directory and then uploaded to an S3
 <p>The following example demonstrates a use case of extracting customer data from a Salesforce
 instance and upload to a &quot;landing&quot; bucket in S3.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_salesforce_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_salesforce_to_s3.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_salesforce_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.0.0/airflow/providers/amazon/aws/example_dags/example_salesforce_to_s3.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">upload_salesforce_data_to_s3_landing</span> <span class="o">=</span> <span class="n">SalesforceToS3Operator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;upload_salesforce_data_to_s3&quot;</span><span class="p">,</span>
         <span class="n">salesforce_query</span><span class="o">=</span><span class="s2">&quot;SELECT Id, Name, Company, Phone, Email, LastModifiedDate, IsActive FROM Customers&quot;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/sqs_publish.html b/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/sqs_publish.html
index 7caa3ab4fd..d5ea7dbd72 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/sqs_publish.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/sqs_publish.html
@@ -627,7 +627,7 @@ to publish a message to Amazon Simple Queue Service (SQS).</p>
 <p>In the following example, the task &quot;publish_to_queue&quot; publishes a message containing
 the task instance and the execution date to a queue named <code class="docutils literal notranslate"><span class="pre">Airflow-Example-Queue</span></code>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_sqs.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_sqs.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_sqs.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.0.0/airflow/providers/amazon/aws/example_dags/example_sqs.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>
     <span class="c1"># Using a task-decorated function to create an SQS queue</span>
     <span class="n">create_queue</span> <span class="o">=</span> <span class="n">create_queue_fn</span><span class="p">()</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/transfer/glacier_to_gcs.html b/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/transfer/glacier_to_gcs.html
index 8bd1d354be..047303c9b6 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/transfer/glacier_to_gcs.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/transfer/glacier_to_gcs.html
@@ -610,7 +610,7 @@ Transferring big files may not work well.</p>
 <a class="reference internal" href="../../_api/airflow/providers/amazon/aws/transfers/glacier_to_gcs/index.html#airflow.providers.amazon.aws.transfers.glacier_to_gcs.GlacierToGCSOperator" title="airflow.providers.amazon.aws.transfers.glacier_to_gcs.GlacierToGCSOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">GlacierToGCSOperator</span></code></a></p>
 <p>Example usage:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.0.0/airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">transfer_archive_to_gcs</span> <span class="o">=</span> <span class="n">GlacierToGCSOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;transfer_archive_to_gcs&quot;</span><span class="p">,</span>
     <span class="n">vault_name</span><span class="o">=</span><span class="n">VAULT_NAME</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/transfer/s3_to_sftp.html b/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/transfer/s3_to_sftp.html
index 79e72c7923..780f895e64 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/transfer/s3_to_sftp.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/transfer/s3_to_sftp.html
@@ -605,7 +605,7 @@ For more information about the service visits <a class="reference external" href
 <a class="reference internal" href="../../_api/airflow/providers/amazon/aws/transfers/s3_to_sftp/index.html#airflow.providers.amazon.aws.transfers.s3_to_sftp.S3ToSFTPOperator" title="airflow.providers.amazon.aws.transfers.s3_to_sftp.S3ToSFTPOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">S3ToSFTPOperator</span></code></a></p>
 <p>Example usage:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3_to_sftp.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/amazon/aws/example_dags/example_s3_to_sftp.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3_to_sftp.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.0.0/airflow/providers/amazon/aws/example_dags/example_s3_to_sftp.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">create_s3_to_sftp_job</span> <span class="o">=</span> <span class="n">S3ToSFTPOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;create_to_s3_sftp_job&quot;</span><span class="p">,</span>
     <span class="n">sftp_conn_id</span><span class="o">=</span><span class="s2">&quot;sftp_conn_id&quot;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/transfer/sftp_to_s3.html b/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/transfer/sftp_to_s3.html
index 43887d7eb6..5bed46ad72 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/transfer/sftp_to_s3.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.0.0/operators/transfer/sftp_to_s3.html
@@ -604,7 +604,7 @@
 <a class="reference internal" href="../../_api/airflow/providers/amazon/aws/transfers/sftp_to_s3/index.html#airflow.providers.amazon.aws.transfers.sftp_to_s3.SFTPToS3Operator" title="airflow.providers.amazon.aws.transfers.sftp_to_s3.SFTPToS3Operator"><code class="xref py py-class docutils literal notranslate"><span class="pre">SFTPToS3Operator</span></code></a></p>
 <p>Example usage:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_sftp_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/amazon/aws/example_dags/example_sftp_to_s3.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_sftp_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.0.0/airflow/providers/amazon/aws/example_dags/example_sftp_to_s3.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">create_sftp_to_s3_job</span> <span class="o">=</span> <span class="n">SFTPToS3Operator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;create_sftp_to_s3_job&quot;</span><span class="p">,</span>
     <span class="n">sftp_conn_id</span><span class="o">=</span><span class="s2">&quot;sftp_conn_id&quot;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/athena.html b/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/athena.html
index ded21b8092..b8ccefe515 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/athena.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/athena.html
@@ -630,7 +630,7 @@ created in an S3 bucket and populated with SAMPLE_DATA.  The example waits for t
 to complete and then drops the created table and deletes the sample CSV file in the S3
 bucket.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_athena.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_athena.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_athena.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_athena.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>
     <span class="c1"># Using a task-decorated function to create a CSV file in S3</span>
     <span class="n">add_sample_data_to_s3</span> <span class="o">=</span> <span class="n">add_sample_data_to_s3</span><span class="p">()</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/batch.html b/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/batch.html
index 63bab8d3bb..58a1de84bc 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/batch.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/batch.html
@@ -620,7 +620,7 @@ infrastructure.</p>
 <p>To wait on the state of an AWS Batch Job until it reaches a terminal state you can
 use <a class="reference internal" href="../_api/airflow/providers/amazon/aws/sensors/batch/index.html#airflow.providers.amazon.aws.sensors.batch.BatchSensor" title="airflow.providers.amazon.aws.sensors.batch.BatchSensor"><code class="xref py py-class docutils literal notranslate"><span class="pre">BatchSensor</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_batch.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_batch.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_batch.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_batch.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">wait_for_batch_job</span> <span class="o">=</span> <span class="n">BatchSensor</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;wait_for_batch_job&#39;</span><span class="p">,</span>
     <span class="n">job_id</span><span class="o">=</span><span class="n">JOB_ID</span><span class="p">,</span>
@@ -634,7 +634,7 @@ use <a class="reference internal" href="../_api/airflow/providers/amazon/aws/sen
 <p>To submit a new AWS Batch Job and monitor it until it reaches a terminal state you can
 use <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/batch/index.html#airflow.providers.amazon.aws.operators.batch.BatchOperator" title="airflow.providers.amazon.aws.operators.batch.BatchOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">BatchOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_batch.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_batch.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_batch.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_batch.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">submit_batch_job</span> <span class="o">=</span> <span class="n">BatchOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;submit_batch_job&#39;</span><span class="p">,</span>
     <span class="n">job_name</span><span class="o">=</span><span class="n">JOB_NAME</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/datasync.html b/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/datasync.html
index 573a9f1173..715541c81c 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/datasync.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/datasync.html
@@ -622,13 +622,13 @@ and an <em>AWS DataSync Task</em> (identified by a TaskArn on AWS).</p>
 <h3>Environment variables<a class="headerlink" href="#environment-variables" title="Permalink to this headline">¶</a></h3>
 <p>These examples rely on the following variables, which can be passed via OS environment variables.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_datasync_1.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_datasync_1.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_datasync_1.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_datasync_1.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">TASK_ARN</span> <span class="o">=</span> <span class="n">getenv</span><span class="p">(</span><span class="s2">&quot;TASK_ARN&quot;</span><span class="p">,</span> <span class="s2">&quot;my_aws_datasync_task_arn&quot;</span><span class="p">)</span>
 </pre></div>
 </div>
 </div>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_datasync_1.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_datasync_1.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_datasync_1.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_datasync_1.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">SOURCE_LOCATION_URI</span> <span class="o">=</span> <span class="n">getenv</span><span class="p">(</span><span class="s2">&quot;SOURCE_LOCATION_URI&quot;</span><span class="p">,</span> <span class="s2">&quot;smb://hostname/directory/&quot;</span><span class="p">)</span>
 
 <span class="n">DESTINATION_LOCATION_URI</span> <span class="o">=</span> <span class="n">getenv</span><span class="p">(</span><span class="s2">&quot;DESTINATION_LOCATION_URI&quot;</span><span class="p">,</span> <span class="s2">&quot;s3://mybucket/prefix&quot;</span><span class="p">)</span>
@@ -641,7 +641,7 @@ and an <em>AWS DataSync Task</em> (identified by a TaskArn on AWS).</p>
 <p>The <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/datasync/index.html#airflow.providers.amazon.aws.operators.datasync.DataSyncOperator" title="airflow.providers.amazon.aws.operators.datasync.DataSyncOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">DataSyncOperator</span></code></a> can execute a specific
 TaskArn by specifying the <code class="docutils literal notranslate"><span class="pre">task_arn</span></code> parameter. This is useful when you know the TaskArn you want to execute.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_datasync_1.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_datasync_1.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_datasync_1.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_datasync_1.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">datasync_task_1</span> <span class="o">=</span> <span class="n">DataSyncOperator</span><span class="p">(</span><span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;datasync_task_1&quot;</span><span class="p">,</span> <span class="n">task_arn</span><span class="o">=</span><span class="n">TASK_ARN</span><span class="p">)</span>
 </pre></div>
 </div>
@@ -657,7 +657,7 @@ can iterate all DataSync Tasks for their source and destination LocationArns. Th
 each LocationArn to see if its the URIs match the desired source / destination URI.</p>
 <p>To perform a search based on the Location URIs, define the task as follows</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_datasync_1.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_datasync_1.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_datasync_1.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_datasync_1.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">datasync_task_2</span> <span class="o">=</span> <span class="n">DataSyncOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;datasync_task_2&quot;</span><span class="p">,</span>
         <span class="n">source_location_uri</span><span class="o">=</span><span class="n">SOURCE_LOCATION_URI</span><span class="p">,</span>
@@ -685,7 +685,7 @@ Finally, delete it.</p>
 <h3>Environment variables<a class="headerlink" href="#id2" title="Permalink to this headline">¶</a></h3>
 <p>This example relies on the following variables, which can be passed via OS environment variables.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_datasync_2.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_datasync_2.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_datasync_2.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_datasync_2.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">SOURCE_LOCATION_URI</span> <span class="o">=</span> <span class="n">getenv</span><span class="p">(</span><span class="s2">&quot;SOURCE_LOCATION_URI&quot;</span><span class="p">,</span> <span class="s2">&quot;smb://hostname/directory/&quot;</span><span class="p">)</span>
 
 <span class="n">DESTINATION_LOCATION_URI</span> <span class="o">=</span> <span class="n">getenv</span><span class="p">(</span><span class="s2">&quot;DESTINATION_LOCATION_URI&quot;</span><span class="p">,</span> <span class="s2">&quot;s3://mybucket/prefix&quot;</span><span class="p">)</span>
@@ -723,7 +723,7 @@ as before but with some extra arguments.</p>
 and/or Locations if no suitable existing Task was found. If these are left to their default value (None)
 then no create will be attempted.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_datasync_2.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_datasync_2.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_datasync_2.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_datasync_2.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">datasync_task</span> <span class="o">=</span> <span class="n">DataSyncOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;datasync_task&quot;</span><span class="p">,</span>
         <span class="n">source_location_uri</span><span class="o">=</span><span class="n">SOURCE_LOCATION_URI</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/dms.html b/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/dms.html
index 7b4048b360..4ac0b34336 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/dms.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/dms.html
@@ -646,7 +646,7 @@ to be completed, and then delete it.</p>
 <h3>Defining tasks<a class="headerlink" href="#defining-tasks" title="Permalink to this headline">¶</a></h3>
 <p>In the following code we create a new replication task, start it, wait for it to be completed and then delete it.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_dms_full_load_task.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_dms_full_load_task.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_dms_full_load_task.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_dms_full_load_task.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">create_task</span> <span class="o">=</span> <span class="n">DmsCreateTaskOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;create_task&#39;</span><span class="p">,</span>
         <span class="n">replication_task_id</span><span class="o">=</span><span class="n">REPLICATION_TASK_ID</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/ecs.html b/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/ecs.html
index 9283f779c9..26bb689ec6 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/ecs.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/ecs.html
@@ -635,7 +635,7 @@ scale containerized applications.</p>
 <li><p>If you have integrated external resources in your ECS Cluster, for example using ECS Anywhere, and want to run your containers on those external resources, set the parameter to EXTERNAL.</p></li>
 </ul>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_ecs_ec2.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_ecs_ec2.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_ecs_ec2.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_ecs_ec2.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">hello_world</span> <span class="o">=</span> <span class="n">EcsOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;hello_world&quot;</span><span class="p">,</span>
         <span class="n">cluster</span><span class="o">=</span><span class="n">os</span><span class="o">.</span><span class="n">environ</span><span class="o">.</span><span class="n">get</span><span class="p">(</span><span class="s2">&quot;CLUSTER_NAME&quot;</span><span class="p">,</span> <span class="s2">&quot;existing_cluster_name&quot;</span><span class="p">),</span>
@@ -665,7 +665,7 @@ scale containerized applications.</p>
 </div>
 </div>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_ecs_fargate.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_ecs_fargate.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_ecs_fargate.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_ecs_fargate.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">hello_world</span> <span class="o">=</span> <span class="n">EcsOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;hello_world&quot;</span><span class="p">,</span>
         <span class="n">cluster</span><span class="o">=</span><span class="n">os</span><span class="o">.</span><span class="n">environ</span><span class="o">.</span><span class="n">get</span><span class="p">(</span><span class="s2">&quot;CLUSTER_NAME&quot;</span><span class="p">,</span> <span class="s2">&quot;existing_cluster_name&quot;</span><span class="p">),</span>
@@ -704,7 +704,7 @@ scale containerized applications.</p>
 <h3>CloudWatch Logging<a class="headerlink" href="#cloudwatch-logging" title="Permalink to this headline">¶</a></h3>
 <p>To stream logs to AWS CloudWatch, you need to define these parameters. Using the example Operators above, we would add these additional parameters to enable logging to CloudWatch. You will need to ensure that you have the appropriate level of permissions (see next section)</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_ecs_ec2.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_ecs_ec2.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_ecs_ec2.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_ecs_ec2.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>        <span class="n">awslogs_group</span><span class="o">=</span><span class="s2">&quot;/ecs/hello-world&quot;</span><span class="p">,</span>
         <span class="n">awslogs_region</span><span class="o">=</span><span class="s2">&quot;aws-region&quot;</span><span class="p">,</span>
         <span class="n">awslogs_stream_prefix</span><span class="o">=</span><span class="s2">&quot;ecs/hello-world-container&quot;</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/eks.html b/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/eks.html
index 1b8da9a79f..e0e537ad5b 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/eks.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/eks.html
@@ -609,7 +609,7 @@ and management of containerized applications.</p>
 <p>To check the state of an Amazon EKS Cluster until it reaches the target state or another terminal
 state you can use <a class="reference internal" href="../_api/airflow/providers/amazon/aws/sensors/eks/index.html#airflow.providers.amazon.aws.sensors.eks.EksClusterStateSensor" title="airflow.providers.amazon.aws.sensors.eks.EksClusterStateSensor"><code class="xref py py-class docutils literal notranslate"><span class="pre">EksClusterStateSensor</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">await_create_cluster</span> <span class="o">=</span> <span class="n">EksClusterStateSensor</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;wait_for_create_cluster&#39;</span><span class="p">,</span>
     <span class="n">cluster_name</span><span class="o">=</span><span class="n">CLUSTER_NAME</span><span class="p">,</span>
@@ -629,7 +629,7 @@ state you can use <a class="reference internal" href="../_api/airflow/providers/
 </dd>
 </dl>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="c1"># Create an Amazon EKS Cluster control plane without attaching compute service.</span>
 <span class="n">create_cluster</span> <span class="o">=</span> <span class="n">EksCreateClusterOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;create_eks_cluster&#39;</span><span class="p">,</span>
@@ -647,7 +647,7 @@ state you can use <a class="reference internal" href="../_api/airflow/providers/
 <p>To delete an existing Amazon EKS Cluster you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/eks/index.html#airflow.providers.amazon.aws.operators.eks.EksDeleteClusterOperator" title="airflow.providers.amazon.aws.operators.eks.EksDeleteClusterOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">EksDeleteClusterOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">delete_cluster</span> <span class="o">=</span> <span class="n">EksDeleteClusterOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;delete_eks_cluster&#39;</span><span class="p">,</span>
     <span class="n">cluster_name</span><span class="o">=</span><span class="n">CLUSTER_NAME</span><span class="p">,</span>
@@ -661,7 +661,7 @@ attempt to delete any attached resources first.</p>
 </dd>
 </dl>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroup_in_one_step.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroup_in_one_step.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroup_in_one_step.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroup_in_one_step.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="c1"># An Amazon EKS cluster can not be deleted with attached resources such as nodegroups or Fargate profiles.</span>
 <span class="c1"># Setting the `force` to `True` will delete any attached resources before deleting the cluster.</span>
 <span class="n">delete_all</span> <span class="o">=</span> <span class="n">EksDeleteClusterOperator</span><span class="p">(</span>
@@ -681,7 +681,7 @@ attempt to delete any attached resources first.</p>
 <p>To check the state of an Amazon EKS managed node group until it reaches the target state or another terminal
 state you can use <a class="reference internal" href="../_api/airflow/providers/amazon/aws/sensors/eks/index.html#airflow.providers.amazon.aws.sensors.eks.EksNodegroupStateSensor" title="airflow.providers.amazon.aws.sensors.eks.EksNodegroupStateSensor"><code class="xref py py-class docutils literal notranslate"><span class="pre">EksNodegroupStateSensor</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">await_create_nodegroup</span> <span class="o">=</span> <span class="n">EksNodegroupStateSensor</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;wait_for_create_nodegroup&#39;</span><span class="p">,</span>
     <span class="n">cluster_name</span><span class="o">=</span><span class="n">CLUSTER_NAME</span><span class="p">,</span>
@@ -703,7 +703,7 @@ state you can use <a class="reference internal" href="../_api/airflow/providers/
 </dd>
 </dl>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">create_nodegroup</span> <span class="o">=</span> <span class="n">EksCreateNodegroupOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;create_eks_nodegroup&#39;</span><span class="p">,</span>
     <span class="n">cluster_name</span><span class="o">=</span><span class="n">CLUSTER_NAME</span><span class="p">,</span>
@@ -720,7 +720,7 @@ state you can use <a class="reference internal" href="../_api/airflow/providers/
 <p>To delete an existing Amazon EKS Managed Nodegroup you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/eks/index.html#airflow.providers.amazon.aws.operators.eks.EksDeleteNodegroupOperator" title="airflow.providers.amazon.aws.operators.eks.EksDeleteNodegroupOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">EksDeleteNodegroupOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">delete_nodegroup</span> <span class="o">=</span> <span class="n">EksDeleteNodegroupOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;delete_eks_nodegroup&#39;</span><span class="p">,</span>
     <span class="n">cluster_name</span><span class="o">=</span><span class="n">CLUSTER_NAME</span><span class="p">,</span>
@@ -744,7 +744,7 @@ state you can use <a class="reference internal" href="../_api/airflow/providers/
 </dd>
 </dl>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroup_in_one_step.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroup_in_one_step.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroup_in_one_step.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroup_in_one_step.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="c1"># Create an Amazon EKS cluster control plane and an EKS nodegroup compute platform in one step.</span>
 <span class="n">create_cluster_and_nodegroup</span> <span class="o">=</span> <span class="n">EksCreateClusterOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;create_eks_cluster_and_nodegroup&#39;</span><span class="p">,</span>
@@ -775,7 +775,7 @@ state you can use <a class="reference internal" href="../_api/airflow/providers/
 </dd>
 </dl>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_in_one_step.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_in_one_step.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_in_one_step.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_in_one_step.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="c1"># Create an Amazon EKS cluster control plane and an AWS Fargate compute platform in one step.</span>
 <span class="n">create_cluster_and_fargate_profile</span> <span class="o">=</span> <span class="n">EksCreateClusterOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;create_eks_cluster_and_fargate_profile&#39;</span><span class="p">,</span>
@@ -799,7 +799,7 @@ state you can use <a class="reference internal" href="../_api/airflow/providers/
 <p>To check the state of an AWS Fargate profile until it reaches the target state or another terminal
 state you can use <code class="xref py py-class docutils literal notranslate"><span class="pre">EksFargateProfileSensor</span></code>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_profile.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_profile.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_profile.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_profile.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">await_create_fargate_profile</span> <span class="o">=</span> <span class="n">EksFargateProfileStateSensor</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;wait_for_create_fargate_profile&#39;</span><span class="p">,</span>
     <span class="n">cluster_name</span><span class="o">=</span><span class="n">CLUSTER_NAME</span><span class="p">,</span>
@@ -821,7 +821,7 @@ state you can use <code class="xref py py-class docutils literal notranslate"><s
 </dd>
 </dl>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_profile.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_profile.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_profile.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_profile.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">create_fargate_profile</span> <span class="o">=</span> <span class="n">EksCreateFargateProfileOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;create_eks_fargate_profile&#39;</span><span class="p">,</span>
     <span class="n">cluster_name</span><span class="o">=</span><span class="n">CLUSTER_NAME</span><span class="p">,</span>
@@ -838,7 +838,7 @@ state you can use <code class="xref py py-class docutils literal notranslate"><s
 <p>To delete an existing AWS Fargate Profile you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/eks/index.html#airflow.providers.amazon.aws.operators.eks.EksDeleteFargateProfileOperator" title="airflow.providers.amazon.aws.operators.eks.EksDeleteFargateProfileOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">EksDeleteFargateProfileOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_profile.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_profile.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_profile.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_profile.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">delete_fargate_profile</span> <span class="o">=</span> <span class="n">EksDeleteFargateProfileOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;delete_eks_fargate_profile&#39;</span><span class="p">,</span>
     <span class="n">cluster_name</span><span class="o">=</span><span class="n">CLUSTER_NAME</span><span class="p">,</span>
@@ -855,7 +855,7 @@ state you can use <code class="xref py py-class docutils literal notranslate"><s
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/eks/index.html#airflow.providers.amazon.aws.operators.eks.EksPodOperator" title="airflow.providers.amazon.aws.operators.eks.EksPodOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">EksPodOperator</span></code></a>.</p>
 <p>Note: An Amazon EKS Cluster with underlying compute infrastructure is required.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">start_pod</span> <span class="o">=</span> <span class="n">EksPodOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;run_pod&quot;</span><span class="p">,</span>
     <span class="n">cluster_name</span><span class="o">=</span><span class="n">CLUSTER_NAME</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/emr.html b/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/emr.html
index 60e1f3aef4..f4dc0e2488 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/emr.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/emr.html
@@ -633,7 +633,7 @@ create a new EMR job flow.  The cluster will be terminated automatically after f
 <h3>JobFlow configuration<a class="headerlink" href="#jobflow-configuration" title="Permalink to this headline">¶</a></h3>
 <p>To create a job flow on EMR, you need to specify the configuration for the EMR cluster:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_job_flow_automatic_steps.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_emr_job_flow_automatic_steps.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_job_flow_automatic_steps.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_emr_job_flow_automatic_steps.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">SPARK_STEPS</span> <span class="o">=</span> <span class="p">[</span>
     <span class="p">{</span>
         <span class="s1">&#39;Name&#39;</span><span class="p">:</span> <span class="s1">&#39;calculate_pi&#39;</span><span class="p">,</span>
@@ -686,7 +686,7 @@ you may not see the cluster in the EMR Management Console - you can change this
 <h3>Create the Job Flow<a class="headerlink" href="#create-the-job-flow" title="Permalink to this headline">¶</a></h3>
 <p>In the following code we are creating a new job flow using the configuration as explained above.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_job_flow_automatic_steps.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_emr_job_flow_automatic_steps.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_job_flow_automatic_steps.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_emr_job_flow_automatic_steps.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">job_flow_creator</span> <span class="o">=</span> <span class="n">EmrCreateJobFlowOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;create_job_flow&#39;</span><span class="p">,</span>
     <span class="n">job_flow_overrides</span><span class="o">=</span><span class="n">JOB_FLOW_OVERRIDES</span><span class="p">,</span>
@@ -701,7 +701,7 @@ you may not see the cluster in the EMR Management Console - you can change this
 <p>To add Steps to an existing EMR Job Flow you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/emr/index.html#airflow.providers.amazon.aws.operators.emr.EmrAddStepsOperator" title="airflow.providers.amazon.aws.operators.emr.EmrAddStepsOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">EmrAddStepsOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_job_flow_manual_steps.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_emr_job_flow_manual_steps.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_job_flow_manual_steps.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_emr_job_flow_manual_steps.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">step_adder</span> <span class="o">=</span> <span class="n">EmrAddStepsOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;add_steps&#39;</span><span class="p">,</span>
     <span class="n">job_flow_id</span><span class="o">=</span><span class="n">cluster_creator</span><span class="o">.</span><span class="n">output</span><span class="p">,</span>
@@ -716,7 +716,7 @@ you may not see the cluster in the EMR Management Console - you can change this
 <p>To terminate an EMR Job Flow you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/emr/index.html#airflow.providers.amazon.aws.operators.emr.EmrTerminateJobFlowOperator" title="airflow.providers.amazon.aws.operators.emr.EmrTerminateJobFlowOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">EmrTerminateJobFlowOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_job_flow_manual_steps.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_emr_job_flow_manual_steps.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_job_flow_manual_steps.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_emr_job_flow_manual_steps.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">cluster_remover</span> <span class="o">=</span> <span class="n">EmrTerminateJobFlowOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;remove_cluster&#39;</span><span class="p">,</span>
     <span class="n">job_flow_id</span><span class="o">=</span><span class="n">cluster_creator</span><span class="o">.</span><span class="n">output</span><span class="p">,</span>
@@ -740,7 +740,7 @@ you may not see the cluster in the EMR Management Console - you can change this
 <p>To monitor the state of an EMR Job Flow you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/sensors/emr/index.html#airflow.providers.amazon.aws.sensors.emr.EmrJobFlowSensor" title="airflow.providers.amazon.aws.sensors.emr.EmrJobFlowSensor"><code class="xref py py-class docutils literal notranslate"><span class="pre">EmrJobFlowSensor</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_job_flow_automatic_steps.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_emr_job_flow_automatic_steps.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_job_flow_automatic_steps.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_emr_job_flow_automatic_steps.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">job_sensor</span> <span class="o">=</span> <span class="n">EmrJobFlowSensor</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;check_job_flow&#39;</span><span class="p">,</span>
     <span class="n">job_flow_id</span><span class="o">=</span><span class="n">job_flow_creator</span><span class="o">.</span><span class="n">output</span><span class="p">,</span>
@@ -754,7 +754,7 @@ you may not see the cluster in the EMR Management Console - you can change this
 <p>To monitor the state of a Step running an existing EMR Job Flow you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/sensors/emr/index.html#airflow.providers.amazon.aws.sensors.emr.EmrStepSensor" title="airflow.providers.amazon.aws.sensors.emr.EmrStepSensor"><code class="xref py py-class docutils literal notranslate"><span class="pre">EmrStepSensor</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_job_flow_manual_steps.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_emr_job_flow_manual_steps.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_job_flow_manual_steps.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_emr_job_flow_manual_steps.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">step_checker</span> <span class="o">=</span> <span class="n">EmrStepSensor</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;watch_step&#39;</span><span class="p">,</span>
     <span class="n">job_flow_id</span><span class="o">=</span><span class="n">cluster_creator</span><span class="o">.</span><span class="n">output</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/emr_eks.html b/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/emr_eks.html
index 570f8feb1e..84eeddaf98 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/emr_eks.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/emr_eks.html
@@ -639,7 +639,7 @@ and <code class="docutils literal notranslate"><span class="pre">monitoringConfi
 Refer to the <a class="reference external" href="https://docs.aws.amazon.com/emr/latest/EMR-on-EKS-DevelopmentGuide/emr-eks-jobs-CLI.html#emr-eks-jobs-parameters">EMR on EKS guide</a>
 for more details on job configuration.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_eks_job.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_emr_eks_job.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_eks_job.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_emr_eks_job.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">JOB_DRIVER_ARG</span> <span class="o">=</span> <span class="p">{</span>
     <span class="s2">&quot;sparkSubmitJobDriver&quot;</span><span class="p">:</span> <span class="p">{</span>
         <span class="s2">&quot;entryPoint&quot;</span><span class="p">:</span> <span class="s2">&quot;local:///usr/lib/spark/examples/src/main/python/pi.py&quot;</span><span class="p">,</span>
@@ -671,7 +671,7 @@ can store them in a connection or provide them in the DAG. Your AWS region shoul
 in the <code class="docutils literal notranslate"><span class="pre">aws_default</span></code> connection as <code class="docutils literal notranslate"><span class="pre">{&quot;region_name&quot;:</span> <span class="pre">&quot;us-east-1&quot;}</span></code> or a custom connection name
 that gets passed to the operator with the <code class="docutils literal notranslate"><span class="pre">aws_conn_id</span></code> parameter.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_eks_job.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_emr_eks_job.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_eks_job.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_emr_eks_job.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">job_starter</span> <span class="o">=</span> <span class="n">EmrContainerOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;start_job&quot;</span><span class="p">,</span>
     <span class="n">virtual_cluster_id</span><span class="o">=</span><span class="n">VIRTUAL_CLUSTER_ID</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/glacier.html b/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/glacier.html
index a00b0f2c3d..d36b4b1473 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/glacier.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/glacier.html
@@ -604,7 +604,7 @@ The operation returns dictionary of information related to the initiated job lik
 <code class="xref py py-class docutils literal notranslate"><span class="pre">GlacierCreateJobOperator</span></code></p>
 <p>Example usage:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">create_glacier_job</span> <span class="o">=</span> <span class="n">GlacierCreateJobOperator</span><span class="p">(</span><span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;create_glacier_job&quot;</span><span class="p">,</span> <span class="n">vault_name</span><span class="o">=</span><span class="n">VAULT_NAME</span><span class="p">)</span>
 <span class="n">JOB_ID</span> <span class="o">=</span> <span class="s1">&#39;{{ task_instance.xcom_pull(&quot;create_glacier_job&quot;)[&quot;jobId&quot;] }}&#39;</span>
 </pre></div>
@@ -629,7 +629,7 @@ Which means that every next request will be sent every 20 minutes.</p>
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/sensors/glacier/index.html#airflow.providers.amazon.aws.sensors.glacier.GlacierJobOperationSensor" title="airflow.providers.amazon.aws.sensors.glacier.GlacierJobOperationSensor"><code class="xref py py-class docutils literal notranslate"><span class="pre">GlacierJobOperationSensor</span></code></a></p>
 <p>Example usage:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">transfer_archive_to_gcs</span> <span class="o">=</span> <span class="n">GlacierToGCSOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;transfer_archive_to_gcs&quot;</span><span class="p">,</span>
     <span class="n">vault_name</span><span class="o">=</span><span class="n">VAULT_NAME</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/google_api_to_s3_transfer.html b/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/google_api_to_s3_transfer.html
index 78b04ca680..4dba268cf8 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/google_api_to_s3_transfer.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/google_api_to_s3_transfer.html
@@ -617,7 +617,7 @@ in action.</p>
 <h3>Environment variables<a class="headerlink" href="#environment-variables" title="Permalink to this headline">¶</a></h3>
 <p>These examples rely on the following variables, which can be passed via OS environment variables.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_basic.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_basic.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_basic.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_basic.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">GOOGLE_SHEET_ID</span> <span class="o">=</span> <span class="n">getenv</span><span class="p">(</span><span class="s2">&quot;GOOGLE_SHEET_ID&quot;</span><span class="p">)</span>
 <span class="n">GOOGLE_SHEET_RANGE</span> <span class="o">=</span> <span class="n">getenv</span><span class="p">(</span><span class="s2">&quot;GOOGLE_SHEET_RANGE&quot;</span><span class="p">)</span>
 <span class="n">S3_DESTINATION_KEY</span> <span class="o">=</span> <span class="n">getenv</span><span class="p">(</span><span class="s2">&quot;S3_DESTINATION_KEY&quot;</span><span class="p">,</span> <span class="s2">&quot;s3://bucket/key.json&quot;</span><span class="p">)</span>
@@ -630,7 +630,7 @@ in action.</p>
 <h3>Get Google Sheets Sheet Values<a class="headerlink" href="#get-google-sheets-sheet-values" title="Permalink to this headline">¶</a></h3>
 <p>In the following code we are requesting a Google Sheet via the <code class="docutils literal notranslate"><span class="pre">sheets.spreadsheets.values.get</span></code> endpoint.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_basic.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_basic.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_basic.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_basic.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">task_google_sheets_values_to_s3</span> <span class="o">=</span> <span class="n">GoogleApiToS3Operator</span><span class="p">(</span>
         <span class="n">google_api_service_name</span><span class="o">=</span><span class="s1">&#39;sheets&#39;</span><span class="p">,</span>
         <span class="n">google_api_service_version</span><span class="o">=</span><span class="s1">&#39;v4&#39;</span><span class="p">,</span>
@@ -657,7 +657,7 @@ tasks to retrieve specific information about YouTube videos.</p>
 <h3>Environment variables<a class="headerlink" href="#id2" title="Permalink to this headline">¶</a></h3>
 <p>This example relies on the following variables, which can be passed via OS environment variables.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">YOUTUBE_CONN_ID</span> <span class="o">=</span> <span class="n">getenv</span><span class="p">(</span><span class="s2">&quot;YOUTUBE_CONN_ID&quot;</span><span class="p">,</span> <span class="s2">&quot;google_cloud_default&quot;</span><span class="p">)</span>
 <span class="n">YOUTUBE_CHANNEL_ID</span> <span class="o">=</span> <span class="n">getenv</span><span class="p">(</span><span class="s2">&quot;YOUTUBE_CHANNEL_ID&quot;</span><span class="p">,</span> <span class="s2">&quot;UCSXwxpWZQ7XZ1WL3wqevChA&quot;</span><span class="p">)</span>  <span class="c1"># &quot;Apache Airflow&quot;</span>
 <span class="n">YOUTUBE_VIDEO_PUBLISHED_AFTER</span> <span class="o">=</span> <span class="n">getenv</span><span class="p">(</span><span class="s2">&quot;YOUTUBE_VIDEO_PUBLISHED_AFTER&quot;</span><span class="p">,</span> <span class="s2">&quot;2019-09-25T00:00:00Z&quot;</span><span class="p">)</span>
@@ -676,7 +676,7 @@ tasks to retrieve specific information about YouTube videos.</p>
 (<code class="docutils literal notranslate"><span class="pre">YOUTUBE_VIDEO_PUBLISHED_AFTER</span></code>, <code class="docutils literal notranslate"><span class="pre">YOUTUBE_VIDEO_PUBLISHED_BEFORE</span></code>) on a YouTube channel (<code class="docutils literal notranslate"><span class="pre">YOUTUBE_CHANNEL_ID</span></code>)
 saves the response in S3 and also pushes the data to xcom.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">task_video_ids_to_s3</span> <span class="o">=</span> <span class="n">GoogleApiToS3Operator</span><span class="p">(</span>
         <span class="n">gcp_conn_id</span><span class="o">=</span><span class="n">YOUTUBE_CONN_ID</span><span class="p">,</span>
         <span class="n">google_api_service_name</span><span class="o">=</span><span class="s1">&#39;youtube&#39;</span><span class="p">,</span>
@@ -701,7 +701,7 @@ saves the response in S3 and also pushes the data to xcom.</p>
 <p>From there a <code class="docutils literal notranslate"><span class="pre">BranchPythonOperator</span></code> will extract the xcom data and bring the IDs in a format the next
 request needs it + it also decides whether we need to request any videos or not.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="k">def</span> <span class="nf">_check_and_transform_video_ids</span><span class="p">(</span><span class="n">task_output</span><span class="p">,</span> <span class="n">task_instance</span><span class="p">):</span>
     <span class="n">video_ids_response</span> <span class="o">=</span> <span class="n">task_output</span>
     <span class="n">video_ids</span> <span class="o">=</span> <span class="p">[</span><span class="n">item</span><span class="p">[</span><span class="s1">&#39;id&#39;</span><span class="p">][</span><span class="s1">&#39;videoId&#39;</span><span class="p">]</span> <span class="k">for</span> <span class="n">item</span> <span class="ow">in</span> <span class="n">video_ids_response</span><span class="p">[</span><span class="s1">&#39;items&#39;</span><span class="p">]]</span>
@@ -716,7 +716,7 @@ request needs it + it also decides whether we need to request any videos or not.
 </div>
 </div>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">task_check_and_transform_video_ids</span> <span class="o">=</span> <span class="n">BranchPythonOperator</span><span class="p">(</span>
         <span class="n">python_callable</span><span class="o">=</span><span class="n">_check_and_transform_video_ids</span><span class="p">,</span>
         <span class="n">op_args</span><span class="o">=</span><span class="p">[</span><span class="n">task_video_ids_to_s3</span><span class="o">.</span><span class="n">output</span><span class="p">[</span><span class="n">task_video_ids_to_s3</span><span class="o">.</span><span class="n">google_api_response_via_xcom</span><span class="p">]],</span>
@@ -728,7 +728,7 @@ request needs it + it also decides whether we need to request any videos or not.
 <p>If there are YouTube Video IDs available, it passes over the YouTube IDs to the next request which then gets the
 information (<code class="docutils literal notranslate"><span class="pre">YOUTUBE_VIDEO_FIELDS</span></code>) for the requested videos and saves them in S3 (<code class="docutils literal notranslate"><span class="pre">S3_DESTINATION_KEY</span></code>).</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">task_video_data_to_s3</span> <span class="o">=</span> <span class="n">GoogleApiToS3Operator</span><span class="p">(</span>
         <span class="n">gcp_conn_id</span><span class="o">=</span><span class="n">YOUTUBE_CONN_ID</span><span class="p">,</span>
         <span class="n">google_api_service_name</span><span class="o">=</span><span class="s1">&#39;youtube&#39;</span><span class="p">,</span>
@@ -748,7 +748,7 @@ information (<code class="docutils literal notranslate"><span class="pre">YOUTUB
 </div>
 <p>If not do nothing - and track it.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">task_no_video_ids</span> <span class="o">=</span> <span class="n">DummyOperator</span><span class="p">(</span><span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;no_video_ids&#39;</span><span class="p">)</span>
 </pre></div>
 </div>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/imap_attachment_to_s3.html b/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/imap_attachment_to_s3.html
index c444f10441..1b05e8e21e 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/imap_attachment_to_s3.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/imap_attachment_to_s3.html
@@ -613,7 +613,7 @@ protocol from a mail server to S3 Bucket.</p>
 <h3>Environment variables<a class="headerlink" href="#environment-variables" title="Permalink to this headline">¶</a></h3>
 <p>These examples rely on the following variables, which can be passed via OS environment variables.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_imap_attachment_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_imap_attachment_to_s3.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_imap_attachment_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_imap_attachment_to_s3.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">IMAP_ATTACHMENT_NAME</span> <span class="o">=</span> <span class="n">getenv</span><span class="p">(</span><span class="s2">&quot;IMAP_ATTACHMENT_NAME&quot;</span><span class="p">,</span> <span class="s2">&quot;test.txt&quot;</span><span class="p">)</span>
 <span class="n">IMAP_MAIL_FOLDER</span> <span class="o">=</span> <span class="n">getenv</span><span class="p">(</span><span class="s2">&quot;IMAP_MAIL_FOLDER&quot;</span><span class="p">,</span> <span class="s2">&quot;INBOX&quot;</span><span class="p">)</span>
 <span class="n">IMAP_MAIL_FILTER</span> <span class="o">=</span> <span class="n">getenv</span><span class="p">(</span><span class="s2">&quot;IMAP_MAIL_FILTER&quot;</span><span class="p">,</span> <span class="s2">&quot;All&quot;</span><span class="p">)</span>
@@ -625,7 +625,7 @@ protocol from a mail server to S3 Bucket.</p>
 <div class="section" id="transfer-mail-attachments-via-imap-to-s3">
 <h3>Transfer Mail Attachments via IMAP to S3<a class="headerlink" href="#transfer-mail-attachments-via-imap-to-s3" title="Permalink to this headline">¶</a></h3>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_imap_attachment_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_imap_attachment_to_s3.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_imap_attachment_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_imap_attachment_to_s3.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">task_transfer_imap_attachment_to_s3</span> <span class="o">=</span> <span class="n">ImapAttachmentToS3Operator</span><span class="p">(</span>
         <span class="n">imap_attachment_name</span><span class="o">=</span><span class="n">IMAP_ATTACHMENT_NAME</span><span class="p">,</span>
         <span class="n">s3_key</span><span class="o">=</span><span class="n">S3_DESTINATION_KEY</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/lambda.html b/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/lambda.html
index e5163ef552..b84eaf2171 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/lambda.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/lambda.html
@@ -609,7 +609,7 @@ and only pay for what you use.</p>
 <p>To publish a message to an Amazon SNS Topic you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/aws_lambda/index.html#airflow.providers.amazon.aws.operators.aws_lambda.AwsLambdaInvokeFunctionOperator" title="airflow.providers.amazon.aws.operators.aws_lambda.AwsLambdaInvokeFunctionOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">AwsLambdaInvokeFunctionOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_lambda.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_lambda.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_lambda.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_lambda.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">invoke_lambda_function</span> <span class="o">=</span> <span class="n">AwsLambdaInvokeFunctionOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;setup__invoke_lambda_function&#39;</span><span class="p">,</span>
     <span class="n">function_name</span><span class="o">=</span><span class="n">LAMBDA_FUNCTION_NAME</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/rds.html b/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/rds.html
index e4f5385189..e80aaa14ea 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/rds.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/rds.html
@@ -611,7 +611,7 @@
 <code class="xref py py-class docutils literal notranslate"><span class="pre">RDSCreateDBSnapshotOperator</span></code>.
 The source DB instance must be in the <code class="docutils literal notranslate"><span class="pre">available</span></code> or <code class="docutils literal notranslate"><span class="pre">storage-optimization</span></code> state.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_rds.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_rds.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="k">with</span> <span class="n">DAG</span><span class="p">(</span>
     <span class="n">dag_id</span><span class="o">=</span><span class="s1">&#39;rds_snapshots&#39;</span><span class="p">,</span> <span class="n">start_date</span><span class="o">=</span><span class="n">datetime</span><span class="p">(</span><span class="mi">2021</span><span class="p">,</span> <span class="mi">1</span><span class="p">,</span> <span class="mi">1</span><span class="p">),</span> <span class="n">schedule_interval</span><span class="o">=</span><span class="kc">None</span><span [...]
 <span class="p">)</span> <span class="k">as</span> <span class="n">dag</span><span class="p">:</span>
@@ -653,7 +653,7 @@ The source DB instance must be in the <code class="docutils literal notranslate"
 <code class="xref py py-class docutils literal notranslate"><span class="pre">RDSCopyDBSnapshotOperator</span></code>.
 The source DB snapshot must be in the <code class="docutils literal notranslate"><span class="pre">available</span></code> state.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_rds.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_rds.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">copy_snapshot</span> <span class="o">=</span> <span class="n">RdsCopyDbSnapshotOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;copy_snapshot&#39;</span><span class="p">,</span>
         <span class="n">db_type</span><span class="o">=</span><span class="s1">&#39;instance&#39;</span><span class="p">,</span>
@@ -675,7 +675,7 @@ The source DB snapshot must be in the <code class="docutils literal notranslate"
 <code class="xref py py-class docutils literal notranslate"><span class="pre">RDSDeleteDBSnapshotOperator</span></code>.
 The DB snapshot must be in the <code class="docutils literal notranslate"><span class="pre">available</span></code> state to be deleted.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_rds.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_rds.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">delete_snapshot</span> <span class="o">=</span> <span class="n">RdsDeleteDbSnapshotOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;delete_snapshot&#39;</span><span class="p">,</span>
         <span class="n">db_type</span><span class="o">=</span><span class="s1">&#39;instance&#39;</span><span class="p">,</span>
@@ -696,7 +696,7 @@ The DB snapshot must be in the <code class="docutils literal notranslate"><span
 <code class="xref py py-class docutils literal notranslate"><span class="pre">RDSStartExportTaskOperator</span></code>.
 The provided IAM role must have access to the S3 bucket.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_rds.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_rds.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">start_export</span> <span class="o">=</span> <span class="n">RdsStartExportTaskOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;start_export&#39;</span><span class="p">,</span>
         <span class="n">export_task_identifier</span><span class="o">=</span><span class="s1">&#39;export-auth-db-snap-{{ ds }}&#39;</span><span class="p">,</span>
@@ -720,7 +720,7 @@ The provided IAM role must have access to the S3 bucket.</p>
 <code class="xref py py-class docutils literal notranslate"><span class="pre">RDSCancelExportTaskOperator</span></code>.
 Any data that has already been written to the S3 bucket isn't removed.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_rds.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_rds.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">cancel_export</span> <span class="o">=</span> <span class="n">RdsCancelExportTaskOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;cancel_export&#39;</span><span class="p">,</span>
         <span class="n">export_task_identifier</span><span class="o">=</span><span class="s1">&#39;export-auth-db-snap-{{ ds }}&#39;</span><span class="p">,</span>
@@ -742,7 +742,7 @@ To obtain an ARN with SNS, you must create a topic in Amazon SNS and subscribe t
 RDS event notification is only available for not encrypted SNS topics.
 If you specify an encrypted SNS topic, event notifications are not sent for the topic.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_rds.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_rds.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">create_subscription</span> <span class="o">=</span> <span class="n">RdsCreateEventSubscriptionOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;create_subscription&#39;</span><span class="p">,</span>
         <span class="n">subscription_name</span><span class="o">=</span><span class="s1">&#39;my-topic-subscription&#39;</span><span class="p">,</span>
@@ -764,7 +764,7 @@ If you specify an encrypted SNS topic, event notifications are not sent for the
 <p>To delete event subscription you can use
 <code class="xref py py-class docutils literal notranslate"><span class="pre">RDSDeleteEventSubscriptionOperator</span></code></p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_rds.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_rds.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">delete_subscription</span> <span class="o">=</span> <span class="n">RdsDeleteEventSubscriptionOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;delete_subscription&#39;</span><span class="p">,</span>
         <span class="n">subscription_name</span><span class="o">=</span><span class="s1">&#39;my-topic-subscription&#39;</span><span class="p">,</span>
@@ -792,7 +792,7 @@ If you specify an encrypted SNS topic, event notifications are not sent for the
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/sensors/rds/index.html#airflow.providers.amazon.aws.sensors.rds.RdsSnapshotExistenceSensor" title="airflow.providers.amazon.aws.sensors.rds.RdsSnapshotExistenceSensor"><code class="xref py py-class docutils literal notranslate"><span class="pre">RdsSnapshotExistenceSensor</span></code></a>.
 By default, sensor waits existence of snapshot with status <code class="docutils literal notranslate"><span class="pre">available</span></code>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_rds.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_rds.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">snapshot_sensor</span> <span class="o">=</span> <span class="n">RdsSnapshotExistenceSensor</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;snapshot_sensor&#39;</span><span class="p">,</span>
         <span class="n">db_type</span><span class="o">=</span><span class="s1">&#39;instance&#39;</span><span class="p">,</span>
@@ -811,7 +811,7 @@ By default, sensor waits existence of snapshot with status <code class="docutils
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/sensors/rds/index.html#airflow.providers.amazon.aws.sensors.rds.RdsExportTaskExistenceSensor" title="airflow.providers.amazon.aws.sensors.rds.RdsExportTaskExistenceSensor"><code class="xref py py-class docutils literal notranslate"><span class="pre">RdsExportTaskExistenceSensor</span></code></a>.
 By default, sensor waits existence of export task with status <code class="docutils literal notranslate"><span class="pre">available</span></code>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_rds.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_rds.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">export_sensor</span> <span class="o">=</span> <span class="n">RdsExportTaskExistenceSensor</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;export_sensor&#39;</span><span class="p">,</span>
         <span class="n">export_task_identifier</span><span class="o">=</span><span class="s1">&#39;export-auth-db-snap-{{ ds }}&#39;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/redshift_cluster.html b/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/redshift_cluster.html
index b64ba6869d..2aeb46237b 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/redshift_cluster.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/redshift_cluster.html
@@ -624,7 +624,7 @@ business and customers.</p>
 <p>To check the state of an Amazon Redshift Cluster until it reaches the target state or another terminal
 state you can use <a class="reference internal" href="../_api/airflow/providers/amazon/aws/sensors/redshift_cluster/index.html#airflow.providers.amazon.aws.sensors.redshift_cluster.RedshiftClusterSensor" title="airflow.providers.amazon.aws.sensors.redshift_cluster.RedshiftClusterSensor"><code class="xref py py-class docutils literal notranslate"><span class="pre">RedshiftClusterSensor</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift_cluster.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_redshift_cluster.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift_cluster.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_redshift_cluster.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">task_wait_cluster_available</span> <span class="o">=</span> <span class="n">RedshiftClusterSensor</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;sensor_redshift_cluster_available&#39;</span><span class="p">,</span>
     <span class="n">cluster_identifier</span><span class="o">=</span><span class="n">REDSHIFT_CLUSTER_IDENTIFIER</span><span class="p">,</span>
@@ -641,7 +641,7 @@ state you can use <a class="reference internal" href="../_api/airflow/providers/
 <p>To resume a 'paused' Amazon Redshift Cluster you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/redshift_cluster/index.html#module-airflow.providers.amazon.aws.operators.redshift_cluster" title="airflow.providers.amazon.aws.operators.redshift_cluster"><code class="xref py py-class docutils literal notranslate"><span class="pre">RedshiftResumeClusterOperator</span></code></a></p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift_cluster.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_redshift_cluster.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift_cluster.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_redshift_cluster.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">task_resume_cluster</span> <span class="o">=</span> <span class="n">RedshiftResumeClusterOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;redshift_resume_cluster&#39;</span><span class="p">,</span>
     <span class="n">cluster_identifier</span><span class="o">=</span><span class="n">REDSHIFT_CLUSTER_IDENTIFIER</span><span class="p">,</span>
@@ -655,7 +655,7 @@ state you can use <a class="reference internal" href="../_api/airflow/providers/
 <p>To pause an 'available' Amazon Redshift Cluster you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/redshift_cluster/index.html#module-airflow.providers.amazon.aws.operators.redshift_cluster" title="airflow.providers.amazon.aws.operators.redshift_cluster"><code class="xref py py-class docutils literal notranslate"><span class="pre">RedshiftPauseClusterOperator</span></code></a></p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift_cluster.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_redshift_cluster.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift_cluster.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_redshift_cluster.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">task_pause_cluster</span> <span class="o">=</span> <span class="n">RedshiftPauseClusterOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;redshift_pause_cluster&#39;</span><span class="p">,</span>
     <span class="n">cluster_identifier</span><span class="o">=</span><span class="n">REDSHIFT_CLUSTER_IDENTIFIER</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/redshift_data.html b/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/redshift_data.html
index acf51160d8..58032dadae 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/redshift_data.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/redshift_data.html
@@ -622,7 +622,7 @@ statements against an Amazon Redshift cluster.</p>
 <p>This is a basic example DAG for using <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/redshift_data/index.html#module-airflow.providers.amazon.aws.operators.redshift_data" title="airflow.providers.amazon.aws.operators.redshift_data"><code class="xref py py-class docutils literal notranslate"><span class="pre">RedshiftDataOperator</span></code></a>
 to execute statements against an Amazon Redshift cluster.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift_data_execute_sql.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_redshift_data_execute_sql.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift_data_execute_sql.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_redshift_data_execute_sql.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">task_query</span> <span class="o">=</span> <span class="n">RedshiftDataOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;redshift_query&#39;</span><span class="p">,</span>
     <span class="n">cluster_identifier</span><span class="o">=</span><span class="n">REDSHIFT_CLUSTER_IDENTIFIER</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/redshift_sql.html b/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/redshift_sql.html
index de0aa74658..7f8813302c 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/redshift_sql.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/redshift_sql.html
@@ -623,7 +623,7 @@ business and customers.</p>
 <div class="section" id="execute-a-sql-query">
 <h3>Execute a SQL query<a class="headerlink" href="#execute-a-sql-query" title="Permalink to this headline">¶</a></h3>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift_sql.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_redshift_sql.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift_sql.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_redshift_sql.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">task_select_data</span> <span class="o">=</span> <span class="n">RedshiftSQLOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;task_get_all_table_data&#39;</span><span class="p">,</span> <span class="n">sql</span><span class="o">=</span><span class="s2">&quot;&quot;&quot;CREATE TABLE more_fruit AS SELECT * FROM fruit;&quot;&quot;&quot;</span>
 <span class="p">)</span>
@@ -636,7 +636,7 @@ business and customers.</p>
 <p>RedshiftSQLOperator supports the <code class="docutils literal notranslate"><span class="pre">parameters</span></code> attribute which allows us to dynamically pass
 parameters into SQL statements.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift_sql.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_redshift_sql.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift_sql.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_redshift_sql.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">task_select_filtered_data</span> <span class="o">=</span> <span class="n">RedshiftSQLOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;task_get_filtered_table_data&#39;</span><span class="p">,</span>
     <span class="n">sql</span><span class="o">=</span><span class="s2">&quot;&quot;&quot;CREATE TABLE filtered_fruit AS SELECT * FROM fruit WHERE color = &#39;{{ params.color }}&#39;;&quot;&quot;&quot;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/s3.html b/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/s3.html
index 8183e777e2..9c6fa185e7 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/s3.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/s3.html
@@ -651,7 +651,7 @@ new S3 bucket with a given bucket name then delete it.</p>
 <h3>Defining tasks<a class="headerlink" href="#defining-tasks" title="Permalink to this headline">¶</a></h3>
 <p>In the following code we create a new bucket, add keys, and then delete the bucket.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3_bucket.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_s3_bucket.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3_bucket.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_s3_bucket.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="nd">@task</span><span class="p">(</span><span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;s3_bucket_dag_add_keys_to_bucket&quot;</span><span class="p">)</span>
 <span class="k">def</span> <span class="nf">upload_keys</span><span class="p">():</span>
     <span class="sd">&quot;&quot;&quot;This is a python callback to add keys into the s3 bucket&quot;&quot;&quot;</span>
@@ -699,7 +699,7 @@ and <code class="docutils literal notranslate"><span class="pre">S3PutBucketTagg
 <h3>Defining tasks<a class="headerlink" href="#id2" title="Permalink to this headline">¶</a></h3>
 <p>In the following code we create a new S3 bucket, apply tagging, get tagging, delete tagging, then delete the bucket.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3_bucket_tagging.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_s3_bucket_tagging.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3_bucket_tagging.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_s3_bucket_tagging.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="k">with</span> <span class="n">DAG</span><span class="p">(</span>
     <span class="n">dag_id</span><span class="o">=</span><span class="s1">&#39;s3_bucket_tagging_dag&#39;</span><span class="p">,</span>
     <span class="n">schedule_interval</span><span class="o">=</span><span class="kc">None</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/sagemaker.html b/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/sagemaker.html
index e24d89d3b3..7fd0f201ac 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/sagemaker.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/sagemaker.html
@@ -640,7 +640,7 @@ generate the models artifact in s3, create the model,
 training, Sagemaker Model, batch transform job and
 then delete the model.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_sagemaker.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_sagemaker.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_sagemaker.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_sagemaker.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="k">with</span> <span class="n">DAG</span><span class="p">(</span>
     <span class="s2">&quot;sample_sagemaker_dag&quot;</span><span class="p">,</span>
     <span class="n">schedule_interval</span><span class="o">=</span><span class="kc">None</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/salesforce_to_s3.html b/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/salesforce_to_s3.html
index 541a6b5551..9ac32062f2 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/salesforce_to_s3.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/salesforce_to_s3.html
@@ -605,7 +605,7 @@ are initially written to a local, temporary directory and then uploaded to an S3
 <p>The following example demonstrates a use case of extracting customer data from a Salesforce
 instance and upload to a &quot;landing&quot; bucket in S3.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_salesforce_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_salesforce_to_s3.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_salesforce_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_salesforce_to_s3.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">upload_salesforce_data_to_s3_landing</span> <span class="o">=</span> <span class="n">SalesforceToS3Operator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;upload_salesforce_data_to_s3&quot;</span><span class="p">,</span>
         <span class="n">salesforce_query</span><span class="o">=</span><span class="s2">&quot;SELECT Id, Name, Company, Phone, Email, LastModifiedDate, IsActive FROM Customers&quot;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/sns.html b/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/sns.html
index 44a7686aec..600c24b853 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/sns.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/sns.html
@@ -610,7 +610,7 @@ messages (SMS).</p>
 <p>To publish a message to an Amazon SNS Topic you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/sns/index.html#airflow.providers.amazon.aws.operators.sns.SnsPublishOperator" title="airflow.providers.amazon.aws.operators.sns.SnsPublishOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">SnsPublishOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_sns.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_sns.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_sns.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_sns.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">publish</span> <span class="o">=</span> <span class="n">SnsPublishOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;publish_message&#39;</span><span class="p">,</span>
     <span class="n">target_arn</span><span class="o">=</span><span class="n">SNS_TOPIC_ARN</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/sqs_publish.html b/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/sqs_publish.html
index e41c479d32..34ebde702d 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/sqs_publish.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/sqs_publish.html
@@ -627,7 +627,7 @@ to publish a message to Amazon Simple Queue Service (SQS).</p>
 <p>In the following example, the task &quot;publish_to_queue&quot; publishes a message containing
 the task instance and the execution date to a queue named <code class="docutils literal notranslate"><span class="pre">Airflow-Example-Queue</span></code>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_sqs.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_sqs.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_sqs.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_sqs.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>
     <span class="c1"># Using a task-decorated function to create an SQS queue</span>
     <span class="n">create_queue</span> <span class="o">=</span> <span class="n">create_queue_fn</span><span class="p">()</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/transfer/dynamodb_to_s3.html b/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/transfer/dynamodb_to_s3.html
index 4d45b772fd..b9ff33868d 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/transfer/dynamodb_to_s3.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/transfer/dynamodb_to_s3.html
@@ -628,7 +628,7 @@ records that satisfy the criteria.</p>
 <a class="reference internal" href="../../_api/airflow/providers/amazon/aws/transfers/dynamodb_to_s3/index.html#airflow.providers.amazon.aws.transfers.dynamodb_to_s3.DynamoDBToS3Operator" title="airflow.providers.amazon.aws.transfers.dynamodb_to_s3.DynamoDBToS3Operator"><code class="xref py py-class docutils literal notranslate"><span class="pre">DynamoDBToS3Operator</span></code></a></p>
 <p>Example usage:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_dynamodb_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/amazon/aws/example_dags/example_dynamodb_to_s3.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_dynamodb_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_dynamodb_to_s3.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">backup_db</span> <span class="o">=</span> <span class="n">DynamoDBToS3Operator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;backup_db&#39;</span><span class="p">,</span>
     <span class="n">dynamodb_table_name</span><span class="o">=</span><span class="n">TABLE_NAME</span><span class="p">,</span>
@@ -642,7 +642,7 @@ records that satisfy the criteria.</p>
 <p>To parallelize the replication, users can create multiple DynamoDBToS3Operator tasks using the
 <code class="docutils literal notranslate"><span class="pre">TotalSegments</span></code> parameter.  For instance to replicate with parallelism of 2, create two tasks:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_dynamodb_to_s3_segmented.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/amazon/aws/example_dags/example_dynamodb_to_s3_segmented.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_dynamodb_to_s3_segmented.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_dynamodb_to_s3_segmented.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="c1"># Segmenting allows the transfer to be parallelized into {segment} number of parallel tasks.</span>
 <span class="n">backup_db_segment_1</span> <span class="o">=</span> <span class="n">DynamoDBToS3Operator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;backup-1&#39;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/transfer/glacier_to_gcs.html b/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/transfer/glacier_to_gcs.html
index b1209fb360..9a732213d1 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/transfer/glacier_to_gcs.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/transfer/glacier_to_gcs.html
@@ -610,7 +610,7 @@ Transferring big files may not work well.</p>
 <a class="reference internal" href="../../_api/airflow/providers/amazon/aws/transfers/glacier_to_gcs/index.html#airflow.providers.amazon.aws.transfers.glacier_to_gcs.GlacierToGCSOperator" title="airflow.providers.amazon.aws.transfers.glacier_to_gcs.GlacierToGCSOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">GlacierToGCSOperator</span></code></a></p>
 <p>Example usage:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">transfer_archive_to_gcs</span> <span class="o">=</span> <span class="n">GlacierToGCSOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;transfer_archive_to_gcs&quot;</span><span class="p">,</span>
     <span class="n">vault_name</span><span class="o">=</span><span class="n">VAULT_NAME</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/transfer/redshift_to_s3.html b/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/transfer/redshift_to_s3.html
index acda480e48..a58fdb01e3 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/transfer/redshift_to_s3.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/transfer/redshift_to_s3.html
@@ -623,7 +623,7 @@ Service (S3) file.</p>
 <a class="reference internal" href="../../_api/airflow/providers/amazon/aws/transfers/redshift_to_s3/index.html#airflow.providers.amazon.aws.transfers.redshift_to_s3.RedshiftToS3Operator" title="airflow.providers.amazon.aws.transfers.redshift_to_s3.RedshiftToS3Operator"><code class="xref py py-class docutils literal notranslate"><span class="pre">RedshiftToS3Operator</span></code></a></p>
 <p>Example usage:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/amazon/aws/example_dags/example_redshift_to_s3.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_redshift_to_s3.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">task_transfer_redshift_to_s3</span> <span class="o">=</span> <span class="n">RedshiftToS3Operator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;transfer_redshift_to_s3&#39;</span><span class="p">,</span>
     <span class="n">s3_bucket</span><span class="o">=</span><span class="n">S3_BUCKET_NAME</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/transfer/s3_to_redshift.html b/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/transfer/s3_to_redshift.html
index 121733e32a..4a12382bea 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/transfer/s3_to_redshift.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/transfer/s3_to_redshift.html
@@ -623,7 +623,7 @@ Amazon Redshift table.</p>
 <a class="reference internal" href="../../_api/airflow/providers/amazon/aws/transfers/s3_to_redshift/index.html#airflow.providers.amazon.aws.transfers.s3_to_redshift.S3ToRedshiftOperator" title="airflow.providers.amazon.aws.transfers.s3_to_redshift.S3ToRedshiftOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">S3ToRedshiftOperator</span></code></a></p>
 <p>Example usage:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3_to_redshift.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/amazon/aws/example_dags/example_s3_to_redshift.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3_to_redshift.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_s3_to_redshift.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">task_transfer_s3_to_redshift</span> <span class="o">=</span> <span class="n">S3ToRedshiftOperator</span><span class="p">(</span>
     <span class="n">s3_bucket</span><span class="o">=</span><span class="n">S3_BUCKET_NAME</span><span class="p">,</span>
     <span class="n">s3_key</span><span class="o">=</span><span class="n">S3_KEY</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/transfer/s3_to_sftp.html b/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/transfer/s3_to_sftp.html
index 3b922bdad3..97fcd31b14 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/transfer/s3_to_sftp.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/transfer/s3_to_sftp.html
@@ -605,7 +605,7 @@ For more information about the service visits <a class="reference external" href
 <a class="reference internal" href="../../_api/airflow/providers/amazon/aws/transfers/s3_to_sftp/index.html#airflow.providers.amazon.aws.transfers.s3_to_sftp.S3ToSFTPOperator" title="airflow.providers.amazon.aws.transfers.s3_to_sftp.S3ToSFTPOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">S3ToSFTPOperator</span></code></a></p>
 <p>Example usage:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3_to_sftp.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/amazon/aws/example_dags/example_s3_to_sftp.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3_to_sftp.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_s3_to_sftp.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">create_s3_to_sftp_job</span> <span class="o">=</span> <span class="n">S3ToSFTPOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;create_to_s3_sftp_job&quot;</span><span class="p">,</span>
     <span class="n">sftp_conn_id</span><span class="o">=</span><span class="s2">&quot;sftp_conn_id&quot;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/transfer/sftp_to_s3.html b/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/transfer/sftp_to_s3.html
index 3e408b34b7..5a31654aa9 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/transfer/sftp_to_s3.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.1.1/operators/transfer/sftp_to_s3.html
@@ -604,7 +604,7 @@
 <a class="reference internal" href="../../_api/airflow/providers/amazon/aws/transfers/sftp_to_s3/index.html#airflow.providers.amazon.aws.transfers.sftp_to_s3.SFTPToS3Operator" title="airflow.providers.amazon.aws.transfers.sftp_to_s3.SFTPToS3Operator"><code class="xref py py-class docutils literal notranslate"><span class="pre">SFTPToS3Operator</span></code></a></p>
 <p>Example usage:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_sftp_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/amazon/aws/example_dags/example_sftp_to_s3.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_sftp_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.1.1/airflow/providers/amazon/aws/example_dags/example_sftp_to_s3.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">create_sftp_to_s3_job</span> <span class="o">=</span> <span class="n">SFTPToS3Operator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;create_sftp_to_s3_job&quot;</span><span class="p">,</span>
     <span class="n">sftp_conn_id</span><span class="o">=</span><span class="s2">&quot;sftp_conn_id&quot;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/athena.html b/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/athena.html
index 288b35b94a..49c0d0402b 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/athena.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/athena.html
@@ -630,7 +630,7 @@ created in an S3 bucket and populated with SAMPLE_DATA.  The example waits for t
 to complete and then drops the created table and deletes the sample CSV file in the S3
 bucket.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_athena.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_athena.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_athena.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_athena.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>
     <span class="c1"># Using a task-decorated function to create a CSV file in S3</span>
     <span class="n">add_sample_data_to_s3</span> <span class="o">=</span> <span class="n">add_sample_data_to_s3</span><span class="p">()</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/batch.html b/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/batch.html
index 842388ebff..d8cee7290a 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/batch.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/batch.html
@@ -620,7 +620,7 @@ infrastructure.</p>
 <p>To wait on the state of an AWS Batch Job until it reaches a terminal state you can
 use <a class="reference internal" href="../_api/airflow/providers/amazon/aws/sensors/batch/index.html#airflow.providers.amazon.aws.sensors.batch.BatchSensor" title="airflow.providers.amazon.aws.sensors.batch.BatchSensor"><code class="xref py py-class docutils literal notranslate"><span class="pre">BatchSensor</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_batch.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_batch.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_batch.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_batch.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">wait_for_batch_job</span> <span class="o">=</span> <span class="n">BatchSensor</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;wait_for_batch_job&#39;</span><span class="p">,</span>
     <span class="n">job_id</span><span class="o">=</span><span class="n">JOB_ID</span><span class="p">,</span>
@@ -634,7 +634,7 @@ use <a class="reference internal" href="../_api/airflow/providers/amazon/aws/sen
 <p>To submit a new AWS Batch Job and monitor it until it reaches a terminal state you can
 use <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/batch/index.html#airflow.providers.amazon.aws.operators.batch.BatchOperator" title="airflow.providers.amazon.aws.operators.batch.BatchOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">BatchOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_batch.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_batch.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_batch.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_batch.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">submit_batch_job</span> <span class="o">=</span> <span class="n">BatchOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;submit_batch_job&#39;</span><span class="p">,</span>
     <span class="n">job_name</span><span class="o">=</span><span class="n">JOB_NAME</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/datasync.html b/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/datasync.html
index 664ee1689d..9bb8fd2725 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/datasync.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/datasync.html
@@ -622,13 +622,13 @@ and an <em>AWS DataSync Task</em> (identified by a TaskArn on AWS).</p>
 <h3>Environment variables<a class="headerlink" href="#environment-variables" title="Permalink to this headline">¶</a></h3>
 <p>These examples rely on the following variables, which can be passed via OS environment variables.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_datasync_1.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_datasync_1.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_datasync_1.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_datasync_1.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">TASK_ARN</span> <span class="o">=</span> <span class="n">getenv</span><span class="p">(</span><span class="s2">&quot;TASK_ARN&quot;</span><span class="p">,</span> <span class="s2">&quot;my_aws_datasync_task_arn&quot;</span><span class="p">)</span>
 </pre></div>
 </div>
 </div>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_datasync_1.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_datasync_1.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_datasync_1.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_datasync_1.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">SOURCE_LOCATION_URI</span> <span class="o">=</span> <span class="n">getenv</span><span class="p">(</span><span class="s2">&quot;SOURCE_LOCATION_URI&quot;</span><span class="p">,</span> <span class="s2">&quot;smb://hostname/directory/&quot;</span><span class="p">)</span>
 
 <span class="n">DESTINATION_LOCATION_URI</span> <span class="o">=</span> <span class="n">getenv</span><span class="p">(</span><span class="s2">&quot;DESTINATION_LOCATION_URI&quot;</span><span class="p">,</span> <span class="s2">&quot;s3://mybucket/prefix&quot;</span><span class="p">)</span>
@@ -641,7 +641,7 @@ and an <em>AWS DataSync Task</em> (identified by a TaskArn on AWS).</p>
 <p>The <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/datasync/index.html#airflow.providers.amazon.aws.operators.datasync.DataSyncOperator" title="airflow.providers.amazon.aws.operators.datasync.DataSyncOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">DataSyncOperator</span></code></a> can execute a specific
 TaskArn by specifying the <code class="docutils literal notranslate"><span class="pre">task_arn</span></code> parameter. This is useful when you know the TaskArn you want to execute.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_datasync_1.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_datasync_1.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_datasync_1.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_datasync_1.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">datasync_task_1</span> <span class="o">=</span> <span class="n">DataSyncOperator</span><span class="p">(</span><span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;datasync_task_1&quot;</span><span class="p">,</span> <span class="n">task_arn</span><span class="o">=</span><span class="n">TASK_ARN</span><span class="p">)</span>
 </pre></div>
 </div>
@@ -657,7 +657,7 @@ can iterate all DataSync Tasks for their source and destination LocationArns. Th
 each LocationArn to see if its the URIs match the desired source / destination URI.</p>
 <p>To perform a search based on the Location URIs, define the task as follows</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_datasync_1.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_datasync_1.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_datasync_1.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_datasync_1.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">datasync_task_2</span> <span class="o">=</span> <span class="n">DataSyncOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;datasync_task_2&quot;</span><span class="p">,</span>
         <span class="n">source_location_uri</span><span class="o">=</span><span class="n">SOURCE_LOCATION_URI</span><span class="p">,</span>
@@ -685,7 +685,7 @@ Finally, delete it.</p>
 <h3>Environment variables<a class="headerlink" href="#id2" title="Permalink to this headline">¶</a></h3>
 <p>This example relies on the following variables, which can be passed via OS environment variables.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_datasync_2.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_datasync_2.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_datasync_2.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_datasync_2.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">SOURCE_LOCATION_URI</span> <span class="o">=</span> <span class="n">getenv</span><span class="p">(</span><span class="s2">&quot;SOURCE_LOCATION_URI&quot;</span><span class="p">,</span> <span class="s2">&quot;smb://hostname/directory/&quot;</span><span class="p">)</span>
 
 <span class="n">DESTINATION_LOCATION_URI</span> <span class="o">=</span> <span class="n">getenv</span><span class="p">(</span><span class="s2">&quot;DESTINATION_LOCATION_URI&quot;</span><span class="p">,</span> <span class="s2">&quot;s3://mybucket/prefix&quot;</span><span class="p">)</span>
@@ -723,7 +723,7 @@ as before but with some extra arguments.</p>
 and/or Locations if no suitable existing Task was found. If these are left to their default value (None)
 then no create will be attempted.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_datasync_2.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_datasync_2.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_datasync_2.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_datasync_2.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">datasync_task</span> <span class="o">=</span> <span class="n">DataSyncOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;datasync_task&quot;</span><span class="p">,</span>
         <span class="n">source_location_uri</span><span class="o">=</span><span class="n">SOURCE_LOCATION_URI</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/dms.html b/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/dms.html
index 1178ff818b..a8e5d780ba 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/dms.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/dms.html
@@ -646,7 +646,7 @@ to be completed, and then delete it.</p>
 <h3>Defining tasks<a class="headerlink" href="#defining-tasks" title="Permalink to this headline">¶</a></h3>
 <p>In the following code we create a new replication task, start it, wait for it to be completed and then delete it.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_dms_full_load_task.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_dms_full_load_task.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_dms_full_load_task.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_dms_full_load_task.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">create_task</span> <span class="o">=</span> <span class="n">DmsCreateTaskOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;create_task&#39;</span><span class="p">,</span>
         <span class="n">replication_task_id</span><span class="o">=</span><span class="n">REPLICATION_TASK_ID</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/ecs.html b/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/ecs.html
index 3681831983..2849ff2532 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/ecs.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/ecs.html
@@ -635,7 +635,7 @@ scale containerized applications.</p>
 <li><p>If you have integrated external resources in your ECS Cluster, for example using ECS Anywhere, and want to run your containers on those external resources, set the parameter to EXTERNAL.</p></li>
 </ul>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_ecs_ec2.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_ecs_ec2.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_ecs_ec2.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_ecs_ec2.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">hello_world</span> <span class="o">=</span> <span class="n">EcsOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;hello_world&quot;</span><span class="p">,</span>
         <span class="n">cluster</span><span class="o">=</span><span class="n">os</span><span class="o">.</span><span class="n">environ</span><span class="o">.</span><span class="n">get</span><span class="p">(</span><span class="s2">&quot;CLUSTER_NAME&quot;</span><span class="p">,</span> <span class="s2">&quot;existing_cluster_name&quot;</span><span class="p">),</span>
@@ -665,7 +665,7 @@ scale containerized applications.</p>
 </div>
 </div>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_ecs_fargate.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_ecs_fargate.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_ecs_fargate.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_ecs_fargate.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">hello_world</span> <span class="o">=</span> <span class="n">EcsOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;hello_world&quot;</span><span class="p">,</span>
         <span class="n">cluster</span><span class="o">=</span><span class="n">os</span><span class="o">.</span><span class="n">environ</span><span class="o">.</span><span class="n">get</span><span class="p">(</span><span class="s2">&quot;CLUSTER_NAME&quot;</span><span class="p">,</span> <span class="s2">&quot;existing_cluster_name&quot;</span><span class="p">),</span>
@@ -704,7 +704,7 @@ scale containerized applications.</p>
 <h3>CloudWatch Logging<a class="headerlink" href="#cloudwatch-logging" title="Permalink to this headline">¶</a></h3>
 <p>To stream logs to AWS CloudWatch, you need to define these parameters. Using the example Operators above, we would add these additional parameters to enable logging to CloudWatch. You will need to ensure that you have the appropriate level of permissions (see next section)</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_ecs_ec2.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_ecs_ec2.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_ecs_ec2.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_ecs_ec2.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>        <span class="n">awslogs_group</span><span class="o">=</span><span class="s2">&quot;/ecs/hello-world&quot;</span><span class="p">,</span>
         <span class="n">awslogs_region</span><span class="o">=</span><span class="s2">&quot;aws-region&quot;</span><span class="p">,</span>
         <span class="n">awslogs_stream_prefix</span><span class="o">=</span><span class="s2">&quot;ecs/hello-world-container&quot;</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/eks.html b/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/eks.html
index a058b0a716..d05fcd6179 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/eks.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/eks.html
@@ -609,7 +609,7 @@ and management of containerized applications.</p>
 <p>To check the state of an Amazon EKS Cluster until it reaches the target state or another terminal
 state you can use <a class="reference internal" href="../_api/airflow/providers/amazon/aws/sensors/eks/index.html#airflow.providers.amazon.aws.sensors.eks.EksClusterStateSensor" title="airflow.providers.amazon.aws.sensors.eks.EksClusterStateSensor"><code class="xref py py-class docutils literal notranslate"><span class="pre">EksClusterStateSensor</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">await_create_cluster</span> <span class="o">=</span> <span class="n">EksClusterStateSensor</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;wait_for_create_cluster&#39;</span><span class="p">,</span>
     <span class="n">cluster_name</span><span class="o">=</span><span class="n">CLUSTER_NAME</span><span class="p">,</span>
@@ -629,7 +629,7 @@ state you can use <a class="reference internal" href="../_api/airflow/providers/
 </dd>
 </dl>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="c1"># Create an Amazon EKS Cluster control plane without attaching compute service.</span>
 <span class="n">create_cluster</span> <span class="o">=</span> <span class="n">EksCreateClusterOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;create_eks_cluster&#39;</span><span class="p">,</span>
@@ -647,7 +647,7 @@ state you can use <a class="reference internal" href="../_api/airflow/providers/
 <p>To delete an existing Amazon EKS Cluster you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/eks/index.html#airflow.providers.amazon.aws.operators.eks.EksDeleteClusterOperator" title="airflow.providers.amazon.aws.operators.eks.EksDeleteClusterOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">EksDeleteClusterOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">delete_cluster</span> <span class="o">=</span> <span class="n">EksDeleteClusterOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;delete_eks_cluster&#39;</span><span class="p">,</span>
     <span class="n">cluster_name</span><span class="o">=</span><span class="n">CLUSTER_NAME</span><span class="p">,</span>
@@ -661,7 +661,7 @@ attempt to delete any attached resources first.</p>
 </dd>
 </dl>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroup_in_one_step.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroup_in_one_step.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroup_in_one_step.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroup_in_one_step.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="c1"># An Amazon EKS cluster can not be deleted with attached resources such as nodegroups or Fargate profiles.</span>
 <span class="c1"># Setting the `force` to `True` will delete any attached resources before deleting the cluster.</span>
 <span class="n">delete_all</span> <span class="o">=</span> <span class="n">EksDeleteClusterOperator</span><span class="p">(</span>
@@ -681,7 +681,7 @@ attempt to delete any attached resources first.</p>
 <p>To check the state of an Amazon EKS managed node group until it reaches the target state or another terminal
 state you can use <a class="reference internal" href="../_api/airflow/providers/amazon/aws/sensors/eks/index.html#airflow.providers.amazon.aws.sensors.eks.EksNodegroupStateSensor" title="airflow.providers.amazon.aws.sensors.eks.EksNodegroupStateSensor"><code class="xref py py-class docutils literal notranslate"><span class="pre">EksNodegroupStateSensor</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">await_create_nodegroup</span> <span class="o">=</span> <span class="n">EksNodegroupStateSensor</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;wait_for_create_nodegroup&#39;</span><span class="p">,</span>
     <span class="n">cluster_name</span><span class="o">=</span><span class="n">CLUSTER_NAME</span><span class="p">,</span>
@@ -703,7 +703,7 @@ state you can use <a class="reference internal" href="../_api/airflow/providers/
 </dd>
 </dl>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">create_nodegroup</span> <span class="o">=</span> <span class="n">EksCreateNodegroupOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;create_eks_nodegroup&#39;</span><span class="p">,</span>
     <span class="n">cluster_name</span><span class="o">=</span><span class="n">CLUSTER_NAME</span><span class="p">,</span>
@@ -720,7 +720,7 @@ state you can use <a class="reference internal" href="../_api/airflow/providers/
 <p>To delete an existing Amazon EKS Managed Nodegroup you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/eks/index.html#airflow.providers.amazon.aws.operators.eks.EksDeleteNodegroupOperator" title="airflow.providers.amazon.aws.operators.eks.EksDeleteNodegroupOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">EksDeleteNodegroupOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">delete_nodegroup</span> <span class="o">=</span> <span class="n">EksDeleteNodegroupOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;delete_eks_nodegroup&#39;</span><span class="p">,</span>
     <span class="n">cluster_name</span><span class="o">=</span><span class="n">CLUSTER_NAME</span><span class="p">,</span>
@@ -744,7 +744,7 @@ state you can use <a class="reference internal" href="../_api/airflow/providers/
 </dd>
 </dl>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroup_in_one_step.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroup_in_one_step.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroup_in_one_step.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroup_in_one_step.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="c1"># Create an Amazon EKS cluster control plane and an EKS nodegroup compute platform in one step.</span>
 <span class="n">create_cluster_and_nodegroup</span> <span class="o">=</span> <span class="n">EksCreateClusterOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;create_eks_cluster_and_nodegroup&#39;</span><span class="p">,</span>
@@ -775,7 +775,7 @@ state you can use <a class="reference internal" href="../_api/airflow/providers/
 </dd>
 </dl>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_in_one_step.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_in_one_step.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_in_one_step.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_in_one_step.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="c1"># Create an Amazon EKS cluster control plane and an AWS Fargate compute platform in one step.</span>
 <span class="n">create_cluster_and_fargate_profile</span> <span class="o">=</span> <span class="n">EksCreateClusterOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;create_eks_cluster_and_fargate_profile&#39;</span><span class="p">,</span>
@@ -799,7 +799,7 @@ state you can use <a class="reference internal" href="../_api/airflow/providers/
 <p>To check the state of an AWS Fargate profile until it reaches the target state or another terminal
 state you can use <code class="xref py py-class docutils literal notranslate"><span class="pre">EksFargateProfileSensor</span></code>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_profile.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_profile.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_profile.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_profile.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">await_create_fargate_profile</span> <span class="o">=</span> <span class="n">EksFargateProfileStateSensor</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;wait_for_create_fargate_profile&#39;</span><span class="p">,</span>
     <span class="n">cluster_name</span><span class="o">=</span><span class="n">CLUSTER_NAME</span><span class="p">,</span>
@@ -821,7 +821,7 @@ state you can use <code class="xref py py-class docutils literal notranslate"><s
 </dd>
 </dl>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_profile.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_profile.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_profile.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_profile.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">create_fargate_profile</span> <span class="o">=</span> <span class="n">EksCreateFargateProfileOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;create_eks_fargate_profile&#39;</span><span class="p">,</span>
     <span class="n">cluster_name</span><span class="o">=</span><span class="n">CLUSTER_NAME</span><span class="p">,</span>
@@ -838,7 +838,7 @@ state you can use <code class="xref py py-class docutils literal notranslate"><s
 <p>To delete an existing AWS Fargate Profile you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/eks/index.html#airflow.providers.amazon.aws.operators.eks.EksDeleteFargateProfileOperator" title="airflow.providers.amazon.aws.operators.eks.EksDeleteFargateProfileOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">EksDeleteFargateProfileOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_profile.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_profile.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_profile.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_profile.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">delete_fargate_profile</span> <span class="o">=</span> <span class="n">EksDeleteFargateProfileOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;delete_eks_fargate_profile&#39;</span><span class="p">,</span>
     <span class="n">cluster_name</span><span class="o">=</span><span class="n">CLUSTER_NAME</span><span class="p">,</span>
@@ -855,7 +855,7 @@ state you can use <code class="xref py py-class docutils literal notranslate"><s
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/eks/index.html#airflow.providers.amazon.aws.operators.eks.EksPodOperator" title="airflow.providers.amazon.aws.operators.eks.EksPodOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">EksPodOperator</span></code></a>.</p>
 <p>Note: An Amazon EKS Cluster with underlying compute infrastructure is required.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">start_pod</span> <span class="o">=</span> <span class="n">EksPodOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;run_pod&quot;</span><span class="p">,</span>
     <span class="n">cluster_name</span><span class="o">=</span><span class="n">CLUSTER_NAME</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/emr.html b/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/emr.html
index 83b78c43d7..589d1e1dd9 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/emr.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/emr.html
@@ -633,7 +633,7 @@ create a new EMR job flow.  The cluster will be terminated automatically after f
 <h3>JobFlow configuration<a class="headerlink" href="#jobflow-configuration" title="Permalink to this headline">¶</a></h3>
 <p>To create a job flow on EMR, you need to specify the configuration for the EMR cluster:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_job_flow_automatic_steps.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_emr_job_flow_automatic_steps.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_job_flow_automatic_steps.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_emr_job_flow_automatic_steps.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">SPARK_STEPS</span> <span class="o">=</span> <span class="p">[</span>
     <span class="p">{</span>
         <span class="s1">&#39;Name&#39;</span><span class="p">:</span> <span class="s1">&#39;calculate_pi&#39;</span><span class="p">,</span>
@@ -686,7 +686,7 @@ you may not see the cluster in the EMR Management Console - you can change this
 <h3>Create the Job Flow<a class="headerlink" href="#create-the-job-flow" title="Permalink to this headline">¶</a></h3>
 <p>In the following code we are creating a new job flow using the configuration as explained above.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_job_flow_automatic_steps.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_emr_job_flow_automatic_steps.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_job_flow_automatic_steps.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_emr_job_flow_automatic_steps.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">job_flow_creator</span> <span class="o">=</span> <span class="n">EmrCreateJobFlowOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;create_job_flow&#39;</span><span class="p">,</span>
     <span class="n">job_flow_overrides</span><span class="o">=</span><span class="n">JOB_FLOW_OVERRIDES</span><span class="p">,</span>
@@ -701,7 +701,7 @@ you may not see the cluster in the EMR Management Console - you can change this
 <p>To add Steps to an existing EMR Job Flow you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/emr/index.html#airflow.providers.amazon.aws.operators.emr.EmrAddStepsOperator" title="airflow.providers.amazon.aws.operators.emr.EmrAddStepsOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">EmrAddStepsOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_job_flow_manual_steps.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_emr_job_flow_manual_steps.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_job_flow_manual_steps.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_emr_job_flow_manual_steps.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">step_adder</span> <span class="o">=</span> <span class="n">EmrAddStepsOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;add_steps&#39;</span><span class="p">,</span>
     <span class="n">job_flow_id</span><span class="o">=</span><span class="n">cluster_creator</span><span class="o">.</span><span class="n">output</span><span class="p">,</span>
@@ -716,7 +716,7 @@ you may not see the cluster in the EMR Management Console - you can change this
 <p>To terminate an EMR Job Flow you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/emr/index.html#airflow.providers.amazon.aws.operators.emr.EmrTerminateJobFlowOperator" title="airflow.providers.amazon.aws.operators.emr.EmrTerminateJobFlowOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">EmrTerminateJobFlowOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_job_flow_manual_steps.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_emr_job_flow_manual_steps.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_job_flow_manual_steps.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_emr_job_flow_manual_steps.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">cluster_remover</span> <span class="o">=</span> <span class="n">EmrTerminateJobFlowOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;remove_cluster&#39;</span><span class="p">,</span>
     <span class="n">job_flow_id</span><span class="o">=</span><span class="n">cluster_creator</span><span class="o">.</span><span class="n">output</span><span class="p">,</span>
@@ -740,7 +740,7 @@ you may not see the cluster in the EMR Management Console - you can change this
 <p>To monitor the state of an EMR Job Flow you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/sensors/emr/index.html#airflow.providers.amazon.aws.sensors.emr.EmrJobFlowSensor" title="airflow.providers.amazon.aws.sensors.emr.EmrJobFlowSensor"><code class="xref py py-class docutils literal notranslate"><span class="pre">EmrJobFlowSensor</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_job_flow_automatic_steps.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_emr_job_flow_automatic_steps.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_job_flow_automatic_steps.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_emr_job_flow_automatic_steps.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">job_sensor</span> <span class="o">=</span> <span class="n">EmrJobFlowSensor</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;check_job_flow&#39;</span><span class="p">,</span>
     <span class="n">job_flow_id</span><span class="o">=</span><span class="n">job_flow_creator</span><span class="o">.</span><span class="n">output</span><span class="p">,</span>
@@ -754,7 +754,7 @@ you may not see the cluster in the EMR Management Console - you can change this
 <p>To monitor the state of a Step running an existing EMR Job Flow you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/sensors/emr/index.html#airflow.providers.amazon.aws.sensors.emr.EmrStepSensor" title="airflow.providers.amazon.aws.sensors.emr.EmrStepSensor"><code class="xref py py-class docutils literal notranslate"><span class="pre">EmrStepSensor</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_job_flow_manual_steps.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_emr_job_flow_manual_steps.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_job_flow_manual_steps.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_emr_job_flow_manual_steps.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">step_checker</span> <span class="o">=</span> <span class="n">EmrStepSensor</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;watch_step&#39;</span><span class="p">,</span>
     <span class="n">job_flow_id</span><span class="o">=</span><span class="n">cluster_creator</span><span class="o">.</span><span class="n">output</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/emr_eks.html b/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/emr_eks.html
index a28cb9f04e..0cb48753dc 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/emr_eks.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/emr_eks.html
@@ -639,7 +639,7 @@ and <code class="docutils literal notranslate"><span class="pre">monitoringConfi
 Refer to the <a class="reference external" href="https://docs.aws.amazon.com/emr/latest/EMR-on-EKS-DevelopmentGuide/emr-eks-jobs-CLI.html#emr-eks-jobs-parameters">EMR on EKS guide</a>
 for more details on job configuration.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_eks_job.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_emr_eks_job.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_eks_job.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_emr_eks_job.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">JOB_DRIVER_ARG</span> <span class="o">=</span> <span class="p">{</span>
     <span class="s2">&quot;sparkSubmitJobDriver&quot;</span><span class="p">:</span> <span class="p">{</span>
         <span class="s2">&quot;entryPoint&quot;</span><span class="p">:</span> <span class="s2">&quot;local:///usr/lib/spark/examples/src/main/python/pi.py&quot;</span><span class="p">,</span>
@@ -671,7 +671,7 @@ can store them in a connection or provide them in the DAG. Your AWS region shoul
 in the <code class="docutils literal notranslate"><span class="pre">aws_default</span></code> connection as <code class="docutils literal notranslate"><span class="pre">{&quot;region_name&quot;:</span> <span class="pre">&quot;us-east-1&quot;}</span></code> or a custom connection name
 that gets passed to the operator with the <code class="docutils literal notranslate"><span class="pre">aws_conn_id</span></code> parameter.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_eks_job.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_emr_eks_job.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_eks_job.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_emr_eks_job.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">job_starter</span> <span class="o">=</span> <span class="n">EmrContainerOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;start_job&quot;</span><span class="p">,</span>
     <span class="n">virtual_cluster_id</span><span class="o">=</span><span class="n">VIRTUAL_CLUSTER_ID</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/glacier.html b/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/glacier.html
index de8378ff36..b77f7adce8 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/glacier.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/glacier.html
@@ -604,7 +604,7 @@ The operation returns dictionary of information related to the initiated job lik
 <code class="xref py py-class docutils literal notranslate"><span class="pre">GlacierCreateJobOperator</span></code></p>
 <p>Example usage:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">create_glacier_job</span> <span class="o">=</span> <span class="n">GlacierCreateJobOperator</span><span class="p">(</span><span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;create_glacier_job&quot;</span><span class="p">,</span> <span class="n">vault_name</span><span class="o">=</span><span class="n">VAULT_NAME</span><span class="p">)</span>
 <span class="n">JOB_ID</span> <span class="o">=</span> <span class="s1">&#39;{{ task_instance.xcom_pull(&quot;create_glacier_job&quot;)[&quot;jobId&quot;] }}&#39;</span>
 </pre></div>
@@ -629,7 +629,7 @@ Which means that every next request will be sent every 20 minutes.</p>
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/sensors/glacier/index.html#airflow.providers.amazon.aws.sensors.glacier.GlacierJobOperationSensor" title="airflow.providers.amazon.aws.sensors.glacier.GlacierJobOperationSensor"><code class="xref py py-class docutils literal notranslate"><span class="pre">GlacierJobOperationSensor</span></code></a></p>
 <p>Example usage:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">transfer_archive_to_gcs</span> <span class="o">=</span> <span class="n">GlacierToGCSOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;transfer_archive_to_gcs&quot;</span><span class="p">,</span>
     <span class="n">vault_name</span><span class="o">=</span><span class="n">VAULT_NAME</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/glue.html b/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/glue.html
index 9c240607de..b48e29306d 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/glue.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/glue.html
@@ -621,7 +621,7 @@ your data and putting it to use in minutes instead of months.</p>
 To create a new AWS Glue Crawler or run an existing one you can
 use <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/glue_crawler/index.html#airflow.providers.amazon.aws.operators.glue_crawler.GlueCrawlerOperator" title="airflow.providers.amazon.aws.operators.glue_crawler.GlueCrawlerOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">GlueCrawlerOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_glue.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_glue.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_glue.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_glue.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">crawl_s3</span> <span class="o">=</span> <span class="n">GlueCrawlerOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;crawl_s3&#39;</span><span class="p">,</span>
     <span class="n">config</span><span class="o">=</span><span class="n">GLUE_CRAWLER_CONFIG</span><span class="p">,</span>
@@ -639,7 +639,7 @@ policy. See the References section below for a link to more details.</p>
 <p>To wait on the state of an AWS Glue Crawler execution until it reaches a terminal state you can
 use <a class="reference internal" href="../_api/airflow/providers/amazon/aws/sensors/glue_crawler/index.html#airflow.providers.amazon.aws.sensors.glue_crawler.GlueCrawlerSensor" title="airflow.providers.amazon.aws.sensors.glue_crawler.GlueCrawlerSensor"><code class="xref py py-class docutils literal notranslate"><span class="pre">GlueCrawlerSensor</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_glue.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_glue.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_glue.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_glue.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">wait_for_crawl</span> <span class="o">=</span> <span class="n">GlueCrawlerSensor</span><span class="p">(</span><span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;wait_for_crawl&#39;</span><span class="p">,</span> <span class="n">crawler_name</span><span class="o">=</span><span class="n">GLUE_CRAWLER_NAME</span><span class="p">)</span>
 </pre></div>
 </div>
@@ -649,7 +649,7 @@ use <a class="reference internal" href="../_api/airflow/providers/amazon/aws/sen
 <span id="howto-operator-gluejoboperator"></span><h3>AWS Glue Job Operator<a class="headerlink" href="#aws-glue-job-operator" title="Permalink to this headline">¶</a></h3>
 <p>To submit a new AWS Glue Job you can use <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/glue/index.html#airflow.providers.amazon.aws.operators.glue.GlueJobOperator" title="airflow.providers.amazon.aws.operators.glue.GlueJobOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">GlueJobOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_glue.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_glue.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_glue.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_glue.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">job_name</span> <span class="o">=</span> <span class="s1">&#39;example_glue_job&#39;</span>
 <span class="n">submit_glue_job</span> <span class="o">=</span> <span class="n">GlueJobOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;submit_glue_job&#39;</span><span class="p">,</span>
@@ -671,7 +671,7 @@ policies to provide access to the output location for result data.</p>
 <p>To wait on the state of an AWS Glue Job until it reaches a terminal state you can
 use <a class="reference internal" href="../_api/airflow/providers/amazon/aws/sensors/glue/index.html#airflow.providers.amazon.aws.sensors.glue.GlueJobSensor" title="airflow.providers.amazon.aws.sensors.glue.GlueJobSensor"><code class="xref py py-class docutils literal notranslate"><span class="pre">GlueJobSensor</span></code></a></p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_glue.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_glue.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_glue.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_glue.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">wait_for_job</span> <span class="o">=</span> <span class="n">GlueJobSensor</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;wait_for_job&#39;</span><span class="p">,</span>
     <span class="n">job_name</span><span class="o">=</span><span class="n">job_name</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/google_api_to_s3_transfer.html b/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/google_api_to_s3_transfer.html
index a49746a00b..f77a96a2da 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/google_api_to_s3_transfer.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/google_api_to_s3_transfer.html
@@ -617,7 +617,7 @@ in action.</p>
 <h3>Environment variables<a class="headerlink" href="#environment-variables" title="Permalink to this headline">¶</a></h3>
 <p>These examples rely on the following variables, which can be passed via OS environment variables.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_basic.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_basic.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_basic.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_basic.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">GOOGLE_SHEET_ID</span> <span class="o">=</span> <span class="n">getenv</span><span class="p">(</span><span class="s2">&quot;GOOGLE_SHEET_ID&quot;</span><span class="p">)</span>
 <span class="n">GOOGLE_SHEET_RANGE</span> <span class="o">=</span> <span class="n">getenv</span><span class="p">(</span><span class="s2">&quot;GOOGLE_SHEET_RANGE&quot;</span><span class="p">)</span>
 <span class="n">S3_DESTINATION_KEY</span> <span class="o">=</span> <span class="n">getenv</span><span class="p">(</span><span class="s2">&quot;S3_DESTINATION_KEY&quot;</span><span class="p">,</span> <span class="s2">&quot;s3://bucket/key.json&quot;</span><span class="p">)</span>
@@ -630,7 +630,7 @@ in action.</p>
 <h3>Get Google Sheets Sheet Values<a class="headerlink" href="#get-google-sheets-sheet-values" title="Permalink to this headline">¶</a></h3>
 <p>In the following code we are requesting a Google Sheet via the <code class="docutils literal notranslate"><span class="pre">sheets.spreadsheets.values.get</span></code> endpoint.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_basic.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_basic.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_basic.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_basic.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">task_google_sheets_values_to_s3</span> <span class="o">=</span> <span class="n">GoogleApiToS3Operator</span><span class="p">(</span>
         <span class="n">google_api_service_name</span><span class="o">=</span><span class="s1">&#39;sheets&#39;</span><span class="p">,</span>
         <span class="n">google_api_service_version</span><span class="o">=</span><span class="s1">&#39;v4&#39;</span><span class="p">,</span>
@@ -657,7 +657,7 @@ tasks to retrieve specific information about YouTube videos.</p>
 <h3>Environment variables<a class="headerlink" href="#id2" title="Permalink to this headline">¶</a></h3>
 <p>This example relies on the following variables, which can be passed via OS environment variables.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">YOUTUBE_CONN_ID</span> <span class="o">=</span> <span class="n">getenv</span><span class="p">(</span><span class="s2">&quot;YOUTUBE_CONN_ID&quot;</span><span class="p">,</span> <span class="s2">&quot;google_cloud_default&quot;</span><span class="p">)</span>
 <span class="n">YOUTUBE_CHANNEL_ID</span> <span class="o">=</span> <span class="n">getenv</span><span class="p">(</span><span class="s2">&quot;YOUTUBE_CHANNEL_ID&quot;</span><span class="p">,</span> <span class="s2">&quot;UCSXwxpWZQ7XZ1WL3wqevChA&quot;</span><span class="p">)</span>  <span class="c1"># &quot;Apache Airflow&quot;</span>
 <span class="n">YOUTUBE_VIDEO_PUBLISHED_AFTER</span> <span class="o">=</span> <span class="n">getenv</span><span class="p">(</span><span class="s2">&quot;YOUTUBE_VIDEO_PUBLISHED_AFTER&quot;</span><span class="p">,</span> <span class="s2">&quot;2019-09-25T00:00:00Z&quot;</span><span class="p">)</span>
@@ -676,7 +676,7 @@ tasks to retrieve specific information about YouTube videos.</p>
 (<code class="docutils literal notranslate"><span class="pre">YOUTUBE_VIDEO_PUBLISHED_AFTER</span></code>, <code class="docutils literal notranslate"><span class="pre">YOUTUBE_VIDEO_PUBLISHED_BEFORE</span></code>) on a YouTube channel (<code class="docutils literal notranslate"><span class="pre">YOUTUBE_CHANNEL_ID</span></code>)
 saves the response in S3 and also pushes the data to xcom.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">task_video_ids_to_s3</span> <span class="o">=</span> <span class="n">GoogleApiToS3Operator</span><span class="p">(</span>
         <span class="n">gcp_conn_id</span><span class="o">=</span><span class="n">YOUTUBE_CONN_ID</span><span class="p">,</span>
         <span class="n">google_api_service_name</span><span class="o">=</span><span class="s1">&#39;youtube&#39;</span><span class="p">,</span>
@@ -701,7 +701,7 @@ saves the response in S3 and also pushes the data to xcom.</p>
 <p>From there a <code class="docutils literal notranslate"><span class="pre">BranchPythonOperator</span></code> will extract the xcom data and bring the IDs in a format the next
 request needs it + it also decides whether we need to request any videos or not.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="k">def</span> <span class="nf">_check_and_transform_video_ids</span><span class="p">(</span><span class="n">task_output</span><span class="p">,</span> <span class="n">task_instance</span><span class="p">):</span>
     <span class="n">video_ids_response</span> <span class="o">=</span> <span class="n">task_output</span>
     <span class="n">video_ids</span> <span class="o">=</span> <span class="p">[</span><span class="n">item</span><span class="p">[</span><span class="s1">&#39;id&#39;</span><span class="p">][</span><span class="s1">&#39;videoId&#39;</span><span class="p">]</span> <span class="k">for</span> <span class="n">item</span> <span class="ow">in</span> <span class="n">video_ids_response</span><span class="p">[</span><span class="s1">&#39;items&#39;</span><span class="p">]]</span>
@@ -716,7 +716,7 @@ request needs it + it also decides whether we need to request any videos or not.
 </div>
 </div>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">task_check_and_transform_video_ids</span> <span class="o">=</span> <span class="n">BranchPythonOperator</span><span class="p">(</span>
         <span class="n">python_callable</span><span class="o">=</span><span class="n">_check_and_transform_video_ids</span><span class="p">,</span>
         <span class="n">op_args</span><span class="o">=</span><span class="p">[</span><span class="n">task_video_ids_to_s3</span><span class="o">.</span><span class="n">output</span><span class="p">[</span><span class="n">task_video_ids_to_s3</span><span class="o">.</span><span class="n">google_api_response_via_xcom</span><span class="p">]],</span>
@@ -728,7 +728,7 @@ request needs it + it also decides whether we need to request any videos or not.
 <p>If there are YouTube Video IDs available, it passes over the YouTube IDs to the next request which then gets the
 information (<code class="docutils literal notranslate"><span class="pre">YOUTUBE_VIDEO_FIELDS</span></code>) for the requested videos and saves them in S3 (<code class="docutils literal notranslate"><span class="pre">S3_DESTINATION_KEY</span></code>).</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">task_video_data_to_s3</span> <span class="o">=</span> <span class="n">GoogleApiToS3Operator</span><span class="p">(</span>
         <span class="n">gcp_conn_id</span><span class="o">=</span><span class="n">YOUTUBE_CONN_ID</span><span class="p">,</span>
         <span class="n">google_api_service_name</span><span class="o">=</span><span class="s1">&#39;youtube&#39;</span><span class="p">,</span>
@@ -748,7 +748,7 @@ information (<code class="docutils literal notranslate"><span class="pre">YOUTUB
 </div>
 <p>If not do nothing - and track it.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">task_no_video_ids</span> <span class="o">=</span> <span class="n">DummyOperator</span><span class="p">(</span><span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;no_video_ids&#39;</span><span class="p">)</span>
 </pre></div>
 </div>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/lambda.html b/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/lambda.html
index 75e93cd8c9..000b80b54f 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/lambda.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/lambda.html
@@ -609,7 +609,7 @@ and only pay for what you use.</p>
 <p>To publish a message to an Amazon SNS Topic you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/aws_lambda/index.html#airflow.providers.amazon.aws.operators.aws_lambda.AwsLambdaInvokeFunctionOperator" title="airflow.providers.amazon.aws.operators.aws_lambda.AwsLambdaInvokeFunctionOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">AwsLambdaInvokeFunctionOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_lambda.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_lambda.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_lambda.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_lambda.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">invoke_lambda_function</span> <span class="o">=</span> <span class="n">AwsLambdaInvokeFunctionOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;setup__invoke_lambda_function&#39;</span><span class="p">,</span>
     <span class="n">function_name</span><span class="o">=</span><span class="n">LAMBDA_FUNCTION_NAME</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/rds.html b/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/rds.html
index eb13396126..c1cd154406 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/rds.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/rds.html
@@ -611,7 +611,7 @@
 <code class="xref py py-class docutils literal notranslate"><span class="pre">RDSCreateDBSnapshotOperator</span></code>.
 The source DB instance must be in the <code class="docutils literal notranslate"><span class="pre">available</span></code> or <code class="docutils literal notranslate"><span class="pre">storage-optimization</span></code> state.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_rds.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_rds.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="k">with</span> <span class="n">DAG</span><span class="p">(</span>
     <span class="n">dag_id</span><span class="o">=</span><span class="s1">&#39;rds_snapshots&#39;</span><span class="p">,</span> <span class="n">start_date</span><span class="o">=</span><span class="n">datetime</span><span class="p">(</span><span class="mi">2021</span><span class="p">,</span> <span class="mi">1</span><span class="p">,</span> <span class="mi">1</span><span class="p">),</span> <span class="n">schedule_interval</span><span class="o">=</span><span class="kc">None</span><span [...]
 <span class="p">)</span> <span class="k">as</span> <span class="n">dag</span><span class="p">:</span>
@@ -653,7 +653,7 @@ The source DB instance must be in the <code class="docutils literal notranslate"
 <code class="xref py py-class docutils literal notranslate"><span class="pre">RDSCopyDBSnapshotOperator</span></code>.
 The source DB snapshot must be in the <code class="docutils literal notranslate"><span class="pre">available</span></code> state.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_rds.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_rds.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">copy_snapshot</span> <span class="o">=</span> <span class="n">RdsCopyDbSnapshotOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;copy_snapshot&#39;</span><span class="p">,</span>
         <span class="n">db_type</span><span class="o">=</span><span class="s1">&#39;instance&#39;</span><span class="p">,</span>
@@ -675,7 +675,7 @@ The source DB snapshot must be in the <code class="docutils literal notranslate"
 <code class="xref py py-class docutils literal notranslate"><span class="pre">RDSDeleteDBSnapshotOperator</span></code>.
 The DB snapshot must be in the <code class="docutils literal notranslate"><span class="pre">available</span></code> state to be deleted.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_rds.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_rds.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">delete_snapshot</span> <span class="o">=</span> <span class="n">RdsDeleteDbSnapshotOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;delete_snapshot&#39;</span><span class="p">,</span>
         <span class="n">db_type</span><span class="o">=</span><span class="s1">&#39;instance&#39;</span><span class="p">,</span>
@@ -696,7 +696,7 @@ The DB snapshot must be in the <code class="docutils literal notranslate"><span
 <code class="xref py py-class docutils literal notranslate"><span class="pre">RDSStartExportTaskOperator</span></code>.
 The provided IAM role must have access to the S3 bucket.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_rds.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_rds.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">start_export</span> <span class="o">=</span> <span class="n">RdsStartExportTaskOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;start_export&#39;</span><span class="p">,</span>
         <span class="n">export_task_identifier</span><span class="o">=</span><span class="s1">&#39;export-auth-db-snap-{{ ds }}&#39;</span><span class="p">,</span>
@@ -720,7 +720,7 @@ The provided IAM role must have access to the S3 bucket.</p>
 <code class="xref py py-class docutils literal notranslate"><span class="pre">RDSCancelExportTaskOperator</span></code>.
 Any data that has already been written to the S3 bucket isn't removed.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_rds.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_rds.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">cancel_export</span> <span class="o">=</span> <span class="n">RdsCancelExportTaskOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;cancel_export&#39;</span><span class="p">,</span>
         <span class="n">export_task_identifier</span><span class="o">=</span><span class="s1">&#39;export-auth-db-snap-{{ ds }}&#39;</span><span class="p">,</span>
@@ -742,7 +742,7 @@ To obtain an ARN with SNS, you must create a topic in Amazon SNS and subscribe t
 RDS event notification is only available for not encrypted SNS topics.
 If you specify an encrypted SNS topic, event notifications are not sent for the topic.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_rds.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_rds.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">create_subscription</span> <span class="o">=</span> <span class="n">RdsCreateEventSubscriptionOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;create_subscription&#39;</span><span class="p">,</span>
         <span class="n">subscription_name</span><span class="o">=</span><span class="s1">&#39;my-topic-subscription&#39;</span><span class="p">,</span>
@@ -764,7 +764,7 @@ If you specify an encrypted SNS topic, event notifications are not sent for the
 <p>To delete event subscription you can use
 <code class="xref py py-class docutils literal notranslate"><span class="pre">RDSDeleteEventSubscriptionOperator</span></code></p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_rds.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_rds.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">delete_subscription</span> <span class="o">=</span> <span class="n">RdsDeleteEventSubscriptionOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;delete_subscription&#39;</span><span class="p">,</span>
         <span class="n">subscription_name</span><span class="o">=</span><span class="s1">&#39;my-topic-subscription&#39;</span><span class="p">,</span>
@@ -792,7 +792,7 @@ If you specify an encrypted SNS topic, event notifications are not sent for the
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/sensors/rds/index.html#airflow.providers.amazon.aws.sensors.rds.RdsSnapshotExistenceSensor" title="airflow.providers.amazon.aws.sensors.rds.RdsSnapshotExistenceSensor"><code class="xref py py-class docutils literal notranslate"><span class="pre">RdsSnapshotExistenceSensor</span></code></a>.
 By default, sensor waits existence of snapshot with status <code class="docutils literal notranslate"><span class="pre">available</span></code>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_rds.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_rds.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">snapshot_sensor</span> <span class="o">=</span> <span class="n">RdsSnapshotExistenceSensor</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;snapshot_sensor&#39;</span><span class="p">,</span>
         <span class="n">db_type</span><span class="o">=</span><span class="s1">&#39;instance&#39;</span><span class="p">,</span>
@@ -811,7 +811,7 @@ By default, sensor waits existence of snapshot with status <code class="docutils
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/sensors/rds/index.html#airflow.providers.amazon.aws.sensors.rds.RdsExportTaskExistenceSensor" title="airflow.providers.amazon.aws.sensors.rds.RdsExportTaskExistenceSensor"><code class="xref py py-class docutils literal notranslate"><span class="pre">RdsExportTaskExistenceSensor</span></code></a>.
 By default, sensor waits existence of export task with status <code class="docutils literal notranslate"><span class="pre">available</span></code>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_rds.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_rds.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">export_sensor</span> <span class="o">=</span> <span class="n">RdsExportTaskExistenceSensor</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;export_sensor&#39;</span><span class="p">,</span>
         <span class="n">export_task_identifier</span><span class="o">=</span><span class="s1">&#39;export-auth-db-snap-{{ ds }}&#39;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/redshift_cluster.html b/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/redshift_cluster.html
index 82851f2a84..02b9009df4 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/redshift_cluster.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/redshift_cluster.html
@@ -624,7 +624,7 @@ business and customers.</p>
 <p>To check the state of an Amazon Redshift Cluster until it reaches the target state or another terminal
 state you can use <a class="reference internal" href="../_api/airflow/providers/amazon/aws/sensors/redshift_cluster/index.html#airflow.providers.amazon.aws.sensors.redshift_cluster.RedshiftClusterSensor" title="airflow.providers.amazon.aws.sensors.redshift_cluster.RedshiftClusterSensor"><code class="xref py py-class docutils literal notranslate"><span class="pre">RedshiftClusterSensor</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift_cluster.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_redshift_cluster.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift_cluster.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_redshift_cluster.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">task_wait_cluster_available</span> <span class="o">=</span> <span class="n">RedshiftClusterSensor</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;sensor_redshift_cluster_available&#39;</span><span class="p">,</span>
     <span class="n">cluster_identifier</span><span class="o">=</span><span class="n">REDSHIFT_CLUSTER_IDENTIFIER</span><span class="p">,</span>
@@ -641,7 +641,7 @@ state you can use <a class="reference internal" href="../_api/airflow/providers/
 <p>To resume a 'paused' Amazon Redshift Cluster you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/redshift_cluster/index.html#module-airflow.providers.amazon.aws.operators.redshift_cluster" title="airflow.providers.amazon.aws.operators.redshift_cluster"><code class="xref py py-class docutils literal notranslate"><span class="pre">RedshiftResumeClusterOperator</span></code></a></p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift_cluster.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_redshift_cluster.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift_cluster.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_redshift_cluster.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">task_resume_cluster</span> <span class="o">=</span> <span class="n">RedshiftResumeClusterOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;redshift_resume_cluster&#39;</span><span class="p">,</span>
     <span class="n">cluster_identifier</span><span class="o">=</span><span class="n">REDSHIFT_CLUSTER_IDENTIFIER</span><span class="p">,</span>
@@ -655,7 +655,7 @@ state you can use <a class="reference internal" href="../_api/airflow/providers/
 <p>To pause an 'available' Amazon Redshift Cluster you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/redshift_cluster/index.html#module-airflow.providers.amazon.aws.operators.redshift_cluster" title="airflow.providers.amazon.aws.operators.redshift_cluster"><code class="xref py py-class docutils literal notranslate"><span class="pre">RedshiftPauseClusterOperator</span></code></a></p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift_cluster.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_redshift_cluster.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift_cluster.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_redshift_cluster.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">task_pause_cluster</span> <span class="o">=</span> <span class="n">RedshiftPauseClusterOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;redshift_pause_cluster&#39;</span><span class="p">,</span>
     <span class="n">cluster_identifier</span><span class="o">=</span><span class="n">REDSHIFT_CLUSTER_IDENTIFIER</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/redshift_data.html b/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/redshift_data.html
index 8ddbdaa4b5..0cfe40eade 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/redshift_data.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/redshift_data.html
@@ -622,7 +622,7 @@ statements against an Amazon Redshift cluster.</p>
 <p>This is a basic example DAG for using <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/redshift_data/index.html#module-airflow.providers.amazon.aws.operators.redshift_data" title="airflow.providers.amazon.aws.operators.redshift_data"><code class="xref py py-class docutils literal notranslate"><span class="pre">RedshiftDataOperator</span></code></a>
 to execute statements against an Amazon Redshift cluster.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift_data_execute_sql.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_redshift_data_execute_sql.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift_data_execute_sql.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_redshift_data_execute_sql.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">task_query</span> <span class="o">=</span> <span class="n">RedshiftDataOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;redshift_query&#39;</span><span class="p">,</span>
     <span class="n">cluster_identifier</span><span class="o">=</span><span class="n">REDSHIFT_CLUSTER_IDENTIFIER</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/redshift_sql.html b/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/redshift_sql.html
index 4251bd5d13..0d14d69d52 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/redshift_sql.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/redshift_sql.html
@@ -623,7 +623,7 @@ business and customers.</p>
 <div class="section" id="execute-a-sql-query">
 <h3>Execute a SQL query<a class="headerlink" href="#execute-a-sql-query" title="Permalink to this headline">¶</a></h3>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift_sql.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_redshift_sql.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift_sql.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_redshift_sql.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">task_select_data</span> <span class="o">=</span> <span class="n">RedshiftSQLOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;task_get_all_table_data&#39;</span><span class="p">,</span> <span class="n">sql</span><span class="o">=</span><span class="s2">&quot;&quot;&quot;CREATE TABLE more_fruit AS SELECT * FROM fruit;&quot;&quot;&quot;</span>
 <span class="p">)</span>
@@ -636,7 +636,7 @@ business and customers.</p>
 <p>RedshiftSQLOperator supports the <code class="docutils literal notranslate"><span class="pre">parameters</span></code> attribute which allows us to dynamically pass
 parameters into SQL statements.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift_sql.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_redshift_sql.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift_sql.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_redshift_sql.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">task_select_filtered_data</span> <span class="o">=</span> <span class="n">RedshiftSQLOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;task_get_filtered_table_data&#39;</span><span class="p">,</span>
     <span class="n">sql</span><span class="o">=</span><span class="s2">&quot;&quot;&quot;CREATE TABLE filtered_fruit AS SELECT * FROM fruit WHERE color = &#39;{{ params.color }}&#39;;&quot;&quot;&quot;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/s3.html b/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/s3.html
index 1002eedf0d..8623d2b13e 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/s3.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/s3.html
@@ -635,7 +635,7 @@
 <p>To create an Amazon S3 bucket you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/s3/index.html#airflow.providers.amazon.aws.operators.s3.S3CreateBucketOperator" title="airflow.providers.amazon.aws.operators.s3.S3CreateBucketOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">S3CreateBucketOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3_bucket.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_s3_bucket.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3_bucket.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_s3_bucket.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">create_bucket</span> <span class="o">=</span> <span class="n">S3CreateBucketOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;s3_create_bucket&#39;</span><span class="p">,</span>
     <span class="n">bucket_name</span><span class="o">=</span><span class="n">BUCKET_NAME</span><span class="p">,</span>
@@ -649,7 +649,7 @@
 <p>To delete an Amazon S3 bucket you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/s3/index.html#airflow.providers.amazon.aws.operators.s3.S3DeleteBucketOperator" title="airflow.providers.amazon.aws.operators.s3.S3DeleteBucketOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">S3DeleteBucketOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3_bucket.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_s3_bucket.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3_bucket.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_s3_bucket.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">delete_bucket</span> <span class="o">=</span> <span class="n">S3DeleteBucketOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;s3_delete_bucket&#39;</span><span class="p">,</span> <span class="n">bucket_name</span><span class="o">=</span><span class="n">BUCKET_NAME</span><span class="p">,</span> <span class="n">force_delete</span><span class="o">=</span><span class="kc">True</span>
 <span class="p">)</span>
@@ -662,7 +662,7 @@
 <p>To set the tags for an Amazon S3 bucket you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/s3/index.html#airflow.providers.amazon.aws.operators.s3.S3PutBucketTaggingOperator" title="airflow.providers.amazon.aws.operators.s3.S3PutBucketTaggingOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">S3PutBucketTaggingOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3_bucket.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_s3_bucket.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3_bucket.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_s3_bucket.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">put_tagging</span> <span class="o">=</span> <span class="n">S3PutBucketTaggingOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;s3_put_bucket_tagging&#39;</span><span class="p">,</span>
     <span class="n">bucket_name</span><span class="o">=</span><span class="n">BUCKET_NAME</span><span class="p">,</span>
@@ -678,7 +678,7 @@
 <p>To get the tag set associated with an Amazon S3 bucket you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/s3/index.html#airflow.providers.amazon.aws.operators.s3.S3GetBucketTaggingOperator" title="airflow.providers.amazon.aws.operators.s3.S3GetBucketTaggingOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">S3GetBucketTaggingOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3_bucket.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_s3_bucket.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3_bucket.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_s3_bucket.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">get_tagging</span> <span class="o">=</span> <span class="n">S3GetBucketTaggingOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;s3_get_bucket_tagging&#39;</span><span class="p">,</span>
     <span class="n">bucket_name</span><span class="o">=</span><span class="n">BUCKET_NAME</span><span class="p">,</span>
@@ -692,7 +692,7 @@
 <p>To delete the tags of an Amazon S3 bucket you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/s3/index.html#airflow.providers.amazon.aws.operators.s3.S3DeleteBucketTaggingOperator" title="airflow.providers.amazon.aws.operators.s3.S3DeleteBucketTaggingOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">S3DeleteBucketTaggingOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3_bucket.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_s3_bucket.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3_bucket.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_s3_bucket.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">delete_tagging</span> <span class="o">=</span> <span class="n">S3DeleteBucketTaggingOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;s3_delete_bucket_tagging&#39;</span><span class="p">,</span>
     <span class="n">bucket_name</span><span class="o">=</span><span class="n">BUCKET_NAME</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/sagemaker.html b/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/sagemaker.html
index f2dff65d23..3a7d96ccbf 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/sagemaker.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/sagemaker.html
@@ -640,7 +640,7 @@ generate the models artifact in s3, create the model,
 training, Sagemaker Model, batch transform job and
 then delete the model.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_sagemaker.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_sagemaker.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_sagemaker.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_sagemaker.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="k">with</span> <span class="n">DAG</span><span class="p">(</span>
     <span class="s2">&quot;sample_sagemaker_dag&quot;</span><span class="p">,</span>
     <span class="n">schedule_interval</span><span class="o">=</span><span class="kc">None</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/salesforce_to_s3.html b/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/salesforce_to_s3.html
index efa25a71f9..cbef3b2616 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/salesforce_to_s3.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/salesforce_to_s3.html
@@ -605,7 +605,7 @@ are initially written to a local, temporary directory and then uploaded to an S3
 <p>The following example demonstrates a use case of extracting customer data from a Salesforce
 instance and upload to a &quot;landing&quot; bucket in S3.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_salesforce_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_salesforce_to_s3.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_salesforce_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_salesforce_to_s3.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">upload_salesforce_data_to_s3_landing</span> <span class="o">=</span> <span class="n">SalesforceToS3Operator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;upload_salesforce_data_to_s3&quot;</span><span class="p">,</span>
         <span class="n">salesforce_query</span><span class="o">=</span><span class="s2">&quot;SELECT Id, Name, Company, Phone, Email, LastModifiedDate, IsActive FROM Customers&quot;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/sns.html b/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/sns.html
index 2182a45a8a..f0fb5251ba 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/sns.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/sns.html
@@ -610,7 +610,7 @@ messages (SMS).</p>
 <p>To publish a message to an Amazon SNS Topic you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/sns/index.html#airflow.providers.amazon.aws.operators.sns.SnsPublishOperator" title="airflow.providers.amazon.aws.operators.sns.SnsPublishOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">SnsPublishOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_sns.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_sns.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_sns.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_sns.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">publish</span> <span class="o">=</span> <span class="n">SnsPublishOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;publish_message&#39;</span><span class="p">,</span>
     <span class="n">target_arn</span><span class="o">=</span><span class="n">SNS_TOPIC_ARN</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/sqs_publish.html b/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/sqs_publish.html
index 365e49fc62..52e83851ba 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/sqs_publish.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/sqs_publish.html
@@ -627,7 +627,7 @@ to publish a message to Amazon Simple Queue Service (SQS).</p>
 <p>In the following example, the task &quot;publish_to_queue&quot; publishes a message containing
 the task instance and the execution date to a queue named <code class="docutils literal notranslate"><span class="pre">Airflow-Example-Queue</span></code>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_sqs.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_sqs.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_sqs.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_sqs.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>
     <span class="c1"># Using a task-decorated function to create an SQS queue</span>
     <span class="n">create_queue</span> <span class="o">=</span> <span class="n">create_queue_fn</span><span class="p">()</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/transfer/dynamodb_to_s3.html b/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/transfer/dynamodb_to_s3.html
index ae50633944..379f01125b 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/transfer/dynamodb_to_s3.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/transfer/dynamodb_to_s3.html
@@ -628,7 +628,7 @@ records that satisfy the criteria.</p>
 <a class="reference internal" href="../../_api/airflow/providers/amazon/aws/transfers/dynamodb_to_s3/index.html#airflow.providers.amazon.aws.transfers.dynamodb_to_s3.DynamoDBToS3Operator" title="airflow.providers.amazon.aws.transfers.dynamodb_to_s3.DynamoDBToS3Operator"><code class="xref py py-class docutils literal notranslate"><span class="pre">DynamoDBToS3Operator</span></code></a></p>
 <p>Example usage:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_dynamodb_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/amazon/aws/example_dags/example_dynamodb_to_s3.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_dynamodb_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_dynamodb_to_s3.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">backup_db</span> <span class="o">=</span> <span class="n">DynamoDBToS3Operator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;backup_db&#39;</span><span class="p">,</span>
     <span class="n">dynamodb_table_name</span><span class="o">=</span><span class="n">TABLE_NAME</span><span class="p">,</span>
@@ -642,7 +642,7 @@ records that satisfy the criteria.</p>
 <p>To parallelize the replication, users can create multiple DynamoDBToS3Operator tasks using the
 <code class="docutils literal notranslate"><span class="pre">TotalSegments</span></code> parameter.  For instance to replicate with parallelism of 2, create two tasks:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_dynamodb_to_s3_segmented.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/amazon/aws/example_dags/example_dynamodb_to_s3_segmented.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_dynamodb_to_s3_segmented.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_dynamodb_to_s3_segmented.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="c1"># Segmenting allows the transfer to be parallelized into {segment} number of parallel tasks.</span>
 <span class="n">backup_db_segment_1</span> <span class="o">=</span> <span class="n">DynamoDBToS3Operator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;backup-1&#39;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/transfer/glacier_to_gcs.html b/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/transfer/glacier_to_gcs.html
index 7bf3c39054..76cdc0a38d 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/transfer/glacier_to_gcs.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/transfer/glacier_to_gcs.html
@@ -610,7 +610,7 @@ Transferring big files may not work well.</p>
 <a class="reference internal" href="../../_api/airflow/providers/amazon/aws/transfers/glacier_to_gcs/index.html#airflow.providers.amazon.aws.transfers.glacier_to_gcs.GlacierToGCSOperator" title="airflow.providers.amazon.aws.transfers.glacier_to_gcs.GlacierToGCSOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">GlacierToGCSOperator</span></code></a></p>
 <p>Example usage:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">transfer_archive_to_gcs</span> <span class="o">=</span> <span class="n">GlacierToGCSOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;transfer_archive_to_gcs&quot;</span><span class="p">,</span>
     <span class="n">vault_name</span><span class="o">=</span><span class="n">VAULT_NAME</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/transfer/imap_attachment_to_s3.html b/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/transfer/imap_attachment_to_s3.html
index d655dcd01f..640587ef41 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/transfer/imap_attachment_to_s3.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/transfer/imap_attachment_to_s3.html
@@ -619,7 +619,7 @@ protocol from a mail server to an Amazon S3 Bucket.</p>
 <div class="section" id="imap-attachment-to-amazon-s3">
 <span id="howto-operator-imapattachmenttos3operator"></span><h2>Imap Attachment To Amazon S3<a class="headerlink" href="#imap-attachment-to-amazon-s3" title="Permalink to this headline">¶</a></h2>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_imap_attachment_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/amazon/aws/example_dags/example_imap_attachment_to_s3.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_imap_attachment_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_imap_attachment_to_s3.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">task_transfer_imap_attachment_to_s3</span> <span class="o">=</span> <span class="n">ImapAttachmentToS3Operator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;transfer_imap_attachment_to_s3&#39;</span><span class="p">,</span>
     <span class="n">imap_attachment_name</span><span class="o">=</span><span class="n">IMAP_ATTACHMENT_NAME</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/transfer/redshift_to_s3.html b/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/transfer/redshift_to_s3.html
index 5893fedbcb..47a31b4c2f 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/transfer/redshift_to_s3.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/transfer/redshift_to_s3.html
@@ -623,7 +623,7 @@ Service (S3) file.</p>
 <a class="reference internal" href="../../_api/airflow/providers/amazon/aws/transfers/redshift_to_s3/index.html#airflow.providers.amazon.aws.transfers.redshift_to_s3.RedshiftToS3Operator" title="airflow.providers.amazon.aws.transfers.redshift_to_s3.RedshiftToS3Operator"><code class="xref py py-class docutils literal notranslate"><span class="pre">RedshiftToS3Operator</span></code></a></p>
 <p>Example usage:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/amazon/aws/example_dags/example_redshift_to_s3.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_redshift_to_s3.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">task_transfer_redshift_to_s3</span> <span class="o">=</span> <span class="n">RedshiftToS3Operator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;transfer_redshift_to_s3&#39;</span><span class="p">,</span>
     <span class="n">s3_bucket</span><span class="o">=</span><span class="n">S3_BUCKET_NAME</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/transfer/s3_to_redshift.html b/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/transfer/s3_to_redshift.html
index 68a3f8d8fc..48e6c33a9e 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/transfer/s3_to_redshift.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/transfer/s3_to_redshift.html
@@ -623,7 +623,7 @@ Amazon Redshift table.</p>
 <a class="reference internal" href="../../_api/airflow/providers/amazon/aws/transfers/s3_to_redshift/index.html#airflow.providers.amazon.aws.transfers.s3_to_redshift.S3ToRedshiftOperator" title="airflow.providers.amazon.aws.transfers.s3_to_redshift.S3ToRedshiftOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">S3ToRedshiftOperator</span></code></a></p>
 <p>Example usage:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3_to_redshift.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/amazon/aws/example_dags/example_s3_to_redshift.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3_to_redshift.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_s3_to_redshift.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">task_transfer_s3_to_redshift</span> <span class="o">=</span> <span class="n">S3ToRedshiftOperator</span><span class="p">(</span>
     <span class="n">s3_bucket</span><span class="o">=</span><span class="n">S3_BUCKET_NAME</span><span class="p">,</span>
     <span class="n">s3_key</span><span class="o">=</span><span class="n">S3_KEY</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/transfer/s3_to_sftp.html b/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/transfer/s3_to_sftp.html
index 24975ef6b8..44e35ba661 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/transfer/s3_to_sftp.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/transfer/s3_to_sftp.html
@@ -624,7 +624,7 @@ For more information about the service visits <a class="reference external" href
 <a class="reference internal" href="../../_api/airflow/providers/amazon/aws/transfers/s3_to_sftp/index.html#airflow.providers.amazon.aws.transfers.s3_to_sftp.S3ToSFTPOperator" title="airflow.providers.amazon.aws.transfers.s3_to_sftp.S3ToSFTPOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">S3ToSFTPOperator</span></code></a></p>
 <p>Example usage:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3_to_sftp.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/amazon/aws/example_dags/example_s3_to_sftp.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3_to_sftp.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_s3_to_sftp.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">create_s3_to_sftp_job</span> <span class="o">=</span> <span class="n">S3ToSFTPOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;create_s3_to_sftp_job&quot;</span><span class="p">,</span>
     <span class="n">sftp_path</span><span class="o">=</span><span class="s2">&quot;sftp_path&quot;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/transfer/sftp_to_s3.html b/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/transfer/sftp_to_s3.html
index 8744483228..ec4d569591 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/transfer/sftp_to_s3.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.2.0/operators/transfer/sftp_to_s3.html
@@ -623,7 +623,7 @@ For more information about the service visits <a class="reference external" href
 <a class="reference internal" href="../../_api/airflow/providers/amazon/aws/transfers/sftp_to_s3/index.html#airflow.providers.amazon.aws.transfers.sftp_to_s3.SFTPToS3Operator" title="airflow.providers.amazon.aws.transfers.sftp_to_s3.SFTPToS3Operator"><code class="xref py py-class docutils literal notranslate"><span class="pre">SFTPToS3Operator</span></code></a></p>
 <p>Example usage:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_sftp_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/amazon/aws/example_dags/example_sftp_to_s3.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_sftp_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.2.0/airflow/providers/amazon/aws/example_dags/example_sftp_to_s3.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">create_sftp_to_s3_job</span> <span class="o">=</span> <span class="n">SFTPToS3Operator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;create_sftp_to_s3_job&quot;</span><span class="p">,</span>
     <span class="n">sftp_path</span><span class="o">=</span><span class="s2">&quot;/tmp/sftp_path&quot;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/athena.html b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/athena.html
index 562c4d211b..f4f8f51c5d 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/athena.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/athena.html
@@ -630,7 +630,7 @@ created in an S3 bucket and populated with SAMPLE_DATA.  The example waits for t
 to complete and then drops the created table and deletes the sample CSV file in the S3
 bucket.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_athena.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_athena.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_athena.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_athena.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>
     <span class="c1"># Using a task-decorated function to create a CSV file in S3</span>
     <span class="n">add_sample_data_to_s3</span> <span class="o">=</span> <span class="n">add_sample_data_to_s3</span><span class="p">()</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/batch.html b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/batch.html
index 1f2a6e1341..1cfaabdbc7 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/batch.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/batch.html
@@ -620,7 +620,7 @@ infrastructure.</p>
 <p>To wait on the state of an AWS Batch Job until it reaches a terminal state you can
 use <a class="reference internal" href="../_api/airflow/providers/amazon/aws/sensors/batch/index.html#airflow.providers.amazon.aws.sensors.batch.BatchSensor" title="airflow.providers.amazon.aws.sensors.batch.BatchSensor"><code class="xref py py-class docutils literal notranslate"><span class="pre">BatchSensor</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_batch.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_batch.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_batch.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_batch.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">wait_for_batch_job</span> <span class="o">=</span> <span class="n">BatchSensor</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;wait_for_batch_job&#39;</span><span class="p">,</span>
     <span class="n">job_id</span><span class="o">=</span><span class="n">JOB_ID</span><span class="p">,</span>
@@ -634,7 +634,7 @@ use <a class="reference internal" href="../_api/airflow/providers/amazon/aws/sen
 <p>To submit a new AWS Batch Job and monitor it until it reaches a terminal state you can
 use <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/batch/index.html#airflow.providers.amazon.aws.operators.batch.BatchOperator" title="airflow.providers.amazon.aws.operators.batch.BatchOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">BatchOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_batch.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_batch.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_batch.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_batch.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">submit_batch_job</span> <span class="o">=</span> <span class="n">BatchOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;submit_batch_job&#39;</span><span class="p">,</span>
     <span class="n">job_name</span><span class="o">=</span><span class="n">JOB_NAME</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/cloudformation.html b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/cloudformation.html
index 6671309f3c..dff790548e 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/cloudformation.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/cloudformation.html
@@ -622,7 +622,7 @@ create and delete a collection of resources together as a single unit (a stack).
 <p>To create a new AWS CloudFormation stack use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/cloud_formation/index.html#airflow.providers.amazon.aws.operators.cloud_formation.CloudFormationCreateStackOperator" title="airflow.providers.amazon.aws.operators.cloud_formation.CloudFormationCreateStackOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">CloudFormationCreateStackOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_cloudformation.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_cloudformation.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_cloudformation.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_cloudformation.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">create_stack</span> <span class="o">=</span> <span class="n">CloudFormationCreateStackOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;create_stack&#39;</span><span class="p">,</span>
     <span class="n">stack_name</span><span class="o">=</span><span class="n">CLOUDFORMATION_STACK_NAME</span><span class="p">,</span>
@@ -637,7 +637,7 @@ create and delete a collection of resources together as a single unit (a stack).
 <p>To wait on the state of an AWS CloudFormation stack creation until it reaches a terminal state you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/sensors/cloud_formation/index.html#airflow.providers.amazon.aws.sensors.cloud_formation.CloudFormationCreateStackSensor" title="airflow.providers.amazon.aws.sensors.cloud_formation.CloudFormationCreateStackSensor"><code class="xref py py-class docutils literal notranslate"><span class="pre">CloudFormationCreateStackSensor</span></code></a></p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_cloudformation.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_cloudformation.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_cloudformation.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_cloudformation.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">wait_for_stack_create</span> <span class="o">=</span> <span class="n">CloudFormationCreateStackSensor</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;wait_for_stack_creation&#39;</span><span class="p">,</span> <span class="n">stack_name</span><span class="o">=</span><span class="n">CLOUDFORMATION_STACK_NAME</span>
 <span class="p">)</span>
@@ -650,7 +650,7 @@ create and delete a collection of resources together as a single unit (a stack).
 <p>To delete an AWS CloudFormation stack you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/cloud_formation/index.html#airflow.providers.amazon.aws.operators.cloud_formation.CloudFormationDeleteStackOperator" title="airflow.providers.amazon.aws.operators.cloud_formation.CloudFormationDeleteStackOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">CloudFormationDeleteStackOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_cloudformation.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_cloudformation.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_cloudformation.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_cloudformation.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">delete_stack</span> <span class="o">=</span> <span class="n">CloudFormationDeleteStackOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;delete_stack&#39;</span><span class="p">,</span> <span class="n">stack_name</span><span class="o">=</span><span class="n">CLOUDFORMATION_STACK_NAME</span>
 <span class="p">)</span>
@@ -663,7 +663,7 @@ create and delete a collection of resources together as a single unit (a stack).
 <p>To wait on the state of an AWS CloudFormation stack deletion until it reaches a terminal state you can use
 use <a class="reference internal" href="../_api/airflow/providers/amazon/aws/sensors/cloud_formation/index.html#airflow.providers.amazon.aws.sensors.cloud_formation.CloudFormationDeleteStackSensor" title="airflow.providers.amazon.aws.sensors.cloud_formation.CloudFormationDeleteStackSensor"><code class="xref py py-class docutils literal notranslate"><span class="pre">CloudFormationDeleteStackSensor</span></code></a></p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_cloudformation.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_cloudformation.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_cloudformation.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_cloudformation.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">wait_for_stack_delete</span> <span class="o">=</span> <span class="n">CloudFormationDeleteStackSensor</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;wait_for_stack_deletion&#39;</span><span class="p">,</span> <span class="n">trigger_rule</span><span class="o">=</span><span class="s1">&#39;all_done&#39;</span><span class="p">,</span> <span class="n">stack_name</span><span class="o">=</span><span class="n">CLOUDFORMATION_STACK_NAME</span>
 <span class="p">)</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/datasync.html b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/datasync.html
index 30f40b911d..dce872054d 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/datasync.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/datasync.html
@@ -622,13 +622,13 @@ and an <em>AWS DataSync Task</em> (identified by a TaskArn on AWS).</p>
 <h3>Environment variables<a class="headerlink" href="#environment-variables" title="Permalink to this headline">¶</a></h3>
 <p>These examples rely on the following variables, which can be passed via OS environment variables.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_datasync_1.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_datasync_1.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_datasync_1.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_datasync_1.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">TASK_ARN</span> <span class="o">=</span> <span class="n">getenv</span><span class="p">(</span><span class="s2">&quot;TASK_ARN&quot;</span><span class="p">,</span> <span class="s2">&quot;my_aws_datasync_task_arn&quot;</span><span class="p">)</span>
 </pre></div>
 </div>
 </div>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_datasync_1.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_datasync_1.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_datasync_1.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_datasync_1.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">SOURCE_LOCATION_URI</span> <span class="o">=</span> <span class="n">getenv</span><span class="p">(</span><span class="s2">&quot;SOURCE_LOCATION_URI&quot;</span><span class="p">,</span> <span class="s2">&quot;smb://hostname/directory/&quot;</span><span class="p">)</span>
 
 <span class="n">DESTINATION_LOCATION_URI</span> <span class="o">=</span> <span class="n">getenv</span><span class="p">(</span><span class="s2">&quot;DESTINATION_LOCATION_URI&quot;</span><span class="p">,</span> <span class="s2">&quot;s3://mybucket/prefix&quot;</span><span class="p">)</span>
@@ -641,7 +641,7 @@ and an <em>AWS DataSync Task</em> (identified by a TaskArn on AWS).</p>
 <p>The <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/datasync/index.html#airflow.providers.amazon.aws.operators.datasync.DataSyncOperator" title="airflow.providers.amazon.aws.operators.datasync.DataSyncOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">DataSyncOperator</span></code></a> can execute a specific
 TaskArn by specifying the <code class="docutils literal notranslate"><span class="pre">task_arn</span></code> parameter. This is useful when you know the TaskArn you want to execute.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_datasync_1.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_datasync_1.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_datasync_1.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_datasync_1.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">datasync_task_1</span> <span class="o">=</span> <span class="n">DataSyncOperator</span><span class="p">(</span><span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;datasync_task_1&quot;</span><span class="p">,</span> <span class="n">task_arn</span><span class="o">=</span><span class="n">TASK_ARN</span><span class="p">)</span>
 </pre></div>
 </div>
@@ -657,7 +657,7 @@ can iterate all DataSync Tasks for their source and destination LocationArns. Th
 each LocationArn to see if its the URIs match the desired source / destination URI.</p>
 <p>To perform a search based on the Location URIs, define the task as follows</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_datasync_1.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_datasync_1.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_datasync_1.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_datasync_1.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">datasync_task_2</span> <span class="o">=</span> <span class="n">DataSyncOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;datasync_task_2&quot;</span><span class="p">,</span>
         <span class="n">source_location_uri</span><span class="o">=</span><span class="n">SOURCE_LOCATION_URI</span><span class="p">,</span>
@@ -685,7 +685,7 @@ Finally, delete it.</p>
 <h3>Environment variables<a class="headerlink" href="#id2" title="Permalink to this headline">¶</a></h3>
 <p>This example relies on the following variables, which can be passed via OS environment variables.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_datasync_2.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_datasync_2.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_datasync_2.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_datasync_2.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">SOURCE_LOCATION_URI</span> <span class="o">=</span> <span class="n">getenv</span><span class="p">(</span><span class="s2">&quot;SOURCE_LOCATION_URI&quot;</span><span class="p">,</span> <span class="s2">&quot;smb://hostname/directory/&quot;</span><span class="p">)</span>
 
 <span class="n">DESTINATION_LOCATION_URI</span> <span class="o">=</span> <span class="n">getenv</span><span class="p">(</span><span class="s2">&quot;DESTINATION_LOCATION_URI&quot;</span><span class="p">,</span> <span class="s2">&quot;s3://mybucket/prefix&quot;</span><span class="p">)</span>
@@ -723,7 +723,7 @@ as before but with some extra arguments.</p>
 and/or Locations if no suitable existing Task was found. If these are left to their default value (None)
 then no create will be attempted.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_datasync_2.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_datasync_2.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_datasync_2.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_datasync_2.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">datasync_task</span> <span class="o">=</span> <span class="n">DataSyncOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;datasync_task&quot;</span><span class="p">,</span>
         <span class="n">source_location_uri</span><span class="o">=</span><span class="n">SOURCE_LOCATION_URI</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/dms.html b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/dms.html
index a89bd851a8..23f93f60bd 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/dms.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/dms.html
@@ -646,7 +646,7 @@ to be completed, and then delete it.</p>
 <h3>Defining tasks<a class="headerlink" href="#defining-tasks" title="Permalink to this headline">¶</a></h3>
 <p>In the following code we create a new replication task, start it, wait for it to be completed and then delete it.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_dms_full_load_task.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_dms_full_load_task.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_dms_full_load_task.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_dms_full_load_task.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">create_task</span> <span class="o">=</span> <span class="n">DmsCreateTaskOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;create_task&#39;</span><span class="p">,</span>
         <span class="n">replication_task_id</span><span class="o">=</span><span class="n">REPLICATION_TASK_ID</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/ecs.html b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/ecs.html
index 1ec85155e2..312bde4c6e 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/ecs.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/ecs.html
@@ -635,7 +635,7 @@ scale containerized applications.</p>
 <li><p>If you have integrated external resources in your ECS Cluster, for example using ECS Anywhere, and want to run your containers on those external resources, set the parameter to EXTERNAL.</p></li>
 </ul>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_ecs_ec2.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_ecs_ec2.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_ecs_ec2.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_ecs_ec2.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">hello_world</span> <span class="o">=</span> <span class="n">EcsOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;hello_world&quot;</span><span class="p">,</span>
         <span class="n">cluster</span><span class="o">=</span><span class="n">os</span><span class="o">.</span><span class="n">environ</span><span class="o">.</span><span class="n">get</span><span class="p">(</span><span class="s2">&quot;CLUSTER_NAME&quot;</span><span class="p">,</span> <span class="s2">&quot;existing_cluster_name&quot;</span><span class="p">),</span>
@@ -665,7 +665,7 @@ scale containerized applications.</p>
 </div>
 </div>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_ecs_fargate.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_ecs_fargate.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_ecs_fargate.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_ecs_fargate.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">hello_world</span> <span class="o">=</span> <span class="n">EcsOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;hello_world&quot;</span><span class="p">,</span>
         <span class="n">cluster</span><span class="o">=</span><span class="n">os</span><span class="o">.</span><span class="n">environ</span><span class="o">.</span><span class="n">get</span><span class="p">(</span><span class="s2">&quot;CLUSTER_NAME&quot;</span><span class="p">,</span> <span class="s2">&quot;existing_cluster_name&quot;</span><span class="p">),</span>
@@ -704,7 +704,7 @@ scale containerized applications.</p>
 <h3>CloudWatch Logging<a class="headerlink" href="#cloudwatch-logging" title="Permalink to this headline">¶</a></h3>
 <p>To stream logs to AWS CloudWatch, you need to define these parameters. Using the example Operators above, we would add these additional parameters to enable logging to CloudWatch. You will need to ensure that you have the appropriate level of permissions (see next section)</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_ecs_ec2.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_ecs_ec2.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_ecs_ec2.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_ecs_ec2.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>        <span class="n">awslogs_group</span><span class="o">=</span><span class="s2">&quot;/ecs/hello-world&quot;</span><span class="p">,</span>
         <span class="n">awslogs_region</span><span class="o">=</span><span class="s2">&quot;aws-region&quot;</span><span class="p">,</span>
         <span class="n">awslogs_stream_prefix</span><span class="o">=</span><span class="s2">&quot;ecs/hello-world-container&quot;</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/eks.html b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/eks.html
index f568b10568..d5ce65ba12 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/eks.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/eks.html
@@ -609,7 +609,7 @@ and management of containerized applications.</p>
 <p>To check the state of an Amazon EKS Cluster until it reaches the target state or another terminal
 state you can use <a class="reference internal" href="../_api/airflow/providers/amazon/aws/sensors/eks/index.html#airflow.providers.amazon.aws.sensors.eks.EksClusterStateSensor" title="airflow.providers.amazon.aws.sensors.eks.EksClusterStateSensor"><code class="xref py py-class docutils literal notranslate"><span class="pre">EksClusterStateSensor</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">await_create_cluster</span> <span class="o">=</span> <span class="n">EksClusterStateSensor</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;wait_for_create_cluster&#39;</span><span class="p">,</span>
     <span class="n">cluster_name</span><span class="o">=</span><span class="n">CLUSTER_NAME</span><span class="p">,</span>
@@ -629,7 +629,7 @@ state you can use <a class="reference internal" href="../_api/airflow/providers/
 </dd>
 </dl>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="c1"># Create an Amazon EKS Cluster control plane without attaching compute service.</span>
 <span class="n">create_cluster</span> <span class="o">=</span> <span class="n">EksCreateClusterOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;create_eks_cluster&#39;</span><span class="p">,</span>
@@ -647,7 +647,7 @@ state you can use <a class="reference internal" href="../_api/airflow/providers/
 <p>To delete an existing Amazon EKS Cluster you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/eks/index.html#airflow.providers.amazon.aws.operators.eks.EksDeleteClusterOperator" title="airflow.providers.amazon.aws.operators.eks.EksDeleteClusterOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">EksDeleteClusterOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">delete_cluster</span> <span class="o">=</span> <span class="n">EksDeleteClusterOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;delete_eks_cluster&#39;</span><span class="p">,</span>
     <span class="n">cluster_name</span><span class="o">=</span><span class="n">CLUSTER_NAME</span><span class="p">,</span>
@@ -661,7 +661,7 @@ attempt to delete any attached resources first.</p>
 </dd>
 </dl>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroup_in_one_step.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroup_in_one_step.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroup_in_one_step.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroup_in_one_step.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="c1"># An Amazon EKS cluster can not be deleted with attached resources such as nodegroups or Fargate profiles.</span>
 <span class="c1"># Setting the `force` to `True` will delete any attached resources before deleting the cluster.</span>
 <span class="n">delete_all</span> <span class="o">=</span> <span class="n">EksDeleteClusterOperator</span><span class="p">(</span>
@@ -681,7 +681,7 @@ attempt to delete any attached resources first.</p>
 <p>To check the state of an Amazon EKS managed node group until it reaches the target state or another terminal
 state you can use <a class="reference internal" href="../_api/airflow/providers/amazon/aws/sensors/eks/index.html#airflow.providers.amazon.aws.sensors.eks.EksNodegroupStateSensor" title="airflow.providers.amazon.aws.sensors.eks.EksNodegroupStateSensor"><code class="xref py py-class docutils literal notranslate"><span class="pre">EksNodegroupStateSensor</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">await_create_nodegroup</span> <span class="o">=</span> <span class="n">EksNodegroupStateSensor</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;wait_for_create_nodegroup&#39;</span><span class="p">,</span>
     <span class="n">cluster_name</span><span class="o">=</span><span class="n">CLUSTER_NAME</span><span class="p">,</span>
@@ -703,7 +703,7 @@ state you can use <a class="reference internal" href="../_api/airflow/providers/
 </dd>
 </dl>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">create_nodegroup</span> <span class="o">=</span> <span class="n">EksCreateNodegroupOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;create_eks_nodegroup&#39;</span><span class="p">,</span>
     <span class="n">cluster_name</span><span class="o">=</span><span class="n">CLUSTER_NAME</span><span class="p">,</span>
@@ -720,7 +720,7 @@ state you can use <a class="reference internal" href="../_api/airflow/providers/
 <p>To delete an existing Amazon EKS Managed Nodegroup you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/eks/index.html#airflow.providers.amazon.aws.operators.eks.EksDeleteNodegroupOperator" title="airflow.providers.amazon.aws.operators.eks.EksDeleteNodegroupOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">EksDeleteNodegroupOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">delete_nodegroup</span> <span class="o">=</span> <span class="n">EksDeleteNodegroupOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;delete_eks_nodegroup&#39;</span><span class="p">,</span>
     <span class="n">cluster_name</span><span class="o">=</span><span class="n">CLUSTER_NAME</span><span class="p">,</span>
@@ -744,7 +744,7 @@ state you can use <a class="reference internal" href="../_api/airflow/providers/
 </dd>
 </dl>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroup_in_one_step.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroup_in_one_step.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroup_in_one_step.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroup_in_one_step.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="c1"># Create an Amazon EKS cluster control plane and an EKS nodegroup compute platform in one step.</span>
 <span class="n">create_cluster_and_nodegroup</span> <span class="o">=</span> <span class="n">EksCreateClusterOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;create_eks_cluster_and_nodegroup&#39;</span><span class="p">,</span>
@@ -775,7 +775,7 @@ state you can use <a class="reference internal" href="../_api/airflow/providers/
 </dd>
 </dl>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_in_one_step.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_in_one_step.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_in_one_step.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_in_one_step.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="c1"># Create an Amazon EKS cluster control plane and an AWS Fargate compute platform in one step.</span>
 <span class="n">create_cluster_and_fargate_profile</span> <span class="o">=</span> <span class="n">EksCreateClusterOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;create_eks_cluster_and_fargate_profile&#39;</span><span class="p">,</span>
@@ -799,7 +799,7 @@ state you can use <a class="reference internal" href="../_api/airflow/providers/
 <p>To check the state of an AWS Fargate profile until it reaches the target state or another terminal
 state you can use <code class="xref py py-class docutils literal notranslate"><span class="pre">EksFargateProfileSensor</span></code>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_profile.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_profile.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_profile.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_profile.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">await_create_fargate_profile</span> <span class="o">=</span> <span class="n">EksFargateProfileStateSensor</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;wait_for_create_fargate_profile&#39;</span><span class="p">,</span>
     <span class="n">cluster_name</span><span class="o">=</span><span class="n">CLUSTER_NAME</span><span class="p">,</span>
@@ -821,7 +821,7 @@ state you can use <code class="xref py py-class docutils literal notranslate"><s
 </dd>
 </dl>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_profile.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_profile.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_profile.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_profile.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">create_fargate_profile</span> <span class="o">=</span> <span class="n">EksCreateFargateProfileOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;create_eks_fargate_profile&#39;</span><span class="p">,</span>
     <span class="n">cluster_name</span><span class="o">=</span><span class="n">CLUSTER_NAME</span><span class="p">,</span>
@@ -838,7 +838,7 @@ state you can use <code class="xref py py-class docutils literal notranslate"><s
 <p>To delete an existing AWS Fargate Profile you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/eks/index.html#airflow.providers.amazon.aws.operators.eks.EksDeleteFargateProfileOperator" title="airflow.providers.amazon.aws.operators.eks.EksDeleteFargateProfileOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">EksDeleteFargateProfileOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_profile.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_profile.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_profile.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_profile.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">delete_fargate_profile</span> <span class="o">=</span> <span class="n">EksDeleteFargateProfileOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;delete_eks_fargate_profile&#39;</span><span class="p">,</span>
     <span class="n">cluster_name</span><span class="o">=</span><span class="n">CLUSTER_NAME</span><span class="p">,</span>
@@ -855,7 +855,7 @@ state you can use <code class="xref py py-class docutils literal notranslate"><s
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/eks/index.html#airflow.providers.amazon.aws.operators.eks.EksPodOperator" title="airflow.providers.amazon.aws.operators.eks.EksPodOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">EksPodOperator</span></code></a>.</p>
 <p>Note: An Amazon EKS Cluster with underlying compute infrastructure is required.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">start_pod</span> <span class="o">=</span> <span class="n">EksPodOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;run_pod&quot;</span><span class="p">,</span>
     <span class="n">cluster_name</span><span class="o">=</span><span class="n">CLUSTER_NAME</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/emr.html b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/emr.html
index a9efd1f1e0..a97f5eac4e 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/emr.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/emr.html
@@ -633,7 +633,7 @@ create a new EMR job flow.  The cluster will be terminated automatically after f
 <h3>JobFlow configuration<a class="headerlink" href="#jobflow-configuration" title="Permalink to this headline">¶</a></h3>
 <p>To create a job flow on EMR, you need to specify the configuration for the EMR cluster:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_job_flow_automatic_steps.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_emr_job_flow_automatic_steps.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_job_flow_automatic_steps.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_emr_job_flow_automatic_steps.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">SPARK_STEPS</span> <span class="o">=</span> <span class="p">[</span>
     <span class="p">{</span>
         <span class="s1">&#39;Name&#39;</span><span class="p">:</span> <span class="s1">&#39;calculate_pi&#39;</span><span class="p">,</span>
@@ -686,7 +686,7 @@ you may not see the cluster in the EMR Management Console - you can change this
 <h3>Create the Job Flow<a class="headerlink" href="#create-the-job-flow" title="Permalink to this headline">¶</a></h3>
 <p>In the following code we are creating a new job flow using the configuration as explained above.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_job_flow_automatic_steps.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_emr_job_flow_automatic_steps.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_job_flow_automatic_steps.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_emr_job_flow_automatic_steps.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">job_flow_creator</span> <span class="o">=</span> <span class="n">EmrCreateJobFlowOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;create_job_flow&#39;</span><span class="p">,</span>
     <span class="n">job_flow_overrides</span><span class="o">=</span><span class="n">JOB_FLOW_OVERRIDES</span><span class="p">,</span>
@@ -701,7 +701,7 @@ you may not see the cluster in the EMR Management Console - you can change this
 <p>To add Steps to an existing EMR Job Flow you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/emr/index.html#airflow.providers.amazon.aws.operators.emr.EmrAddStepsOperator" title="airflow.providers.amazon.aws.operators.emr.EmrAddStepsOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">EmrAddStepsOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_job_flow_manual_steps.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_emr_job_flow_manual_steps.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_job_flow_manual_steps.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_emr_job_flow_manual_steps.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">step_adder</span> <span class="o">=</span> <span class="n">EmrAddStepsOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;add_steps&#39;</span><span class="p">,</span>
     <span class="n">job_flow_id</span><span class="o">=</span><span class="n">cluster_creator</span><span class="o">.</span><span class="n">output</span><span class="p">,</span>
@@ -716,7 +716,7 @@ you may not see the cluster in the EMR Management Console - you can change this
 <p>To terminate an EMR Job Flow you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/emr/index.html#airflow.providers.amazon.aws.operators.emr.EmrTerminateJobFlowOperator" title="airflow.providers.amazon.aws.operators.emr.EmrTerminateJobFlowOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">EmrTerminateJobFlowOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_job_flow_manual_steps.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_emr_job_flow_manual_steps.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_job_flow_manual_steps.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_emr_job_flow_manual_steps.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">cluster_remover</span> <span class="o">=</span> <span class="n">EmrTerminateJobFlowOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;remove_cluster&#39;</span><span class="p">,</span>
     <span class="n">job_flow_id</span><span class="o">=</span><span class="n">cluster_creator</span><span class="o">.</span><span class="n">output</span><span class="p">,</span>
@@ -740,7 +740,7 @@ you may not see the cluster in the EMR Management Console - you can change this
 <p>To monitor the state of an EMR Job Flow you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/sensors/emr/index.html#airflow.providers.amazon.aws.sensors.emr.EmrJobFlowSensor" title="airflow.providers.amazon.aws.sensors.emr.EmrJobFlowSensor"><code class="xref py py-class docutils literal notranslate"><span class="pre">EmrJobFlowSensor</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_job_flow_automatic_steps.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_emr_job_flow_automatic_steps.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_job_flow_automatic_steps.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_emr_job_flow_automatic_steps.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">job_sensor</span> <span class="o">=</span> <span class="n">EmrJobFlowSensor</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;check_job_flow&#39;</span><span class="p">,</span>
     <span class="n">job_flow_id</span><span class="o">=</span><span class="n">job_flow_creator</span><span class="o">.</span><span class="n">output</span><span class="p">,</span>
@@ -754,7 +754,7 @@ you may not see the cluster in the EMR Management Console - you can change this
 <p>To monitor the state of a Step running an existing EMR Job Flow you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/sensors/emr/index.html#airflow.providers.amazon.aws.sensors.emr.EmrStepSensor" title="airflow.providers.amazon.aws.sensors.emr.EmrStepSensor"><code class="xref py py-class docutils literal notranslate"><span class="pre">EmrStepSensor</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_job_flow_manual_steps.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_emr_job_flow_manual_steps.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_job_flow_manual_steps.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_emr_job_flow_manual_steps.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">step_checker</span> <span class="o">=</span> <span class="n">EmrStepSensor</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;watch_step&#39;</span><span class="p">,</span>
     <span class="n">job_flow_id</span><span class="o">=</span><span class="n">cluster_creator</span><span class="o">.</span><span class="n">output</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/emr_eks.html b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/emr_eks.html
index 6266c602c1..af4d1fd4e7 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/emr_eks.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/emr_eks.html
@@ -639,7 +639,7 @@ and <code class="docutils literal notranslate"><span class="pre">monitoringConfi
 Refer to the <a class="reference external" href="https://docs.aws.amazon.com/emr/latest/EMR-on-EKS-DevelopmentGuide/emr-eks-jobs-CLI.html#emr-eks-jobs-parameters">EMR on EKS guide</a>
 for more details on job configuration.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_eks_job.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_emr_eks_job.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_eks_job.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_emr_eks_job.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">JOB_DRIVER_ARG</span> <span class="o">=</span> <span class="p">{</span>
     <span class="s2">&quot;sparkSubmitJobDriver&quot;</span><span class="p">:</span> <span class="p">{</span>
         <span class="s2">&quot;entryPoint&quot;</span><span class="p">:</span> <span class="s2">&quot;local:///usr/lib/spark/examples/src/main/python/pi.py&quot;</span><span class="p">,</span>
@@ -671,7 +671,7 @@ can store them in a connection or provide them in the DAG. Your AWS region shoul
 in the <code class="docutils literal notranslate"><span class="pre">aws_default</span></code> connection as <code class="docutils literal notranslate"><span class="pre">{&quot;region_name&quot;:</span> <span class="pre">&quot;us-east-1&quot;}</span></code> or a custom connection name
 that gets passed to the operator with the <code class="docutils literal notranslate"><span class="pre">aws_conn_id</span></code> parameter.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_eks_job.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_emr_eks_job.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_eks_job.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_emr_eks_job.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">job_starter</span> <span class="o">=</span> <span class="n">EmrContainerOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;start_job&quot;</span><span class="p">,</span>
     <span class="n">virtual_cluster_id</span><span class="o">=</span><span class="n">VIRTUAL_CLUSTER_ID</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/glacier.html b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/glacier.html
index df53b8721d..b940fb4e06 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/glacier.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/glacier.html
@@ -604,7 +604,7 @@ The operation returns dictionary of information related to the initiated job lik
 <code class="xref py py-class docutils literal notranslate"><span class="pre">GlacierCreateJobOperator</span></code></p>
 <p>Example usage:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">create_glacier_job</span> <span class="o">=</span> <span class="n">GlacierCreateJobOperator</span><span class="p">(</span><span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;create_glacier_job&quot;</span><span class="p">,</span> <span class="n">vault_name</span><span class="o">=</span><span class="n">VAULT_NAME</span><span class="p">)</span>
 <span class="n">JOB_ID</span> <span class="o">=</span> <span class="s1">&#39;{{ task_instance.xcom_pull(&quot;create_glacier_job&quot;)[&quot;jobId&quot;] }}&#39;</span>
 </pre></div>
@@ -629,7 +629,7 @@ Which means that every next request will be sent every 20 minutes.</p>
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/sensors/glacier/index.html#airflow.providers.amazon.aws.sensors.glacier.GlacierJobOperationSensor" title="airflow.providers.amazon.aws.sensors.glacier.GlacierJobOperationSensor"><code class="xref py py-class docutils literal notranslate"><span class="pre">GlacierJobOperationSensor</span></code></a></p>
 <p>Example usage:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">transfer_archive_to_gcs</span> <span class="o">=</span> <span class="n">GlacierToGCSOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;transfer_archive_to_gcs&quot;</span><span class="p">,</span>
     <span class="n">vault_name</span><span class="o">=</span><span class="n">VAULT_NAME</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/glue.html b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/glue.html
index f8207bf54d..094274df04 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/glue.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/glue.html
@@ -621,7 +621,7 @@ your data and putting it to use in minutes instead of months.</p>
 To create a new AWS Glue Crawler or run an existing one you can
 use <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/glue_crawler/index.html#airflow.providers.amazon.aws.operators.glue_crawler.GlueCrawlerOperator" title="airflow.providers.amazon.aws.operators.glue_crawler.GlueCrawlerOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">GlueCrawlerOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_glue.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_glue.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_glue.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_glue.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">crawl_s3</span> <span class="o">=</span> <span class="n">GlueCrawlerOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;crawl_s3&#39;</span><span class="p">,</span>
     <span class="n">config</span><span class="o">=</span><span class="n">GLUE_CRAWLER_CONFIG</span><span class="p">,</span>
@@ -639,7 +639,7 @@ policy. See the References section below for a link to more details.</p>
 <p>To wait on the state of an AWS Glue Crawler execution until it reaches a terminal state you can
 use <a class="reference internal" href="../_api/airflow/providers/amazon/aws/sensors/glue_crawler/index.html#airflow.providers.amazon.aws.sensors.glue_crawler.GlueCrawlerSensor" title="airflow.providers.amazon.aws.sensors.glue_crawler.GlueCrawlerSensor"><code class="xref py py-class docutils literal notranslate"><span class="pre">GlueCrawlerSensor</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_glue.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_glue.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_glue.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_glue.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">wait_for_crawl</span> <span class="o">=</span> <span class="n">GlueCrawlerSensor</span><span class="p">(</span><span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;wait_for_crawl&#39;</span><span class="p">,</span> <span class="n">crawler_name</span><span class="o">=</span><span class="n">GLUE_CRAWLER_NAME</span><span class="p">)</span>
 </pre></div>
 </div>
@@ -649,7 +649,7 @@ use <a class="reference internal" href="../_api/airflow/providers/amazon/aws/sen
 <span id="howto-operator-gluejoboperator"></span><h3>AWS Glue Job Operator<a class="headerlink" href="#aws-glue-job-operator" title="Permalink to this headline">¶</a></h3>
 <p>To submit a new AWS Glue Job you can use <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/glue/index.html#airflow.providers.amazon.aws.operators.glue.GlueJobOperator" title="airflow.providers.amazon.aws.operators.glue.GlueJobOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">GlueJobOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_glue.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_glue.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_glue.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_glue.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">job_name</span> <span class="o">=</span> <span class="s1">&#39;example_glue_job&#39;</span>
 <span class="n">submit_glue_job</span> <span class="o">=</span> <span class="n">GlueJobOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;submit_glue_job&#39;</span><span class="p">,</span>
@@ -671,7 +671,7 @@ policies to provide access to the output location for result data.</p>
 <p>To wait on the state of an AWS Glue Job until it reaches a terminal state you can
 use <a class="reference internal" href="../_api/airflow/providers/amazon/aws/sensors/glue/index.html#airflow.providers.amazon.aws.sensors.glue.GlueJobSensor" title="airflow.providers.amazon.aws.sensors.glue.GlueJobSensor"><code class="xref py py-class docutils literal notranslate"><span class="pre">GlueJobSensor</span></code></a></p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_glue.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_glue.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_glue.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_glue.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">wait_for_job</span> <span class="o">=</span> <span class="n">GlueJobSensor</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;wait_for_job&#39;</span><span class="p">,</span>
     <span class="n">job_name</span><span class="o">=</span><span class="n">job_name</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/lambda.html b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/lambda.html
index 7be8862087..500114811e 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/lambda.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/lambda.html
@@ -609,7 +609,7 @@ and only pay for what you use.</p>
 <p>To publish a message to an Amazon SNS Topic you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/aws_lambda/index.html#airflow.providers.amazon.aws.operators.aws_lambda.AwsLambdaInvokeFunctionOperator" title="airflow.providers.amazon.aws.operators.aws_lambda.AwsLambdaInvokeFunctionOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">AwsLambdaInvokeFunctionOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_lambda.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_lambda.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_lambda.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_lambda.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">invoke_lambda_function</span> <span class="o">=</span> <span class="n">AwsLambdaInvokeFunctionOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;setup__invoke_lambda_function&#39;</span><span class="p">,</span>
     <span class="n">function_name</span><span class="o">=</span><span class="n">LAMBDA_FUNCTION_NAME</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/rds.html b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/rds.html
index dd65e3b3ba..16a3bffb72 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/rds.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/rds.html
@@ -611,7 +611,7 @@
 <code class="xref py py-class docutils literal notranslate"><span class="pre">RDSCreateDBSnapshotOperator</span></code>.
 The source DB instance must be in the <code class="docutils literal notranslate"><span class="pre">available</span></code> or <code class="docutils literal notranslate"><span class="pre">storage-optimization</span></code> state.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_rds.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_rds.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="k">with</span> <span class="n">DAG</span><span class="p">(</span>
     <span class="n">dag_id</span><span class="o">=</span><span class="s1">&#39;rds_snapshots&#39;</span><span class="p">,</span> <span class="n">start_date</span><span class="o">=</span><span class="n">datetime</span><span class="p">(</span><span class="mi">2021</span><span class="p">,</span> <span class="mi">1</span><span class="p">,</span> <span class="mi">1</span><span class="p">),</span> <span class="n">schedule_interval</span><span class="o">=</span><span class="kc">None</span><span [...]
 <span class="p">)</span> <span class="k">as</span> <span class="n">dag</span><span class="p">:</span>
@@ -653,7 +653,7 @@ The source DB instance must be in the <code class="docutils literal notranslate"
 <code class="xref py py-class docutils literal notranslate"><span class="pre">RDSCopyDBSnapshotOperator</span></code>.
 The source DB snapshot must be in the <code class="docutils literal notranslate"><span class="pre">available</span></code> state.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_rds.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_rds.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">copy_snapshot</span> <span class="o">=</span> <span class="n">RdsCopyDbSnapshotOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;copy_snapshot&#39;</span><span class="p">,</span>
         <span class="n">db_type</span><span class="o">=</span><span class="s1">&#39;instance&#39;</span><span class="p">,</span>
@@ -675,7 +675,7 @@ The source DB snapshot must be in the <code class="docutils literal notranslate"
 <code class="xref py py-class docutils literal notranslate"><span class="pre">RDSDeleteDBSnapshotOperator</span></code>.
 The DB snapshot must be in the <code class="docutils literal notranslate"><span class="pre">available</span></code> state to be deleted.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_rds.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_rds.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">delete_snapshot</span> <span class="o">=</span> <span class="n">RdsDeleteDbSnapshotOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;delete_snapshot&#39;</span><span class="p">,</span>
         <span class="n">db_type</span><span class="o">=</span><span class="s1">&#39;instance&#39;</span><span class="p">,</span>
@@ -696,7 +696,7 @@ The DB snapshot must be in the <code class="docutils literal notranslate"><span
 <code class="xref py py-class docutils literal notranslate"><span class="pre">RDSStartExportTaskOperator</span></code>.
 The provided IAM role must have access to the S3 bucket.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_rds.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_rds.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">start_export</span> <span class="o">=</span> <span class="n">RdsStartExportTaskOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;start_export&#39;</span><span class="p">,</span>
         <span class="n">export_task_identifier</span><span class="o">=</span><span class="s1">&#39;export-auth-db-snap-{{ ds }}&#39;</span><span class="p">,</span>
@@ -720,7 +720,7 @@ The provided IAM role must have access to the S3 bucket.</p>
 <code class="xref py py-class docutils literal notranslate"><span class="pre">RDSCancelExportTaskOperator</span></code>.
 Any data that has already been written to the S3 bucket isn’t removed.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_rds.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_rds.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">cancel_export</span> <span class="o">=</span> <span class="n">RdsCancelExportTaskOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;cancel_export&#39;</span><span class="p">,</span>
         <span class="n">export_task_identifier</span><span class="o">=</span><span class="s1">&#39;export-auth-db-snap-{{ ds }}&#39;</span><span class="p">,</span>
@@ -742,7 +742,7 @@ To obtain an ARN with SNS, you must create a topic in Amazon SNS and subscribe t
 RDS event notification is only available for not encrypted SNS topics.
 If you specify an encrypted SNS topic, event notifications are not sent for the topic.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_rds.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_rds.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">create_subscription</span> <span class="o">=</span> <span class="n">RdsCreateEventSubscriptionOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;create_subscription&#39;</span><span class="p">,</span>
         <span class="n">subscription_name</span><span class="o">=</span><span class="s1">&#39;my-topic-subscription&#39;</span><span class="p">,</span>
@@ -764,7 +764,7 @@ If you specify an encrypted SNS topic, event notifications are not sent for the
 <p>To delete event subscription you can use
 <code class="xref py py-class docutils literal notranslate"><span class="pre">RDSDeleteEventSubscriptionOperator</span></code></p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_rds.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_rds.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">delete_subscription</span> <span class="o">=</span> <span class="n">RdsDeleteEventSubscriptionOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;delete_subscription&#39;</span><span class="p">,</span>
         <span class="n">subscription_name</span><span class="o">=</span><span class="s1">&#39;my-topic-subscription&#39;</span><span class="p">,</span>
@@ -792,7 +792,7 @@ If you specify an encrypted SNS topic, event notifications are not sent for the
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/sensors/rds/index.html#airflow.providers.amazon.aws.sensors.rds.RdsSnapshotExistenceSensor" title="airflow.providers.amazon.aws.sensors.rds.RdsSnapshotExistenceSensor"><code class="xref py py-class docutils literal notranslate"><span class="pre">RdsSnapshotExistenceSensor</span></code></a>.
 By default, sensor waits existence of snapshot with status <code class="docutils literal notranslate"><span class="pre">available</span></code>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_rds.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_rds.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">snapshot_sensor</span> <span class="o">=</span> <span class="n">RdsSnapshotExistenceSensor</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;snapshot_sensor&#39;</span><span class="p">,</span>
         <span class="n">db_type</span><span class="o">=</span><span class="s1">&#39;instance&#39;</span><span class="p">,</span>
@@ -811,7 +811,7 @@ By default, sensor waits existence of snapshot with status <code class="docutils
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/sensors/rds/index.html#airflow.providers.amazon.aws.sensors.rds.RdsExportTaskExistenceSensor" title="airflow.providers.amazon.aws.sensors.rds.RdsExportTaskExistenceSensor"><code class="xref py py-class docutils literal notranslate"><span class="pre">RdsExportTaskExistenceSensor</span></code></a>.
 By default, sensor waits existence of export task with status <code class="docutils literal notranslate"><span class="pre">available</span></code>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_rds.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_rds.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">export_sensor</span> <span class="o">=</span> <span class="n">RdsExportTaskExistenceSensor</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;export_sensor&#39;</span><span class="p">,</span>
         <span class="n">export_task_identifier</span><span class="o">=</span><span class="s1">&#39;export-auth-db-snap-{{ ds }}&#39;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/redshift_cluster.html b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/redshift_cluster.html
index 61c0b582d5..0b205eb8fa 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/redshift_cluster.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/redshift_cluster.html
@@ -624,7 +624,7 @@ business and customers.</p>
 <p>To check the state of an Amazon Redshift Cluster until it reaches the target state or another terminal
 state you can use <a class="reference internal" href="../_api/airflow/providers/amazon/aws/sensors/redshift_cluster/index.html#airflow.providers.amazon.aws.sensors.redshift_cluster.RedshiftClusterSensor" title="airflow.providers.amazon.aws.sensors.redshift_cluster.RedshiftClusterSensor"><code class="xref py py-class docutils literal notranslate"><span class="pre">RedshiftClusterSensor</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift_cluster.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_redshift_cluster.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift_cluster.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_redshift_cluster.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">task_wait_cluster_available</span> <span class="o">=</span> <span class="n">RedshiftClusterSensor</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;sensor_redshift_cluster_available&#39;</span><span class="p">,</span>
     <span class="n">cluster_identifier</span><span class="o">=</span><span class="n">REDSHIFT_CLUSTER_IDENTIFIER</span><span class="p">,</span>
@@ -641,7 +641,7 @@ state you can use <a class="reference internal" href="../_api/airflow/providers/
 <p>To resume a ‘paused’ Amazon Redshift Cluster you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/redshift_cluster/index.html#module-airflow.providers.amazon.aws.operators.redshift_cluster" title="airflow.providers.amazon.aws.operators.redshift_cluster"><code class="xref py py-class docutils literal notranslate"><span class="pre">RedshiftResumeClusterOperator</span></code></a></p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift_cluster.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_redshift_cluster.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift_cluster.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_redshift_cluster.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">task_resume_cluster</span> <span class="o">=</span> <span class="n">RedshiftResumeClusterOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;redshift_resume_cluster&#39;</span><span class="p">,</span>
     <span class="n">cluster_identifier</span><span class="o">=</span><span class="n">REDSHIFT_CLUSTER_IDENTIFIER</span><span class="p">,</span>
@@ -655,7 +655,7 @@ state you can use <a class="reference internal" href="../_api/airflow/providers/
 <p>To pause an ‘available’ Amazon Redshift Cluster you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/redshift_cluster/index.html#module-airflow.providers.amazon.aws.operators.redshift_cluster" title="airflow.providers.amazon.aws.operators.redshift_cluster"><code class="xref py py-class docutils literal notranslate"><span class="pre">RedshiftPauseClusterOperator</span></code></a></p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift_cluster.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_redshift_cluster.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift_cluster.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_redshift_cluster.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">task_pause_cluster</span> <span class="o">=</span> <span class="n">RedshiftPauseClusterOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;redshift_pause_cluster&#39;</span><span class="p">,</span>
     <span class="n">cluster_identifier</span><span class="o">=</span><span class="n">REDSHIFT_CLUSTER_IDENTIFIER</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/redshift_data.html b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/redshift_data.html
index 91a648d384..e7d8cfaca9 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/redshift_data.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/redshift_data.html
@@ -622,7 +622,7 @@ statements against an Amazon Redshift cluster.</p>
 <p>This is a basic example DAG for using <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/redshift_data/index.html#module-airflow.providers.amazon.aws.operators.redshift_data" title="airflow.providers.amazon.aws.operators.redshift_data"><code class="xref py py-class docutils literal notranslate"><span class="pre">RedshiftDataOperator</span></code></a>
 to execute statements against an Amazon Redshift cluster.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift_data_execute_sql.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_redshift_data_execute_sql.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift_data_execute_sql.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_redshift_data_execute_sql.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">task_query</span> <span class="o">=</span> <span class="n">RedshiftDataOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;redshift_query&#39;</span><span class="p">,</span>
     <span class="n">cluster_identifier</span><span class="o">=</span><span class="n">REDSHIFT_CLUSTER_IDENTIFIER</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/redshift_sql.html b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/redshift_sql.html
index 178f0fdfa6..dcd0a15251 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/redshift_sql.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/redshift_sql.html
@@ -623,7 +623,7 @@ business and customers.</p>
 <div class="section" id="execute-a-sql-query">
 <h3>Execute a SQL query<a class="headerlink" href="#execute-a-sql-query" title="Permalink to this headline">¶</a></h3>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift_sql.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_redshift_sql.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift_sql.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_redshift_sql.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">task_select_data</span> <span class="o">=</span> <span class="n">RedshiftSQLOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;task_get_all_table_data&#39;</span><span class="p">,</span> <span class="n">sql</span><span class="o">=</span><span class="s2">&quot;&quot;&quot;CREATE TABLE more_fruit AS SELECT * FROM fruit;&quot;&quot;&quot;</span>
 <span class="p">)</span>
@@ -636,7 +636,7 @@ business and customers.</p>
 <p>RedshiftSQLOperator supports the <code class="docutils literal notranslate"><span class="pre">parameters</span></code> attribute which allows us to dynamically pass
 parameters into SQL statements.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift_sql.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_redshift_sql.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift_sql.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_redshift_sql.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">task_select_filtered_data</span> <span class="o">=</span> <span class="n">RedshiftSQLOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;task_get_filtered_table_data&#39;</span><span class="p">,</span>
     <span class="n">sql</span><span class="o">=</span><span class="s2">&quot;&quot;&quot;CREATE TABLE filtered_fruit AS SELECT * FROM fruit WHERE color = &#39;{{ params.color }}&#39;;&quot;&quot;&quot;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/s3.html b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/s3.html
index 41cae1428f..b26ccfde02 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/s3.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/s3.html
@@ -635,7 +635,7 @@
 <p>To create an Amazon S3 bucket you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/s3/index.html#airflow.providers.amazon.aws.operators.s3.S3CreateBucketOperator" title="airflow.providers.amazon.aws.operators.s3.S3CreateBucketOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">S3CreateBucketOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3_bucket.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_s3_bucket.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3_bucket.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_s3_bucket.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">create_bucket</span> <span class="o">=</span> <span class="n">S3CreateBucketOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;s3_create_bucket&#39;</span><span class="p">,</span>
     <span class="n">bucket_name</span><span class="o">=</span><span class="n">BUCKET_NAME</span><span class="p">,</span>
@@ -649,7 +649,7 @@
 <p>To delete an Amazon S3 bucket you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/s3/index.html#airflow.providers.amazon.aws.operators.s3.S3DeleteBucketOperator" title="airflow.providers.amazon.aws.operators.s3.S3DeleteBucketOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">S3DeleteBucketOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3_bucket.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_s3_bucket.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3_bucket.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_s3_bucket.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">delete_bucket</span> <span class="o">=</span> <span class="n">S3DeleteBucketOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;s3_delete_bucket&#39;</span><span class="p">,</span> <span class="n">bucket_name</span><span class="o">=</span><span class="n">BUCKET_NAME</span><span class="p">,</span> <span class="n">force_delete</span><span class="o">=</span><span class="kc">True</span>
 <span class="p">)</span>
@@ -662,7 +662,7 @@
 <p>To set the tags for an Amazon S3 bucket you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/s3/index.html#airflow.providers.amazon.aws.operators.s3.S3PutBucketTaggingOperator" title="airflow.providers.amazon.aws.operators.s3.S3PutBucketTaggingOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">S3PutBucketTaggingOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3_bucket.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_s3_bucket.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3_bucket.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_s3_bucket.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">put_tagging</span> <span class="o">=</span> <span class="n">S3PutBucketTaggingOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;s3_put_bucket_tagging&#39;</span><span class="p">,</span>
     <span class="n">bucket_name</span><span class="o">=</span><span class="n">BUCKET_NAME</span><span class="p">,</span>
@@ -678,7 +678,7 @@
 <p>To get the tag set associated with an Amazon S3 bucket you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/s3/index.html#airflow.providers.amazon.aws.operators.s3.S3GetBucketTaggingOperator" title="airflow.providers.amazon.aws.operators.s3.S3GetBucketTaggingOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">S3GetBucketTaggingOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3_bucket.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_s3_bucket.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3_bucket.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_s3_bucket.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">get_tagging</span> <span class="o">=</span> <span class="n">S3GetBucketTaggingOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;s3_get_bucket_tagging&#39;</span><span class="p">,</span>
     <span class="n">bucket_name</span><span class="o">=</span><span class="n">BUCKET_NAME</span><span class="p">,</span>
@@ -692,7 +692,7 @@
 <p>To delete the tags of an Amazon S3 bucket you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/s3/index.html#airflow.providers.amazon.aws.operators.s3.S3DeleteBucketTaggingOperator" title="airflow.providers.amazon.aws.operators.s3.S3DeleteBucketTaggingOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">S3DeleteBucketTaggingOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3_bucket.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_s3_bucket.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3_bucket.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_s3_bucket.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">delete_tagging</span> <span class="o">=</span> <span class="n">S3DeleteBucketTaggingOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;s3_delete_bucket_tagging&#39;</span><span class="p">,</span>
     <span class="n">bucket_name</span><span class="o">=</span><span class="n">BUCKET_NAME</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/sagemaker.html b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/sagemaker.html
index d045dcb447..d233c6a7ac 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/sagemaker.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/sagemaker.html
@@ -640,7 +640,7 @@ generate the models artifact in s3, create the model,
 training, Sagemaker Model, batch transform job and
 then delete the model.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_sagemaker.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_sagemaker.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_sagemaker.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_sagemaker.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="k">with</span> <span class="n">DAG</span><span class="p">(</span>
     <span class="s2">&quot;sample_sagemaker_dag&quot;</span><span class="p">,</span>
     <span class="n">schedule_interval</span><span class="o">=</span><span class="kc">None</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/sns.html b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/sns.html
index 5197c17bdf..0319d6c2f9 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/sns.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/sns.html
@@ -610,7 +610,7 @@ messages (SMS).</p>
 <p>To publish a message to an Amazon SNS Topic you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/sns/index.html#airflow.providers.amazon.aws.operators.sns.SnsPublishOperator" title="airflow.providers.amazon.aws.operators.sns.SnsPublishOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">SnsPublishOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_sns.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_sns.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_sns.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_sns.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">publish</span> <span class="o">=</span> <span class="n">SnsPublishOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;publish_message&#39;</span><span class="p">,</span>
     <span class="n">target_arn</span><span class="o">=</span><span class="n">SNS_TOPIC_ARN</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/sqs_publish.html b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/sqs_publish.html
index bb5bb7bbd2..dbcb139cb2 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/sqs_publish.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/sqs_publish.html
@@ -627,7 +627,7 @@ to publish a message to Amazon Simple Queue Service (SQS).</p>
 <p>In the following example, the task “publish_to_queue” publishes a message containing
 the task instance and the execution date to a queue named <code class="docutils literal notranslate"><span class="pre">Airflow-Example-Queue</span></code>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_sqs.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_sqs.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_sqs.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_sqs.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>
     <span class="c1"># Using a task-decorated function to create an SQS queue</span>
     <span class="n">create_queue</span> <span class="o">=</span> <span class="n">create_queue_fn</span><span class="p">()</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/transfer/dynamodb_to_s3.html b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/transfer/dynamodb_to_s3.html
index c3620e8a16..1298506d67 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/transfer/dynamodb_to_s3.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/transfer/dynamodb_to_s3.html
@@ -628,7 +628,7 @@ records that satisfy the criteria.</p>
 <a class="reference internal" href="../../_api/airflow/providers/amazon/aws/transfers/dynamodb_to_s3/index.html#airflow.providers.amazon.aws.transfers.dynamodb_to_s3.DynamoDBToS3Operator" title="airflow.providers.amazon.aws.transfers.dynamodb_to_s3.DynamoDBToS3Operator"><code class="xref py py-class docutils literal notranslate"><span class="pre">DynamoDBToS3Operator</span></code></a></p>
 <p>Example usage:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_dynamodb_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/amazon/aws/example_dags/example_dynamodb_to_s3.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_dynamodb_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_dynamodb_to_s3.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">backup_db</span> <span class="o">=</span> <span class="n">DynamoDBToS3Operator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;backup_db&#39;</span><span class="p">,</span>
     <span class="n">dynamodb_table_name</span><span class="o">=</span><span class="n">TABLE_NAME</span><span class="p">,</span>
@@ -642,7 +642,7 @@ records that satisfy the criteria.</p>
 <p>To parallelize the replication, users can create multiple DynamoDBToS3Operator tasks using the
 <code class="docutils literal notranslate"><span class="pre">TotalSegments</span></code> parameter.  For instance to replicate with parallelism of 2, create two tasks:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_dynamodb_to_s3_segmented.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/amazon/aws/example_dags/example_dynamodb_to_s3_segmented.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_dynamodb_to_s3_segmented.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_dynamodb_to_s3_segmented.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="c1"># Segmenting allows the transfer to be parallelized into {segment} number of parallel tasks.</span>
 <span class="n">backup_db_segment_1</span> <span class="o">=</span> <span class="n">DynamoDBToS3Operator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;backup-1&#39;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/transfer/ftp_to_s3.html b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/transfer/ftp_to_s3.html
index 39048b32e4..36d4dc597c 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/transfer/ftp_to_s3.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/transfer/ftp_to_s3.html
@@ -622,7 +622,7 @@
 <a class="reference internal" href="../../_api/airflow/providers/amazon/aws/transfers/ftp_to_s3/index.html#airflow.providers.amazon.aws.transfers.ftp_to_s3.FTPToS3Operator" title="airflow.providers.amazon.aws.transfers.ftp_to_s3.FTPToS3Operator"><code class="xref py py-class docutils literal notranslate"><span class="pre">FTPToS3Operator</span></code></a></p>
 <p>Example usage:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_ftp_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/amazon/aws/example_dags/example_ftp_to_s3.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_ftp_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_ftp_to_s3.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">ftp_to_s3_task</span> <span class="o">=</span> <span class="n">FTPToS3Operator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;ftp_to_s3_task&quot;</span><span class="p">,</span>
     <span class="n">ftp_path</span><span class="o">=</span><span class="s2">&quot;/tmp/ftp_path&quot;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/transfer/glacier_to_gcs.html b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/transfer/glacier_to_gcs.html
index 504f1561a9..8c0c862de9 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/transfer/glacier_to_gcs.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/transfer/glacier_to_gcs.html
@@ -610,7 +610,7 @@ Transferring big files may not work well.</p>
 <a class="reference internal" href="../../_api/airflow/providers/amazon/aws/transfers/glacier_to_gcs/index.html#airflow.providers.amazon.aws.transfers.glacier_to_gcs.GlacierToGCSOperator" title="airflow.providers.amazon.aws.transfers.glacier_to_gcs.GlacierToGCSOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">GlacierToGCSOperator</span></code></a></p>
 <p>Example usage:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">transfer_archive_to_gcs</span> <span class="o">=</span> <span class="n">GlacierToGCSOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;transfer_archive_to_gcs&quot;</span><span class="p">,</span>
     <span class="n">vault_name</span><span class="o">=</span><span class="n">VAULT_NAME</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/transfer/google_api_to_s3.html b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/transfer/google_api_to_s3.html
index 5faac032f9..6d30a75987 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/transfer/google_api_to_s3.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/transfer/google_api_to_s3.html
@@ -620,7 +620,7 @@ on Amazon S3.</p>
 <span id="howto-operator-googleapitos3transfer"></span><h2>Google Sheets to Amazon S3<a class="headerlink" href="#google-sheets-to-amazon-s3" title="Permalink to this headline">¶</a></h2>
 <p>This example loads data from Google Sheets and save it to an Amazon S3 file.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_sheets_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/amazon/aws/example_dags/example_google_api_sheets_to_s3.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_sheets_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_google_api_sheets_to_s3.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">task_google_sheets_values_to_s3</span> <span class="o">=</span> <span class="n">GoogleApiToS3Operator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;google_sheet_data_to_s3&#39;</span><span class="p">,</span>
     <span class="n">google_api_service_name</span><span class="o">=</span><span class="s1">&#39;sheets&#39;</span><span class="p">,</span>
@@ -645,7 +645,7 @@ tasks to retrieve specific information about YouTube videos.</p>
 (<code class="docutils literal notranslate"><span class="pre">YOUTUBE_VIDEO_PUBLISHED_AFTER</span></code>, <code class="docutils literal notranslate"><span class="pre">YOUTUBE_VIDEO_PUBLISHED_BEFORE</span></code>) on a YouTube channel (<code class="docutils literal notranslate"><span class="pre">YOUTUBE_CHANNEL_ID</span></code>)
 saves the response in Amazon S3 and also pushes the data to xcom.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_youtube_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/amazon/aws/example_dags/example_google_api_youtube_to_s3.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_youtube_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_google_api_youtube_to_s3.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">task_video_ids_to_s3</span> <span class="o">=</span> <span class="n">GoogleApiToS3Operator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;video_ids_to_s3&#39;</span><span class="p">,</span>
     <span class="n">google_api_service_name</span><span class="o">=</span><span class="s1">&#39;youtube&#39;</span><span class="p">,</span>
@@ -670,7 +670,7 @@ saves the response in Amazon S3 and also pushes the data to xcom.</p>
 <p>It passes over the YouTube IDs to the next request which then gets the
 information (<code class="docutils literal notranslate"><span class="pre">YOUTUBE_VIDEO_FIELDS</span></code>) for the requested videos and saves them in Amazon S3 (<code class="docutils literal notranslate"><span class="pre">S3_BUCKET_NAME</span></code>).</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_youtube_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/amazon/aws/example_dags/example_google_api_youtube_to_s3.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_youtube_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_google_api_youtube_to_s3.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">task_video_data_to_s3</span> <span class="o">=</span> <span class="n">GoogleApiToS3Operator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;video_data_to_s3&#39;</span><span class="p">,</span>
     <span class="n">google_api_service_name</span><span class="o">=</span><span class="s1">&#39;youtube&#39;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/transfer/hive_to_dynamodb.html b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/transfer/hive_to_dynamodb.html
index 78b320f49e..fa5142a02e 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/transfer/hive_to_dynamodb.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/transfer/hive_to_dynamodb.html
@@ -625,7 +625,7 @@ to use as filtering criteria.</p>
 <a class="reference internal" href="../../_api/airflow/providers/amazon/aws/transfers/hive_to_dynamodb/index.html#airflow.providers.amazon.aws.transfers.hive_to_dynamodb.HiveToDynamoDBOperator" title="airflow.providers.amazon.aws.transfers.hive_to_dynamodb.HiveToDynamoDBOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">HiveToDynamoDBOperator</span></code></a></p>
 <p>Example usage:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_hive_to_dynamodb.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/amazon/aws/example_dags/example_hive_to_dynamodb.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_hive_to_dynamodb.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_hive_to_dynamodb.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">backup_to_dynamodb</span> <span class="o">=</span> <span class="n">HiveToDynamoDBOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;backup_to_dynamodb&#39;</span><span class="p">,</span>
     <span class="n">hiveserver2_conn_id</span><span class="o">=</span><span class="n">HIVE_CONNECTION_ID</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/transfer/imap_attachment_to_s3.html b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/transfer/imap_attachment_to_s3.html
index f9142a4275..de5003fbac 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/transfer/imap_attachment_to_s3.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/transfer/imap_attachment_to_s3.html
@@ -619,7 +619,7 @@ protocol from a mail server to an Amazon S3 Bucket.</p>
 <div class="section" id="imap-attachment-to-amazon-s3">
 <span id="howto-operator-imapattachmenttos3operator"></span><h2>Imap Attachment To Amazon S3<a class="headerlink" href="#imap-attachment-to-amazon-s3" title="Permalink to this headline">¶</a></h2>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_imap_attachment_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/amazon/aws/example_dags/example_imap_attachment_to_s3.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_imap_attachment_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_imap_attachment_to_s3.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">task_transfer_imap_attachment_to_s3</span> <span class="o">=</span> <span class="n">ImapAttachmentToS3Operator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;transfer_imap_attachment_to_s3&#39;</span><span class="p">,</span>
     <span class="n">imap_attachment_name</span><span class="o">=</span><span class="n">IMAP_ATTACHMENT_NAME</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/transfer/local_to_s3.html b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/transfer/local_to_s3.html
index 4bd1590ccd..bc39564a4b 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/transfer/local_to_s3.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/transfer/local_to_s3.html
@@ -623,7 +623,7 @@ to an Amazon Simple Storage Service (S3) file.</p>
 <a class="reference internal" href="../../_api/airflow/providers/amazon/aws/transfers/local_to_s3/index.html#airflow.providers.amazon.aws.transfers.local_to_s3.LocalFilesystemToS3Operator" title="airflow.providers.amazon.aws.transfers.local_to_s3.LocalFilesystemToS3Operator"><code class="xref py py-class docutils literal notranslate"><span class="pre">LocalFilesystemToS3Operator</span></code></a></p>
 <p>Example usage:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_local_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/amazon/aws/example_dags/example_local_to_s3.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_local_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_local_to_s3.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">create_local_to_s3_job</span> <span class="o">=</span> <span class="n">LocalFilesystemToS3Operator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;create_local_to_s3_job&quot;</span><span class="p">,</span>
     <span class="n">filename</span><span class="o">=</span><span class="s2">&quot;relative/path/to/file.csv&quot;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/transfer/mongo_to_s3.html b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/transfer/mongo_to_s3.html
index a58cc0ed67..652f436ae8 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/transfer/mongo_to_s3.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/transfer/mongo_to_s3.html
@@ -623,7 +623,7 @@ In order to select the data you want to copy, you need to use the <code class="d
 <a class="reference internal" href="../../_api/airflow/providers/amazon/aws/transfers/mongo_to_s3/index.html#airflow.providers.amazon.aws.transfers.mongo_to_s3.MongoToS3Operator" title="airflow.providers.amazon.aws.transfers.mongo_to_s3.MongoToS3Operator"><code class="xref py py-class docutils literal notranslate"><span class="pre">MongoToS3Operator</span></code></a></p>
 <p>Example usage:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_mongo_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/amazon/aws/example_dags/example_mongo_to_s3.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_mongo_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_mongo_to_s3.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">create_local_to_s3_job</span> <span class="o">=</span> <span class="n">MongoToS3Operator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;create_mongo_to_s3_job&quot;</span><span class="p">,</span>
     <span class="n">mongo_collection</span><span class="o">=</span><span class="n">MONGO_COLLECTION</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/transfer/redshift_to_s3.html b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/transfer/redshift_to_s3.html
index 1a7cf456d5..7a100101b4 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/transfer/redshift_to_s3.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/transfer/redshift_to_s3.html
@@ -623,7 +623,7 @@ Service (S3) file.</p>
 <a class="reference internal" href="../../_api/airflow/providers/amazon/aws/transfers/redshift_to_s3/index.html#airflow.providers.amazon.aws.transfers.redshift_to_s3.RedshiftToS3Operator" title="airflow.providers.amazon.aws.transfers.redshift_to_s3.RedshiftToS3Operator"><code class="xref py py-class docutils literal notranslate"><span class="pre">RedshiftToS3Operator</span></code></a></p>
 <p>Example usage:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/amazon/aws/example_dags/example_redshift_to_s3.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_redshift_to_s3.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">task_transfer_redshift_to_s3</span> <span class="o">=</span> <span class="n">RedshiftToS3Operator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;transfer_redshift_to_s3&#39;</span><span class="p">,</span>
     <span class="n">s3_bucket</span><span class="o">=</span><span class="n">S3_BUCKET_NAME</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/transfer/s3_to_ftp.html b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/transfer/s3_to_ftp.html
index cfdf42bf1f..640973ce6f 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/transfer/s3_to_ftp.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/transfer/s3_to_ftp.html
@@ -623,7 +623,7 @@ using FTP protocol.</p>
 <a class="reference internal" href="../../_api/airflow/providers/amazon/aws/transfers/s3_to_ftp/index.html#airflow.providers.amazon.aws.transfers.s3_to_ftp.S3ToFTPOperator" title="airflow.providers.amazon.aws.transfers.s3_to_ftp.S3ToFTPOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">S3ToFTPOperator</span></code></a></p>
 <p>Example usage:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3_to_ftp.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/amazon/aws/example_dags/example_s3_to_ftp.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3_to_ftp.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_s3_to_ftp.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">s3_to_ftp_task</span> <span class="o">=</span> <span class="n">S3ToFTPOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;ftp_to_s3_task&quot;</span><span class="p">,</span>
     <span class="n">ftp_path</span><span class="o">=</span><span class="s2">&quot;/tmp/ftp_path&quot;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/transfer/s3_to_redshift.html b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/transfer/s3_to_redshift.html
index d7499e983d..8e6dc8932b 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/transfer/s3_to_redshift.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/transfer/s3_to_redshift.html
@@ -623,7 +623,7 @@ Amazon Redshift table.</p>
 <a class="reference internal" href="../../_api/airflow/providers/amazon/aws/transfers/s3_to_redshift/index.html#airflow.providers.amazon.aws.transfers.s3_to_redshift.S3ToRedshiftOperator" title="airflow.providers.amazon.aws.transfers.s3_to_redshift.S3ToRedshiftOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">S3ToRedshiftOperator</span></code></a></p>
 <p>Example usage:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3_to_redshift.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/amazon/aws/example_dags/example_s3_to_redshift.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3_to_redshift.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_s3_to_redshift.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">task_transfer_s3_to_redshift</span> <span class="o">=</span> <span class="n">S3ToRedshiftOperator</span><span class="p">(</span>
     <span class="n">s3_bucket</span><span class="o">=</span><span class="n">S3_BUCKET_NAME</span><span class="p">,</span>
     <span class="n">s3_key</span><span class="o">=</span><span class="n">S3_KEY</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/transfer/s3_to_sftp.html b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/transfer/s3_to_sftp.html
index cb81720c00..0c1003301a 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/transfer/s3_to_sftp.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/transfer/s3_to_sftp.html
@@ -624,7 +624,7 @@ For more information about the service visits <a class="reference external" href
 <a class="reference internal" href="../../_api/airflow/providers/amazon/aws/transfers/s3_to_sftp/index.html#airflow.providers.amazon.aws.transfers.s3_to_sftp.S3ToSFTPOperator" title="airflow.providers.amazon.aws.transfers.s3_to_sftp.S3ToSFTPOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">S3ToSFTPOperator</span></code></a></p>
 <p>Example usage:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3_to_sftp.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/amazon/aws/example_dags/example_s3_to_sftp.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3_to_sftp.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_s3_to_sftp.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">create_s3_to_sftp_job</span> <span class="o">=</span> <span class="n">S3ToSFTPOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;create_s3_to_sftp_job&quot;</span><span class="p">,</span>
     <span class="n">sftp_path</span><span class="o">=</span><span class="s2">&quot;sftp_path&quot;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/transfer/salesforce_to_s3.html b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/transfer/salesforce_to_s3.html
index bff031ba98..369243e533 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/transfer/salesforce_to_s3.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/transfer/salesforce_to_s3.html
@@ -622,7 +622,7 @@ to execute a Salesforce query to fetch data and upload to an Amazon S3 bucket.</
 <p>The following example demonstrates a use case of extracting account data from a Salesforce
 instance and upload to an Amazon S3 bucket.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_salesforce_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/amazon/aws/example_dags/example_salesforce_to_s3.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_salesforce_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_salesforce_to_s3.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">upload_salesforce_data_to_s3</span> <span class="o">=</span> <span class="n">SalesforceToS3Operator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;upload_salesforce_to_s3&quot;</span><span class="p">,</span>
     <span class="n">salesforce_query</span><span class="o">=</span><span class="s2">&quot;SELECT AccountNumber, Name FROM Account&quot;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/transfer/sftp_to_s3.html b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/transfer/sftp_to_s3.html
index a3aef10ecb..8bdf12dd24 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/transfer/sftp_to_s3.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/transfer/sftp_to_s3.html
@@ -623,7 +623,7 @@ For more information about the service visits <a class="reference external" href
 <a class="reference internal" href="../../_api/airflow/providers/amazon/aws/transfers/sftp_to_s3/index.html#airflow.providers.amazon.aws.transfers.sftp_to_s3.SFTPToS3Operator" title="airflow.providers.amazon.aws.transfers.sftp_to_s3.SFTPToS3Operator"><code class="xref py py-class docutils literal notranslate"><span class="pre">SFTPToS3Operator</span></code></a></p>
 <p>Example usage:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_sftp_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/amazon/aws/example_dags/example_sftp_to_s3.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_sftp_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_sftp_to_s3.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">create_sftp_to_s3_job</span> <span class="o">=</span> <span class="n">SFTPToS3Operator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;create_sftp_to_s3_job&quot;</span><span class="p">,</span>
     <span class="n">sftp_path</span><span class="o">=</span><span class="s2">&quot;/tmp/sftp_path&quot;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/transfer/sql_to_s3.html b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/transfer/sql_to_s3.html
index 0410b9e7fe..7ab479c36b 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/transfer/sql_to_s3.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.3.0/operators/transfer/sql_to_s3.html
@@ -625,7 +625,7 @@ converts the SQL result to <a class="reference external" href="https://pandas.py
 <a class="reference internal" href="../../_api/airflow/providers/amazon/aws/transfers/sql_to_s3/index.html#airflow.providers.amazon.aws.transfers.sql_to_s3.SqlToS3Operator" title="airflow.providers.amazon.aws.transfers.sql_to_s3.SqlToS3Operator"><code class="xref py py-class docutils literal notranslate"><span class="pre">SqlToS3Operator</span></code></a></p>
 <p>Example usage:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_sql_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/amazon/aws/example_dags/example_sql_to_s3.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_sql_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.3.0/airflow/providers/amazon/aws/example_dags/example_sql_to_s3.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">sql_to_s3_task</span> <span class="o">=</span> <span class="n">SqlToS3Operator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;sql_to_s3_task&quot;</span><span class="p">,</span>
     <span class="n">sql_conn_id</span><span class="o">=</span><span class="s2">&quot;mysql_default&quot;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/athena.html b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/athena.html
index bacab88ff4..ec91e61847 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/athena.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/athena.html
@@ -627,7 +627,7 @@ to run a query in Amazon Athena.</p>
 an existing Amazon S3 bucket.  For more examples of how to use this operator, please
 see the <a class="reference external" href="https://github.com/apache/airflow/blob/main/airflow/providers/amazon/aws/example_dags/example_athena.py">Sample DAG</a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_athena.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_athena.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_athena.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_athena.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">read_table</span> <span class="o">=</span> <span class="n">AthenaOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;read_table&#39;</span><span class="p">,</span>
     <span class="n">query</span><span class="o">=</span><span class="n">QUERY_READ_TABLE</span><span class="p">,</span>
@@ -643,7 +643,7 @@ see the <a class="reference external" href="https://github.com/apache/airflow/bl
 <p>Use the <a class="reference internal" href="../_api/airflow/providers/amazon/aws/sensors/athena/index.html#airflow.providers.amazon.aws.sensors.athena.AthenaSensor" title="airflow.providers.amazon.aws.sensors.athena.AthenaSensor"><code class="xref py py-class docutils literal notranslate"><span class="pre">AthenaSensor</span></code></a>
 to wait for the results of a query in Amazon Athena.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_athena.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_athena.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_athena.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_athena.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">await_query</span> <span class="o">=</span> <span class="n">AthenaSensor</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;await_query&#39;</span><span class="p">,</span>
     <span class="n">query_execution_id</span><span class="o">=</span><span class="n">read_table</span><span class="o">.</span><span class="n">output</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/batch.html b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/batch.html
index 926b5e13f3..011001bb94 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/batch.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/batch.html
@@ -622,7 +622,7 @@ infrastructure.</p>
 <p>To wait on the state of an AWS Batch Job until it reaches a terminal state you can
 use <a class="reference internal" href="../_api/airflow/providers/amazon/aws/sensors/batch/index.html#airflow.providers.amazon.aws.sensors.batch.BatchSensor" title="airflow.providers.amazon.aws.sensors.batch.BatchSensor"><code class="xref py py-class docutils literal notranslate"><span class="pre">BatchSensor</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_batch.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_batch.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_batch.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_batch.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">wait_for_batch_job</span> <span class="o">=</span> <span class="n">BatchSensor</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;wait_for_batch_job&#39;</span><span class="p">,</span>
     <span class="n">job_id</span><span class="o">=</span><span class="n">JOB_ID</span><span class="p">,</span>
@@ -636,7 +636,7 @@ use <a class="reference internal" href="../_api/airflow/providers/amazon/aws/sen
 <p>To submit a new AWS Batch Job and monitor it until it reaches a terminal state you can
 use <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/batch/index.html#airflow.providers.amazon.aws.operators.batch.BatchOperator" title="airflow.providers.amazon.aws.operators.batch.BatchOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">BatchOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_batch.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_batch.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_batch.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_batch.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">submit_batch_job</span> <span class="o">=</span> <span class="n">BatchOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;submit_batch_job&#39;</span><span class="p">,</span>
     <span class="n">job_name</span><span class="o">=</span><span class="n">JOB_NAME</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/cloudformation.html b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/cloudformation.html
index 34328eff5d..abdaf9ecc2 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/cloudformation.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/cloudformation.html
@@ -624,7 +624,7 @@ create and delete a collection of resources together as a single unit (a stack).
 <p>To create a new AWS CloudFormation stack use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/cloud_formation/index.html#airflow.providers.amazon.aws.operators.cloud_formation.CloudFormationCreateStackOperator" title="airflow.providers.amazon.aws.operators.cloud_formation.CloudFormationCreateStackOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">CloudFormationCreateStackOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_cloudformation.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_cloudformation.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_cloudformation.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_cloudformation.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">create_stack</span> <span class="o">=</span> <span class="n">CloudFormationCreateStackOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;create_stack&#39;</span><span class="p">,</span>
     <span class="n">stack_name</span><span class="o">=</span><span class="n">CLOUDFORMATION_STACK_NAME</span><span class="p">,</span>
@@ -639,7 +639,7 @@ create and delete a collection of resources together as a single unit (a stack).
 <p>To wait on the state of an AWS CloudFormation stack creation until it reaches a terminal state you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/sensors/cloud_formation/index.html#airflow.providers.amazon.aws.sensors.cloud_formation.CloudFormationCreateStackSensor" title="airflow.providers.amazon.aws.sensors.cloud_formation.CloudFormationCreateStackSensor"><code class="xref py py-class docutils literal notranslate"><span class="pre">CloudFormationCreateStackSensor</span></code></a></p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_cloudformation.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_cloudformation.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_cloudformation.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_cloudformation.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">wait_for_stack_create</span> <span class="o">=</span> <span class="n">CloudFormationCreateStackSensor</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;wait_for_stack_creation&#39;</span><span class="p">,</span> <span class="n">stack_name</span><span class="o">=</span><span class="n">CLOUDFORMATION_STACK_NAME</span>
 <span class="p">)</span>
@@ -652,7 +652,7 @@ create and delete a collection of resources together as a single unit (a stack).
 <p>To delete an AWS CloudFormation stack you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/cloud_formation/index.html#airflow.providers.amazon.aws.operators.cloud_formation.CloudFormationDeleteStackOperator" title="airflow.providers.amazon.aws.operators.cloud_formation.CloudFormationDeleteStackOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">CloudFormationDeleteStackOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_cloudformation.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_cloudformation.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_cloudformation.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_cloudformation.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">delete_stack</span> <span class="o">=</span> <span class="n">CloudFormationDeleteStackOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;delete_stack&#39;</span><span class="p">,</span> <span class="n">stack_name</span><span class="o">=</span><span class="n">CLOUDFORMATION_STACK_NAME</span>
 <span class="p">)</span>
@@ -665,7 +665,7 @@ create and delete a collection of resources together as a single unit (a stack).
 <p>To wait on the state of an AWS CloudFormation stack deletion until it reaches a terminal state you can use
 use <a class="reference internal" href="../_api/airflow/providers/amazon/aws/sensors/cloud_formation/index.html#airflow.providers.amazon.aws.sensors.cloud_formation.CloudFormationDeleteStackSensor" title="airflow.providers.amazon.aws.sensors.cloud_formation.CloudFormationDeleteStackSensor"><code class="xref py py-class docutils literal notranslate"><span class="pre">CloudFormationDeleteStackSensor</span></code></a></p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_cloudformation.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_cloudformation.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_cloudformation.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_cloudformation.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">wait_for_stack_delete</span> <span class="o">=</span> <span class="n">CloudFormationDeleteStackSensor</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;wait_for_stack_deletion&#39;</span><span class="p">,</span> <span class="n">trigger_rule</span><span class="o">=</span><span class="s1">&#39;all_done&#39;</span><span class="p">,</span> <span class="n">stack_name</span><span class="o">=</span><span class="n">CLOUDFORMATION_STACK_NAME</span>
 <span class="p">)</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/datasync.html b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/datasync.html
index 72c349dfd6..987287b263 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/datasync.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/datasync.html
@@ -637,7 +637,7 @@ for more details.</p>
 <h3>Execute a task<a class="headerlink" href="#execute-a-task" title="Permalink to this headline">¶</a></h3>
 <p>To execute a specific task, you can pass the <code class="docutils literal notranslate"><span class="pre">task_arn</span></code> to the operator.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_datasync.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_datasync.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_datasync.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_datasync.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="c1"># Execute a specific task</span>
 <span class="n">datasync_specific_task</span> <span class="o">=</span> <span class="n">DataSyncOperator</span><span class="p">(</span><span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;datasync_specific_task&quot;</span><span class="p">,</span> <span class="n">task_arn</span><span class="o">=</span><span class="n">TASK_ARN</span><span class="p">)</span>
 </pre></div>
@@ -651,7 +651,7 @@ If one task is found, this one will be executed.
 If more than one task is found, the operator will raise an Exception. To avoid this, you can set
 <code class="docutils literal notranslate"><span class="pre">allow_random_task_choice</span></code> to <code class="docutils literal notranslate"><span class="pre">True</span></code> to randomly choose from candidate tasks.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_datasync.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_datasync.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_datasync.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_datasync.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="c1"># Search and execute a task</span>
 <span class="n">datasync_search_task</span> <span class="o">=</span> <span class="n">DataSyncOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;datasync_search_task&quot;</span><span class="p">,</span>
@@ -672,7 +672,7 @@ existing Task was found. If these are left to their default value (None) then no
 <p>Also, because <code class="docutils literal notranslate"><span class="pre">delete_task_after_execution</span></code> is set to <code class="docutils literal notranslate"><span class="pre">True</span></code>, the task will be deleted
 from AWS DataSync after it completes successfully.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_datasync.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_datasync.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_datasync.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_datasync.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="c1"># Create a task (the task does not exist)</span>
 <span class="n">datasync_create_task</span> <span class="o">=</span> <span class="n">DataSyncOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;datasync_create_task&quot;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/dms.html b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/dms.html
index 6a4a48cd2c..3859e725b0 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/dms.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/dms.html
@@ -648,7 +648,7 @@ to be completed, and then delete it.</p>
 <h3>Defining tasks<a class="headerlink" href="#defining-tasks" title="Permalink to this headline">¶</a></h3>
 <p>In the following code we create a new replication task, start it, wait for it to be completed and then delete it.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_dms_full_load_task.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_dms_full_load_task.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_dms_full_load_task.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_dms_full_load_task.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">create_task</span> <span class="o">=</span> <span class="n">DmsCreateTaskOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;create_task&#39;</span><span class="p">,</span>
         <span class="n">replication_task_id</span><span class="o">=</span><span class="n">REPLICATION_TASK_ID</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/ec2.html b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/ec2.html
index aefe9bfc71..00608edb78 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/ec2.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/ec2.html
@@ -623,7 +623,7 @@ computing capacity—literally, servers in Amazon's data centers—that you use
 <p>To start an Amazon EC2 instance you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/ec2/index.html#airflow.providers.amazon.aws.operators.ec2.EC2StartInstanceOperator" title="airflow.providers.amazon.aws.operators.ec2.EC2StartInstanceOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">EC2StartInstanceOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_ec2.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_ec2.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_ec2.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_ec2.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">start_instance</span> <span class="o">=</span> <span class="n">EC2StartInstanceOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;ec2_start_instance&quot;</span><span class="p">,</span>
     <span class="n">instance_id</span><span class="o">=</span><span class="n">INSTANCE_ID</span><span class="p">,</span>
@@ -637,7 +637,7 @@ computing capacity—literally, servers in Amazon's data centers—that you use
 <p>To stop an Amazon EC2 instance you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/ec2/index.html#airflow.providers.amazon.aws.operators.ec2.EC2StopInstanceOperator" title="airflow.providers.amazon.aws.operators.ec2.EC2StopInstanceOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">EC2StopInstanceOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_ec2.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_ec2.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_ec2.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_ec2.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">stop_instance</span> <span class="o">=</span> <span class="n">EC2StopInstanceOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;ec2_stop_instance&quot;</span><span class="p">,</span>
     <span class="n">instance_id</span><span class="o">=</span><span class="n">INSTANCE_ID</span><span class="p">,</span>
@@ -654,7 +654,7 @@ computing capacity—literally, servers in Amazon's data centers—that you use
 <p>To check the state of an Amazon EC2 instance and wait until it reaches the target state you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/sensors/ec2/index.html#airflow.providers.amazon.aws.sensors.ec2.EC2InstanceStateSensor" title="airflow.providers.amazon.aws.sensors.ec2.EC2InstanceStateSensor"><code class="xref py py-class docutils literal notranslate"><span class="pre">EC2InstanceStateSensor</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_ec2.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_ec2.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_ec2.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_ec2.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">instance_state</span> <span class="o">=</span> <span class="n">EC2InstanceStateSensor</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;ec2_instance_state&quot;</span><span class="p">,</span>
     <span class="n">instance_id</span><span class="o">=</span><span class="n">INSTANCE_ID</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/ecs.html b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/ecs.html
index e960b39510..8ef1eb61d2 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/ecs.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/ecs.html
@@ -637,7 +637,7 @@ scale containerized applications.</p>
 <li><p>If you have integrated external resources in your ECS Cluster, for example using ECS Anywhere, and want to run your containers on those external resources, set the parameter to EXTERNAL.</p></li>
 </ul>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_ecs.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_ecs.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_ecs.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_ecs.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">hello_world</span> <span class="o">=</span> <span class="n">EcsOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;hello_world&quot;</span><span class="p">,</span>
     <span class="n">cluster</span><span class="o">=</span><span class="n">os</span><span class="o">.</span><span class="n">environ</span><span class="o">.</span><span class="n">get</span><span class="p">(</span><span class="s2">&quot;CLUSTER_NAME&quot;</span><span class="p">,</span> <span class="s2">&quot;existing_cluster_name&quot;</span><span class="p">),</span>
@@ -667,7 +667,7 @@ scale containerized applications.</p>
 </div>
 </div>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_ecs_fargate.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_ecs_fargate.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_ecs_fargate.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_ecs_fargate.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">hello_world</span> <span class="o">=</span> <span class="n">EcsOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;hello_world&quot;</span><span class="p">,</span>
     <span class="n">cluster</span><span class="o">=</span><span class="n">os</span><span class="o">.</span><span class="n">environ</span><span class="o">.</span><span class="n">get</span><span class="p">(</span><span class="s2">&quot;CLUSTER_NAME&quot;</span><span class="p">,</span> <span class="s2">&quot;existing_cluster_name&quot;</span><span class="p">),</span>
@@ -706,7 +706,7 @@ scale containerized applications.</p>
 <h3>CloudWatch Logging<a class="headerlink" href="#cloudwatch-logging" title="Permalink to this headline">¶</a></h3>
 <p>To stream logs to AWS CloudWatch, you need to define these parameters. Using the example Operators above, we would add these additional parameters to enable logging to CloudWatch. You will need to ensure that you have the appropriate level of permissions (see next section)</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_ecs.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_ecs.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_ecs.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_ecs.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">awslogs_group</span><span class="o">=</span><span class="s2">&quot;/ecs/hello-world&quot;</span><span class="p">,</span>
     <span class="n">awslogs_region</span><span class="o">=</span><span class="s2">&quot;aws-region&quot;</span><span class="p">,</span>
     <span class="n">awslogs_stream_prefix</span><span class="o">=</span><span class="s2">&quot;ecs/hello-world-container&quot;</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/eks.html b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/eks.html
index 9780b215a3..83bdde9ff6 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/eks.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/eks.html
@@ -611,7 +611,7 @@ and management of containerized applications.</p>
 <p>To check the state of an Amazon EKS Cluster until it reaches the target state or another terminal
 state you can use <a class="reference internal" href="../_api/airflow/providers/amazon/aws/sensors/eks/index.html#airflow.providers.amazon.aws.sensors.eks.EksClusterStateSensor" title="airflow.providers.amazon.aws.sensors.eks.EksClusterStateSensor"><code class="xref py py-class docutils literal notranslate"><span class="pre">EksClusterStateSensor</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">await_create_cluster</span> <span class="o">=</span> <span class="n">EksClusterStateSensor</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;wait_for_create_cluster&#39;</span><span class="p">,</span>
     <span class="n">cluster_name</span><span class="o">=</span><span class="n">CLUSTER_NAME</span><span class="p">,</span>
@@ -631,7 +631,7 @@ state you can use <a class="reference internal" href="../_api/airflow/providers/
 </dd>
 </dl>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="c1"># Create an Amazon EKS Cluster control plane without attaching compute service.</span>
 <span class="n">create_cluster</span> <span class="o">=</span> <span class="n">EksCreateClusterOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;create_eks_cluster&#39;</span><span class="p">,</span>
@@ -649,7 +649,7 @@ state you can use <a class="reference internal" href="../_api/airflow/providers/
 <p>To delete an existing Amazon EKS Cluster you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/eks/index.html#airflow.providers.amazon.aws.operators.eks.EksDeleteClusterOperator" title="airflow.providers.amazon.aws.operators.eks.EksDeleteClusterOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">EksDeleteClusterOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">delete_cluster</span> <span class="o">=</span> <span class="n">EksDeleteClusterOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;delete_eks_cluster&#39;</span><span class="p">,</span>
     <span class="n">cluster_name</span><span class="o">=</span><span class="n">CLUSTER_NAME</span><span class="p">,</span>
@@ -663,7 +663,7 @@ attempt to delete any attached resources first.</p>
 </dd>
 </dl>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroup_in_one_step.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroup_in_one_step.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroup_in_one_step.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroup_in_one_step.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="c1"># An Amazon EKS cluster can not be deleted with attached resources such as nodegroups or Fargate profiles.</span>
 <span class="c1"># Setting the `force` to `True` will delete any attached resources before deleting the cluster.</span>
 <span class="n">delete_all</span> <span class="o">=</span> <span class="n">EksDeleteClusterOperator</span><span class="p">(</span>
@@ -683,7 +683,7 @@ attempt to delete any attached resources first.</p>
 <p>To check the state of an Amazon EKS managed node group until it reaches the target state or another terminal
 state you can use <a class="reference internal" href="../_api/airflow/providers/amazon/aws/sensors/eks/index.html#airflow.providers.amazon.aws.sensors.eks.EksNodegroupStateSensor" title="airflow.providers.amazon.aws.sensors.eks.EksNodegroupStateSensor"><code class="xref py py-class docutils literal notranslate"><span class="pre">EksNodegroupStateSensor</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">await_create_nodegroup</span> <span class="o">=</span> <span class="n">EksNodegroupStateSensor</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;wait_for_create_nodegroup&#39;</span><span class="p">,</span>
     <span class="n">cluster_name</span><span class="o">=</span><span class="n">CLUSTER_NAME</span><span class="p">,</span>
@@ -705,7 +705,7 @@ state you can use <a class="reference internal" href="../_api/airflow/providers/
 </dd>
 </dl>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">create_nodegroup</span> <span class="o">=</span> <span class="n">EksCreateNodegroupOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;create_eks_nodegroup&#39;</span><span class="p">,</span>
     <span class="n">cluster_name</span><span class="o">=</span><span class="n">CLUSTER_NAME</span><span class="p">,</span>
@@ -722,7 +722,7 @@ state you can use <a class="reference internal" href="../_api/airflow/providers/
 <p>To delete an existing Amazon EKS Managed Nodegroup you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/eks/index.html#airflow.providers.amazon.aws.operators.eks.EksDeleteNodegroupOperator" title="airflow.providers.amazon.aws.operators.eks.EksDeleteNodegroupOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">EksDeleteNodegroupOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">delete_nodegroup</span> <span class="o">=</span> <span class="n">EksDeleteNodegroupOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;delete_eks_nodegroup&#39;</span><span class="p">,</span>
     <span class="n">cluster_name</span><span class="o">=</span><span class="n">CLUSTER_NAME</span><span class="p">,</span>
@@ -746,7 +746,7 @@ state you can use <a class="reference internal" href="../_api/airflow/providers/
 </dd>
 </dl>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroup_in_one_step.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroup_in_one_step.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroup_in_one_step.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroup_in_one_step.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="c1"># Create an Amazon EKS cluster control plane and an EKS nodegroup compute platform in one step.</span>
 <span class="n">create_cluster_and_nodegroup</span> <span class="o">=</span> <span class="n">EksCreateClusterOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;create_eks_cluster_and_nodegroup&#39;</span><span class="p">,</span>
@@ -777,7 +777,7 @@ state you can use <a class="reference internal" href="../_api/airflow/providers/
 </dd>
 </dl>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_in_one_step.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_in_one_step.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_in_one_step.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_in_one_step.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="c1"># Create an Amazon EKS cluster control plane and an AWS Fargate compute platform in one step.</span>
 <span class="n">create_cluster_and_fargate_profile</span> <span class="o">=</span> <span class="n">EksCreateClusterOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;create_eks_cluster_and_fargate_profile&#39;</span><span class="p">,</span>
@@ -801,7 +801,7 @@ state you can use <a class="reference internal" href="../_api/airflow/providers/
 <p>To check the state of an AWS Fargate profile until it reaches the target state or another terminal
 state you can use <code class="xref py py-class docutils literal notranslate"><span class="pre">EksFargateProfileSensor</span></code>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_profile.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_profile.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_profile.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_profile.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">await_create_fargate_profile</span> <span class="o">=</span> <span class="n">EksFargateProfileStateSensor</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;wait_for_create_fargate_profile&#39;</span><span class="p">,</span>
     <span class="n">cluster_name</span><span class="o">=</span><span class="n">CLUSTER_NAME</span><span class="p">,</span>
@@ -823,7 +823,7 @@ state you can use <code class="xref py py-class docutils literal notranslate"><s
 </dd>
 </dl>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_profile.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_profile.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_profile.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_profile.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">create_fargate_profile</span> <span class="o">=</span> <span class="n">EksCreateFargateProfileOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;create_eks_fargate_profile&#39;</span><span class="p">,</span>
     <span class="n">cluster_name</span><span class="o">=</span><span class="n">CLUSTER_NAME</span><span class="p">,</span>
@@ -840,7 +840,7 @@ state you can use <code class="xref py py-class docutils literal notranslate"><s
 <p>To delete an existing AWS Fargate Profile you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/eks/index.html#airflow.providers.amazon.aws.operators.eks.EksDeleteFargateProfileOperator" title="airflow.providers.amazon.aws.operators.eks.EksDeleteFargateProfileOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">EksDeleteFargateProfileOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_profile.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_profile.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_profile.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_eks_with_fargate_profile.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">delete_fargate_profile</span> <span class="o">=</span> <span class="n">EksDeleteFargateProfileOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;delete_eks_fargate_profile&#39;</span><span class="p">,</span>
     <span class="n">cluster_name</span><span class="o">=</span><span class="n">CLUSTER_NAME</span><span class="p">,</span>
@@ -857,7 +857,7 @@ state you can use <code class="xref py py-class docutils literal notranslate"><s
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/eks/index.html#airflow.providers.amazon.aws.operators.eks.EksPodOperator" title="airflow.providers.amazon.aws.operators.eks.EksPodOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">EksPodOperator</span></code></a>.</p>
 <p>Note: An Amazon EKS Cluster with underlying compute infrastructure is required.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_eks_with_nodegroups.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">start_pod</span> <span class="o">=</span> <span class="n">EksPodOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;run_pod&quot;</span><span class="p">,</span>
     <span class="n">cluster_name</span><span class="o">=</span><span class="n">CLUSTER_NAME</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/emr.html b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/emr.html
index 7b53ede742..0153b660fa 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/emr.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/emr.html
@@ -635,7 +635,7 @@ create a new EMR job flow.  The cluster will be terminated automatically after f
 <h3>JobFlow configuration<a class="headerlink" href="#jobflow-configuration" title="Permalink to this headline">¶</a></h3>
 <p>To create a job flow on EMR, you need to specify the configuration for the EMR cluster:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_job_flow_automatic_steps.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_emr_job_flow_automatic_steps.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_job_flow_automatic_steps.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_emr_job_flow_automatic_steps.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">SPARK_STEPS</span> <span class="o">=</span> <span class="p">[</span>
     <span class="p">{</span>
         <span class="s1">&#39;Name&#39;</span><span class="p">:</span> <span class="s1">&#39;calculate_pi&#39;</span><span class="p">,</span>
@@ -688,7 +688,7 @@ you may not see the cluster in the EMR Management Console - you can change this
 <h3>Create the Job Flow<a class="headerlink" href="#create-the-job-flow" title="Permalink to this headline">¶</a></h3>
 <p>In the following code we are creating a new job flow using the configuration as explained above.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_job_flow_automatic_steps.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_emr_job_flow_automatic_steps.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_job_flow_automatic_steps.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_emr_job_flow_automatic_steps.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">job_flow_creator</span> <span class="o">=</span> <span class="n">EmrCreateJobFlowOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;create_job_flow&#39;</span><span class="p">,</span>
     <span class="n">job_flow_overrides</span><span class="o">=</span><span class="n">JOB_FLOW_OVERRIDES</span><span class="p">,</span>
@@ -703,7 +703,7 @@ you may not see the cluster in the EMR Management Console - you can change this
 <p>To add Steps to an existing EMR Job Flow you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/emr/index.html#airflow.providers.amazon.aws.operators.emr.EmrAddStepsOperator" title="airflow.providers.amazon.aws.operators.emr.EmrAddStepsOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">EmrAddStepsOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_job_flow_manual_steps.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_emr_job_flow_manual_steps.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_job_flow_manual_steps.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_emr_job_flow_manual_steps.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">step_adder</span> <span class="o">=</span> <span class="n">EmrAddStepsOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;add_steps&#39;</span><span class="p">,</span>
     <span class="n">job_flow_id</span><span class="o">=</span><span class="n">cluster_creator</span><span class="o">.</span><span class="n">output</span><span class="p">,</span>
@@ -718,7 +718,7 @@ you may not see the cluster in the EMR Management Console - you can change this
 <p>To terminate an EMR Job Flow you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/emr/index.html#airflow.providers.amazon.aws.operators.emr.EmrTerminateJobFlowOperator" title="airflow.providers.amazon.aws.operators.emr.EmrTerminateJobFlowOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">EmrTerminateJobFlowOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_job_flow_manual_steps.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_emr_job_flow_manual_steps.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_job_flow_manual_steps.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_emr_job_flow_manual_steps.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">cluster_remover</span> <span class="o">=</span> <span class="n">EmrTerminateJobFlowOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;remove_cluster&#39;</span><span class="p">,</span>
     <span class="n">job_flow_id</span><span class="o">=</span><span class="n">cluster_creator</span><span class="o">.</span><span class="n">output</span><span class="p">,</span>
@@ -742,7 +742,7 @@ you may not see the cluster in the EMR Management Console - you can change this
 <p>To monitor the state of an EMR Job Flow you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/sensors/emr/index.html#airflow.providers.amazon.aws.sensors.emr.EmrJobFlowSensor" title="airflow.providers.amazon.aws.sensors.emr.EmrJobFlowSensor"><code class="xref py py-class docutils literal notranslate"><span class="pre">EmrJobFlowSensor</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_job_flow_automatic_steps.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_emr_job_flow_automatic_steps.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_job_flow_automatic_steps.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_emr_job_flow_automatic_steps.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">job_sensor</span> <span class="o">=</span> <span class="n">EmrJobFlowSensor</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;check_job_flow&#39;</span><span class="p">,</span>
     <span class="n">job_flow_id</span><span class="o">=</span><span class="n">job_flow_creator</span><span class="o">.</span><span class="n">output</span><span class="p">,</span>
@@ -756,7 +756,7 @@ you may not see the cluster in the EMR Management Console - you can change this
 <p>To monitor the state of a Step running an existing EMR Job Flow you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/sensors/emr/index.html#airflow.providers.amazon.aws.sensors.emr.EmrStepSensor" title="airflow.providers.amazon.aws.sensors.emr.EmrStepSensor"><code class="xref py py-class docutils literal notranslate"><span class="pre">EmrStepSensor</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_job_flow_manual_steps.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_emr_job_flow_manual_steps.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_job_flow_manual_steps.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_emr_job_flow_manual_steps.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">step_checker</span> <span class="o">=</span> <span class="n">EmrStepSensor</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;watch_step&#39;</span><span class="p">,</span>
     <span class="n">job_flow_id</span><span class="o">=</span><span class="n">cluster_creator</span><span class="o">.</span><span class="n">output</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/emr_eks.html b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/emr_eks.html
index b5a55dc173..1174f173f1 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/emr_eks.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/emr_eks.html
@@ -641,7 +641,7 @@ and <code class="docutils literal notranslate"><span class="pre">monitoringConfi
 Refer to the <a class="reference external" href="https://docs.aws.amazon.com/emr/latest/EMR-on-EKS-DevelopmentGuide/emr-eks-jobs-CLI.html#emr-eks-jobs-parameters">EMR on EKS guide</a>
 for more details on job configuration.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_eks_job.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_emr_eks_job.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_eks_job.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_emr_eks_job.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">JOB_DRIVER_ARG</span> <span class="o">=</span> <span class="p">{</span>
     <span class="s2">&quot;sparkSubmitJobDriver&quot;</span><span class="p">:</span> <span class="p">{</span>
         <span class="s2">&quot;entryPoint&quot;</span><span class="p">:</span> <span class="s2">&quot;local:///usr/lib/spark/examples/src/main/python/pi.py&quot;</span><span class="p">,</span>
@@ -673,7 +673,7 @@ can store them in a connection or provide them in the DAG. Your AWS region shoul
 in the <code class="docutils literal notranslate"><span class="pre">aws_default</span></code> connection as <code class="docutils literal notranslate"><span class="pre">{&quot;region_name&quot;:</span> <span class="pre">&quot;us-east-1&quot;}</span></code> or a custom connection name
 that gets passed to the operator with the <code class="docutils literal notranslate"><span class="pre">aws_conn_id</span></code> parameter.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_eks_job.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_emr_eks_job.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_emr_eks_job.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_emr_eks_job.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">job_starter</span> <span class="o">=</span> <span class="n">EmrContainerOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;start_job&quot;</span><span class="p">,</span>
     <span class="n">virtual_cluster_id</span><span class="o">=</span><span class="n">VIRTUAL_CLUSTER_ID</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/glacier.html b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/glacier.html
index 1e8a97799d..f07a3b730c 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/glacier.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/glacier.html
@@ -620,7 +620,7 @@
 use <code class="xref py py-class docutils literal notranslate"><span class="pre">GlacierCreateJobOperator</span></code></p>
 <p>This Operator returns a dictionary of information related to the initiated job such as <em>jobId</em>, which is required for subsequent tasks.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">create_glacier_job</span> <span class="o">=</span> <span class="n">GlacierCreateJobOperator</span><span class="p">(</span><span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;create_glacier_job&quot;</span><span class="p">,</span> <span class="n">vault_name</span><span class="o">=</span><span class="n">VAULT_NAME</span><span class="p">)</span>
 <span class="n">JOB_ID</span> <span class="o">=</span> <span class="s1">&#39;{{ task_instance.xcom_pull(&quot;create_glacier_job&quot;)[&quot;jobId&quot;] }}&#39;</span>
 </pre></div>
@@ -632,7 +632,7 @@ use <code class="xref py py-class docutils literal notranslate"><span class="pre
 <p>To wait on the status of an Amazon Glacier Job to reach a terminal state
 use <a class="reference internal" href="../_api/airflow/providers/amazon/aws/sensors/glacier/index.html#airflow.providers.amazon.aws.sensors.glacier.GlacierJobOperationSensor" title="airflow.providers.amazon.aws.sensors.glacier.GlacierJobOperationSensor"><code class="xref py py-class docutils literal notranslate"><span class="pre">GlacierJobOperationSensor</span></code></a></p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">wait_for_operation_complete</span> <span class="o">=</span> <span class="n">GlacierJobOperationSensor</span><span class="p">(</span>
     <span class="n">vault_name</span><span class="o">=</span><span class="n">VAULT_NAME</span><span class="p">,</span>
     <span class="n">job_id</span><span class="o">=</span><span class="n">JOB_ID</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/glue.html b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/glue.html
index de4abdcf2b..0b06e42799 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/glue.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/glue.html
@@ -623,7 +623,7 @@ your data and putting it to use in minutes instead of months.</p>
 To create a new AWS Glue Crawler or run an existing one you can
 use <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/glue_crawler/index.html#airflow.providers.amazon.aws.operators.glue_crawler.GlueCrawlerOperator" title="airflow.providers.amazon.aws.operators.glue_crawler.GlueCrawlerOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">GlueCrawlerOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_glue.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_glue.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_glue.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_glue.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">crawl_s3</span> <span class="o">=</span> <span class="n">GlueCrawlerOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;crawl_s3&#39;</span><span class="p">,</span>
     <span class="n">config</span><span class="o">=</span><span class="n">GLUE_CRAWLER_CONFIG</span><span class="p">,</span>
@@ -641,7 +641,7 @@ policy. See the References section below for a link to more details.</p>
 <p>To wait on the state of an AWS Glue Crawler execution until it reaches a terminal state you can
 use <a class="reference internal" href="../_api/airflow/providers/amazon/aws/sensors/glue_crawler/index.html#airflow.providers.amazon.aws.sensors.glue_crawler.GlueCrawlerSensor" title="airflow.providers.amazon.aws.sensors.glue_crawler.GlueCrawlerSensor"><code class="xref py py-class docutils literal notranslate"><span class="pre">GlueCrawlerSensor</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_glue.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_glue.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_glue.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_glue.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">wait_for_crawl</span> <span class="o">=</span> <span class="n">GlueCrawlerSensor</span><span class="p">(</span><span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;wait_for_crawl&#39;</span><span class="p">,</span> <span class="n">crawler_name</span><span class="o">=</span><span class="n">GLUE_CRAWLER_NAME</span><span class="p">)</span>
 </pre></div>
 </div>
@@ -651,7 +651,7 @@ use <a class="reference internal" href="../_api/airflow/providers/amazon/aws/sen
 <span id="howto-operator-gluejoboperator"></span><h3>AWS Glue Job Operator<a class="headerlink" href="#aws-glue-job-operator" title="Permalink to this headline">¶</a></h3>
 <p>To submit a new AWS Glue Job you can use <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/glue/index.html#airflow.providers.amazon.aws.operators.glue.GlueJobOperator" title="airflow.providers.amazon.aws.operators.glue.GlueJobOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">GlueJobOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_glue.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_glue.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_glue.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_glue.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">job_name</span> <span class="o">=</span> <span class="s1">&#39;example_glue_job&#39;</span>
 <span class="n">submit_glue_job</span> <span class="o">=</span> <span class="n">GlueJobOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;submit_glue_job&#39;</span><span class="p">,</span>
@@ -673,7 +673,7 @@ policies to provide access to the output location for result data.</p>
 <p>To wait on the state of an AWS Glue Job until it reaches a terminal state you can
 use <a class="reference internal" href="../_api/airflow/providers/amazon/aws/sensors/glue/index.html#airflow.providers.amazon.aws.sensors.glue.GlueJobSensor" title="airflow.providers.amazon.aws.sensors.glue.GlueJobSensor"><code class="xref py py-class docutils literal notranslate"><span class="pre">GlueJobSensor</span></code></a></p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_glue.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_glue.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_glue.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_glue.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">wait_for_job</span> <span class="o">=</span> <span class="n">GlueJobSensor</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;wait_for_job&#39;</span><span class="p">,</span>
     <span class="n">job_name</span><span class="o">=</span><span class="n">job_name</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/lambda.html b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/lambda.html
index 12ac09dab8..c984c5d9f2 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/lambda.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/lambda.html
@@ -611,7 +611,7 @@ and only pay for what you use.</p>
 <p>To publish a message to an Amazon SNS Topic you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/aws_lambda/index.html#airflow.providers.amazon.aws.operators.aws_lambda.AwsLambdaInvokeFunctionOperator" title="airflow.providers.amazon.aws.operators.aws_lambda.AwsLambdaInvokeFunctionOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">AwsLambdaInvokeFunctionOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_lambda.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_lambda.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_lambda.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_lambda.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">invoke_lambda_function</span> <span class="o">=</span> <span class="n">AwsLambdaInvokeFunctionOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;setup__invoke_lambda_function&#39;</span><span class="p">,</span>
     <span class="n">function_name</span><span class="o">=</span><span class="n">LAMBDA_FUNCTION_NAME</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/quicksight.html b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/quicksight.html
index 496f9a5a16..ed8f62c184 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/quicksight.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/quicksight.html
@@ -628,7 +628,7 @@
 <p>The QuickSightCreateIngestionOperator Creates and starts a new SPICE ingestion for a dataset.
 The operator also refreshes existing SPICE datasets</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_quicksight.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_quicksight.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_quicksight.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_quicksight.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">quicksight_create_ingestion_no_waiting</span> <span class="o">=</span> <span class="n">QuickSightCreateIngestionOperator</span><span class="p">(</span>
     <span class="n">data_set_id</span><span class="o">=</span><span class="n">DATA_SET_ID</span><span class="p">,</span>
     <span class="n">ingestion_id</span><span class="o">=</span><span class="n">INGESTION_NO_WAITING_ID</span><span class="p">,</span>
@@ -643,7 +643,7 @@ The operator also refreshes existing SPICE datasets</p>
 <span id="howto-sensor-quicksightsensor"></span><h3>Amazon QuickSight Sensor<a class="headerlink" href="#amazon-quicksight-sensor" title="Permalink to this headline">¶</a></h3>
 <p>The QuickSightSensor wait for Amazon QuickSight CreateIngestion until it reaches a terminal state</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_quicksight.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_quicksight.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_quicksight.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_quicksight.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">quicksight_job_status</span> <span class="o">=</span> <span class="n">QuickSightSensor</span><span class="p">(</span>
     <span class="n">data_set_id</span><span class="o">=</span><span class="n">DATA_SET_ID</span><span class="p">,</span>
     <span class="n">ingestion_id</span><span class="o">=</span><span class="n">INGESTION_NO_WAITING_ID</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/rds.html b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/rds.html
index 07c30ec82c..adff29d356 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/rds.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/rds.html
@@ -613,7 +613,7 @@
 <code class="xref py py-class docutils literal notranslate"><span class="pre">RDSCreateDBSnapshotOperator</span></code>.
 The source DB instance must be in the <code class="docutils literal notranslate"><span class="pre">available</span></code> or <code class="docutils literal notranslate"><span class="pre">storage-optimization</span></code> state.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_rds.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_rds.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="k">with</span> <span class="n">DAG</span><span class="p">(</span>
     <span class="n">dag_id</span><span class="o">=</span><span class="s1">&#39;rds_snapshots&#39;</span><span class="p">,</span> <span class="n">start_date</span><span class="o">=</span><span class="n">datetime</span><span class="p">(</span><span class="mi">2021</span><span class="p">,</span> <span class="mi">1</span><span class="p">,</span> <span class="mi">1</span><span class="p">),</span> <span class="n">schedule_interval</span><span class="o">=</span><span class="kc">None</span><span [...]
 <span class="p">)</span> <span class="k">as</span> <span class="n">dag</span><span class="p">:</span>
@@ -655,7 +655,7 @@ The source DB instance must be in the <code class="docutils literal notranslate"
 <code class="xref py py-class docutils literal notranslate"><span class="pre">RDSCopyDBSnapshotOperator</span></code>.
 The source DB snapshot must be in the <code class="docutils literal notranslate"><span class="pre">available</span></code> state.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_rds.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_rds.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">copy_snapshot</span> <span class="o">=</span> <span class="n">RdsCopyDbSnapshotOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;copy_snapshot&#39;</span><span class="p">,</span>
         <span class="n">db_type</span><span class="o">=</span><span class="s1">&#39;instance&#39;</span><span class="p">,</span>
@@ -677,7 +677,7 @@ The source DB snapshot must be in the <code class="docutils literal notranslate"
 <code class="xref py py-class docutils literal notranslate"><span class="pre">RDSDeleteDBSnapshotOperator</span></code>.
 The DB snapshot must be in the <code class="docutils literal notranslate"><span class="pre">available</span></code> state to be deleted.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_rds.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_rds.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">delete_snapshot</span> <span class="o">=</span> <span class="n">RdsDeleteDbSnapshotOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;delete_snapshot&#39;</span><span class="p">,</span>
         <span class="n">db_type</span><span class="o">=</span><span class="s1">&#39;instance&#39;</span><span class="p">,</span>
@@ -698,7 +698,7 @@ The DB snapshot must be in the <code class="docutils literal notranslate"><span
 <code class="xref py py-class docutils literal notranslate"><span class="pre">RDSStartExportTaskOperator</span></code>.
 The provided IAM role must have access to the S3 bucket.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_rds.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_rds.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">start_export</span> <span class="o">=</span> <span class="n">RdsStartExportTaskOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;start_export&#39;</span><span class="p">,</span>
         <span class="n">export_task_identifier</span><span class="o">=</span><span class="s1">&#39;export-auth-db-snap-{{ ds }}&#39;</span><span class="p">,</span>
@@ -722,7 +722,7 @@ The provided IAM role must have access to the S3 bucket.</p>
 <code class="xref py py-class docutils literal notranslate"><span class="pre">RDSCancelExportTaskOperator</span></code>.
 Any data that has already been written to the S3 bucket isn't removed.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_rds.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_rds.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">cancel_export</span> <span class="o">=</span> <span class="n">RdsCancelExportTaskOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;cancel_export&#39;</span><span class="p">,</span>
         <span class="n">export_task_identifier</span><span class="o">=</span><span class="s1">&#39;export-auth-db-snap-{{ ds }}&#39;</span><span class="p">,</span>
@@ -744,7 +744,7 @@ To obtain an ARN with SNS, you must create a topic in Amazon SNS and subscribe t
 RDS event notification is only available for not encrypted SNS topics.
 If you specify an encrypted SNS topic, event notifications are not sent for the topic.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_rds.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_rds.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">create_subscription</span> <span class="o">=</span> <span class="n">RdsCreateEventSubscriptionOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;create_subscription&#39;</span><span class="p">,</span>
         <span class="n">subscription_name</span><span class="o">=</span><span class="s1">&#39;my-topic-subscription&#39;</span><span class="p">,</span>
@@ -766,7 +766,7 @@ If you specify an encrypted SNS topic, event notifications are not sent for the
 <p>To delete event subscription you can use
 <code class="xref py py-class docutils literal notranslate"><span class="pre">RDSDeleteEventSubscriptionOperator</span></code></p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_rds.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_rds.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">delete_subscription</span> <span class="o">=</span> <span class="n">RdsDeleteEventSubscriptionOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;delete_subscription&#39;</span><span class="p">,</span>
         <span class="n">subscription_name</span><span class="o">=</span><span class="s1">&#39;my-topic-subscription&#39;</span><span class="p">,</span>
@@ -794,7 +794,7 @@ If you specify an encrypted SNS topic, event notifications are not sent for the
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/sensors/rds/index.html#airflow.providers.amazon.aws.sensors.rds.RdsSnapshotExistenceSensor" title="airflow.providers.amazon.aws.sensors.rds.RdsSnapshotExistenceSensor"><code class="xref py py-class docutils literal notranslate"><span class="pre">RdsSnapshotExistenceSensor</span></code></a>.
 By default, sensor waits existence of snapshot with status <code class="docutils literal notranslate"><span class="pre">available</span></code>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_rds.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_rds.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">snapshot_sensor</span> <span class="o">=</span> <span class="n">RdsSnapshotExistenceSensor</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;snapshot_sensor&#39;</span><span class="p">,</span>
         <span class="n">db_type</span><span class="o">=</span><span class="s1">&#39;instance&#39;</span><span class="p">,</span>
@@ -813,7 +813,7 @@ By default, sensor waits existence of snapshot with status <code class="docutils
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/sensors/rds/index.html#airflow.providers.amazon.aws.sensors.rds.RdsExportTaskExistenceSensor" title="airflow.providers.amazon.aws.sensors.rds.RdsExportTaskExistenceSensor"><code class="xref py py-class docutils literal notranslate"><span class="pre">RdsExportTaskExistenceSensor</span></code></a>.
 By default, sensor waits existence of export task with status <code class="docutils literal notranslate"><span class="pre">available</span></code>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_rds.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_rds.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_rds.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">export_sensor</span> <span class="o">=</span> <span class="n">RdsExportTaskExistenceSensor</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;export_sensor&#39;</span><span class="p">,</span>
         <span class="n">export_task_identifier</span><span class="o">=</span><span class="s1">&#39;export-auth-db-snap-{{ ds }}&#39;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/redshift_cluster.html b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/redshift_cluster.html
index 0ec4bfb7ca..dc1ec76401 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/redshift_cluster.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/redshift_cluster.html
@@ -626,7 +626,7 @@ business and customers.</p>
 <p>To create an Amazon Redshift Cluster with the specified parameters
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/redshift_cluster/index.html#airflow.providers.amazon.aws.operators.redshift_cluster.RedshiftCreateClusterOperator" title="airflow.providers.amazon.aws.operators.redshift_cluster.RedshiftCreateClusterOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">RedshiftCreateClusterOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift_cluster.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_redshift_cluster.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift_cluster.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_redshift_cluster.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">task_create_cluster</span> <span class="o">=</span> <span class="n">RedshiftCreateClusterOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;redshift_create_cluster&quot;</span><span class="p">,</span>
     <span class="n">cluster_identifier</span><span class="o">=</span><span class="n">REDSHIFT_CLUSTER_IDENTIFIER</span><span class="p">,</span>
@@ -644,7 +644,7 @@ business and customers.</p>
 <p>To check the state of an Amazon Redshift Cluster until it reaches the target state or another terminal
 state you can use <a class="reference internal" href="../_api/airflow/providers/amazon/aws/sensors/redshift_cluster/index.html#airflow.providers.amazon.aws.sensors.redshift_cluster.RedshiftClusterSensor" title="airflow.providers.amazon.aws.sensors.redshift_cluster.RedshiftClusterSensor"><code class="xref py py-class docutils literal notranslate"><span class="pre">RedshiftClusterSensor</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift_cluster.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_redshift_cluster.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift_cluster.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_redshift_cluster.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">task_wait_cluster_available</span> <span class="o">=</span> <span class="n">RedshiftClusterSensor</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;sensor_redshift_cluster_available&#39;</span><span class="p">,</span>
     <span class="n">cluster_identifier</span><span class="o">=</span><span class="n">REDSHIFT_CLUSTER_IDENTIFIER</span><span class="p">,</span>
@@ -661,7 +661,7 @@ state you can use <a class="reference internal" href="../_api/airflow/providers/
 <p>To resume a 'paused' Amazon Redshift Cluster you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/redshift_cluster/index.html#module-airflow.providers.amazon.aws.operators.redshift_cluster" title="airflow.providers.amazon.aws.operators.redshift_cluster"><code class="xref py py-class docutils literal notranslate"><span class="pre">RedshiftResumeClusterOperator</span></code></a></p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift_cluster.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_redshift_cluster.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift_cluster.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_redshift_cluster.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">task_resume_cluster</span> <span class="o">=</span> <span class="n">RedshiftResumeClusterOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;redshift_resume_cluster&#39;</span><span class="p">,</span>
     <span class="n">cluster_identifier</span><span class="o">=</span><span class="n">REDSHIFT_CLUSTER_IDENTIFIER</span><span class="p">,</span>
@@ -675,7 +675,7 @@ state you can use <a class="reference internal" href="../_api/airflow/providers/
 <p>To pause an 'available' Amazon Redshift Cluster you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/redshift_cluster/index.html#module-airflow.providers.amazon.aws.operators.redshift_cluster" title="airflow.providers.amazon.aws.operators.redshift_cluster"><code class="xref py py-class docutils literal notranslate"><span class="pre">RedshiftPauseClusterOperator</span></code></a></p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift_cluster.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_redshift_cluster.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift_cluster.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_redshift_cluster.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">task_pause_cluster</span> <span class="o">=</span> <span class="n">RedshiftPauseClusterOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;redshift_pause_cluster&#39;</span><span class="p">,</span>
     <span class="n">cluster_identifier</span><span class="o">=</span><span class="n">REDSHIFT_CLUSTER_IDENTIFIER</span><span class="p">,</span>
@@ -689,7 +689,7 @@ state you can use <a class="reference internal" href="../_api/airflow/providers/
 <p>To delete an Amazon Redshift Cluster you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/redshift_cluster/index.html#module-airflow.providers.amazon.aws.operators.redshift_cluster" title="airflow.providers.amazon.aws.operators.redshift_cluster"><code class="xref py py-class docutils literal notranslate"><span class="pre">RedshiftDeleteClusterOperator</span></code></a></p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift_cluster.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_redshift_cluster.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift_cluster.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_redshift_cluster.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">task_delete_cluster</span> <span class="o">=</span> <span class="n">RedshiftDeleteClusterOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;delete_cluster&quot;</span><span class="p">,</span>
     <span class="n">cluster_identifier</span><span class="o">=</span><span class="n">REDSHIFT_CLUSTER_IDENTIFIER</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/redshift_data.html b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/redshift_data.html
index a525d76042..e9002e58bc 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/redshift_data.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/redshift_data.html
@@ -624,7 +624,7 @@ statements against an Amazon Redshift cluster.</p>
 <p>This is a basic example DAG for using <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/redshift_data/index.html#module-airflow.providers.amazon.aws.operators.redshift_data" title="airflow.providers.amazon.aws.operators.redshift_data"><code class="xref py py-class docutils literal notranslate"><span class="pre">RedshiftDataOperator</span></code></a>
 to execute statements against an Amazon Redshift cluster.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift_data_execute_sql.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_redshift_data_execute_sql.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift_data_execute_sql.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_redshift_data_execute_sql.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">task_query</span> <span class="o">=</span> <span class="n">RedshiftDataOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;redshift_query&#39;</span><span class="p">,</span>
     <span class="n">cluster_identifier</span><span class="o">=</span><span class="n">REDSHIFT_CLUSTER_IDENTIFIER</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/redshift_sql.html b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/redshift_sql.html
index ead4369062..6133c96bb5 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/redshift_sql.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/redshift_sql.html
@@ -625,7 +625,7 @@ business and customers.</p>
 <div class="section" id="execute-a-sql-query">
 <h3>Execute a SQL query<a class="headerlink" href="#execute-a-sql-query" title="Permalink to this headline">¶</a></h3>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift_sql.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_redshift_sql.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift_sql.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_redshift_sql.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">task_select_data</span> <span class="o">=</span> <span class="n">RedshiftSQLOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;task_get_all_table_data&#39;</span><span class="p">,</span> <span class="n">sql</span><span class="o">=</span><span class="s2">&quot;&quot;&quot;CREATE TABLE more_fruit AS SELECT * FROM fruit;&quot;&quot;&quot;</span>
 <span class="p">)</span>
@@ -638,7 +638,7 @@ business and customers.</p>
 <p>RedshiftSQLOperator supports the <code class="docutils literal notranslate"><span class="pre">parameters</span></code> attribute which allows us to dynamically pass
 parameters into SQL statements.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift_sql.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_redshift_sql.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift_sql.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_redshift_sql.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">task_select_filtered_data</span> <span class="o">=</span> <span class="n">RedshiftSQLOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;task_get_filtered_table_data&#39;</span><span class="p">,</span>
     <span class="n">sql</span><span class="o">=</span><span class="s2">&quot;&quot;&quot;CREATE TABLE filtered_fruit AS SELECT * FROM fruit WHERE color = &#39;{{ params.color }}&#39;;&quot;&quot;&quot;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/s3.html b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/s3.html
index 11bd932a04..31fe8fda23 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/s3.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/s3.html
@@ -636,7 +636,7 @@
 <p>To create an Amazon S3 bucket you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/s3/index.html#airflow.providers.amazon.aws.operators.s3.S3CreateBucketOperator" title="airflow.providers.amazon.aws.operators.s3.S3CreateBucketOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">S3CreateBucketOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_s3.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_s3.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">create_bucket</span> <span class="o">=</span> <span class="n">S3CreateBucketOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;s3_create_bucket&#39;</span><span class="p">,</span>
     <span class="n">bucket_name</span><span class="o">=</span><span class="n">BUCKET_NAME</span><span class="p">,</span>
@@ -650,7 +650,7 @@
 <p>To delete an Amazon S3 bucket you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/s3/index.html#airflow.providers.amazon.aws.operators.s3.S3DeleteBucketOperator" title="airflow.providers.amazon.aws.operators.s3.S3DeleteBucketOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">S3DeleteBucketOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_s3.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_s3.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">delete_bucket</span> <span class="o">=</span> <span class="n">S3DeleteBucketOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;s3_delete_bucket&#39;</span><span class="p">,</span> <span class="n">bucket_name</span><span class="o">=</span><span class="n">BUCKET_NAME</span><span class="p">,</span> <span class="n">force_delete</span><span class="o">=</span><span class="kc">True</span>
 <span class="p">)</span>
@@ -663,7 +663,7 @@
 <p>To set the tags for an Amazon S3 bucket you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/s3/index.html#airflow.providers.amazon.aws.operators.s3.S3PutBucketTaggingOperator" title="airflow.providers.amazon.aws.operators.s3.S3PutBucketTaggingOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">S3PutBucketTaggingOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_s3.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_s3.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">put_tagging</span> <span class="o">=</span> <span class="n">S3PutBucketTaggingOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;s3_put_bucket_tagging&#39;</span><span class="p">,</span>
     <span class="n">bucket_name</span><span class="o">=</span><span class="n">BUCKET_NAME</span><span class="p">,</span>
@@ -679,7 +679,7 @@
 <p>To get the tag set associated with an Amazon S3 bucket you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/s3/index.html#airflow.providers.amazon.aws.operators.s3.S3GetBucketTaggingOperator" title="airflow.providers.amazon.aws.operators.s3.S3GetBucketTaggingOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">S3GetBucketTaggingOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_s3.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_s3.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">get_tagging</span> <span class="o">=</span> <span class="n">S3GetBucketTaggingOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;s3_get_bucket_tagging&#39;</span><span class="p">,</span>
     <span class="n">bucket_name</span><span class="o">=</span><span class="n">BUCKET_NAME</span><span class="p">,</span>
@@ -693,7 +693,7 @@
 <p>To delete the tags of an Amazon S3 bucket you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/s3/index.html#airflow.providers.amazon.aws.operators.s3.S3DeleteBucketTaggingOperator" title="airflow.providers.amazon.aws.operators.s3.S3DeleteBucketTaggingOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">S3DeleteBucketTaggingOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_s3.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_s3.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">delete_tagging</span> <span class="o">=</span> <span class="n">S3DeleteBucketTaggingOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;s3_delete_bucket_tagging&#39;</span><span class="p">,</span>
     <span class="n">bucket_name</span><span class="o">=</span><span class="n">BUCKET_NAME</span><span class="p">,</span>
@@ -713,7 +713,7 @@ API if <code class="docutils literal notranslate"><span class="pre">wildcard_mat
 Please keep in mind, especially when used to check a large volume of keys, that it makes one API call per key.</p>
 <p>To check one file:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_s3.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_s3.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="c1"># Check if a file exists</span>
 <span class="n">sensor_one_key</span> <span class="o">=</span> <span class="n">S3KeySensor</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;s3_sensor_one_key&quot;</span><span class="p">,</span>
@@ -725,7 +725,7 @@ Please keep in mind, especially when used to check a large volume of keys, that
 </div>
 <p>To check multiple files:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_s3.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_s3.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="c1"># Check if both files exist</span>
 <span class="n">sensor_two_keys</span> <span class="o">=</span> <span class="n">S3KeySensor</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;s3_sensor_two_keys&quot;</span><span class="p">,</span>
@@ -748,7 +748,7 @@ multiple files can match one key. The list of matched S3 object attributes conta
 </pre></div>
 </div>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_s3.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_s3.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="k">def</span> <span class="nf">check_fn</span><span class="p">(</span><span class="n">files</span><span class="p">:</span> <span class="n">List</span><span class="p">)</span> <span class="o">-&gt;</span> <span class="nb">bool</span><span class="p">:</span>
     <span class="sd">&quot;&quot;&quot;</span>
 <span class="sd">    Example of custom check: check if all files are bigger than 1kB</span>
@@ -766,7 +766,7 @@ multiple files can match one key. The list of matched S3 object attributes conta
 </div>
 </div>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_s3.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_s3.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="c1"># Check if a file exists and match a certain pattern defined in check_fn</span>
 <span class="n">sensor_key_with_function</span> <span class="o">=</span> <span class="n">S3KeySensor</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;s3_sensor_key_function&quot;</span><span class="p">,</span>
@@ -786,7 +786,7 @@ the inactivity period has passed with no increase in the number of objects you c
 Note, this sensor will not behave correctly in reschedule mode,
 as the state of the listed objects in the Amazon S3 bucket will be lost between rescheduled invocations.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_s3.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_s3.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">sensor_keys_unchanged</span> <span class="o">=</span> <span class="n">S3KeysUnchangedSensor</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;s3_sensor_one_key_size&quot;</span><span class="p">,</span>
     <span class="n">bucket_name</span><span class="o">=</span><span class="n">BUCKET_NAME_2</span><span class="p">,</span>
@@ -802,7 +802,7 @@ as the state of the listed objects in the Amazon S3 bucket will be lost between
 <p>To create a new (or replace) Amazon S3 object you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/s3/index.html#airflow.providers.amazon.aws.operators.s3.S3CreateObjectOperator" title="airflow.providers.amazon.aws.operators.s3.S3CreateObjectOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">S3CreateObjectOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_s3.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_s3.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">create_object</span> <span class="o">=</span> <span class="n">S3CreateObjectOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;s3_create_object&quot;</span><span class="p">,</span>
     <span class="n">s3_bucket</span><span class="o">=</span><span class="n">BUCKET_NAME</span><span class="p">,</span>
@@ -821,7 +821,7 @@ as the state of the listed objects in the Amazon S3 bucket will be lost between
 See <a class="reference external" href="https://docs.aws.amazon.com/AmazonS3/latest/userguide/using-prefixes.html">here</a>
 for more information about Amazon S3 prefixes.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_s3.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_s3.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">list_prefixes</span> <span class="o">=</span> <span class="n">S3ListPrefixesOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;s3_list_prefix_operator&quot;</span><span class="p">,</span>
     <span class="n">bucket</span><span class="o">=</span><span class="n">BUCKET_NAME</span><span class="p">,</span>
@@ -838,7 +838,7 @@ for more information about Amazon S3 prefixes.</p>
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/s3/index.html#airflow.providers.amazon.aws.operators.s3.S3ListOperator" title="airflow.providers.amazon.aws.operators.s3.S3ListOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">S3ListOperator</span></code></a>.
 You can specify a <code class="docutils literal notranslate"><span class="pre">prefix</span></code> to filter the objects whose name begins with such prefix.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_s3.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_s3.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">list_keys</span> <span class="o">=</span> <span class="n">S3ListOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;s3_list_operator&quot;</span><span class="p">,</span>
     <span class="n">bucket</span><span class="o">=</span><span class="n">BUCKET_NAME</span><span class="p">,</span>
@@ -854,7 +854,7 @@ You can specify a <code class="docutils literal notranslate"><span class="pre">p
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/s3/index.html#airflow.providers.amazon.aws.operators.s3.S3CopyObjectOperator" title="airflow.providers.amazon.aws.operators.s3.S3CopyObjectOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">S3CopyObjectOperator</span></code></a>.
 The Amazon S3 connection used here needs to have access to both source and destination bucket/key.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_s3.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_s3.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">copy_object</span> <span class="o">=</span> <span class="n">S3CopyObjectOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;s3_copy_object&quot;</span><span class="p">,</span>
     <span class="n">source_bucket_name</span><span class="o">=</span><span class="n">BUCKET_NAME</span><span class="p">,</span>
@@ -871,7 +871,7 @@ The Amazon S3 connection used here needs to have access to both source and desti
 <p>To delete one or multiple Amazon S3 objects you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/s3/index.html#airflow.providers.amazon.aws.operators.s3.S3DeleteObjectsOperator" title="airflow.providers.amazon.aws.operators.s3.S3DeleteObjectsOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">S3DeleteObjectsOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_s3.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_s3.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">delete_objects</span> <span class="o">=</span> <span class="n">S3DeleteObjectsOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;s3_delete_objects&quot;</span><span class="p">,</span>
     <span class="n">bucket</span><span class="o">=</span><span class="n">BUCKET_NAME_2</span><span class="p">,</span>
@@ -888,7 +888,7 @@ The Amazon S3 connection used here needs to have access to both source and desti
 You can also apply an optional [Amazon S3 Select expression](<a class="reference external" href="https://docs.aws.amazon.com/AmazonS3/latest/userguide/s3-glacier-select-sql-reference-select.html">https://docs.aws.amazon.com/AmazonS3/latest/userguide/s3-glacier-select-sql-reference-select.html</a>)
 to select the data you want to retrieve from <code class="docutils literal notranslate"><span class="pre">source_s3_key</span></code> using <code class="docutils literal notranslate"><span class="pre">select_expression</span></code>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_s3.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_s3.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">transforms_file</span> <span class="o">=</span> <span class="n">S3FileTransformOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;s3_file_transform&quot;</span><span class="p">,</span>
     <span class="n">source_s3_key</span><span class="o">=</span><span class="sa">f</span><span class="s1">&#39;s3://</span><span class="si">{</span><span class="n">BUCKET_NAME</span><span class="si">}</span><span class="s1">/</span><span class="si">{</span><span class="n">KEY</span><span class="si">}</span><span class="s1">&#39;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/sagemaker.html b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/sagemaker.html
index d980e7c37f..19da4c7245 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/sagemaker.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/sagemaker.html
@@ -625,7 +625,7 @@ production-ready hosted environment.</p>
 <p>To create an Amazon Sagemaker processing job to sanitize your dataset you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/sagemaker/index.html#airflow.providers.amazon.aws.operators.sagemaker.SageMakerProcessingOperator" title="airflow.providers.amazon.aws.operators.sagemaker.SageMakerProcessingOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">SageMakerProcessingOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_sagemaker.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_sagemaker.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_sagemaker.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_sagemaker.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">preprocess_raw_data</span> <span class="o">=</span> <span class="n">SageMakerProcessingOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;preprocess_raw_data&#39;</span><span class="p">,</span>
     <span class="n">config</span><span class="o">=</span><span class="n">SAGEMAKER_PROCESSING_JOB_CONFIG</span><span class="p">,</span>
@@ -640,7 +640,7 @@ production-ready hosted environment.</p>
 <p>To create an Amazon Sagemaker training job you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/sagemaker/index.html#airflow.providers.amazon.aws.operators.sagemaker.SageMakerTrainingOperator" title="airflow.providers.amazon.aws.operators.sagemaker.SageMakerTrainingOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">SageMakerTrainingOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_sagemaker.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_sagemaker.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_sagemaker.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_sagemaker.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">train_model</span> <span class="o">=</span> <span class="n">SageMakerTrainingOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;train_model&#39;</span><span class="p">,</span>
     <span class="n">config</span><span class="o">=</span><span class="n">TRAINING_CONFIG</span><span class="p">,</span>
@@ -657,7 +657,7 @@ production-ready hosted environment.</p>
 <p>To create an Amazon Sagemaker model you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/sagemaker/index.html#airflow.providers.amazon.aws.operators.sagemaker.SageMakerModelOperator" title="airflow.providers.amazon.aws.operators.sagemaker.SageMakerModelOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">SageMakerModelOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_sagemaker.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_sagemaker.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_sagemaker.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_sagemaker.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">create_model</span> <span class="o">=</span> <span class="n">SageMakerModelOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;create_model&#39;</span><span class="p">,</span>
     <span class="n">config</span><span class="o">=</span><span class="n">MODEL_CONFIG</span><span class="p">,</span>
@@ -672,7 +672,7 @@ production-ready hosted environment.</p>
 <p>To start a hyperparameter tuning job for an Amazon Sagemaker model you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/sagemaker/index.html#airflow.providers.amazon.aws.operators.sagemaker.SageMakerTuningOperator" title="airflow.providers.amazon.aws.operators.sagemaker.SageMakerTuningOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">SageMakerTuningOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_sagemaker.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_sagemaker.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_sagemaker.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_sagemaker.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">tune_model</span> <span class="o">=</span> <span class="n">SageMakerTuningOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;tune_model&#39;</span><span class="p">,</span>
     <span class="n">config</span><span class="o">=</span><span class="n">TUNING_CONFIG</span><span class="p">,</span>
@@ -689,7 +689,7 @@ production-ready hosted environment.</p>
 <p>To delete an Amazon Sagemaker model you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/sagemaker/index.html#airflow.providers.amazon.aws.operators.sagemaker.SageMakerDeleteModelOperator" title="airflow.providers.amazon.aws.operators.sagemaker.SageMakerDeleteModelOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">SageMakerDeleteModelOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_sagemaker.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_sagemaker.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_sagemaker.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_sagemaker.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">delete_model</span> <span class="o">=</span> <span class="n">SageMakerDeleteModelOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;delete_model&#39;</span><span class="p">,</span>
     <span class="n">config</span><span class="o">=</span><span class="p">{</span><span class="s1">&#39;ModelName&#39;</span><span class="p">:</span> <span class="n">MODEL_NAME</span><span class="p">},</span>
@@ -704,7 +704,7 @@ production-ready hosted environment.</p>
 <p>To create an Amazon Sagemaker transform job you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/sagemaker/index.html#airflow.providers.amazon.aws.operators.sagemaker.SageMakerTransformOperator" title="airflow.providers.amazon.aws.operators.sagemaker.SageMakerTransformOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">SageMakerTransformOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_sagemaker.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_sagemaker.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_sagemaker.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_sagemaker.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">test_model</span> <span class="o">=</span> <span class="n">SageMakerTransformOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;test_model&#39;</span><span class="p">,</span>
     <span class="n">config</span><span class="o">=</span><span class="n">TRANSFORM_CONFIG</span><span class="p">,</span>
@@ -721,7 +721,7 @@ production-ready hosted environment.</p>
 <p>To create an Amazon Sagemaker endpoint config job you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/sagemaker/index.html#airflow.providers.amazon.aws.operators.sagemaker.SageMakerEndpointConfigOperator" title="airflow.providers.amazon.aws.operators.sagemaker.SageMakerEndpointConfigOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">SageMakerEndpointConfigOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_sagemaker_endpoint.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_sagemaker_endpoint.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_sagemaker_endpoint.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_sagemaker_endpoint.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">configure_endpoint</span> <span class="o">=</span> <span class="n">SageMakerEndpointConfigOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;configure_endpoint&#39;</span><span class="p">,</span>
     <span class="n">config</span><span class="o">=</span><span class="n">ENDPOINT_CONFIG_CONFIG</span><span class="p">,</span>
@@ -736,7 +736,7 @@ production-ready hosted environment.</p>
 <p>To create an Amazon Sagemaker endpoint you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/sagemaker/index.html#airflow.providers.amazon.aws.operators.sagemaker.SageMakerEndpointOperator" title="airflow.providers.amazon.aws.operators.sagemaker.SageMakerEndpointOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">SageMakerEndpointOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_sagemaker_endpoint.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_sagemaker_endpoint.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_sagemaker_endpoint.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_sagemaker_endpoint.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">deploy_endpoint</span> <span class="o">=</span> <span class="n">SageMakerEndpointOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;deploy_endpoint&#39;</span><span class="p">,</span>
     <span class="n">config</span><span class="o">=</span><span class="n">DEPLOY_ENDPOINT_CONFIG</span><span class="p">,</span>
@@ -756,7 +756,7 @@ production-ready hosted environment.</p>
 <p>To check the state of an Amazon Sagemaker training job until it reaches a terminal state
 you can use <a class="reference internal" href="../_api/airflow/providers/amazon/aws/sensors/sagemaker/index.html#airflow.providers.amazon.aws.sensors.sagemaker.SageMakerTrainingSensor" title="airflow.providers.amazon.aws.sensors.sagemaker.SageMakerTrainingSensor"><code class="xref py py-class docutils literal notranslate"><span class="pre">SageMakerTrainingSensor</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_sagemaker.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_sagemaker.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_sagemaker.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_sagemaker.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">await_training</span> <span class="o">=</span> <span class="n">SageMakerTrainingSensor</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;await_training&#39;</span><span class="p">,</span>
     <span class="n">job_name</span><span class="o">=</span><span class="n">TRAINING_JOB_NAME</span><span class="p">,</span>
@@ -770,7 +770,7 @@ you can use <a class="reference internal" href="../_api/airflow/providers/amazon
 <p>To check the state of an Amazon Sagemaker transform job until it reaches a terminal state
 you can use <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/sagemaker/index.html#airflow.providers.amazon.aws.operators.sagemaker.SageMakerTransformOperator" title="airflow.providers.amazon.aws.operators.sagemaker.SageMakerTransformOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">SageMakerTransformOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_sagemaker.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_sagemaker.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_sagemaker.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_sagemaker.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">await_transform</span> <span class="o">=</span> <span class="n">SageMakerTransformSensor</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;await_transform&#39;</span><span class="p">,</span>
     <span class="n">job_name</span><span class="o">=</span><span class="n">TRANSFORM_JOB_NAME</span><span class="p">,</span>
@@ -784,7 +784,7 @@ you can use <a class="reference internal" href="../_api/airflow/providers/amazon
 <p>To check the state of an Amazon Sagemaker hyperparameter tuning job until it reaches a terminal state
 you can use <a class="reference internal" href="../_api/airflow/providers/amazon/aws/sensors/sagemaker/index.html#airflow.providers.amazon.aws.sensors.sagemaker.SageMakerTuningSensor" title="airflow.providers.amazon.aws.sensors.sagemaker.SageMakerTuningSensor"><code class="xref py py-class docutils literal notranslate"><span class="pre">SageMakerTuningSensor</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_sagemaker.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_sagemaker.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_sagemaker.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_sagemaker.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">await_tune</span> <span class="o">=</span> <span class="n">SageMakerTuningSensor</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;await_tuning&#39;</span><span class="p">,</span>
     <span class="n">job_name</span><span class="o">=</span><span class="n">TUNING_JOB_NAME</span><span class="p">,</span>
@@ -798,7 +798,7 @@ you can use <a class="reference internal" href="../_api/airflow/providers/amazon
 <p>To check the state of an Amazon Sagemaker hyperparameter tuning job until it reaches a terminal state
 you can use <a class="reference internal" href="../_api/airflow/providers/amazon/aws/sensors/sagemaker/index.html#airflow.providers.amazon.aws.sensors.sagemaker.SageMakerEndpointSensor" title="airflow.providers.amazon.aws.sensors.sagemaker.SageMakerEndpointSensor"><code class="xref py py-class docutils literal notranslate"><span class="pre">SageMakerEndpointSensor</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_sagemaker_endpoint.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_sagemaker_endpoint.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_sagemaker_endpoint.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_sagemaker_endpoint.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">await_endpoint</span> <span class="o">=</span> <span class="n">SageMakerEndpointSensor</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;await_endpoint&#39;</span><span class="p">,</span>
     <span class="n">endpoint_name</span><span class="o">=</span><span class="n">ENDPOINT_NAME</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/sns.html b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/sns.html
index ff16473df9..8442aae823 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/sns.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/sns.html
@@ -612,7 +612,7 @@ messages (SMS).</p>
 <p>To publish a message to an Amazon SNS Topic you can use
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/sns/index.html#airflow.providers.amazon.aws.operators.sns.SnsPublishOperator" title="airflow.providers.amazon.aws.operators.sns.SnsPublishOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">SnsPublishOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_sns.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_sns.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_sns.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_sns.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">publish</span> <span class="o">=</span> <span class="n">SnsPublishOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;publish_message&#39;</span><span class="p">,</span>
     <span class="n">target_arn</span><span class="o">=</span><span class="n">SNS_TOPIC_ARN</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/sqs.html b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/sqs.html
index 0bf192b034..2db2136f09 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/sqs.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/sqs.html
@@ -626,7 +626,7 @@ or requiring other services to be available.</p>
 <p>In the following example, the task &quot;publish_to_queue&quot; publishes a message containing
 the task instance and the execution date to a queue with a default name of <code class="docutils literal notranslate"><span class="pre">Airflow-Example-Queue</span></code>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_sqs.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_sqs.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_sqs.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_sqs.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">publish_to_queue</span> <span class="o">=</span> <span class="n">SqsPublishOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;publish_to_queue&#39;</span><span class="p">,</span>
         <span class="n">sqs_queue</span><span class="o">=</span><span class="n">create_queue</span><span class="p">,</span>
@@ -641,7 +641,7 @@ the task instance and the execution date to a queue with a default name of <code
 <p>To read messages from an Amazon SQS queue until exhausted use the
 <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/sqs/index.html#airflow.providers.amazon.aws.operators.sqs.SqsPublishOperator" title="airflow.providers.amazon.aws.operators.sqs.SqsPublishOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">SqsPublishOperator</span></code></a></p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_sqs.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_sqs.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_sqs.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_sqs.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">read_from_queue</span> <span class="o">=</span> <span class="n">SqsSensor</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;read_from_queue&#39;</span><span class="p">,</span>
         <span class="n">sqs_queue</span><span class="o">=</span><span class="n">create_queue</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/step_functions.html b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/step_functions.html
index 227f3f0b6c..0b68df3e81 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/step_functions.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/step_functions.html
@@ -621,7 +621,7 @@ machines to execute the steps of your application in a reliable and scalable fas
 <p>To start a new AWS Step Functions State Machine execution
 use <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/step_function/index.html#airflow.providers.amazon.aws.operators.step_function.StepFunctionStartExecutionOperator" title="airflow.providers.amazon.aws.operators.step_function.StepFunctionStartExecutionOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">StepFunctionStartExecutionOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_step_functions.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_step_functions.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_step_functions.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_step_functions.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">start_execution</span> <span class="o">=</span> <span class="n">StepFunctionStartExecutionOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;start_execution&#39;</span><span class="p">,</span> <span class="n">state_machine_arn</span><span class="o">=</span><span class="n">STEP_FUNCTIONS_STATE_MACHINE_ARN</span>
 <span class="p">)</span>
@@ -634,7 +634,7 @@ use <a class="reference internal" href="../_api/airflow/providers/amazon/aws/ope
 <p>To wait on the state of an AWS Step Function State Machine execution until it reaches a terminal state you can
 use <a class="reference internal" href="../_api/airflow/providers/amazon/aws/sensors/step_function/index.html#airflow.providers.amazon.aws.sensors.step_function.StepFunctionExecutionSensor" title="airflow.providers.amazon.aws.sensors.step_function.StepFunctionExecutionSensor"><code class="xref py py-class docutils literal notranslate"><span class="pre">StepFunctionExecutionSensor</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_step_functions.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_step_functions.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_step_functions.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_step_functions.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">wait_for_execution</span> <span class="o">=</span> <span class="n">StepFunctionExecutionSensor</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;wait_for_execution&#39;</span><span class="p">,</span> <span class="n">execution_arn</span><span class="o">=</span><span class="n">start_execution</span><span class="o">.</span><span class="n">output</span>
 <span class="p">)</span>
@@ -647,7 +647,7 @@ use <a class="reference internal" href="../_api/airflow/providers/amazon/aws/sen
 <p>To fetch the output from an AWS Step Function State Machine execution you can
 use <a class="reference internal" href="../_api/airflow/providers/amazon/aws/operators/step_function/index.html#airflow.providers.amazon.aws.operators.step_function.StepFunctionGetExecutionOutputOperator" title="airflow.providers.amazon.aws.operators.step_function.StepFunctionGetExecutionOutputOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">StepFunctionGetExecutionOutputOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_step_functions.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/amazon/aws/example_dags/example_step_functions.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_step_functions.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_step_functions.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">get_execution_output</span> <span class="o">=</span> <span class="n">StepFunctionGetExecutionOutputOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;get_execution_output&#39;</span><span class="p">,</span> <span class="n">execution_arn</span><span class="o">=</span><span class="n">start_execution</span><span class="o">.</span><span class="n">output</span>
 <span class="p">)</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/transfer/dynamodb_to_s3.html b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/transfer/dynamodb_to_s3.html
index f79d69d968..22a0648d89 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/transfer/dynamodb_to_s3.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/transfer/dynamodb_to_s3.html
@@ -630,7 +630,7 @@ records that satisfy the criteria.</p>
 <a class="reference internal" href="../../_api/airflow/providers/amazon/aws/transfers/dynamodb_to_s3/index.html#airflow.providers.amazon.aws.transfers.dynamodb_to_s3.DynamoDBToS3Operator" title="airflow.providers.amazon.aws.transfers.dynamodb_to_s3.DynamoDBToS3Operator"><code class="xref py py-class docutils literal notranslate"><span class="pre">DynamoDBToS3Operator</span></code></a></p>
 <p>Example usage:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_dynamodb_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/amazon/aws/example_dags/example_dynamodb_to_s3.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_dynamodb_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_dynamodb_to_s3.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">backup_db</span> <span class="o">=</span> <span class="n">DynamoDBToS3Operator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;backup_db&#39;</span><span class="p">,</span>
     <span class="n">dynamodb_table_name</span><span class="o">=</span><span class="n">TABLE_NAME</span><span class="p">,</span>
@@ -644,7 +644,7 @@ records that satisfy the criteria.</p>
 <p>To parallelize the replication, users can create multiple DynamoDBToS3Operator tasks using the
 <code class="docutils literal notranslate"><span class="pre">TotalSegments</span></code> parameter.  For instance to replicate with parallelism of 2, create two tasks:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_dynamodb_to_s3_segmented.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/amazon/aws/example_dags/example_dynamodb_to_s3_segmented.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_dynamodb_to_s3_segmented.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_dynamodb_to_s3_segmented.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="c1"># Segmenting allows the transfer to be parallelized into {segment} number of parallel tasks.</span>
 <span class="n">backup_db_segment_1</span> <span class="o">=</span> <span class="n">DynamoDBToS3Operator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;backup-1&#39;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/transfer/ftp_to_s3.html b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/transfer/ftp_to_s3.html
index 2e19e4d23e..95b56aae34 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/transfer/ftp_to_s3.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/transfer/ftp_to_s3.html
@@ -624,7 +624,7 @@
 <a class="reference internal" href="../../_api/airflow/providers/amazon/aws/transfers/ftp_to_s3/index.html#airflow.providers.amazon.aws.transfers.ftp_to_s3.FTPToS3Operator" title="airflow.providers.amazon.aws.transfers.ftp_to_s3.FTPToS3Operator"><code class="xref py py-class docutils literal notranslate"><span class="pre">FTPToS3Operator</span></code></a></p>
 <p>Example usage:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_ftp_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/amazon/aws/example_dags/example_ftp_to_s3.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_ftp_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_ftp_to_s3.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">ftp_to_s3_task</span> <span class="o">=</span> <span class="n">FTPToS3Operator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;ftp_to_s3_task&quot;</span><span class="p">,</span>
     <span class="n">ftp_path</span><span class="o">=</span><span class="s2">&quot;/tmp/ftp_path&quot;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/transfer/glacier_to_gcs.html b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/transfer/glacier_to_gcs.html
index b864426f70..192587c3d8 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/transfer/glacier_to_gcs.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/transfer/glacier_to_gcs.html
@@ -621,7 +621,7 @@
 <p>To transfer data from an Amazon Glacier vault to Google Cloud Storage.
 use <a class="reference internal" href="../../_api/airflow/providers/amazon/aws/transfers/glacier_to_gcs/index.html#airflow.providers.amazon.aws.transfers.glacier_to_gcs.GlacierToGCSOperator" title="airflow.providers.amazon.aws.transfers.glacier_to_gcs.GlacierToGCSOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">GlacierToGCSOperator</span></code></a></p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_glacier_to_gcs.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">transfer_archive_to_gcs</span> <span class="o">=</span> <span class="n">GlacierToGCSOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;transfer_archive_to_gcs&quot;</span><span class="p">,</span>
     <span class="n">vault_name</span><span class="o">=</span><span class="n">VAULT_NAME</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/transfer/google_api_to_s3.html b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/transfer/google_api_to_s3.html
index 3bed98e9c4..c583017995 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/transfer/google_api_to_s3.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/transfer/google_api_to_s3.html
@@ -622,7 +622,7 @@ on Amazon S3.</p>
 <span id="howto-operator-googleapitos3transfer"></span><h2>Google Sheets to Amazon S3<a class="headerlink" href="#google-sheets-to-amazon-s3" title="Permalink to this headline">¶</a></h2>
 <p>This example loads data from Google Sheets and save it to an Amazon S3 file.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_sheets_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/amazon/aws/example_dags/example_google_api_sheets_to_s3.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_sheets_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_google_api_sheets_to_s3.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">task_google_sheets_values_to_s3</span> <span class="o">=</span> <span class="n">GoogleApiToS3Operator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;google_sheet_data_to_s3&#39;</span><span class="p">,</span>
     <span class="n">google_api_service_name</span><span class="o">=</span><span class="s1">&#39;sheets&#39;</span><span class="p">,</span>
@@ -647,7 +647,7 @@ tasks to retrieve specific information about YouTube videos.</p>
 (<code class="docutils literal notranslate"><span class="pre">YOUTUBE_VIDEO_PUBLISHED_AFTER</span></code>, <code class="docutils literal notranslate"><span class="pre">YOUTUBE_VIDEO_PUBLISHED_BEFORE</span></code>) on a YouTube channel (<code class="docutils literal notranslate"><span class="pre">YOUTUBE_CHANNEL_ID</span></code>)
 saves the response in Amazon S3 and also pushes the data to xcom.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_youtube_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/amazon/aws/example_dags/example_google_api_youtube_to_s3.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_youtube_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_google_api_youtube_to_s3.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">task_video_ids_to_s3</span> <span class="o">=</span> <span class="n">GoogleApiToS3Operator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;video_ids_to_s3&#39;</span><span class="p">,</span>
     <span class="n">google_api_service_name</span><span class="o">=</span><span class="s1">&#39;youtube&#39;</span><span class="p">,</span>
@@ -672,7 +672,7 @@ saves the response in Amazon S3 and also pushes the data to xcom.</p>
 <p>It passes over the YouTube IDs to the next request which then gets the
 information (<code class="docutils literal notranslate"><span class="pre">YOUTUBE_VIDEO_FIELDS</span></code>) for the requested videos and saves them in Amazon S3 (<code class="docutils literal notranslate"><span class="pre">S3_BUCKET_NAME</span></code>).</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_youtube_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/amazon/aws/example_dags/example_google_api_youtube_to_s3.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_google_api_youtube_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_google_api_youtube_to_s3.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">task_video_data_to_s3</span> <span class="o">=</span> <span class="n">GoogleApiToS3Operator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;video_data_to_s3&#39;</span><span class="p">,</span>
     <span class="n">google_api_service_name</span><span class="o">=</span><span class="s1">&#39;youtube&#39;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/transfer/hive_to_dynamodb.html b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/transfer/hive_to_dynamodb.html
index 0cd853f768..e93f37f86e 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/transfer/hive_to_dynamodb.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/transfer/hive_to_dynamodb.html
@@ -627,7 +627,7 @@ to use as filtering criteria.</p>
 <a class="reference internal" href="../../_api/airflow/providers/amazon/aws/transfers/hive_to_dynamodb/index.html#airflow.providers.amazon.aws.transfers.hive_to_dynamodb.HiveToDynamoDBOperator" title="airflow.providers.amazon.aws.transfers.hive_to_dynamodb.HiveToDynamoDBOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">HiveToDynamoDBOperator</span></code></a></p>
 <p>Example usage:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_hive_to_dynamodb.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/amazon/aws/example_dags/example_hive_to_dynamodb.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_hive_to_dynamodb.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_hive_to_dynamodb.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">backup_to_dynamodb</span> <span class="o">=</span> <span class="n">HiveToDynamoDBOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;backup_to_dynamodb&#39;</span><span class="p">,</span>
     <span class="n">hiveserver2_conn_id</span><span class="o">=</span><span class="n">HIVE_CONNECTION_ID</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/transfer/imap_attachment_to_s3.html b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/transfer/imap_attachment_to_s3.html
index 50bbe61017..1009b3ef9c 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/transfer/imap_attachment_to_s3.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/transfer/imap_attachment_to_s3.html
@@ -621,7 +621,7 @@ protocol from a mail server to an Amazon S3 Bucket.</p>
 <div class="section" id="imap-attachment-to-amazon-s3">
 <span id="howto-operator-imapattachmenttos3operator"></span><h2>Imap Attachment To Amazon S3<a class="headerlink" href="#imap-attachment-to-amazon-s3" title="Permalink to this headline">¶</a></h2>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_imap_attachment_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/amazon/aws/example_dags/example_imap_attachment_to_s3.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_imap_attachment_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_imap_attachment_to_s3.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">task_transfer_imap_attachment_to_s3</span> <span class="o">=</span> <span class="n">ImapAttachmentToS3Operator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;transfer_imap_attachment_to_s3&#39;</span><span class="p">,</span>
     <span class="n">imap_attachment_name</span><span class="o">=</span><span class="n">IMAP_ATTACHMENT_NAME</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/transfer/local_to_s3.html b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/transfer/local_to_s3.html
index f6cbed1e95..3bb2dc577f 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/transfer/local_to_s3.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/transfer/local_to_s3.html
@@ -625,7 +625,7 @@ to an Amazon Simple Storage Service (S3) file.</p>
 <a class="reference internal" href="../../_api/airflow/providers/amazon/aws/transfers/local_to_s3/index.html#airflow.providers.amazon.aws.transfers.local_to_s3.LocalFilesystemToS3Operator" title="airflow.providers.amazon.aws.transfers.local_to_s3.LocalFilesystemToS3Operator"><code class="xref py py-class docutils literal notranslate"><span class="pre">LocalFilesystemToS3Operator</span></code></a></p>
 <p>Example usage:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_local_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/amazon/aws/example_dags/example_local_to_s3.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_local_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_local_to_s3.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">create_local_to_s3_job</span> <span class="o">=</span> <span class="n">LocalFilesystemToS3Operator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;create_local_to_s3_job&quot;</span><span class="p">,</span>
     <span class="n">filename</span><span class="o">=</span><span class="s2">&quot;relative/path/to/file.csv&quot;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/transfer/mongo_to_s3.html b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/transfer/mongo_to_s3.html
index 9dbd05f83a..acf94e06a9 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/transfer/mongo_to_s3.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/transfer/mongo_to_s3.html
@@ -625,7 +625,7 @@ In order to select the data you want to copy, you need to use the <code class="d
 <a class="reference internal" href="../../_api/airflow/providers/amazon/aws/transfers/mongo_to_s3/index.html#airflow.providers.amazon.aws.transfers.mongo_to_s3.MongoToS3Operator" title="airflow.providers.amazon.aws.transfers.mongo_to_s3.MongoToS3Operator"><code class="xref py py-class docutils literal notranslate"><span class="pre">MongoToS3Operator</span></code></a></p>
 <p>Example usage:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_mongo_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/amazon/aws/example_dags/example_mongo_to_s3.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_mongo_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_mongo_to_s3.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">create_local_to_s3_job</span> <span class="o">=</span> <span class="n">MongoToS3Operator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;create_mongo_to_s3_job&quot;</span><span class="p">,</span>
     <span class="n">mongo_collection</span><span class="o">=</span><span class="n">MONGO_COLLECTION</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/transfer/redshift_to_s3.html b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/transfer/redshift_to_s3.html
index efe8fa45ed..a9504f00a7 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/transfer/redshift_to_s3.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/transfer/redshift_to_s3.html
@@ -625,7 +625,7 @@ Service (S3) file.</p>
 <a class="reference internal" href="../../_api/airflow/providers/amazon/aws/transfers/redshift_to_s3/index.html#airflow.providers.amazon.aws.transfers.redshift_to_s3.RedshiftToS3Operator" title="airflow.providers.amazon.aws.transfers.redshift_to_s3.RedshiftToS3Operator"><code class="xref py py-class docutils literal notranslate"><span class="pre">RedshiftToS3Operator</span></code></a></p>
 <p>Example usage:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/amazon/aws/example_dags/example_redshift_to_s3.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_redshift_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_redshift_to_s3.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">task_transfer_redshift_to_s3</span> <span class="o">=</span> <span class="n">RedshiftToS3Operator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;transfer_redshift_to_s3&#39;</span><span class="p">,</span>
     <span class="n">s3_bucket</span><span class="o">=</span><span class="n">S3_BUCKET_NAME</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/transfer/s3_to_ftp.html b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/transfer/s3_to_ftp.html
index 0341aadb9a..75b20f524b 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/transfer/s3_to_ftp.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/transfer/s3_to_ftp.html
@@ -625,7 +625,7 @@ using FTP protocol.</p>
 <a class="reference internal" href="../../_api/airflow/providers/amazon/aws/transfers/s3_to_ftp/index.html#airflow.providers.amazon.aws.transfers.s3_to_ftp.S3ToFTPOperator" title="airflow.providers.amazon.aws.transfers.s3_to_ftp.S3ToFTPOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">S3ToFTPOperator</span></code></a></p>
 <p>Example usage:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3_to_ftp.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/amazon/aws/example_dags/example_s3_to_ftp.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3_to_ftp.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_s3_to_ftp.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">s3_to_ftp_task</span> <span class="o">=</span> <span class="n">S3ToFTPOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;ftp_to_s3_task&quot;</span><span class="p">,</span>
     <span class="n">ftp_path</span><span class="o">=</span><span class="s2">&quot;/tmp/ftp_path&quot;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/transfer/s3_to_redshift.html b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/transfer/s3_to_redshift.html
index fe2cdae942..79081cdba7 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/transfer/s3_to_redshift.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/transfer/s3_to_redshift.html
@@ -625,7 +625,7 @@ Amazon Redshift table.</p>
 <a class="reference internal" href="../../_api/airflow/providers/amazon/aws/transfers/s3_to_redshift/index.html#airflow.providers.amazon.aws.transfers.s3_to_redshift.S3ToRedshiftOperator" title="airflow.providers.amazon.aws.transfers.s3_to_redshift.S3ToRedshiftOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">S3ToRedshiftOperator</span></code></a></p>
 <p>Example usage:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3_to_redshift.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/amazon/aws/example_dags/example_s3_to_redshift.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3_to_redshift.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_s3_to_redshift.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">task_transfer_s3_to_redshift</span> <span class="o">=</span> <span class="n">S3ToRedshiftOperator</span><span class="p">(</span>
     <span class="n">s3_bucket</span><span class="o">=</span><span class="n">S3_BUCKET_NAME</span><span class="p">,</span>
     <span class="n">s3_key</span><span class="o">=</span><span class="n">S3_KEY</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/transfer/s3_to_sftp.html b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/transfer/s3_to_sftp.html
index 08cf45d6c6..06169acd12 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/transfer/s3_to_sftp.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/transfer/s3_to_sftp.html
@@ -626,7 +626,7 @@ For more information about the service visits <a class="reference external" href
 <a class="reference internal" href="../../_api/airflow/providers/amazon/aws/transfers/s3_to_sftp/index.html#airflow.providers.amazon.aws.transfers.s3_to_sftp.S3ToSFTPOperator" title="airflow.providers.amazon.aws.transfers.s3_to_sftp.S3ToSFTPOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">S3ToSFTPOperator</span></code></a></p>
 <p>Example usage:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3_to_sftp.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/amazon/aws/example_dags/example_s3_to_sftp.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_s3_to_sftp.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_s3_to_sftp.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">create_s3_to_sftp_job</span> <span class="o">=</span> <span class="n">S3ToSFTPOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;create_s3_to_sftp_job&quot;</span><span class="p">,</span>
     <span class="n">sftp_path</span><span class="o">=</span><span class="s2">&quot;sftp_path&quot;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/transfer/salesforce_to_s3.html b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/transfer/salesforce_to_s3.html
index e98c8816f6..ffce89ddc8 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/transfer/salesforce_to_s3.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/transfer/salesforce_to_s3.html
@@ -624,7 +624,7 @@ to execute a Salesforce query to fetch data and upload to an Amazon S3 bucket.</
 <p>The following example demonstrates a use case of extracting account data from a Salesforce
 instance and upload to an Amazon S3 bucket.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_salesforce_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/amazon/aws/example_dags/example_salesforce_to_s3.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_salesforce_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_salesforce_to_s3.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">upload_salesforce_data_to_s3</span> <span class="o">=</span> <span class="n">SalesforceToS3Operator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;upload_salesforce_to_s3&quot;</span><span class="p">,</span>
     <span class="n">salesforce_query</span><span class="o">=</span><span class="s2">&quot;SELECT AccountNumber, Name FROM Account&quot;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/transfer/sftp_to_s3.html b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/transfer/sftp_to_s3.html
index cad9bfd83a..cca5e1c70c 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/transfer/sftp_to_s3.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/transfer/sftp_to_s3.html
@@ -625,7 +625,7 @@ For more information about the service visits <a class="reference external" href
 <a class="reference internal" href="../../_api/airflow/providers/amazon/aws/transfers/sftp_to_s3/index.html#airflow.providers.amazon.aws.transfers.sftp_to_s3.SFTPToS3Operator" title="airflow.providers.amazon.aws.transfers.sftp_to_s3.SFTPToS3Operator"><code class="xref py py-class docutils literal notranslate"><span class="pre">SFTPToS3Operator</span></code></a></p>
 <p>Example usage:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_sftp_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/amazon/aws/example_dags/example_sftp_to_s3.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_sftp_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_sftp_to_s3.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">create_sftp_to_s3_job</span> <span class="o">=</span> <span class="n">SFTPToS3Operator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;create_sftp_to_s3_job&quot;</span><span class="p">,</span>
     <span class="n">sftp_path</span><span class="o">=</span><span class="s2">&quot;/tmp/sftp_path&quot;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/transfer/sql_to_s3.html b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/transfer/sql_to_s3.html
index 3198ac5501..4dd743bf5c 100644
--- a/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/transfer/sql_to_s3.html
+++ b/docs-archive/apache-airflow-providers-amazon/3.4.0/operators/transfer/sql_to_s3.html
@@ -627,7 +627,7 @@ converts the SQL result to <a class="reference external" href="https://pandas.py
 <a class="reference internal" href="../../_api/airflow/providers/amazon/aws/transfers/sql_to_s3/index.html#airflow.providers.amazon.aws.transfers.sql_to_s3.SqlToS3Operator" title="airflow.providers.amazon.aws.transfers.sql_to_s3.SqlToS3Operator"><code class="xref py py-class docutils literal notranslate"><span class="pre">SqlToS3Operator</span></code></a></p>
 <p>Example usage:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_sql_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/amazon/aws/example_dags/example_sql_to_s3.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/amazon/aws/example_dags/example_sql_to_s3.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-amazon/3.4.0/airflow/providers/amazon/aws/example_dags/example_sql_to_s3.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">sql_to_s3_task</span> <span class="o">=</span> <span class="n">SqlToS3Operator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;sql_to_s3_task&quot;</span><span class="p">,</span>
     <span class="n">sql_conn_id</span><span class="o">=</span><span class="s2">&quot;mysql_default&quot;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-apache-beam/1.0.0/operators.html b/docs-archive/apache-airflow-providers-apache-beam/1.0.0/operators.html
index 7015bdd3d1..9c19144048 100644
--- a/docs-archive/apache-airflow-providers-apache-beam/1.0.0/operators.html
+++ b/docs-archive/apache-airflow-providers-apache-beam/1.0.0/operators.html
@@ -602,7 +602,7 @@ recommend avoiding unless the Dataflow job requires it.</p>
 <div class="section" id="python-pipelines-with-directrunner">
 <h2>Python Pipelines with DirectRunner<a class="headerlink" href="#python-pipelines-with-directrunner" title="Permalink to this headline">¶</a></h2>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/1.0.0/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">start_python_pipeline_local_direct_runner</span> <span class="o">=</span> <span class="n">BeamRunPythonPipelineOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;start_python_pipeline_local_direct_runner&quot;</span><span class="p">,</span>
     <span class="n">py_file</span><span class="o">=</span><span class="s1">&#39;apache_beam.examples.wordcount&#39;</span><span class="p">,</span>
@@ -615,7 +615,7 @@ recommend avoiding unless the Dataflow job requires it.</p>
 </div>
 </div>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/1.0.0/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">start_python_pipeline_direct_runner</span> <span class="o">=</span> <span class="n">BeamRunPythonPipelineOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;start_python_pipeline_direct_runner&quot;</span><span class="p">,</span>
     <span class="n">py_file</span><span class="o">=</span><span class="n">GCS_PYTHON</span><span class="p">,</span>
@@ -632,7 +632,7 @@ recommend avoiding unless the Dataflow job requires it.</p>
 <div class="section" id="python-pipelines-with-dataflowrunner">
 <h2>Python Pipelines with DataflowRunner<a class="headerlink" href="#python-pipelines-with-dataflowrunner" title="Permalink to this headline">¶</a></h2>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/1.0.0/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">start_python_pipeline_dataflow_runner</span> <span class="o">=</span> <span class="n">BeamRunPythonPipelineOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;start_python_pipeline_dataflow_runner&quot;</span><span class="p">,</span>
     <span class="n">runner</span><span class="o">=</span><span class="s2">&quot;DataflowRunner&quot;</span><span class="p">,</span>
@@ -654,7 +654,7 @@ recommend avoiding unless the Dataflow job requires it.</p>
 </div>
 </div>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/1.0.0/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">start_python_job_dataflow_runner_async</span> <span class="o">=</span> <span class="n">BeamRunPythonPipelineOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;start_python_job_dataflow_runner_async&quot;</span><span class="p">,</span>
     <span class="n">runner</span><span class="o">=</span><span class="s2">&quot;DataflowRunner&quot;</span><span class="p">,</span>
@@ -699,7 +699,7 @@ has the ability to download or available on the local filesystem (provide the ab
 <div class="section" id="java-pipelines-with-directrunner">
 <h2>Java Pipelines with DirectRunner<a class="headerlink" href="#java-pipelines-with-directrunner" title="Permalink to this headline">¶</a></h2>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/1.0.0/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">jar_to_local_direct_runner</span> <span class="o">=</span> <span class="n">GCSToLocalFilesystemOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;jar_to_local_direct_runner&quot;</span><span class="p">,</span>
     <span class="n">bucket</span><span class="o">=</span><span class="n">GCS_JAR_DIRECT_RUNNER_BUCKET_NAME</span><span class="p">,</span>
@@ -725,7 +725,7 @@ has the ability to download or available on the local filesystem (provide the ab
 <div class="section" id="java-pipelines-with-dataflowrunner">
 <h2>Java Pipelines with DataflowRunner<a class="headerlink" href="#java-pipelines-with-dataflowrunner" title="Permalink to this headline">¶</a></h2>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/1.0.0/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">jar_to_local_dataflow_runner</span> <span class="o">=</span> <span class="n">GCSToLocalFilesystemOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;jar_to_local_dataflow_runner&quot;</span><span class="p">,</span>
     <span class="n">bucket</span><span class="o">=</span><span class="n">GCS_JAR_DATAFLOW_RUNNER_BUCKET_NAME</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-apache-beam/1.0.1/operators.html b/docs-archive/apache-airflow-providers-apache-beam/1.0.1/operators.html
index 7015bdd3d1..0790aec3ca 100644
--- a/docs-archive/apache-airflow-providers-apache-beam/1.0.1/operators.html
+++ b/docs-archive/apache-airflow-providers-apache-beam/1.0.1/operators.html
@@ -602,7 +602,7 @@ recommend avoiding unless the Dataflow job requires it.</p>
 <div class="section" id="python-pipelines-with-directrunner">
 <h2>Python Pipelines with DirectRunner<a class="headerlink" href="#python-pipelines-with-directrunner" title="Permalink to this headline">¶</a></h2>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/1.0.1/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">start_python_pipeline_local_direct_runner</span> <span class="o">=</span> <span class="n">BeamRunPythonPipelineOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;start_python_pipeline_local_direct_runner&quot;</span><span class="p">,</span>
     <span class="n">py_file</span><span class="o">=</span><span class="s1">&#39;apache_beam.examples.wordcount&#39;</span><span class="p">,</span>
@@ -615,7 +615,7 @@ recommend avoiding unless the Dataflow job requires it.</p>
 </div>
 </div>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/1.0.1/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">start_python_pipeline_direct_runner</span> <span class="o">=</span> <span class="n">BeamRunPythonPipelineOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;start_python_pipeline_direct_runner&quot;</span><span class="p">,</span>
     <span class="n">py_file</span><span class="o">=</span><span class="n">GCS_PYTHON</span><span class="p">,</span>
@@ -632,7 +632,7 @@ recommend avoiding unless the Dataflow job requires it.</p>
 <div class="section" id="python-pipelines-with-dataflowrunner">
 <h2>Python Pipelines with DataflowRunner<a class="headerlink" href="#python-pipelines-with-dataflowrunner" title="Permalink to this headline">¶</a></h2>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/1.0.1/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">start_python_pipeline_dataflow_runner</span> <span class="o">=</span> <span class="n">BeamRunPythonPipelineOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;start_python_pipeline_dataflow_runner&quot;</span><span class="p">,</span>
     <span class="n">runner</span><span class="o">=</span><span class="s2">&quot;DataflowRunner&quot;</span><span class="p">,</span>
@@ -654,7 +654,7 @@ recommend avoiding unless the Dataflow job requires it.</p>
 </div>
 </div>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/1.0.1/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">start_python_job_dataflow_runner_async</span> <span class="o">=</span> <span class="n">BeamRunPythonPipelineOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;start_python_job_dataflow_runner_async&quot;</span><span class="p">,</span>
     <span class="n">runner</span><span class="o">=</span><span class="s2">&quot;DataflowRunner&quot;</span><span class="p">,</span>
@@ -699,7 +699,7 @@ has the ability to download or available on the local filesystem (provide the ab
 <div class="section" id="java-pipelines-with-directrunner">
 <h2>Java Pipelines with DirectRunner<a class="headerlink" href="#java-pipelines-with-directrunner" title="Permalink to this headline">¶</a></h2>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/1.0.1/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">jar_to_local_direct_runner</span> <span class="o">=</span> <span class="n">GCSToLocalFilesystemOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;jar_to_local_direct_runner&quot;</span><span class="p">,</span>
     <span class="n">bucket</span><span class="o">=</span><span class="n">GCS_JAR_DIRECT_RUNNER_BUCKET_NAME</span><span class="p">,</span>
@@ -725,7 +725,7 @@ has the ability to download or available on the local filesystem (provide the ab
 <div class="section" id="java-pipelines-with-dataflowrunner">
 <h2>Java Pipelines with DataflowRunner<a class="headerlink" href="#java-pipelines-with-dataflowrunner" title="Permalink to this headline">¶</a></h2>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/1.0.1/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">jar_to_local_dataflow_runner</span> <span class="o">=</span> <span class="n">GCSToLocalFilesystemOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;jar_to_local_dataflow_runner&quot;</span><span class="p">,</span>
     <span class="n">bucket</span><span class="o">=</span><span class="n">GCS_JAR_DATAFLOW_RUNNER_BUCKET_NAME</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-apache-beam/2.0.0/operators.html b/docs-archive/apache-airflow-providers-apache-beam/2.0.0/operators.html
index efde990dea..11475e4f87 100644
--- a/docs-archive/apache-airflow-providers-apache-beam/2.0.0/operators.html
+++ b/docs-archive/apache-airflow-providers-apache-beam/2.0.0/operators.html
@@ -602,7 +602,7 @@ recommend avoiding unless the Dataflow job requires it.</p>
 <div class="section" id="python-pipelines-with-directrunner">
 <h2>Python Pipelines with DirectRunner<a class="headerlink" href="#python-pipelines-with-directrunner" title="Permalink to this headline">¶</a></h2>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/2.0.0/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">start_python_pipeline_local_direct_runner</span> <span class="o">=</span> <span class="n">BeamRunPythonPipelineOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;start_python_pipeline_local_direct_runner&quot;</span><span class="p">,</span>
     <span class="n">py_file</span><span class="o">=</span><span class="s1">&#39;apache_beam.examples.wordcount&#39;</span><span class="p">,</span>
@@ -615,7 +615,7 @@ recommend avoiding unless the Dataflow job requires it.</p>
 </div>
 </div>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/2.0.0/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">start_python_pipeline_direct_runner</span> <span class="o">=</span> <span class="n">BeamRunPythonPipelineOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;start_python_pipeline_direct_runner&quot;</span><span class="p">,</span>
     <span class="n">py_file</span><span class="o">=</span><span class="n">GCS_PYTHON</span><span class="p">,</span>
@@ -632,7 +632,7 @@ recommend avoiding unless the Dataflow job requires it.</p>
 <div class="section" id="python-pipelines-with-dataflowrunner">
 <h2>Python Pipelines with DataflowRunner<a class="headerlink" href="#python-pipelines-with-dataflowrunner" title="Permalink to this headline">¶</a></h2>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/2.0.0/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">start_python_pipeline_dataflow_runner</span> <span class="o">=</span> <span class="n">BeamRunPythonPipelineOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;start_python_pipeline_dataflow_runner&quot;</span><span class="p">,</span>
     <span class="n">runner</span><span class="o">=</span><span class="s2">&quot;DataflowRunner&quot;</span><span class="p">,</span>
@@ -654,7 +654,7 @@ recommend avoiding unless the Dataflow job requires it.</p>
 </div>
 </div>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/2.0.0/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">start_python_job_dataflow_runner_async</span> <span class="o">=</span> <span class="n">BeamRunPythonPipelineOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;start_python_job_dataflow_runner_async&quot;</span><span class="p">,</span>
     <span class="n">runner</span><span class="o">=</span><span class="s2">&quot;DataflowRunner&quot;</span><span class="p">,</span>
@@ -699,7 +699,7 @@ has the ability to download or available on the local filesystem (provide the ab
 <div class="section" id="java-pipelines-with-directrunner">
 <h2>Java Pipelines with DirectRunner<a class="headerlink" href="#java-pipelines-with-directrunner" title="Permalink to this headline">¶</a></h2>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/2.0.0/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">jar_to_local_direct_runner</span> <span class="o">=</span> <span class="n">GCSToLocalFilesystemOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;jar_to_local_direct_runner&quot;</span><span class="p">,</span>
     <span class="n">bucket</span><span class="o">=</span><span class="n">GCS_JAR_DIRECT_RUNNER_BUCKET_NAME</span><span class="p">,</span>
@@ -725,7 +725,7 @@ has the ability to download or available on the local filesystem (provide the ab
 <div class="section" id="java-pipelines-with-dataflowrunner">
 <h2>Java Pipelines with DataflowRunner<a class="headerlink" href="#java-pipelines-with-dataflowrunner" title="Permalink to this headline">¶</a></h2>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/2.0.0/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">jar_to_local_dataflow_runner</span> <span class="o">=</span> <span class="n">GCSToLocalFilesystemOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;jar_to_local_dataflow_runner&quot;</span><span class="p">,</span>
     <span class="n">bucket</span><span class="o">=</span><span class="n">GCS_JAR_DATAFLOW_RUNNER_BUCKET_NAME</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-apache-beam/3.0.0/operators.html b/docs-archive/apache-airflow-providers-apache-beam/3.0.0/operators.html
index 1f44f2e62c..dffc8eb9dd 100644
--- a/docs-archive/apache-airflow-providers-apache-beam/3.0.0/operators.html
+++ b/docs-archive/apache-airflow-providers-apache-beam/3.0.0/operators.html
@@ -601,7 +601,7 @@ recommend avoiding unless the Dataflow job requires it.</p>
 <div class="section" id="python-pipelines-with-directrunner">
 <h2>Python Pipelines with DirectRunner<a class="headerlink" href="#python-pipelines-with-directrunner" title="Permalink to this headline">¶</a></h2>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/3.0.0/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">start_python_pipeline_local_direct_runner</span> <span class="o">=</span> <span class="n">BeamRunPythonPipelineOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;start_python_pipeline_local_direct_runner&quot;</span><span class="p">,</span>
     <span class="n">py_file</span><span class="o">=</span><span class="s1">&#39;apache_beam.examples.wordcount&#39;</span><span class="p">,</span>
@@ -614,7 +614,7 @@ recommend avoiding unless the Dataflow job requires it.</p>
 </div>
 </div>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/3.0.0/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">start_python_pipeline_direct_runner</span> <span class="o">=</span> <span class="n">BeamRunPythonPipelineOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;start_python_pipeline_direct_runner&quot;</span><span class="p">,</span>
     <span class="n">py_file</span><span class="o">=</span><span class="n">GCS_PYTHON</span><span class="p">,</span>
@@ -631,7 +631,7 @@ recommend avoiding unless the Dataflow job requires it.</p>
 <div class="section" id="python-pipelines-with-dataflowrunner">
 <h2>Python Pipelines with DataflowRunner<a class="headerlink" href="#python-pipelines-with-dataflowrunner" title="Permalink to this headline">¶</a></h2>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/3.0.0/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">start_python_pipeline_dataflow_runner</span> <span class="o">=</span> <span class="n">BeamRunPythonPipelineOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;start_python_pipeline_dataflow_runner&quot;</span><span class="p">,</span>
     <span class="n">runner</span><span class="o">=</span><span class="s2">&quot;DataflowRunner&quot;</span><span class="p">,</span>
@@ -653,7 +653,7 @@ recommend avoiding unless the Dataflow job requires it.</p>
 </div>
 </div>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/3.0.0/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">start_python_job_dataflow_runner_async</span> <span class="o">=</span> <span class="n">BeamRunPythonPipelineOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;start_python_job_dataflow_runner_async&quot;</span><span class="p">,</span>
     <span class="n">runner</span><span class="o">=</span><span class="s2">&quot;DataflowRunner&quot;</span><span class="p">,</span>
@@ -698,7 +698,7 @@ has the ability to download or available on the local filesystem (provide the ab
 <div class="section" id="java-pipelines-with-directrunner">
 <h2>Java Pipelines with DirectRunner<a class="headerlink" href="#java-pipelines-with-directrunner" title="Permalink to this headline">¶</a></h2>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/3.0.0/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">jar_to_local_direct_runner</span> <span class="o">=</span> <span class="n">GCSToLocalFilesystemOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;jar_to_local_direct_runner&quot;</span><span class="p">,</span>
     <span class="n">bucket</span><span class="o">=</span><span class="n">GCS_JAR_DIRECT_RUNNER_BUCKET_NAME</span><span class="p">,</span>
@@ -724,7 +724,7 @@ has the ability to download or available on the local filesystem (provide the ab
 <div class="section" id="java-pipelines-with-dataflowrunner">
 <h2>Java Pipelines with DataflowRunner<a class="headerlink" href="#java-pipelines-with-dataflowrunner" title="Permalink to this headline">¶</a></h2>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/3.0.0/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">jar_to_local_dataflow_runner</span> <span class="o">=</span> <span class="n">GCSToLocalFilesystemOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;jar_to_local_dataflow_runner&quot;</span><span class="p">,</span>
     <span class="n">bucket</span><span class="o">=</span><span class="n">GCS_JAR_DATAFLOW_RUNNER_BUCKET_NAME</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-apache-beam/3.0.1/operators.html b/docs-archive/apache-airflow-providers-apache-beam/3.0.1/operators.html
index 100b6ded38..4e504b20ff 100644
--- a/docs-archive/apache-airflow-providers-apache-beam/3.0.1/operators.html
+++ b/docs-archive/apache-airflow-providers-apache-beam/3.0.1/operators.html
@@ -603,7 +603,7 @@ recommend avoiding unless the Dataflow job requires it.</p>
 <div class="section" id="python-pipelines-with-directrunner">
 <h2>Python Pipelines with DirectRunner<a class="headerlink" href="#python-pipelines-with-directrunner" title="Permalink to this headline">¶</a></h2>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/3.0.1/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">start_python_pipeline_local_direct_runner</span> <span class="o">=</span> <span class="n">BeamRunPythonPipelineOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;start_python_pipeline_local_direct_runner&quot;</span><span class="p">,</span>
     <span class="n">py_file</span><span class="o">=</span><span class="s1">&#39;apache_beam.examples.wordcount&#39;</span><span class="p">,</span>
@@ -616,7 +616,7 @@ recommend avoiding unless the Dataflow job requires it.</p>
 </div>
 </div>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/3.0.1/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">start_python_pipeline_direct_runner</span> <span class="o">=</span> <span class="n">BeamRunPythonPipelineOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;start_python_pipeline_direct_runner&quot;</span><span class="p">,</span>
     <span class="n">py_file</span><span class="o">=</span><span class="n">GCS_PYTHON</span><span class="p">,</span>
@@ -633,7 +633,7 @@ recommend avoiding unless the Dataflow job requires it.</p>
 <div class="section" id="python-pipelines-with-dataflowrunner">
 <h2>Python Pipelines with DataflowRunner<a class="headerlink" href="#python-pipelines-with-dataflowrunner" title="Permalink to this headline">¶</a></h2>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/3.0.1/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">start_python_pipeline_dataflow_runner</span> <span class="o">=</span> <span class="n">BeamRunPythonPipelineOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;start_python_pipeline_dataflow_runner&quot;</span><span class="p">,</span>
     <span class="n">runner</span><span class="o">=</span><span class="s2">&quot;DataflowRunner&quot;</span><span class="p">,</span>
@@ -655,7 +655,7 @@ recommend avoiding unless the Dataflow job requires it.</p>
 </div>
 </div>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/3.0.1/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">start_python_job_dataflow_runner_async</span> <span class="o">=</span> <span class="n">BeamRunPythonPipelineOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;start_python_job_dataflow_runner_async&quot;</span><span class="p">,</span>
     <span class="n">runner</span><span class="o">=</span><span class="s2">&quot;DataflowRunner&quot;</span><span class="p">,</span>
@@ -700,7 +700,7 @@ has the ability to download or available on the local filesystem (provide the ab
 <div class="section" id="java-pipelines-with-directrunner">
 <h2>Java Pipelines with DirectRunner<a class="headerlink" href="#java-pipelines-with-directrunner" title="Permalink to this headline">¶</a></h2>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/3.0.1/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">jar_to_local_direct_runner</span> <span class="o">=</span> <span class="n">GCSToLocalFilesystemOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;jar_to_local_direct_runner&quot;</span><span class="p">,</span>
     <span class="n">bucket</span><span class="o">=</span><span class="n">GCS_JAR_DIRECT_RUNNER_BUCKET_NAME</span><span class="p">,</span>
@@ -726,7 +726,7 @@ has the ability to download or available on the local filesystem (provide the ab
 <div class="section" id="java-pipelines-with-dataflowrunner">
 <h2>Java Pipelines with DataflowRunner<a class="headerlink" href="#java-pipelines-with-dataflowrunner" title="Permalink to this headline">¶</a></h2>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/3.0.1/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">jar_to_local_dataflow_runner</span> <span class="o">=</span> <span class="n">GCSToLocalFilesystemOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;jar_to_local_dataflow_runner&quot;</span><span class="p">,</span>
     <span class="n">bucket</span><span class="o">=</span><span class="n">GCS_JAR_DATAFLOW_RUNNER_BUCKET_NAME</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-apache-beam/3.1.0/operators.html b/docs-archive/apache-airflow-providers-apache-beam/3.1.0/operators.html
index 36316cc7fd..c9e6110307 100644
--- a/docs-archive/apache-airflow-providers-apache-beam/3.1.0/operators.html
+++ b/docs-archive/apache-airflow-providers-apache-beam/3.1.0/operators.html
@@ -606,7 +606,7 @@ recommend avoiding unless the Dataflow job requires it.</p>
 <div class="section" id="python-pipelines-with-directrunner">
 <h2>Python Pipelines with DirectRunner<a class="headerlink" href="#python-pipelines-with-directrunner" title="Permalink to this headline">¶</a></h2>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/3.1.0/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">start_python_pipeline_local_direct_runner</span> <span class="o">=</span> <span class="n">BeamRunPythonPipelineOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;start_python_pipeline_local_direct_runner&quot;</span><span class="p">,</span>
     <span class="n">py_file</span><span class="o">=</span><span class="s1">&#39;apache_beam.examples.wordcount&#39;</span><span class="p">,</span>
@@ -619,7 +619,7 @@ recommend avoiding unless the Dataflow job requires it.</p>
 </div>
 </div>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/3.1.0/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">start_python_pipeline_direct_runner</span> <span class="o">=</span> <span class="n">BeamRunPythonPipelineOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;start_python_pipeline_direct_runner&quot;</span><span class="p">,</span>
     <span class="n">py_file</span><span class="o">=</span><span class="n">GCS_PYTHON</span><span class="p">,</span>
@@ -636,7 +636,7 @@ recommend avoiding unless the Dataflow job requires it.</p>
 <div class="section" id="python-pipelines-with-dataflowrunner">
 <h2>Python Pipelines with DataflowRunner<a class="headerlink" href="#python-pipelines-with-dataflowrunner" title="Permalink to this headline">¶</a></h2>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/3.1.0/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">start_python_pipeline_dataflow_runner</span> <span class="o">=</span> <span class="n">BeamRunPythonPipelineOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;start_python_pipeline_dataflow_runner&quot;</span><span class="p">,</span>
     <span class="n">runner</span><span class="o">=</span><span class="s2">&quot;DataflowRunner&quot;</span><span class="p">,</span>
@@ -658,7 +658,7 @@ recommend avoiding unless the Dataflow job requires it.</p>
 </div>
 </div>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/3.1.0/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">start_python_job_dataflow_runner_async</span> <span class="o">=</span> <span class="n">BeamRunPythonPipelineOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;start_python_job_dataflow_runner_async&quot;</span><span class="p">,</span>
     <span class="n">runner</span><span class="o">=</span><span class="s2">&quot;DataflowRunner&quot;</span><span class="p">,</span>
@@ -703,7 +703,7 @@ has the ability to download or available on the local filesystem (provide the ab
 <div class="section" id="java-pipelines-with-directrunner">
 <h2>Java Pipelines with DirectRunner<a class="headerlink" href="#java-pipelines-with-directrunner" title="Permalink to this headline">¶</a></h2>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/3.1.0/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">jar_to_local_direct_runner</span> <span class="o">=</span> <span class="n">GCSToLocalFilesystemOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;jar_to_local_direct_runner&quot;</span><span class="p">,</span>
     <span class="n">bucket</span><span class="o">=</span><span class="n">GCS_JAR_DIRECT_RUNNER_BUCKET_NAME</span><span class="p">,</span>
@@ -729,7 +729,7 @@ has the ability to download or available on the local filesystem (provide the ab
 <div class="section" id="java-pipelines-with-dataflowrunner">
 <h2>Java Pipelines with DataflowRunner<a class="headerlink" href="#java-pipelines-with-dataflowrunner" title="Permalink to this headline">¶</a></h2>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/3.1.0/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">jar_to_local_dataflow_runner</span> <span class="o">=</span> <span class="n">GCSToLocalFilesystemOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;jar_to_local_dataflow_runner&quot;</span><span class="p">,</span>
     <span class="n">bucket</span><span class="o">=</span><span class="n">GCS_JAR_DATAFLOW_RUNNER_BUCKET_NAME</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-apache-beam/3.2.0/operators.html b/docs-archive/apache-airflow-providers-apache-beam/3.2.0/operators.html
index 9c460c4419..68e68821eb 100644
--- a/docs-archive/apache-airflow-providers-apache-beam/3.2.0/operators.html
+++ b/docs-archive/apache-airflow-providers-apache-beam/3.2.0/operators.html
@@ -608,7 +608,7 @@ recommend avoiding unless the Dataflow job requires it.</p>
 <div class="section" id="python-pipelines-with-directrunner">
 <h2>Python Pipelines with DirectRunner<a class="headerlink" href="#python-pipelines-with-directrunner" title="Permalink to this headline">¶</a></h2>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/3.2.0/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">start_python_pipeline_local_direct_runner</span> <span class="o">=</span> <span class="n">BeamRunPythonPipelineOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;start_python_pipeline_local_direct_runner&quot;</span><span class="p">,</span>
     <span class="n">py_file</span><span class="o">=</span><span class="s1">&#39;apache_beam.examples.wordcount&#39;</span><span class="p">,</span>
@@ -621,7 +621,7 @@ recommend avoiding unless the Dataflow job requires it.</p>
 </div>
 </div>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/3.2.0/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">start_python_pipeline_direct_runner</span> <span class="o">=</span> <span class="n">BeamRunPythonPipelineOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;start_python_pipeline_direct_runner&quot;</span><span class="p">,</span>
     <span class="n">py_file</span><span class="o">=</span><span class="n">GCS_PYTHON</span><span class="p">,</span>
@@ -638,7 +638,7 @@ recommend avoiding unless the Dataflow job requires it.</p>
 <div class="section" id="python-pipelines-with-dataflowrunner">
 <h2>Python Pipelines with DataflowRunner<a class="headerlink" href="#python-pipelines-with-dataflowrunner" title="Permalink to this headline">¶</a></h2>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/3.2.0/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">start_python_pipeline_dataflow_runner</span> <span class="o">=</span> <span class="n">BeamRunPythonPipelineOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;start_python_pipeline_dataflow_runner&quot;</span><span class="p">,</span>
     <span class="n">runner</span><span class="o">=</span><span class="s2">&quot;DataflowRunner&quot;</span><span class="p">,</span>
@@ -660,7 +660,7 @@ recommend avoiding unless the Dataflow job requires it.</p>
 </div>
 </div>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/3.2.0/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">start_python_job_dataflow_runner_async</span> <span class="o">=</span> <span class="n">BeamRunPythonPipelineOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;start_python_job_dataflow_runner_async&quot;</span><span class="p">,</span>
     <span class="n">runner</span><span class="o">=</span><span class="s2">&quot;DataflowRunner&quot;</span><span class="p">,</span>
@@ -709,7 +709,7 @@ has the ability to download or available on the local filesystem (provide the ab
 <div class="section" id="java-pipelines-with-directrunner">
 <h2>Java Pipelines with DirectRunner<a class="headerlink" href="#java-pipelines-with-directrunner" title="Permalink to this headline">¶</a></h2>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/3.2.0/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">jar_to_local_direct_runner</span> <span class="o">=</span> <span class="n">GCSToLocalFilesystemOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;jar_to_local_direct_runner&quot;</span><span class="p">,</span>
     <span class="n">bucket</span><span class="o">=</span><span class="n">GCS_JAR_DIRECT_RUNNER_BUCKET_NAME</span><span class="p">,</span>
@@ -735,7 +735,7 @@ has the ability to download or available on the local filesystem (provide the ab
 <div class="section" id="java-pipelines-with-dataflowrunner">
 <h2>Java Pipelines with DataflowRunner<a class="headerlink" href="#java-pipelines-with-dataflowrunner" title="Permalink to this headline">¶</a></h2>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/3.2.0/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">jar_to_local_dataflow_runner</span> <span class="o">=</span> <span class="n">GCSToLocalFilesystemOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;jar_to_local_dataflow_runner&quot;</span><span class="p">,</span>
     <span class="n">bucket</span><span class="o">=</span><span class="n">GCS_JAR_DATAFLOW_RUNNER_BUCKET_NAME</span><span class="p">,</span>
@@ -777,7 +777,7 @@ init the module and install dependencies with <code class="docutils literal notr
 <div class="section" id="go-pipelines-with-directrunner">
 <h2>Go Pipelines with DirectRunner<a class="headerlink" href="#go-pipelines-with-directrunner" title="Permalink to this headline">¶</a></h2>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/3.2.0/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">start_go_pipeline_local_direct_runner</span> <span class="o">=</span> <span class="n">BeamRunGoPipelineOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;start_go_pipeline_local_direct_runner&quot;</span><span class="p">,</span>
     <span class="n">go_file</span><span class="o">=</span><span class="s1">&#39;files/apache_beam/examples/wordcount.go&#39;</span><span class="p">,</span>
@@ -786,7 +786,7 @@ init the module and install dependencies with <code class="docutils literal notr
 </div>
 </div>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/3.2.0/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">start_go_pipeline_direct_runner</span> <span class="o">=</span> <span class="n">BeamRunGoPipelineOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;start_go_pipeline_direct_runner&quot;</span><span class="p">,</span>
     <span class="n">go_file</span><span class="o">=</span><span class="n">GCS_GO</span><span class="p">,</span>
@@ -799,7 +799,7 @@ init the module and install dependencies with <code class="docutils literal notr
 <div class="section" id="go-pipelines-with-dataflowrunner">
 <h2>Go Pipelines with DataflowRunner<a class="headerlink" href="#go-pipelines-with-dataflowrunner" title="Permalink to this headline">¶</a></h2>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/3.2.0/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">start_go_pipeline_dataflow_runner</span> <span class="o">=</span> <span class="n">BeamRunGoPipelineOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;start_go_pipeline_dataflow_runner&quot;</span><span class="p">,</span>
     <span class="n">runner</span><span class="o">=</span><span class="s2">&quot;DataflowRunner&quot;</span><span class="p">,</span>
@@ -818,7 +818,7 @@ init the module and install dependencies with <code class="docutils literal notr
 </div>
 </div>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/3.2.0/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">start_go_job_dataflow_runner_async</span> <span class="o">=</span> <span class="n">BeamRunGoPipelineOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;start_go_job_dataflow_runner_async&quot;</span><span class="p">,</span>
     <span class="n">runner</span><span class="o">=</span><span class="s2">&quot;DataflowRunner&quot;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-apache-beam/3.2.1/operators.html b/docs-archive/apache-airflow-providers-apache-beam/3.2.1/operators.html
index 33f1e9c825..d4326b510c 100644
--- a/docs-archive/apache-airflow-providers-apache-beam/3.2.1/operators.html
+++ b/docs-archive/apache-airflow-providers-apache-beam/3.2.1/operators.html
@@ -608,7 +608,7 @@ recommend avoiding unless the Dataflow job requires it.</p>
 <div class="section" id="python-pipelines-with-directrunner">
 <h2>Python Pipelines with DirectRunner<a class="headerlink" href="#python-pipelines-with-directrunner" title="Permalink to this headline">¶</a></h2>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/3.2.1/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">start_python_pipeline_local_direct_runner</span> <span class="o">=</span> <span class="n">BeamRunPythonPipelineOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;start_python_pipeline_local_direct_runner&quot;</span><span class="p">,</span>
     <span class="n">py_file</span><span class="o">=</span><span class="s1">&#39;apache_beam.examples.wordcount&#39;</span><span class="p">,</span>
@@ -621,7 +621,7 @@ recommend avoiding unless the Dataflow job requires it.</p>
 </div>
 </div>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/3.2.1/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">start_python_pipeline_direct_runner</span> <span class="o">=</span> <span class="n">BeamRunPythonPipelineOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;start_python_pipeline_direct_runner&quot;</span><span class="p">,</span>
     <span class="n">py_file</span><span class="o">=</span><span class="n">GCS_PYTHON</span><span class="p">,</span>
@@ -638,7 +638,7 @@ recommend avoiding unless the Dataflow job requires it.</p>
 <div class="section" id="python-pipelines-with-dataflowrunner">
 <h2>Python Pipelines with DataflowRunner<a class="headerlink" href="#python-pipelines-with-dataflowrunner" title="Permalink to this headline">¶</a></h2>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/3.2.1/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">start_python_pipeline_dataflow_runner</span> <span class="o">=</span> <span class="n">BeamRunPythonPipelineOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;start_python_pipeline_dataflow_runner&quot;</span><span class="p">,</span>
     <span class="n">runner</span><span class="o">=</span><span class="s2">&quot;DataflowRunner&quot;</span><span class="p">,</span>
@@ -660,7 +660,7 @@ recommend avoiding unless the Dataflow job requires it.</p>
 </div>
 </div>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/3.2.1/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">start_python_job_dataflow_runner_async</span> <span class="o">=</span> <span class="n">BeamRunPythonPipelineOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;start_python_job_dataflow_runner_async&quot;</span><span class="p">,</span>
     <span class="n">runner</span><span class="o">=</span><span class="s2">&quot;DataflowRunner&quot;</span><span class="p">,</span>
@@ -709,7 +709,7 @@ has the ability to download or available on the local filesystem (provide the ab
 <div class="section" id="java-pipelines-with-directrunner">
 <h2>Java Pipelines with DirectRunner<a class="headerlink" href="#java-pipelines-with-directrunner" title="Permalink to this headline">¶</a></h2>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/3.2.1/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">jar_to_local_direct_runner</span> <span class="o">=</span> <span class="n">GCSToLocalFilesystemOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;jar_to_local_direct_runner&quot;</span><span class="p">,</span>
     <span class="n">bucket</span><span class="o">=</span><span class="n">GCS_JAR_DIRECT_RUNNER_BUCKET_NAME</span><span class="p">,</span>
@@ -735,7 +735,7 @@ has the ability to download or available on the local filesystem (provide the ab
 <div class="section" id="java-pipelines-with-dataflowrunner">
 <h2>Java Pipelines with DataflowRunner<a class="headerlink" href="#java-pipelines-with-dataflowrunner" title="Permalink to this headline">¶</a></h2>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/3.2.1/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">jar_to_local_dataflow_runner</span> <span class="o">=</span> <span class="n">GCSToLocalFilesystemOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;jar_to_local_dataflow_runner&quot;</span><span class="p">,</span>
     <span class="n">bucket</span><span class="o">=</span><span class="n">GCS_JAR_DATAFLOW_RUNNER_BUCKET_NAME</span><span class="p">,</span>
@@ -777,7 +777,7 @@ init the module and install dependencies with <code class="docutils literal notr
 <div class="section" id="go-pipelines-with-directrunner">
 <h2>Go Pipelines with DirectRunner<a class="headerlink" href="#go-pipelines-with-directrunner" title="Permalink to this headline">¶</a></h2>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/3.2.1/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">start_go_pipeline_local_direct_runner</span> <span class="o">=</span> <span class="n">BeamRunGoPipelineOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;start_go_pipeline_local_direct_runner&quot;</span><span class="p">,</span>
     <span class="n">go_file</span><span class="o">=</span><span class="s1">&#39;files/apache_beam/examples/wordcount.go&#39;</span><span class="p">,</span>
@@ -786,7 +786,7 @@ init the module and install dependencies with <code class="docutils literal notr
 </div>
 </div>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/3.2.1/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">start_go_pipeline_direct_runner</span> <span class="o">=</span> <span class="n">BeamRunGoPipelineOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;start_go_pipeline_direct_runner&quot;</span><span class="p">,</span>
     <span class="n">go_file</span><span class="o">=</span><span class="n">GCS_GO</span><span class="p">,</span>
@@ -799,7 +799,7 @@ init the module and install dependencies with <code class="docutils literal notr
 <div class="section" id="go-pipelines-with-dataflowrunner">
 <h2>Go Pipelines with DataflowRunner<a class="headerlink" href="#go-pipelines-with-dataflowrunner" title="Permalink to this headline">¶</a></h2>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/3.2.1/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">start_go_pipeline_dataflow_runner</span> <span class="o">=</span> <span class="n">BeamRunGoPipelineOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;start_go_pipeline_dataflow_runner&quot;</span><span class="p">,</span>
     <span class="n">runner</span><span class="o">=</span><span class="s2">&quot;DataflowRunner&quot;</span><span class="p">,</span>
@@ -818,7 +818,7 @@ init the module and install dependencies with <code class="docutils literal notr
 </div>
 </div>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/3.2.1/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">start_go_job_dataflow_runner_async</span> <span class="o">=</span> <span class="n">BeamRunGoPipelineOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;start_go_job_dataflow_runner_async&quot;</span><span class="p">,</span>
     <span class="n">runner</span><span class="o">=</span><span class="s2">&quot;DataflowRunner&quot;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-apache-beam/3.3.0/operators.html b/docs-archive/apache-airflow-providers-apache-beam/3.3.0/operators.html
index ed765dbc2c..61b54541d2 100644
--- a/docs-archive/apache-airflow-providers-apache-beam/3.3.0/operators.html
+++ b/docs-archive/apache-airflow-providers-apache-beam/3.3.0/operators.html
@@ -608,7 +608,7 @@ recommend avoiding unless the Dataflow job requires it.</p>
 <div class="section" id="python-pipelines-with-directrunner">
 <h2>Python Pipelines with DirectRunner<a class="headerlink" href="#python-pipelines-with-directrunner" title="Permalink to this headline">¶</a></h2>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/3.3.0/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">start_python_pipeline_local_direct_runner</span> <span class="o">=</span> <span class="n">BeamRunPythonPipelineOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;start_python_pipeline_local_direct_runner&quot;</span><span class="p">,</span>
     <span class="n">py_file</span><span class="o">=</span><span class="s1">&#39;apache_beam.examples.wordcount&#39;</span><span class="p">,</span>
@@ -621,7 +621,7 @@ recommend avoiding unless the Dataflow job requires it.</p>
 </div>
 </div>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/3.3.0/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">start_python_pipeline_direct_runner</span> <span class="o">=</span> <span class="n">BeamRunPythonPipelineOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;start_python_pipeline_direct_runner&quot;</span><span class="p">,</span>
     <span class="n">py_file</span><span class="o">=</span><span class="n">GCS_PYTHON</span><span class="p">,</span>
@@ -638,7 +638,7 @@ recommend avoiding unless the Dataflow job requires it.</p>
 <div class="section" id="python-pipelines-with-dataflowrunner">
 <h2>Python Pipelines with DataflowRunner<a class="headerlink" href="#python-pipelines-with-dataflowrunner" title="Permalink to this headline">¶</a></h2>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/3.3.0/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">start_python_pipeline_dataflow_runner</span> <span class="o">=</span> <span class="n">BeamRunPythonPipelineOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;start_python_pipeline_dataflow_runner&quot;</span><span class="p">,</span>
     <span class="n">runner</span><span class="o">=</span><span class="s2">&quot;DataflowRunner&quot;</span><span class="p">,</span>
@@ -660,7 +660,7 @@ recommend avoiding unless the Dataflow job requires it.</p>
 </div>
 </div>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/3.3.0/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">start_python_job_dataflow_runner_async</span> <span class="o">=</span> <span class="n">BeamRunPythonPipelineOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;start_python_job_dataflow_runner_async&quot;</span><span class="p">,</span>
     <span class="n">runner</span><span class="o">=</span><span class="s2">&quot;DataflowRunner&quot;</span><span class="p">,</span>
@@ -709,7 +709,7 @@ has the ability to download or available on the local filesystem (provide the ab
 <div class="section" id="java-pipelines-with-directrunner">
 <h2>Java Pipelines with DirectRunner<a class="headerlink" href="#java-pipelines-with-directrunner" title="Permalink to this headline">¶</a></h2>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/3.3.0/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">jar_to_local_direct_runner</span> <span class="o">=</span> <span class="n">GCSToLocalFilesystemOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;jar_to_local_direct_runner&quot;</span><span class="p">,</span>
     <span class="n">bucket</span><span class="o">=</span><span class="n">GCS_JAR_DIRECT_RUNNER_BUCKET_NAME</span><span class="p">,</span>
@@ -735,7 +735,7 @@ has the ability to download or available on the local filesystem (provide the ab
 <div class="section" id="java-pipelines-with-dataflowrunner">
 <h2>Java Pipelines with DataflowRunner<a class="headerlink" href="#java-pipelines-with-dataflowrunner" title="Permalink to this headline">¶</a></h2>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/3.3.0/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">jar_to_local_dataflow_runner</span> <span class="o">=</span> <span class="n">GCSToLocalFilesystemOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;jar_to_local_dataflow_runner&quot;</span><span class="p">,</span>
     <span class="n">bucket</span><span class="o">=</span><span class="n">GCS_JAR_DATAFLOW_RUNNER_BUCKET_NAME</span><span class="p">,</span>
@@ -777,7 +777,7 @@ init the module and install dependencies with <code class="docutils literal notr
 <div class="section" id="go-pipelines-with-directrunner">
 <h2>Go Pipelines with DirectRunner<a class="headerlink" href="#go-pipelines-with-directrunner" title="Permalink to this headline">¶</a></h2>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/3.3.0/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">start_go_pipeline_local_direct_runner</span> <span class="o">=</span> <span class="n">BeamRunGoPipelineOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;start_go_pipeline_local_direct_runner&quot;</span><span class="p">,</span>
     <span class="n">go_file</span><span class="o">=</span><span class="s1">&#39;files/apache_beam/examples/wordcount.go&#39;</span><span class="p">,</span>
@@ -786,7 +786,7 @@ init the module and install dependencies with <code class="docutils literal notr
 </div>
 </div>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/3.3.0/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">start_go_pipeline_direct_runner</span> <span class="o">=</span> <span class="n">BeamRunGoPipelineOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;start_go_pipeline_direct_runner&quot;</span><span class="p">,</span>
     <span class="n">go_file</span><span class="o">=</span><span class="n">GCS_GO</span><span class="p">,</span>
@@ -799,7 +799,7 @@ init the module and install dependencies with <code class="docutils literal notr
 <div class="section" id="go-pipelines-with-dataflowrunner">
 <h2>Go Pipelines with DataflowRunner<a class="headerlink" href="#go-pipelines-with-dataflowrunner" title="Permalink to this headline">¶</a></h2>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/3.3.0/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">start_go_pipeline_dataflow_runner</span> <span class="o">=</span> <span class="n">BeamRunGoPipelineOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;start_go_pipeline_dataflow_runner&quot;</span><span class="p">,</span>
     <span class="n">runner</span><span class="o">=</span><span class="s2">&quot;DataflowRunner&quot;</span><span class="p">,</span>
@@ -818,7 +818,7 @@ init the module and install dependencies with <code class="docutils literal notr
 </div>
 </div>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/3.3.0/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">start_go_job_dataflow_runner_async</span> <span class="o">=</span> <span class="n">BeamRunGoPipelineOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;start_go_job_dataflow_runner_async&quot;</span><span class="p">,</span>
     <span class="n">runner</span><span class="o">=</span><span class="s2">&quot;DataflowRunner&quot;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-apache-beam/3.4.0/operators.html b/docs-archive/apache-airflow-providers-apache-beam/3.4.0/operators.html
index 381c7f036b..6b0f640b99 100644
--- a/docs-archive/apache-airflow-providers-apache-beam/3.4.0/operators.html
+++ b/docs-archive/apache-airflow-providers-apache-beam/3.4.0/operators.html
@@ -610,7 +610,7 @@ recommend avoiding unless the Dataflow job requires it.</p>
 <div class="section" id="python-pipelines-with-directrunner">
 <h2>Python Pipelines with DirectRunner<a class="headerlink" href="#python-pipelines-with-directrunner" title="Permalink to this headline">¶</a></h2>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/3.4.0/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">start_python_pipeline_local_direct_runner</span> <span class="o">=</span> <span class="n">BeamRunPythonPipelineOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;start_python_pipeline_local_direct_runner&quot;</span><span class="p">,</span>
     <span class="n">py_file</span><span class="o">=</span><span class="s1">&#39;apache_beam.examples.wordcount&#39;</span><span class="p">,</span>
@@ -623,7 +623,7 @@ recommend avoiding unless the Dataflow job requires it.</p>
 </div>
 </div>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/3.4.0/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">start_python_pipeline_direct_runner</span> <span class="o">=</span> <span class="n">BeamRunPythonPipelineOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;start_python_pipeline_direct_runner&quot;</span><span class="p">,</span>
     <span class="n">py_file</span><span class="o">=</span><span class="n">GCS_PYTHON</span><span class="p">,</span>
@@ -640,7 +640,7 @@ recommend avoiding unless the Dataflow job requires it.</p>
 <div class="section" id="python-pipelines-with-dataflowrunner">
 <h2>Python Pipelines with DataflowRunner<a class="headerlink" href="#python-pipelines-with-dataflowrunner" title="Permalink to this headline">¶</a></h2>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/3.4.0/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">start_python_pipeline_dataflow_runner</span> <span class="o">=</span> <span class="n">BeamRunPythonPipelineOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;start_python_pipeline_dataflow_runner&quot;</span><span class="p">,</span>
     <span class="n">runner</span><span class="o">=</span><span class="s2">&quot;DataflowRunner&quot;</span><span class="p">,</span>
@@ -662,7 +662,7 @@ recommend avoiding unless the Dataflow job requires it.</p>
 </div>
 </div>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/3.4.0/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">start_python_job_dataflow_runner_async</span> <span class="o">=</span> <span class="n">BeamRunPythonPipelineOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;start_python_job_dataflow_runner_async&quot;</span><span class="p">,</span>
     <span class="n">runner</span><span class="o">=</span><span class="s2">&quot;DataflowRunner&quot;</span><span class="p">,</span>
@@ -711,7 +711,7 @@ has the ability to download or available on the local filesystem (provide the ab
 <div class="section" id="java-pipelines-with-directrunner">
 <h2>Java Pipelines with DirectRunner<a class="headerlink" href="#java-pipelines-with-directrunner" title="Permalink to this headline">¶</a></h2>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/3.4.0/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">jar_to_local_direct_runner</span> <span class="o">=</span> <span class="n">GCSToLocalFilesystemOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;jar_to_local_direct_runner&quot;</span><span class="p">,</span>
     <span class="n">bucket</span><span class="o">=</span><span class="n">GCS_JAR_DIRECT_RUNNER_BUCKET_NAME</span><span class="p">,</span>
@@ -737,7 +737,7 @@ has the ability to download or available on the local filesystem (provide the ab
 <div class="section" id="java-pipelines-with-dataflowrunner">
 <h2>Java Pipelines with DataflowRunner<a class="headerlink" href="#java-pipelines-with-dataflowrunner" title="Permalink to this headline">¶</a></h2>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/3.4.0/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">jar_to_local_dataflow_runner</span> <span class="o">=</span> <span class="n">GCSToLocalFilesystemOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;jar_to_local_dataflow_runner&quot;</span><span class="p">,</span>
     <span class="n">bucket</span><span class="o">=</span><span class="n">GCS_JAR_DATAFLOW_RUNNER_BUCKET_NAME</span><span class="p">,</span>
@@ -779,7 +779,7 @@ init the module and install dependencies with <code class="docutils literal notr
 <div class="section" id="go-pipelines-with-directrunner">
 <h2>Go Pipelines with DirectRunner<a class="headerlink" href="#go-pipelines-with-directrunner" title="Permalink to this headline">¶</a></h2>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/3.4.0/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">start_go_pipeline_local_direct_runner</span> <span class="o">=</span> <span class="n">BeamRunGoPipelineOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;start_go_pipeline_local_direct_runner&quot;</span><span class="p">,</span>
     <span class="n">go_file</span><span class="o">=</span><span class="s1">&#39;files/apache_beam/examples/wordcount.go&#39;</span><span class="p">,</span>
@@ -788,7 +788,7 @@ init the module and install dependencies with <code class="docutils literal notr
 </div>
 </div>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/3.4.0/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">start_go_pipeline_direct_runner</span> <span class="o">=</span> <span class="n">BeamRunGoPipelineOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;start_go_pipeline_direct_runner&quot;</span><span class="p">,</span>
     <span class="n">go_file</span><span class="o">=</span><span class="n">GCS_GO</span><span class="p">,</span>
@@ -801,7 +801,7 @@ init the module and install dependencies with <code class="docutils literal notr
 <div class="section" id="go-pipelines-with-dataflowrunner">
 <h2>Go Pipelines with DataflowRunner<a class="headerlink" href="#go-pipelines-with-dataflowrunner" title="Permalink to this headline">¶</a></h2>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/3.4.0/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">start_go_pipeline_dataflow_runner</span> <span class="o">=</span> <span class="n">BeamRunGoPipelineOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;start_go_pipeline_dataflow_runner&quot;</span><span class="p">,</span>
     <span class="n">runner</span><span class="o">=</span><span class="s2">&quot;DataflowRunner&quot;</span><span class="p">,</span>
@@ -820,7 +820,7 @@ init the module and install dependencies with <code class="docutils literal notr
 </div>
 </div>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/beam/example_dags/example_beam.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/beam/example_dags/example_beam.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-beam/3.4.0/airflow/providers/apache/beam/example_dags/example_beam.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">start_go_job_dataflow_runner_async</span> <span class="o">=</span> <span class="n">BeamRunGoPipelineOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;start_go_job_dataflow_runner_async&quot;</span><span class="p">,</span>
     <span class="n">runner</span><span class="o">=</span><span class="s2">&quot;DataflowRunner&quot;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-apache-cassandra/1.0.0/operators.html b/docs-archive/apache-airflow-providers-apache-cassandra/1.0.0/operators.html
index 4bf26b7c41..7895d53883 100644
--- a/docs-archive/apache-airflow-providers-apache-cassandra/1.0.0/operators.html
+++ b/docs-archive/apache-airflow-providers-apache-cassandra/1.0.0/operators.html
@@ -588,7 +588,7 @@
 <p>The <a class="reference internal" href="_api/airflow/providers/apache/cassandra/sensors/table/index.html#airflow.providers.apache.cassandra.sensors.table.CassandraTableSensor" title="airflow.providers.apache.cassandra.sensors.table.CassandraTableSensor"><code class="xref py py-class docutils literal notranslate"><span class="pre">CassandraTableSensor</span></code></a> operator is used to check for the existence of a table in a Cassandra cluster.</p>
 <p>Use the <code class="docutils literal notranslate"><span class="pre">table</span></code> parameter to poke until the provided table is found. Use dot notation to target a specific keyspace.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/cassandra/example_dags/example_cassandra_dag.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/cassandra/example_dags/example_cassandra_dag.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/cassandra/example_dags/example_cassandra_dag.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-cassandra/1.0.0/airflow/providers/apache/cassandra/example_dags/example_cassandra_dag.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">table_sensor</span> <span class="o">=</span> <span class="n">CassandraTableSensor</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;cassandra_table_sensor&quot;</span><span class="p">,</span>
     <span class="n">cassandra_conn_id</span><span class="o">=</span><span class="s2">&quot;cassandra_default&quot;</span><span class="p">,</span>
@@ -604,7 +604,7 @@
 <p>Use the <code class="docutils literal notranslate"><span class="pre">table</span></code> parameter to mention the keyspace and table for the record. Use dot notation to target a specific keyspace.</p>
 <p>Use the <code class="docutils literal notranslate"><span class="pre">keys</span></code> parameter to poke until the provided record is found. The existence of record is identified using key value pairs. In the given example, we're are looking for value <code class="docutils literal notranslate"><span class="pre">v1</span></code> in column <code class="docutils literal notranslate"><span class="pre">p1</span></code> and <code class="docutils literal notranslate"><span class="pre">v2</s [...]
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/cassandra/example_dags/example_cassandra_dag.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/cassandra/example_dags/example_cassandra_dag.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/cassandra/example_dags/example_cassandra_dag.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-cassandra/1.0.0/airflow/providers/apache/cassandra/example_dags/example_cassandra_dag.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">record_sensor</span> <span class="o">=</span> <span class="n">CassandraRecordSensor</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;cassandra_record_sensor&quot;</span><span class="p">,</span>
     <span class="n">cassandra_conn_id</span><span class="o">=</span><span class="s2">&quot;cassandra_default&quot;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-apache-cassandra/1.0.1/operators.html b/docs-archive/apache-airflow-providers-apache-cassandra/1.0.1/operators.html
index 27f7eabb11..c9a26c7ff7 100644
--- a/docs-archive/apache-airflow-providers-apache-cassandra/1.0.1/operators.html
+++ b/docs-archive/apache-airflow-providers-apache-cassandra/1.0.1/operators.html
@@ -598,7 +598,7 @@
 <p>The <a class="reference internal" href="_api/airflow/providers/apache/cassandra/sensors/table/index.html#airflow.providers.apache.cassandra.sensors.table.CassandraTableSensor" title="airflow.providers.apache.cassandra.sensors.table.CassandraTableSensor"><code class="xref py py-class docutils literal notranslate"><span class="pre">CassandraTableSensor</span></code></a> operator is used to check for the existence of a table in a Cassandra cluster.</p>
 <p>Use the <code class="docutils literal notranslate"><span class="pre">table</span></code> parameter to poke until the provided table is found. Use dot notation to target a specific keyspace.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/cassandra/example_dags/example_cassandra_dag.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/cassandra/example_dags/example_cassandra_dag.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/cassandra/example_dags/example_cassandra_dag.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-cassandra/1.0.1/airflow/providers/apache/cassandra/example_dags/example_cassandra_dag.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">table_sensor</span> <span class="o">=</span> <span class="n">CassandraTableSensor</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;cassandra_table_sensor&quot;</span><span class="p">,</span>
     <span class="n">cassandra_conn_id</span><span class="o">=</span><span class="s2">&quot;cassandra_default&quot;</span><span class="p">,</span>
@@ -614,7 +614,7 @@
 <p>Use the <code class="docutils literal notranslate"><span class="pre">table</span></code> parameter to mention the keyspace and table for the record. Use dot notation to target a specific keyspace.</p>
 <p>Use the <code class="docutils literal notranslate"><span class="pre">keys</span></code> parameter to poke until the provided record is found. The existence of record is identified using key value pairs. In the given example, we're are looking for value <code class="docutils literal notranslate"><span class="pre">v1</span></code> in column <code class="docutils literal notranslate"><span class="pre">p1</span></code> and <code class="docutils literal notranslate"><span class="pre">v2</s [...]
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/cassandra/example_dags/example_cassandra_dag.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/cassandra/example_dags/example_cassandra_dag.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/cassandra/example_dags/example_cassandra_dag.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-cassandra/1.0.1/airflow/providers/apache/cassandra/example_dags/example_cassandra_dag.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">record_sensor</span> <span class="o">=</span> <span class="n">CassandraRecordSensor</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;cassandra_record_sensor&quot;</span><span class="p">,</span>
     <span class="n">cassandra_conn_id</span><span class="o">=</span><span class="s2">&quot;cassandra_default&quot;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-apache-cassandra/2.0.0/operators.html b/docs-archive/apache-airflow-providers-apache-cassandra/2.0.0/operators.html
index f3d047388b..cc31935875 100644
--- a/docs-archive/apache-airflow-providers-apache-cassandra/2.0.0/operators.html
+++ b/docs-archive/apache-airflow-providers-apache-cassandra/2.0.0/operators.html
@@ -597,7 +597,7 @@
 <p>The <a class="reference internal" href="_api/airflow/providers/apache/cassandra/sensors/table/index.html#airflow.providers.apache.cassandra.sensors.table.CassandraTableSensor" title="airflow.providers.apache.cassandra.sensors.table.CassandraTableSensor"><code class="xref py py-class docutils literal notranslate"><span class="pre">CassandraTableSensor</span></code></a> operator is used to check for the existence of a table in a Cassandra cluster.</p>
 <p>Use the <code class="docutils literal notranslate"><span class="pre">table</span></code> parameter to poke until the provided table is found. Use dot notation to target a specific keyspace.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/cassandra/example_dags/example_cassandra_dag.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/cassandra/example_dags/example_cassandra_dag.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/cassandra/example_dags/example_cassandra_dag.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-cassandra/2.0.0/airflow/providers/apache/cassandra/example_dags/example_cassandra_dag.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">table_sensor</span> <span class="o">=</span> <span class="n">CassandraTableSensor</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;cassandra_table_sensor&quot;</span><span class="p">,</span>
     <span class="n">cassandra_conn_id</span><span class="o">=</span><span class="s2">&quot;cassandra_default&quot;</span><span class="p">,</span>
@@ -613,7 +613,7 @@
 <p>Use the <code class="docutils literal notranslate"><span class="pre">table</span></code> parameter to mention the keyspace and table for the record. Use dot notation to target a specific keyspace.</p>
 <p>Use the <code class="docutils literal notranslate"><span class="pre">keys</span></code> parameter to poke until the provided record is found. The existence of record is identified using key value pairs. In the given example, we're are looking for value <code class="docutils literal notranslate"><span class="pre">v1</span></code> in column <code class="docutils literal notranslate"><span class="pre">p1</span></code> and <code class="docutils literal notranslate"><span class="pre">v2</s [...]
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/cassandra/example_dags/example_cassandra_dag.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/cassandra/example_dags/example_cassandra_dag.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/cassandra/example_dags/example_cassandra_dag.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-cassandra/2.0.0/airflow/providers/apache/cassandra/example_dags/example_cassandra_dag.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">record_sensor</span> <span class="o">=</span> <span class="n">CassandraRecordSensor</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;cassandra_record_sensor&quot;</span><span class="p">,</span>
     <span class="n">cassandra_conn_id</span><span class="o">=</span><span class="s2">&quot;cassandra_default&quot;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-apache-cassandra/2.0.1/operators.html b/docs-archive/apache-airflow-providers-apache-cassandra/2.0.1/operators.html
index 3210929c93..9f80895995 100644
--- a/docs-archive/apache-airflow-providers-apache-cassandra/2.0.1/operators.html
+++ b/docs-archive/apache-airflow-providers-apache-cassandra/2.0.1/operators.html
@@ -597,7 +597,7 @@
 <p>The <a class="reference internal" href="_api/airflow/providers/apache/cassandra/sensors/table/index.html#airflow.providers.apache.cassandra.sensors.table.CassandraTableSensor" title="airflow.providers.apache.cassandra.sensors.table.CassandraTableSensor"><code class="xref py py-class docutils literal notranslate"><span class="pre">CassandraTableSensor</span></code></a> operator is used to check for the existence of a table in a Cassandra cluster.</p>
 <p>Use the <code class="docutils literal notranslate"><span class="pre">table</span></code> parameter to poke until the provided table is found. Use dot notation to target a specific keyspace.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/cassandra/example_dags/example_cassandra_dag.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/cassandra/example_dags/example_cassandra_dag.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/cassandra/example_dags/example_cassandra_dag.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-cassandra/2.0.1/airflow/providers/apache/cassandra/example_dags/example_cassandra_dag.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">table_sensor</span> <span class="o">=</span> <span class="n">CassandraTableSensor</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;cassandra_table_sensor&quot;</span><span class="p">,</span>
     <span class="n">cassandra_conn_id</span><span class="o">=</span><span class="s2">&quot;cassandra_default&quot;</span><span class="p">,</span>
@@ -613,7 +613,7 @@
 <p>Use the <code class="docutils literal notranslate"><span class="pre">table</span></code> parameter to mention the keyspace and table for the record. Use dot notation to target a specific keyspace.</p>
 <p>Use the <code class="docutils literal notranslate"><span class="pre">keys</span></code> parameter to poke until the provided record is found. The existence of record is identified using key value pairs. In the given example, we're are looking for value <code class="docutils literal notranslate"><span class="pre">v1</span></code> in column <code class="docutils literal notranslate"><span class="pre">p1</span></code> and <code class="docutils literal notranslate"><span class="pre">v2</s [...]
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/cassandra/example_dags/example_cassandra_dag.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/cassandra/example_dags/example_cassandra_dag.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/cassandra/example_dags/example_cassandra_dag.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-cassandra/2.0.1/airflow/providers/apache/cassandra/example_dags/example_cassandra_dag.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">record_sensor</span> <span class="o">=</span> <span class="n">CassandraRecordSensor</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;cassandra_record_sensor&quot;</span><span class="p">,</span>
     <span class="n">cassandra_conn_id</span><span class="o">=</span><span class="s2">&quot;cassandra_default&quot;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-apache-cassandra/2.1.0/operators.html b/docs-archive/apache-airflow-providers-apache-cassandra/2.1.0/operators.html
index 71f27ff71c..6bad0e23db 100644
--- a/docs-archive/apache-airflow-providers-apache-cassandra/2.1.0/operators.html
+++ b/docs-archive/apache-airflow-providers-apache-cassandra/2.1.0/operators.html
@@ -606,7 +606,7 @@
 <div class="section" id="example-use-of-these-sensors">
 <h3>Example use of these sensors<a class="headerlink" href="#example-use-of-these-sensors" title="Permalink to this headline">¶</a></h3>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/cassandra/example_dags/example_cassandra_dag.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/cassandra/example_dags/example_cassandra_dag.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/cassandra/example_dags/example_cassandra_dag.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-cassandra/2.1.0/airflow/providers/apache/cassandra/example_dags/example_cassandra_dag.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="k">with</span> <span class="n">DAG</span><span class="p">(</span>
     <span class="n">dag_id</span><span class="o">=</span><span class="s1">&#39;example_cassandra_operator&#39;</span><span class="p">,</span>
     <span class="n">schedule_interval</span><span class="o">=</span><span class="kc">None</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-apache-cassandra/2.1.1/operators.html b/docs-archive/apache-airflow-providers-apache-cassandra/2.1.1/operators.html
index 55aab546fc..c1e87a4dc8 100644
--- a/docs-archive/apache-airflow-providers-apache-cassandra/2.1.1/operators.html
+++ b/docs-archive/apache-airflow-providers-apache-cassandra/2.1.1/operators.html
@@ -606,7 +606,7 @@
 <div class="section" id="example-use-of-these-sensors">
 <h3>Example use of these sensors<a class="headerlink" href="#example-use-of-these-sensors" title="Permalink to this headline">¶</a></h3>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/cassandra/example_dags/example_cassandra_dag.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/cassandra/example_dags/example_cassandra_dag.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/cassandra/example_dags/example_cassandra_dag.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-cassandra/2.1.1/airflow/providers/apache/cassandra/example_dags/example_cassandra_dag.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="k">with</span> <span class="n">DAG</span><span class="p">(</span>
     <span class="n">dag_id</span><span class="o">=</span><span class="s1">&#39;example_cassandra_operator&#39;</span><span class="p">,</span>
     <span class="n">schedule_interval</span><span class="o">=</span><span class="kc">None</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-apache-cassandra/2.1.2/operators.html b/docs-archive/apache-airflow-providers-apache-cassandra/2.1.2/operators.html
index 291cf6de53..6b958d7aa1 100644
--- a/docs-archive/apache-airflow-providers-apache-cassandra/2.1.2/operators.html
+++ b/docs-archive/apache-airflow-providers-apache-cassandra/2.1.2/operators.html
@@ -606,7 +606,7 @@
 <div class="section" id="example-use-of-these-sensors">
 <h3>Example use of these sensors<a class="headerlink" href="#example-use-of-these-sensors" title="Permalink to this headline">¶</a></h3>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/cassandra/example_dags/example_cassandra_dag.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/cassandra/example_dags/example_cassandra_dag.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/cassandra/example_dags/example_cassandra_dag.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-cassandra/2.1.2/airflow/providers/apache/cassandra/example_dags/example_cassandra_dag.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="k">with</span> <span class="n">DAG</span><span class="p">(</span>
     <span class="n">dag_id</span><span class="o">=</span><span class="s1">&#39;example_cassandra_operator&#39;</span><span class="p">,</span>
     <span class="n">schedule_interval</span><span class="o">=</span><span class="kc">None</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-apache-cassandra/2.1.3/operators.html b/docs-archive/apache-airflow-providers-apache-cassandra/2.1.3/operators.html
index 1243317085..93e8d4e152 100644
--- a/docs-archive/apache-airflow-providers-apache-cassandra/2.1.3/operators.html
+++ b/docs-archive/apache-airflow-providers-apache-cassandra/2.1.3/operators.html
@@ -608,7 +608,7 @@
 <div class="section" id="example-use-of-these-sensors">
 <h3>Example use of these sensors<a class="headerlink" href="#example-use-of-these-sensors" title="Permalink to this headline">¶</a></h3>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/cassandra/example_dags/example_cassandra_dag.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/cassandra/example_dags/example_cassandra_dag.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/cassandra/example_dags/example_cassandra_dag.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-cassandra/2.1.3/airflow/providers/apache/cassandra/example_dags/example_cassandra_dag.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="k">with</span> <span class="n">DAG</span><span class="p">(</span>
     <span class="n">dag_id</span><span class="o">=</span><span class="s1">&#39;example_cassandra_operator&#39;</span><span class="p">,</span>
     <span class="n">schedule_interval</span><span class="o">=</span><span class="kc">None</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-apache-drill/1.0.0/operators.html b/docs-archive/apache-airflow-providers-apache-drill/1.0.0/operators.html
index c86088b8ae..2be3afb6f8 100644
--- a/docs-archive/apache-airflow-providers-apache-drill/1.0.0/operators.html
+++ b/docs-archive/apache-airflow-providers-apache-drill/1.0.0/operators.html
@@ -599,7 +599,7 @@
 <div class="section" id="using-the-operator">
 <h3>Using the operator<a class="headerlink" href="#using-the-operator" title="Permalink to this headline">¶</a></h3>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/drill/example_dags/example_drill_dag.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/drill/example_dags/example_drill_dag.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/drill/example_dags/example_drill_dag.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-drill/1.0.0/airflow/providers/apache/drill/example_dags/example_drill_dag.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">sql_task</span> <span class="o">=</span> <span class="n">DrillOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;json_to_parquet_table&#39;</span><span class="p">,</span>
     <span class="n">sql</span><span class="o">=</span><span class="s1">&#39;&#39;&#39;</span>
diff --git a/docs-archive/apache-airflow-providers-apache-drill/1.0.1/operators.html b/docs-archive/apache-airflow-providers-apache-drill/1.0.1/operators.html
index fdcdf8e1b2..305e8ed476 100644
--- a/docs-archive/apache-airflow-providers-apache-drill/1.0.1/operators.html
+++ b/docs-archive/apache-airflow-providers-apache-drill/1.0.1/operators.html
@@ -598,7 +598,7 @@
 <div class="section" id="using-the-operator">
 <h3>Using the operator<a class="headerlink" href="#using-the-operator" title="Permalink to this headline">¶</a></h3>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/drill/example_dags/example_drill_dag.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/drill/example_dags/example_drill_dag.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/drill/example_dags/example_drill_dag.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-drill/1.0.1/airflow/providers/apache/drill/example_dags/example_drill_dag.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">sql_task</span> <span class="o">=</span> <span class="n">DrillOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;json_to_parquet_table&#39;</span><span class="p">,</span>
     <span class="n">sql</span><span class="o">=</span><span class="s1">&#39;&#39;&#39;</span>
diff --git a/docs-archive/apache-airflow-providers-apache-drill/1.0.2/operators.html b/docs-archive/apache-airflow-providers-apache-drill/1.0.2/operators.html
index cd3140a994..0e4d42bdea 100644
--- a/docs-archive/apache-airflow-providers-apache-drill/1.0.2/operators.html
+++ b/docs-archive/apache-airflow-providers-apache-drill/1.0.2/operators.html
@@ -598,7 +598,7 @@
 <div class="section" id="using-the-operator">
 <h3>Using the operator<a class="headerlink" href="#using-the-operator" title="Permalink to this headline">¶</a></h3>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/drill/example_dags/example_drill_dag.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/drill/example_dags/example_drill_dag.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/drill/example_dags/example_drill_dag.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-drill/1.0.2/airflow/providers/apache/drill/example_dags/example_drill_dag.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">sql_task</span> <span class="o">=</span> <span class="n">DrillOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;json_to_parquet_table&#39;</span><span class="p">,</span>
     <span class="n">sql</span><span class="o">=</span><span class="s1">&#39;&#39;&#39;</span>
diff --git a/docs-archive/apache-airflow-providers-apache-drill/1.0.3/operators.html b/docs-archive/apache-airflow-providers-apache-drill/1.0.3/operators.html
index 88034022b1..af1287de8c 100644
--- a/docs-archive/apache-airflow-providers-apache-drill/1.0.3/operators.html
+++ b/docs-archive/apache-airflow-providers-apache-drill/1.0.3/operators.html
@@ -598,7 +598,7 @@
 <div class="section" id="using-the-operator">
 <h3>Using the operator<a class="headerlink" href="#using-the-operator" title="Permalink to this headline">¶</a></h3>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/drill/example_dags/example_drill_dag.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/drill/example_dags/example_drill_dag.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/drill/example_dags/example_drill_dag.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-drill/1.0.3/airflow/providers/apache/drill/example_dags/example_drill_dag.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">sql_task</span> <span class="o">=</span> <span class="n">DrillOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;json_to_parquet_table&#39;</span><span class="p">,</span>
     <span class="n">sql</span><span class="o">=</span><span class="s1">&#39;&#39;&#39;</span>
diff --git a/docs-archive/apache-airflow-providers-apache-drill/1.0.4/operators.html b/docs-archive/apache-airflow-providers-apache-drill/1.0.4/operators.html
index 53e1f166b7..855c100ff1 100644
--- a/docs-archive/apache-airflow-providers-apache-drill/1.0.4/operators.html
+++ b/docs-archive/apache-airflow-providers-apache-drill/1.0.4/operators.html
@@ -600,7 +600,7 @@
 <div class="section" id="using-the-operator">
 <h3>Using the operator<a class="headerlink" href="#using-the-operator" title="Permalink to this headline">¶</a></h3>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/drill/example_dags/example_drill_dag.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/drill/example_dags/example_drill_dag.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/drill/example_dags/example_drill_dag.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-drill/1.0.4/airflow/providers/apache/drill/example_dags/example_drill_dag.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">sql_task</span> <span class="o">=</span> <span class="n">DrillOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;json_to_parquet_table&#39;</span><span class="p">,</span>
     <span class="n">sql</span><span class="o">=</span><span class="s1">&#39;&#39;&#39;</span>
diff --git a/docs-archive/apache-airflow-providers-apache-druid/2.1.0/operators.html b/docs-archive/apache-airflow-providers-apache-druid/2.1.0/operators.html
index a9223ae6cd..1bc0c39c3a 100644
--- a/docs-archive/apache-airflow-providers-apache-druid/2.1.0/operators.html
+++ b/docs-archive/apache-airflow-providers-apache-druid/2.1.0/operators.html
@@ -589,7 +589,7 @@
 <div class="section" id="using-the-operator">
 <h3>Using the operator<a class="headerlink" href="#using-the-operator" title="Permalink to this headline">¶</a></h3>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/druid/example_dags/example_druid_dag.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/druid/example_dags/example_druid_dag.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/druid/example_dags/example_druid_dag.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-druid/2.1.0/airflow/providers/apache/druid/example_dags/example_druid_dag.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">submit_job</span> <span class="o">=</span> <span class="n">DruidOperator</span><span class="p">(</span><span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;spark_submit_job&#39;</span><span class="p">,</span> <span class="n">json_index_file</span><span class="o">=</span><span class="s1">&#39;json_index.json&#39;</span><span class="p">)</span>
 <span class="c1"># Example content of json_index.json:</span>
 <span class="n">JSON_INDEX_STR</span> <span class="o">=</span> <span class="s2">&quot;&quot;&quot;</span>
diff --git a/docs-archive/apache-airflow-providers-apache-druid/2.2.0/operators.html b/docs-archive/apache-airflow-providers-apache-druid/2.2.0/operators.html
index c5d7a7bc2a..16dc233f8f 100644
--- a/docs-archive/apache-airflow-providers-apache-druid/2.2.0/operators.html
+++ b/docs-archive/apache-airflow-providers-apache-druid/2.2.0/operators.html
@@ -592,7 +592,7 @@
 <div class="section" id="using-the-operator">
 <h3>Using the operator<a class="headerlink" href="#using-the-operator" title="Permalink to this headline">¶</a></h3>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/druid/example_dags/example_druid_dag.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/druid/example_dags/example_druid_dag.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/druid/example_dags/example_druid_dag.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-druid/2.2.0/airflow/providers/apache/druid/example_dags/example_druid_dag.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">submit_job</span> <span class="o">=</span> <span class="n">DruidOperator</span><span class="p">(</span><span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;spark_submit_job&#39;</span><span class="p">,</span> <span class="n">json_index_file</span><span class="o">=</span><span class="s1">&#39;json_index.json&#39;</span><span class="p">)</span>
 <span class="c1"># Example content of json_index.json:</span>
 <span class="n">JSON_INDEX_STR</span> <span class="o">=</span> <span class="s2">&quot;&quot;&quot;</span>
diff --git a/docs-archive/apache-airflow-providers-apache-druid/2.3.0/operators.html b/docs-archive/apache-airflow-providers-apache-druid/2.3.0/operators.html
index 8e79391e68..8b03cc1674 100644
--- a/docs-archive/apache-airflow-providers-apache-druid/2.3.0/operators.html
+++ b/docs-archive/apache-airflow-providers-apache-druid/2.3.0/operators.html
@@ -592,7 +592,7 @@
 <div class="section" id="using-the-operator">
 <h3>Using the operator<a class="headerlink" href="#using-the-operator" title="Permalink to this headline">¶</a></h3>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/druid/example_dags/example_druid_dag.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/druid/example_dags/example_druid_dag.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/druid/example_dags/example_druid_dag.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-druid/2.3.0/airflow/providers/apache/druid/example_dags/example_druid_dag.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">submit_job</span> <span class="o">=</span> <span class="n">DruidOperator</span><span class="p">(</span><span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;spark_submit_job&#39;</span><span class="p">,</span> <span class="n">json_index_file</span><span class="o">=</span><span class="s1">&#39;json_index.json&#39;</span><span class="p">)</span>
 <span class="c1"># Example content of json_index.json:</span>
 <span class="n">JSON_INDEX_STR</span> <span class="o">=</span> <span class="s2">&quot;&quot;&quot;</span>
diff --git a/docs-archive/apache-airflow-providers-apache-druid/2.3.1/operators.html b/docs-archive/apache-airflow-providers-apache-druid/2.3.1/operators.html
index 7d7f3ed814..8f1e02906d 100644
--- a/docs-archive/apache-airflow-providers-apache-druid/2.3.1/operators.html
+++ b/docs-archive/apache-airflow-providers-apache-druid/2.3.1/operators.html
@@ -592,7 +592,7 @@
 <div class="section" id="using-the-operator">
 <h3>Using the operator<a class="headerlink" href="#using-the-operator" title="Permalink to this headline">¶</a></h3>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/druid/example_dags/example_druid_dag.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/druid/example_dags/example_druid_dag.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/druid/example_dags/example_druid_dag.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-druid/2.3.1/airflow/providers/apache/druid/example_dags/example_druid_dag.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">submit_job</span> <span class="o">=</span> <span class="n">DruidOperator</span><span class="p">(</span><span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;spark_submit_job&#39;</span><span class="p">,</span> <span class="n">json_index_file</span><span class="o">=</span><span class="s1">&#39;json_index.json&#39;</span><span class="p">)</span>
 <span class="c1"># Example content of json_index.json:</span>
 <span class="n">JSON_INDEX_STR</span> <span class="o">=</span> <span class="s2">&quot;&quot;&quot;</span>
diff --git a/docs-archive/apache-airflow-providers-apache-druid/2.3.2/operators.html b/docs-archive/apache-airflow-providers-apache-druid/2.3.2/operators.html
index 8865c63174..604238c41d 100644
--- a/docs-archive/apache-airflow-providers-apache-druid/2.3.2/operators.html
+++ b/docs-archive/apache-airflow-providers-apache-druid/2.3.2/operators.html
@@ -592,7 +592,7 @@
 <div class="section" id="using-the-operator">
 <h3>Using the operator<a class="headerlink" href="#using-the-operator" title="Permalink to this headline">¶</a></h3>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/druid/example_dags/example_druid_dag.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/druid/example_dags/example_druid_dag.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/druid/example_dags/example_druid_dag.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-druid/2.3.2/airflow/providers/apache/druid/example_dags/example_druid_dag.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">submit_job</span> <span class="o">=</span> <span class="n">DruidOperator</span><span class="p">(</span><span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;spark_submit_job&#39;</span><span class="p">,</span> <span class="n">json_index_file</span><span class="o">=</span><span class="s1">&#39;json_index.json&#39;</span><span class="p">)</span>
 <span class="c1"># Example content of json_index.json:</span>
 <span class="n">JSON_INDEX_STR</span> <span class="o">=</span> <span class="s2">&quot;&quot;&quot;</span>
diff --git a/docs-archive/apache-airflow-providers-apache-druid/2.3.3/operators.html b/docs-archive/apache-airflow-providers-apache-druid/2.3.3/operators.html
index fe2d63d566..098668a288 100644
--- a/docs-archive/apache-airflow-providers-apache-druid/2.3.3/operators.html
+++ b/docs-archive/apache-airflow-providers-apache-druid/2.3.3/operators.html
@@ -594,7 +594,7 @@
 <div class="section" id="using-the-operator">
 <h3>Using the operator<a class="headerlink" href="#using-the-operator" title="Permalink to this headline">¶</a></h3>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/druid/example_dags/example_druid_dag.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/druid/example_dags/example_druid_dag.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/druid/example_dags/example_druid_dag.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-druid/2.3.3/airflow/providers/apache/druid/example_dags/example_druid_dag.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">submit_job</span> <span class="o">=</span> <span class="n">DruidOperator</span><span class="p">(</span><span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;spark_submit_job&#39;</span><span class="p">,</span> <span class="n">json_index_file</span><span class="o">=</span><span class="s1">&#39;json_index.json&#39;</span><span class="p">)</span>
 <span class="c1"># Example content of json_index.json:</span>
 <span class="n">JSON_INDEX_STR</span> <span class="o">=</span> <span class="s2">&quot;&quot;&quot;</span>
diff --git a/docs-archive/apache-airflow-providers-apache-hive/2.3.0/operators.html b/docs-archive/apache-airflow-providers-apache-hive/2.3.0/operators.html
index 633a142e42..7b605be4a4 100644
--- a/docs-archive/apache-airflow-providers-apache-hive/2.3.0/operators.html
+++ b/docs-archive/apache-airflow-providers-apache-hive/2.3.0/operators.html
@@ -595,7 +595,7 @@ Structure can be projected onto data already in storage.</p>
 <h2>HiveOperator<a class="headerlink" href="#hiveoperator" title="Permalink to this headline">¶</a></h2>
 <p>This operator executes hql code or hive script in a specific Hive database.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/hive/example_dags/example_twitter_dag.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/hive/example_dags/example_twitter_dag.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/hive/example_dags/example_twitter_dag.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-hive/2.3.0/airflow/providers/apache/hive/example_dags/example_twitter_dag.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>        <span class="n">load_to_hive</span> <span class="o">=</span> <span class="n">HiveOperator</span><span class="p">(</span>
             <span class="n">task_id</span><span class="o">=</span><span class="sa">f</span><span class="s2">&quot;load_</span><span class="si">{</span><span class="n">channel</span><span class="si">}</span><span class="s2">_to_hive&quot;</span><span class="p">,</span>
             <span class="n">hql</span><span class="o">=</span><span class="p">(</span>
diff --git a/docs-archive/apache-airflow-providers-apache-hive/2.3.1/operators.html b/docs-archive/apache-airflow-providers-apache-hive/2.3.1/operators.html
index 0adad4737f..47d5b0cb6a 100644
--- a/docs-archive/apache-airflow-providers-apache-hive/2.3.1/operators.html
+++ b/docs-archive/apache-airflow-providers-apache-hive/2.3.1/operators.html
@@ -595,7 +595,7 @@ Structure can be projected onto data already in storage.</p>
 <h2>HiveOperator<a class="headerlink" href="#hiveoperator" title="Permalink to this headline">¶</a></h2>
 <p>This operator executes hql code or hive script in a specific Hive database.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/hive/example_dags/example_twitter_dag.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/hive/example_dags/example_twitter_dag.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/hive/example_dags/example_twitter_dag.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-hive/2.3.1/airflow/providers/apache/hive/example_dags/example_twitter_dag.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>        <span class="n">load_to_hive</span> <span class="o">=</span> <span class="n">HiveOperator</span><span class="p">(</span>
             <span class="n">task_id</span><span class="o">=</span><span class="sa">f</span><span class="s2">&quot;load_</span><span class="si">{</span><span class="n">channel</span><span class="si">}</span><span class="s2">_to_hive&quot;</span><span class="p">,</span>
             <span class="n">hql</span><span class="o">=</span><span class="p">(</span>
diff --git a/docs-archive/apache-airflow-providers-apache-hive/2.3.2/operators.html b/docs-archive/apache-airflow-providers-apache-hive/2.3.2/operators.html
index 904c36285b..f937da5a2f 100644
--- a/docs-archive/apache-airflow-providers-apache-hive/2.3.2/operators.html
+++ b/docs-archive/apache-airflow-providers-apache-hive/2.3.2/operators.html
@@ -595,7 +595,7 @@ Structure can be projected onto data already in storage.</p>
 <h2>HiveOperator<a class="headerlink" href="#hiveoperator" title="Permalink to this headline">¶</a></h2>
 <p>This operator executes hql code or hive script in a specific Hive database.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/hive/example_dags/example_twitter_dag.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/hive/example_dags/example_twitter_dag.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/hive/example_dags/example_twitter_dag.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-hive/2.3.2/airflow/providers/apache/hive/example_dags/example_twitter_dag.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>        <span class="n">load_to_hive</span> <span class="o">=</span> <span class="n">HiveOperator</span><span class="p">(</span>
             <span class="n">task_id</span><span class="o">=</span><span class="sa">f</span><span class="s2">&quot;load_</span><span class="si">{</span><span class="n">channel</span><span class="si">}</span><span class="s2">_to_hive&quot;</span><span class="p">,</span>
             <span class="n">hql</span><span class="o">=</span><span class="p">(</span>
diff --git a/docs-archive/apache-airflow-providers-apache-hive/2.3.3/operators.html b/docs-archive/apache-airflow-providers-apache-hive/2.3.3/operators.html
index d6a7ad31a1..829591920a 100644
--- a/docs-archive/apache-airflow-providers-apache-hive/2.3.3/operators.html
+++ b/docs-archive/apache-airflow-providers-apache-hive/2.3.3/operators.html
@@ -597,7 +597,7 @@ Structure can be projected onto data already in storage.</p>
 <h2>HiveOperator<a class="headerlink" href="#hiveoperator" title="Permalink to this headline">¶</a></h2>
 <p>This operator executes hql code or hive script in a specific Hive database.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/hive/example_dags/example_twitter_dag.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/hive/example_dags/example_twitter_dag.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/hive/example_dags/example_twitter_dag.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-hive/2.3.3/airflow/providers/apache/hive/example_dags/example_twitter_dag.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>        <span class="n">load_to_hive</span> <span class="o">=</span> <span class="n">HiveOperator</span><span class="p">(</span>
             <span class="n">task_id</span><span class="o">=</span><span class="sa">f</span><span class="s2">&quot;load_</span><span class="si">{</span><span class="n">channel</span><span class="si">}</span><span class="s2">_to_hive&quot;</span><span class="p">,</span>
             <span class="n">hql</span><span class="o">=</span><span class="p">(</span>
diff --git a/docs-archive/apache-airflow-providers-apache-livy/2.2.0/operators.html b/docs-archive/apache-airflow-providers-apache-livy/2.2.0/operators.html
index b9ea9af164..d7cd8b2f22 100644
--- a/docs-archive/apache-airflow-providers-apache-livy/2.2.0/operators.html
+++ b/docs-archive/apache-airflow-providers-apache-livy/2.2.0/operators.html
@@ -593,7 +593,7 @@ as well as Spark Context management, all via a simple REST interface or an RPC c
 <h2>LivyOperator<a class="headerlink" href="#livyoperator" title="Permalink to this headline">¶</a></h2>
 <p>This operator wraps the Apache Livy batch REST API, allowing to submit a Spark application to the underlying cluster.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/livy/example_dags/example_livy.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/livy/example_dags/example_livy.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/livy/example_dags/example_livy.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-livy/2.2.0/airflow/providers/apache/livy/example_dags/example_livy.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">livy_java_task</span> <span class="o">=</span> <span class="n">LivyOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;pi_java_task&#39;</span><span class="p">,</span>
         <span class="n">file</span><span class="o">=</span><span class="s1">&#39;/spark-examples.jar&#39;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-apache-livy/2.2.1/operators.html b/docs-archive/apache-airflow-providers-apache-livy/2.2.1/operators.html
index a8b13d2609..3025ff0e94 100644
--- a/docs-archive/apache-airflow-providers-apache-livy/2.2.1/operators.html
+++ b/docs-archive/apache-airflow-providers-apache-livy/2.2.1/operators.html
@@ -593,7 +593,7 @@ as well as Spark Context management, all via a simple REST interface or an RPC c
 <h2>LivyOperator<a class="headerlink" href="#livyoperator" title="Permalink to this headline">¶</a></h2>
 <p>This operator wraps the Apache Livy batch REST API, allowing to submit a Spark application to the underlying cluster.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/livy/example_dags/example_livy.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/livy/example_dags/example_livy.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/livy/example_dags/example_livy.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-livy/2.2.1/airflow/providers/apache/livy/example_dags/example_livy.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">livy_java_task</span> <span class="o">=</span> <span class="n">LivyOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;pi_java_task&#39;</span><span class="p">,</span>
         <span class="n">file</span><span class="o">=</span><span class="s1">&#39;/spark-examples.jar&#39;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-apache-livy/2.2.2/operators.html b/docs-archive/apache-airflow-providers-apache-livy/2.2.2/operators.html
index 28e94453e1..e14ab950f8 100644
--- a/docs-archive/apache-airflow-providers-apache-livy/2.2.2/operators.html
+++ b/docs-archive/apache-airflow-providers-apache-livy/2.2.2/operators.html
@@ -593,7 +593,7 @@ as well as Spark Context management, all via a simple REST interface or an RPC c
 <h2>LivyOperator<a class="headerlink" href="#livyoperator" title="Permalink to this headline">¶</a></h2>
 <p>This operator wraps the Apache Livy batch REST API, allowing to submit a Spark application to the underlying cluster.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/livy/example_dags/example_livy.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/livy/example_dags/example_livy.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/livy/example_dags/example_livy.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-livy/2.2.2/airflow/providers/apache/livy/example_dags/example_livy.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">livy_java_task</span> <span class="o">=</span> <span class="n">LivyOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;pi_java_task&#39;</span><span class="p">,</span>
         <span class="n">file</span><span class="o">=</span><span class="s1">&#39;/spark-examples.jar&#39;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-apache-livy/2.2.3/operators.html b/docs-archive/apache-airflow-providers-apache-livy/2.2.3/operators.html
index 30f3925ae3..26811f57cb 100644
--- a/docs-archive/apache-airflow-providers-apache-livy/2.2.3/operators.html
+++ b/docs-archive/apache-airflow-providers-apache-livy/2.2.3/operators.html
@@ -595,7 +595,7 @@ as well as Spark Context management, all via a simple REST interface or an RPC c
 <h2>LivyOperator<a class="headerlink" href="#livyoperator" title="Permalink to this headline">¶</a></h2>
 <p>This operator wraps the Apache Livy batch REST API, allowing to submit a Spark application to the underlying cluster.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/livy/example_dags/example_livy.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/livy/example_dags/example_livy.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/livy/example_dags/example_livy.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-livy/2.2.3/airflow/providers/apache/livy/example_dags/example_livy.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">livy_java_task</span> <span class="o">=</span> <span class="n">LivyOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;pi_java_task&#39;</span><span class="p">,</span>
         <span class="n">file</span><span class="o">=</span><span class="s1">&#39;/spark-examples.jar&#39;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-apache-pig/2.0.2/operators.html b/docs-archive/apache-airflow-providers-apache-pig/2.0.2/operators.html
index 5bcb69a287..a08e7c6589 100644
--- a/docs-archive/apache-airflow-providers-apache-pig/2.0.2/operators.html
+++ b/docs-archive/apache-airflow-providers-apache-pig/2.0.2/operators.html
@@ -591,7 +591,7 @@ for expressing data analysis programs, coupled with infrastructure for evaluatin
 Pig programs are amenable to substantial parallelization, which in turns enables them to handle very large data sets.</p>
 <p>use the PigOperator to execute a pig script</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/pig/example_dags/example_pig.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/pig/example_dags/example_pig.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/pig/example_dags/example_pig.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-pig/2.0.2/airflow/providers/apache/pig/example_dags/example_pig.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">run_this</span> <span class="o">=</span> <span class="n">PigOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;run_example_pig_script&quot;</span><span class="p">,</span>
     <span class="n">pig</span><span class="o">=</span><span class="s2">&quot;ls /;&quot;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-apache-pig/2.0.3/operators.html b/docs-archive/apache-airflow-providers-apache-pig/2.0.3/operators.html
index 930e8ae804..d6f20802b1 100644
--- a/docs-archive/apache-airflow-providers-apache-pig/2.0.3/operators.html
+++ b/docs-archive/apache-airflow-providers-apache-pig/2.0.3/operators.html
@@ -591,7 +591,7 @@ for expressing data analysis programs, coupled with infrastructure for evaluatin
 Pig programs are amenable to substantial parallelization, which in turns enables them to handle very large data sets.</p>
 <p>use the PigOperator to execute a pig script</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/pig/example_dags/example_pig.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/pig/example_dags/example_pig.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/pig/example_dags/example_pig.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-pig/2.0.3/airflow/providers/apache/pig/example_dags/example_pig.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">run_this</span> <span class="o">=</span> <span class="n">PigOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;run_example_pig_script&quot;</span><span class="p">,</span>
     <span class="n">pig</span><span class="o">=</span><span class="s2">&quot;ls /;&quot;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-apache-pig/2.0.4/operators.html b/docs-archive/apache-airflow-providers-apache-pig/2.0.4/operators.html
index 83498c102a..9392767174 100644
--- a/docs-archive/apache-airflow-providers-apache-pig/2.0.4/operators.html
+++ b/docs-archive/apache-airflow-providers-apache-pig/2.0.4/operators.html
@@ -593,7 +593,7 @@ for expressing data analysis programs, coupled with infrastructure for evaluatin
 Pig programs are amenable to substantial parallelization, which in turns enables them to handle very large data sets.</p>
 <p>use the PigOperator to execute a pig script</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/pig/example_dags/example_pig.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/pig/example_dags/example_pig.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/pig/example_dags/example_pig.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-pig/2.0.4/airflow/providers/apache/pig/example_dags/example_pig.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">run_this</span> <span class="o">=</span> <span class="n">PigOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;run_example_pig_script&quot;</span><span class="p">,</span>
     <span class="n">pig</span><span class="o">=</span><span class="s2">&quot;ls /;&quot;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-apache-spark/1.0.0/operators.html b/docs-archive/apache-airflow-providers-apache-spark/1.0.0/operators.html
index 27485415ca..e8ea801b7d 100644
--- a/docs-archive/apache-airflow-providers-apache-spark/1.0.0/operators.html
+++ b/docs-archive/apache-airflow-providers-apache-spark/1.0.0/operators.html
@@ -595,7 +595,7 @@
 <h3>Using the operator<a class="headerlink" href="#using-the-operator" title="Permalink to this headline">¶</a></h3>
 <p>Using <code class="docutils literal notranslate"><span class="pre">cmd_type</span></code> parameter, is possible to transfer data from Spark to a database (<code class="docutils literal notranslate"><span class="pre">spark_to_jdbc</span></code>) or from a database to Spark (<code class="docutils literal notranslate"><span class="pre">jdbc_to_spark</span></code>), which will write the table using the Spark command <code class="docutils literal notranslate"><span class="pre">saveAsTable [...]
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/spark/example_dags/example_spark_dag.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-spark/1.0.0/airflow/providers/apache/spark/example_dags/example_spark_dag.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">jdbc_to_spark_job</span> <span class="o">=</span> <span class="n">SparkJDBCOperator</span><span class="p">(</span>
     <span class="n">cmd_type</span><span class="o">=</span><span class="s1">&#39;jdbc_to_spark&#39;</span><span class="p">,</span>
     <span class="n">jdbc_table</span><span class="o">=</span><span class="s2">&quot;foo&quot;</span><span class="p">,</span>
@@ -633,7 +633,7 @@ The operator will run the SQL query on Spark Hive metastore service, the <code c
 <div class="section" id="id1">
 <h3>Using the operator<a class="headerlink" href="#id1" title="Permalink to this headline">¶</a></h3>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/spark/example_dags/example_spark_dag.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-spark/1.0.0/airflow/providers/apache/spark/example_dags/example_spark_dag.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">sql_job</span> <span class="o">=</span> <span class="n">SparkSqlOperator</span><span class="p">(</span><span class="n">sql</span><span class="o">=</span><span class="s2">&quot;SELECT * FROM bar&quot;</span><span class="p">,</span> <span class="n">master</span><span class="o">=</span><span class="s2">&quot;local&quot;</span><span class="p">,</span> <span class="n">task_id</span><span class=" [...]
 </pre></div>
 </div>
@@ -651,7 +651,7 @@ The operator will run the SQL query on Spark Hive metastore service, the <code c
 <div class="section" id="id3">
 <h3>Using the operator<a class="headerlink" href="#id3" title="Permalink to this headline">¶</a></h3>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/spark/example_dags/example_spark_dag.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-spark/1.0.0/airflow/providers/apache/spark/example_dags/example_spark_dag.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">submit_job</span> <span class="o">=</span> <span class="n">SparkSubmitOperator</span><span class="p">(</span>
     <span class="n">application</span><span class="o">=</span><span class="s2">&quot;$</span><span class="si">{SPARK_HOME}</span><span class="s2">/examples/src/main/python/pi.py&quot;</span><span class="p">,</span> <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;submit_job&quot;</span>
 <span class="p">)</span>
diff --git a/docs-archive/apache-airflow-providers-apache-spark/1.0.1/operators.html b/docs-archive/apache-airflow-providers-apache-spark/1.0.1/operators.html
index 6ccb3feb62..ee7ae2114c 100644
--- a/docs-archive/apache-airflow-providers-apache-spark/1.0.1/operators.html
+++ b/docs-archive/apache-airflow-providers-apache-spark/1.0.1/operators.html
@@ -605,7 +605,7 @@
 <h3>Using the operator<a class="headerlink" href="#using-the-operator" title="Permalink to this headline">¶</a></h3>
 <p>Using <code class="docutils literal notranslate"><span class="pre">cmd_type</span></code> parameter, is possible to transfer data from Spark to a database (<code class="docutils literal notranslate"><span class="pre">spark_to_jdbc</span></code>) or from a database to Spark (<code class="docutils literal notranslate"><span class="pre">jdbc_to_spark</span></code>), which will write the table using the Spark command <code class="docutils literal notranslate"><span class="pre">saveAsTable [...]
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/spark/example_dags/example_spark_dag.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-spark/1.0.1/airflow/providers/apache/spark/example_dags/example_spark_dag.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">jdbc_to_spark_job</span> <span class="o">=</span> <span class="n">SparkJDBCOperator</span><span class="p">(</span>
     <span class="n">cmd_type</span><span class="o">=</span><span class="s1">&#39;jdbc_to_spark&#39;</span><span class="p">,</span>
     <span class="n">jdbc_table</span><span class="o">=</span><span class="s2">&quot;foo&quot;</span><span class="p">,</span>
@@ -643,7 +643,7 @@ The operator will run the SQL query on Spark Hive metastore service, the <code c
 <div class="section" id="id1">
 <h3>Using the operator<a class="headerlink" href="#id1" title="Permalink to this headline">¶</a></h3>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/spark/example_dags/example_spark_dag.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-spark/1.0.1/airflow/providers/apache/spark/example_dags/example_spark_dag.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">sql_job</span> <span class="o">=</span> <span class="n">SparkSqlOperator</span><span class="p">(</span><span class="n">sql</span><span class="o">=</span><span class="s2">&quot;SELECT * FROM bar&quot;</span><span class="p">,</span> <span class="n">master</span><span class="o">=</span><span class="s2">&quot;local&quot;</span><span class="p">,</span> <span class="n">task_id</span><span class=" [...]
 </pre></div>
 </div>
@@ -661,7 +661,7 @@ The operator will run the SQL query on Spark Hive metastore service, the <code c
 <div class="section" id="id3">
 <h3>Using the operator<a class="headerlink" href="#id3" title="Permalink to this headline">¶</a></h3>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/spark/example_dags/example_spark_dag.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-spark/1.0.1/airflow/providers/apache/spark/example_dags/example_spark_dag.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">submit_job</span> <span class="o">=</span> <span class="n">SparkSubmitOperator</span><span class="p">(</span>
     <span class="n">application</span><span class="o">=</span><span class="s2">&quot;$</span><span class="si">{SPARK_HOME}</span><span class="s2">/examples/src/main/python/pi.py&quot;</span><span class="p">,</span> <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;submit_job&quot;</span>
 <span class="p">)</span>
diff --git a/docs-archive/apache-airflow-providers-apache-spark/1.0.2/operators.html b/docs-archive/apache-airflow-providers-apache-spark/1.0.2/operators.html
index 6ccb3feb62..dfb620d90e 100644
--- a/docs-archive/apache-airflow-providers-apache-spark/1.0.2/operators.html
+++ b/docs-archive/apache-airflow-providers-apache-spark/1.0.2/operators.html
@@ -605,7 +605,7 @@
 <h3>Using the operator<a class="headerlink" href="#using-the-operator" title="Permalink to this headline">¶</a></h3>
 <p>Using <code class="docutils literal notranslate"><span class="pre">cmd_type</span></code> parameter, is possible to transfer data from Spark to a database (<code class="docutils literal notranslate"><span class="pre">spark_to_jdbc</span></code>) or from a database to Spark (<code class="docutils literal notranslate"><span class="pre">jdbc_to_spark</span></code>), which will write the table using the Spark command <code class="docutils literal notranslate"><span class="pre">saveAsTable [...]
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/spark/example_dags/example_spark_dag.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-spark/1.0.2/airflow/providers/apache/spark/example_dags/example_spark_dag.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">jdbc_to_spark_job</span> <span class="o">=</span> <span class="n">SparkJDBCOperator</span><span class="p">(</span>
     <span class="n">cmd_type</span><span class="o">=</span><span class="s1">&#39;jdbc_to_spark&#39;</span><span class="p">,</span>
     <span class="n">jdbc_table</span><span class="o">=</span><span class="s2">&quot;foo&quot;</span><span class="p">,</span>
@@ -643,7 +643,7 @@ The operator will run the SQL query on Spark Hive metastore service, the <code c
 <div class="section" id="id1">
 <h3>Using the operator<a class="headerlink" href="#id1" title="Permalink to this headline">¶</a></h3>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/spark/example_dags/example_spark_dag.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-spark/1.0.2/airflow/providers/apache/spark/example_dags/example_spark_dag.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">sql_job</span> <span class="o">=</span> <span class="n">SparkSqlOperator</span><span class="p">(</span><span class="n">sql</span><span class="o">=</span><span class="s2">&quot;SELECT * FROM bar&quot;</span><span class="p">,</span> <span class="n">master</span><span class="o">=</span><span class="s2">&quot;local&quot;</span><span class="p">,</span> <span class="n">task_id</span><span class=" [...]
 </pre></div>
 </div>
@@ -661,7 +661,7 @@ The operator will run the SQL query on Spark Hive metastore service, the <code c
 <div class="section" id="id3">
 <h3>Using the operator<a class="headerlink" href="#id3" title="Permalink to this headline">¶</a></h3>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/spark/example_dags/example_spark_dag.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-spark/1.0.2/airflow/providers/apache/spark/example_dags/example_spark_dag.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">submit_job</span> <span class="o">=</span> <span class="n">SparkSubmitOperator</span><span class="p">(</span>
     <span class="n">application</span><span class="o">=</span><span class="s2">&quot;$</span><span class="si">{SPARK_HOME}</span><span class="s2">/examples/src/main/python/pi.py&quot;</span><span class="p">,</span> <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;submit_job&quot;</span>
 <span class="p">)</span>
diff --git a/docs-archive/apache-airflow-providers-apache-spark/1.0.3/operators.html b/docs-archive/apache-airflow-providers-apache-spark/1.0.3/operators.html
index d2ed904b7a..60ba360ed4 100644
--- a/docs-archive/apache-airflow-providers-apache-spark/1.0.3/operators.html
+++ b/docs-archive/apache-airflow-providers-apache-spark/1.0.3/operators.html
@@ -605,7 +605,7 @@
 <h3>Using the operator<a class="headerlink" href="#using-the-operator" title="Permalink to this headline">¶</a></h3>
 <p>Using <code class="docutils literal notranslate"><span class="pre">cmd_type</span></code> parameter, is possible to transfer data from Spark to a database (<code class="docutils literal notranslate"><span class="pre">spark_to_jdbc</span></code>) or from a database to Spark (<code class="docutils literal notranslate"><span class="pre">jdbc_to_spark</span></code>), which will write the table using the Spark command <code class="docutils literal notranslate"><span class="pre">saveAsTable [...]
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/spark/example_dags/example_spark_dag.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-spark/1.0.3/airflow/providers/apache/spark/example_dags/example_spark_dag.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">jdbc_to_spark_job</span> <span class="o">=</span> <span class="n">SparkJDBCOperator</span><span class="p">(</span>
     <span class="n">cmd_type</span><span class="o">=</span><span class="s1">&#39;jdbc_to_spark&#39;</span><span class="p">,</span>
     <span class="n">jdbc_table</span><span class="o">=</span><span class="s2">&quot;foo&quot;</span><span class="p">,</span>
@@ -643,7 +643,7 @@ The operator will run the SQL query on Spark Hive metastore service, the <code c
 <div class="section" id="id1">
 <h3>Using the operator<a class="headerlink" href="#id1" title="Permalink to this headline">¶</a></h3>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/spark/example_dags/example_spark_dag.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-spark/1.0.3/airflow/providers/apache/spark/example_dags/example_spark_dag.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">sql_job</span> <span class="o">=</span> <span class="n">SparkSqlOperator</span><span class="p">(</span><span class="n">sql</span><span class="o">=</span><span class="s2">&quot;SELECT * FROM bar&quot;</span><span class="p">,</span> <span class="n">master</span><span class="o">=</span><span class="s2">&quot;local&quot;</span><span class="p">,</span> <span class="n">task_id</span><span class=" [...]
 </pre></div>
 </div>
@@ -661,7 +661,7 @@ The operator will run the SQL query on Spark Hive metastore service, the <code c
 <div class="section" id="id3">
 <h3>Using the operator<a class="headerlink" href="#id3" title="Permalink to this headline">¶</a></h3>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/spark/example_dags/example_spark_dag.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-spark/1.0.3/airflow/providers/apache/spark/example_dags/example_spark_dag.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">submit_job</span> <span class="o">=</span> <span class="n">SparkSubmitOperator</span><span class="p">(</span>
     <span class="n">application</span><span class="o">=</span><span class="s2">&quot;$</span><span class="si">{SPARK_HOME}</span><span class="s2">/examples/src/main/python/pi.py&quot;</span><span class="p">,</span> <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;submit_job&quot;</span>
 <span class="p">)</span>
diff --git a/docs-archive/apache-airflow-providers-apache-spark/2.0.0/operators.html b/docs-archive/apache-airflow-providers-apache-spark/2.0.0/operators.html
index b5a12451d6..d7a5d99079 100644
--- a/docs-archive/apache-airflow-providers-apache-spark/2.0.0/operators.html
+++ b/docs-archive/apache-airflow-providers-apache-spark/2.0.0/operators.html
@@ -604,7 +604,7 @@
 <h3>Using the operator<a class="headerlink" href="#using-the-operator" title="Permalink to this headline">¶</a></h3>
 <p>Using <code class="docutils literal notranslate"><span class="pre">cmd_type</span></code> parameter, is possible to transfer data from Spark to a database (<code class="docutils literal notranslate"><span class="pre">spark_to_jdbc</span></code>) or from a database to Spark (<code class="docutils literal notranslate"><span class="pre">jdbc_to_spark</span></code>), which will write the table using the Spark command <code class="docutils literal notranslate"><span class="pre">saveAsTable [...]
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/spark/example_dags/example_spark_dag.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-spark/2.0.0/airflow/providers/apache/spark/example_dags/example_spark_dag.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">jdbc_to_spark_job</span> <span class="o">=</span> <span class="n">SparkJDBCOperator</span><span class="p">(</span>
     <span class="n">cmd_type</span><span class="o">=</span><span class="s1">&#39;jdbc_to_spark&#39;</span><span class="p">,</span>
     <span class="n">jdbc_table</span><span class="o">=</span><span class="s2">&quot;foo&quot;</span><span class="p">,</span>
@@ -642,7 +642,7 @@ The operator will run the SQL query on Spark Hive metastore service, the <code c
 <div class="section" id="id1">
 <h3>Using the operator<a class="headerlink" href="#id1" title="Permalink to this headline">¶</a></h3>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/spark/example_dags/example_spark_dag.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-spark/2.0.0/airflow/providers/apache/spark/example_dags/example_spark_dag.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">sql_job</span> <span class="o">=</span> <span class="n">SparkSqlOperator</span><span class="p">(</span><span class="n">sql</span><span class="o">=</span><span class="s2">&quot;SELECT * FROM bar&quot;</span><span class="p">,</span> <span class="n">master</span><span class="o">=</span><span class="s2">&quot;local&quot;</span><span class="p">,</span> <span class="n">task_id</span><span class=" [...]
 </pre></div>
 </div>
@@ -660,7 +660,7 @@ The operator will run the SQL query on Spark Hive metastore service, the <code c
 <div class="section" id="id3">
 <h3>Using the operator<a class="headerlink" href="#id3" title="Permalink to this headline">¶</a></h3>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/spark/example_dags/example_spark_dag.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-spark/2.0.0/airflow/providers/apache/spark/example_dags/example_spark_dag.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">submit_job</span> <span class="o">=</span> <span class="n">SparkSubmitOperator</span><span class="p">(</span>
     <span class="n">application</span><span class="o">=</span><span class="s2">&quot;$</span><span class="si">{SPARK_HOME}</span><span class="s2">/examples/src/main/python/pi.py&quot;</span><span class="p">,</span> <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;submit_job&quot;</span>
 <span class="p">)</span>
diff --git a/docs-archive/apache-airflow-providers-apache-spark/2.0.1/operators.html b/docs-archive/apache-airflow-providers-apache-spark/2.0.1/operators.html
index 0c40fbe4b7..ac9cd37a83 100644
--- a/docs-archive/apache-airflow-providers-apache-spark/2.0.1/operators.html
+++ b/docs-archive/apache-airflow-providers-apache-spark/2.0.1/operators.html
@@ -598,7 +598,7 @@
 <h3>Using the operator<a class="headerlink" href="#using-the-operator" title="Permalink to this headline">¶</a></h3>
 <p>Using <code class="docutils literal notranslate"><span class="pre">cmd_type</span></code> parameter, is possible to transfer data from Spark to a database (<code class="docutils literal notranslate"><span class="pre">spark_to_jdbc</span></code>) or from a database to Spark (<code class="docutils literal notranslate"><span class="pre">jdbc_to_spark</span></code>), which will write the table using the Spark command <code class="docutils literal notranslate"><span class="pre">saveAsTable [...]
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/spark/example_dags/example_spark_dag.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-spark/2.0.1/airflow/providers/apache/spark/example_dags/example_spark_dag.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">jdbc_to_spark_job</span> <span class="o">=</span> <span class="n">SparkJDBCOperator</span><span class="p">(</span>
     <span class="n">cmd_type</span><span class="o">=</span><span class="s1">&#39;jdbc_to_spark&#39;</span><span class="p">,</span>
     <span class="n">jdbc_table</span><span class="o">=</span><span class="s2">&quot;foo&quot;</span><span class="p">,</span>
@@ -636,7 +636,7 @@ The operator will run the SQL query on Spark Hive metastore service, the <code c
 <div class="section" id="id1">
 <h3>Using the operator<a class="headerlink" href="#id1" title="Permalink to this headline">¶</a></h3>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/spark/example_dags/example_spark_dag.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-spark/2.0.1/airflow/providers/apache/spark/example_dags/example_spark_dag.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">sql_job</span> <span class="o">=</span> <span class="n">SparkSqlOperator</span><span class="p">(</span><span class="n">sql</span><span class="o">=</span><span class="s2">&quot;SELECT * FROM bar&quot;</span><span class="p">,</span> <span class="n">master</span><span class="o">=</span><span class="s2">&quot;local&quot;</span><span class="p">,</span> <span class="n">task_id</span><span class=" [...]
 </pre></div>
 </div>
@@ -654,7 +654,7 @@ The operator will run the SQL query on Spark Hive metastore service, the <code c
 <div class="section" id="id3">
 <h3>Using the operator<a class="headerlink" href="#id3" title="Permalink to this headline">¶</a></h3>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/spark/example_dags/example_spark_dag.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-spark/2.0.1/airflow/providers/apache/spark/example_dags/example_spark_dag.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">submit_job</span> <span class="o">=</span> <span class="n">SparkSubmitOperator</span><span class="p">(</span>
     <span class="n">application</span><span class="o">=</span><span class="s2">&quot;$</span><span class="si">{SPARK_HOME}</span><span class="s2">/examples/src/main/python/pi.py&quot;</span><span class="p">,</span> <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;submit_job&quot;</span>
 <span class="p">)</span>
diff --git a/docs-archive/apache-airflow-providers-apache-spark/2.0.2/operators.html b/docs-archive/apache-airflow-providers-apache-spark/2.0.2/operators.html
index 801fa33fc4..ef3e49ca92 100644
--- a/docs-archive/apache-airflow-providers-apache-spark/2.0.2/operators.html
+++ b/docs-archive/apache-airflow-providers-apache-spark/2.0.2/operators.html
@@ -598,7 +598,7 @@
 <h3>Using the operator<a class="headerlink" href="#using-the-operator" title="Permalink to this headline">¶</a></h3>
 <p>Using <code class="docutils literal notranslate"><span class="pre">cmd_type</span></code> parameter, is possible to transfer data from Spark to a database (<code class="docutils literal notranslate"><span class="pre">spark_to_jdbc</span></code>) or from a database to Spark (<code class="docutils literal notranslate"><span class="pre">jdbc_to_spark</span></code>), which will write the table using the Spark command <code class="docutils literal notranslate"><span class="pre">saveAsTable [...]
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/spark/example_dags/example_spark_dag.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-spark/2.0.2/airflow/providers/apache/spark/example_dags/example_spark_dag.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">jdbc_to_spark_job</span> <span class="o">=</span> <span class="n">SparkJDBCOperator</span><span class="p">(</span>
     <span class="n">cmd_type</span><span class="o">=</span><span class="s1">&#39;jdbc_to_spark&#39;</span><span class="p">,</span>
     <span class="n">jdbc_table</span><span class="o">=</span><span class="s2">&quot;foo&quot;</span><span class="p">,</span>
@@ -636,7 +636,7 @@ The operator will run the SQL query on Spark Hive metastore service, the <code c
 <div class="section" id="id1">
 <h3>Using the operator<a class="headerlink" href="#id1" title="Permalink to this headline">¶</a></h3>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/spark/example_dags/example_spark_dag.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-spark/2.0.2/airflow/providers/apache/spark/example_dags/example_spark_dag.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">sql_job</span> <span class="o">=</span> <span class="n">SparkSqlOperator</span><span class="p">(</span><span class="n">sql</span><span class="o">=</span><span class="s2">&quot;SELECT * FROM bar&quot;</span><span class="p">,</span> <span class="n">master</span><span class="o">=</span><span class="s2">&quot;local&quot;</span><span class="p">,</span> <span class="n">task_id</span><span class=" [...]
 </pre></div>
 </div>
@@ -654,7 +654,7 @@ The operator will run the SQL query on Spark Hive metastore service, the <code c
 <div class="section" id="id3">
 <h3>Using the operator<a class="headerlink" href="#id3" title="Permalink to this headline">¶</a></h3>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/spark/example_dags/example_spark_dag.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-spark/2.0.2/airflow/providers/apache/spark/example_dags/example_spark_dag.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">submit_job</span> <span class="o">=</span> <span class="n">SparkSubmitOperator</span><span class="p">(</span>
     <span class="n">application</span><span class="o">=</span><span class="s2">&quot;$</span><span class="si">{SPARK_HOME}</span><span class="s2">/examples/src/main/python/pi.py&quot;</span><span class="p">,</span> <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;submit_job&quot;</span>
 <span class="p">)</span>
diff --git a/docs-archive/apache-airflow-providers-apache-spark/2.0.3/operators.html b/docs-archive/apache-airflow-providers-apache-spark/2.0.3/operators.html
index 863f271069..03f2234024 100644
--- a/docs-archive/apache-airflow-providers-apache-spark/2.0.3/operators.html
+++ b/docs-archive/apache-airflow-providers-apache-spark/2.0.3/operators.html
@@ -601,7 +601,7 @@
 <h3>Using the operator<a class="headerlink" href="#using-the-operator" title="Permalink to this headline">¶</a></h3>
 <p>Using <code class="docutils literal notranslate"><span class="pre">cmd_type</span></code> parameter, is possible to transfer data from Spark to a database (<code class="docutils literal notranslate"><span class="pre">spark_to_jdbc</span></code>) or from a database to Spark (<code class="docutils literal notranslate"><span class="pre">jdbc_to_spark</span></code>), which will write the table using the Spark command <code class="docutils literal notranslate"><span class="pre">saveAsTable [...]
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/spark/example_dags/example_spark_dag.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-spark/2.0.3/airflow/providers/apache/spark/example_dags/example_spark_dag.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">jdbc_to_spark_job</span> <span class="o">=</span> <span class="n">SparkJDBCOperator</span><span class="p">(</span>
     <span class="n">cmd_type</span><span class="o">=</span><span class="s1">&#39;jdbc_to_spark&#39;</span><span class="p">,</span>
     <span class="n">jdbc_table</span><span class="o">=</span><span class="s2">&quot;foo&quot;</span><span class="p">,</span>
@@ -639,7 +639,7 @@ The operator will run the SQL query on Spark Hive metastore service, the <code c
 <div class="section" id="id1">
 <h3>Using the operator<a class="headerlink" href="#id1" title="Permalink to this headline">¶</a></h3>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/spark/example_dags/example_spark_dag.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-spark/2.0.3/airflow/providers/apache/spark/example_dags/example_spark_dag.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">sql_job</span> <span class="o">=</span> <span class="n">SparkSqlOperator</span><span class="p">(</span><span class="n">sql</span><span class="o">=</span><span class="s2">&quot;SELECT * FROM bar&quot;</span><span class="p">,</span> <span class="n">master</span><span class="o">=</span><span class="s2">&quot;local&quot;</span><span class="p">,</span> <span class="n">task_id</span><span class=" [...]
 </pre></div>
 </div>
@@ -657,7 +657,7 @@ The operator will run the SQL query on Spark Hive metastore service, the <code c
 <div class="section" id="id3">
 <h3>Using the operator<a class="headerlink" href="#id3" title="Permalink to this headline">¶</a></h3>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/spark/example_dags/example_spark_dag.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-spark/2.0.3/airflow/providers/apache/spark/example_dags/example_spark_dag.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">submit_job</span> <span class="o">=</span> <span class="n">SparkSubmitOperator</span><span class="p">(</span>
     <span class="n">application</span><span class="o">=</span><span class="s2">&quot;$</span><span class="si">{SPARK_HOME}</span><span class="s2">/examples/src/main/python/pi.py&quot;</span><span class="p">,</span> <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;submit_job&quot;</span>
 <span class="p">)</span>
diff --git a/docs-archive/apache-airflow-providers-apache-spark/2.1.0/operators.html b/docs-archive/apache-airflow-providers-apache-spark/2.1.0/operators.html
index 0717c3a347..67006cc44e 100644
--- a/docs-archive/apache-airflow-providers-apache-spark/2.1.0/operators.html
+++ b/docs-archive/apache-airflow-providers-apache-spark/2.1.0/operators.html
@@ -601,7 +601,7 @@
 <h3>Using the operator<a class="headerlink" href="#using-the-operator" title="Permalink to this headline">¶</a></h3>
 <p>Using <code class="docutils literal notranslate"><span class="pre">cmd_type</span></code> parameter, is possible to transfer data from Spark to a database (<code class="docutils literal notranslate"><span class="pre">spark_to_jdbc</span></code>) or from a database to Spark (<code class="docutils literal notranslate"><span class="pre">jdbc_to_spark</span></code>), which will write the table using the Spark command <code class="docutils literal notranslate"><span class="pre">saveAsTable [...]
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/spark/example_dags/example_spark_dag.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-spark/2.1.0/airflow/providers/apache/spark/example_dags/example_spark_dag.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">jdbc_to_spark_job</span> <span class="o">=</span> <span class="n">SparkJDBCOperator</span><span class="p">(</span>
     <span class="n">cmd_type</span><span class="o">=</span><span class="s1">&#39;jdbc_to_spark&#39;</span><span class="p">,</span>
     <span class="n">jdbc_table</span><span class="o">=</span><span class="s2">&quot;foo&quot;</span><span class="p">,</span>
@@ -639,7 +639,7 @@ The operator will run the SQL query on Spark Hive metastore service, the <code c
 <div class="section" id="id1">
 <h3>Using the operator<a class="headerlink" href="#id1" title="Permalink to this headline">¶</a></h3>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/spark/example_dags/example_spark_dag.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-spark/2.1.0/airflow/providers/apache/spark/example_dags/example_spark_dag.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">sql_job</span> <span class="o">=</span> <span class="n">SparkSqlOperator</span><span class="p">(</span><span class="n">sql</span><span class="o">=</span><span class="s2">&quot;SELECT * FROM bar&quot;</span><span class="p">,</span> <span class="n">master</span><span class="o">=</span><span class="s2">&quot;local&quot;</span><span class="p">,</span> <span class="n">task_id</span><span class=" [...]
 </pre></div>
 </div>
@@ -657,7 +657,7 @@ The operator will run the SQL query on Spark Hive metastore service, the <code c
 <div class="section" id="id3">
 <h3>Using the operator<a class="headerlink" href="#id3" title="Permalink to this headline">¶</a></h3>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/spark/example_dags/example_spark_dag.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-spark/2.1.0/airflow/providers/apache/spark/example_dags/example_spark_dag.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">submit_job</span> <span class="o">=</span> <span class="n">SparkSubmitOperator</span><span class="p">(</span>
     <span class="n">application</span><span class="o">=</span><span class="s2">&quot;$</span><span class="si">{SPARK_HOME}</span><span class="s2">/examples/src/main/python/pi.py&quot;</span><span class="p">,</span> <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;submit_job&quot;</span>
 <span class="p">)</span>
diff --git a/docs-archive/apache-airflow-providers-apache-spark/2.1.1/operators.html b/docs-archive/apache-airflow-providers-apache-spark/2.1.1/operators.html
index c0c6578387..48681a7f4b 100644
--- a/docs-archive/apache-airflow-providers-apache-spark/2.1.1/operators.html
+++ b/docs-archive/apache-airflow-providers-apache-spark/2.1.1/operators.html
@@ -601,7 +601,7 @@
 <h3>Using the operator<a class="headerlink" href="#using-the-operator" title="Permalink to this headline">¶</a></h3>
 <p>Using <code class="docutils literal notranslate"><span class="pre">cmd_type</span></code> parameter, is possible to transfer data from Spark to a database (<code class="docutils literal notranslate"><span class="pre">spark_to_jdbc</span></code>) or from a database to Spark (<code class="docutils literal notranslate"><span class="pre">jdbc_to_spark</span></code>), which will write the table using the Spark command <code class="docutils literal notranslate"><span class="pre">saveAsTable [...]
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/spark/example_dags/example_spark_dag.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-spark/2.1.1/airflow/providers/apache/spark/example_dags/example_spark_dag.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">jdbc_to_spark_job</span> <span class="o">=</span> <span class="n">SparkJDBCOperator</span><span class="p">(</span>
     <span class="n">cmd_type</span><span class="o">=</span><span class="s1">&#39;jdbc_to_spark&#39;</span><span class="p">,</span>
     <span class="n">jdbc_table</span><span class="o">=</span><span class="s2">&quot;foo&quot;</span><span class="p">,</span>
@@ -639,7 +639,7 @@ The operator will run the SQL query on Spark Hive metastore service, the <code c
 <div class="section" id="id1">
 <h3>Using the operator<a class="headerlink" href="#id1" title="Permalink to this headline">¶</a></h3>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/spark/example_dags/example_spark_dag.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-spark/2.1.1/airflow/providers/apache/spark/example_dags/example_spark_dag.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">sql_job</span> <span class="o">=</span> <span class="n">SparkSqlOperator</span><span class="p">(</span><span class="n">sql</span><span class="o">=</span><span class="s2">&quot;SELECT * FROM bar&quot;</span><span class="p">,</span> <span class="n">master</span><span class="o">=</span><span class="s2">&quot;local&quot;</span><span class="p">,</span> <span class="n">task_id</span><span class=" [...]
 </pre></div>
 </div>
@@ -657,7 +657,7 @@ The operator will run the SQL query on Spark Hive metastore service, the <code c
 <div class="section" id="id3">
 <h3>Using the operator<a class="headerlink" href="#id3" title="Permalink to this headline">¶</a></h3>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/spark/example_dags/example_spark_dag.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-spark/2.1.1/airflow/providers/apache/spark/example_dags/example_spark_dag.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">submit_job</span> <span class="o">=</span> <span class="n">SparkSubmitOperator</span><span class="p">(</span>
     <span class="n">application</span><span class="o">=</span><span class="s2">&quot;$</span><span class="si">{SPARK_HOME}</span><span class="s2">/examples/src/main/python/pi.py&quot;</span><span class="p">,</span> <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;submit_job&quot;</span>
 <span class="p">)</span>
diff --git a/docs-archive/apache-airflow-providers-apache-spark/2.1.2/operators.html b/docs-archive/apache-airflow-providers-apache-spark/2.1.2/operators.html
index d84bc96be9..5cbc5b7d9c 100644
--- a/docs-archive/apache-airflow-providers-apache-spark/2.1.2/operators.html
+++ b/docs-archive/apache-airflow-providers-apache-spark/2.1.2/operators.html
@@ -601,7 +601,7 @@
 <h3>Using the operator<a class="headerlink" href="#using-the-operator" title="Permalink to this headline">¶</a></h3>
 <p>Using <code class="docutils literal notranslate"><span class="pre">cmd_type</span></code> parameter, is possible to transfer data from Spark to a database (<code class="docutils literal notranslate"><span class="pre">spark_to_jdbc</span></code>) or from a database to Spark (<code class="docutils literal notranslate"><span class="pre">jdbc_to_spark</span></code>), which will write the table using the Spark command <code class="docutils literal notranslate"><span class="pre">saveAsTable [...]
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/spark/example_dags/example_spark_dag.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-spark/2.1.2/airflow/providers/apache/spark/example_dags/example_spark_dag.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">jdbc_to_spark_job</span> <span class="o">=</span> <span class="n">SparkJDBCOperator</span><span class="p">(</span>
     <span class="n">cmd_type</span><span class="o">=</span><span class="s1">&#39;jdbc_to_spark&#39;</span><span class="p">,</span>
     <span class="n">jdbc_table</span><span class="o">=</span><span class="s2">&quot;foo&quot;</span><span class="p">,</span>
@@ -639,7 +639,7 @@ The operator will run the SQL query on Spark Hive metastore service, the <code c
 <div class="section" id="id1">
 <h3>Using the operator<a class="headerlink" href="#id1" title="Permalink to this headline">¶</a></h3>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/spark/example_dags/example_spark_dag.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-spark/2.1.2/airflow/providers/apache/spark/example_dags/example_spark_dag.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">sql_job</span> <span class="o">=</span> <span class="n">SparkSqlOperator</span><span class="p">(</span><span class="n">sql</span><span class="o">=</span><span class="s2">&quot;SELECT * FROM bar&quot;</span><span class="p">,</span> <span class="n">master</span><span class="o">=</span><span class="s2">&quot;local&quot;</span><span class="p">,</span> <span class="n">task_id</span><span class=" [...]
 </pre></div>
 </div>
@@ -657,7 +657,7 @@ The operator will run the SQL query on Spark Hive metastore service, the <code c
 <div class="section" id="id3">
 <h3>Using the operator<a class="headerlink" href="#id3" title="Permalink to this headline">¶</a></h3>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/spark/example_dags/example_spark_dag.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-spark/2.1.2/airflow/providers/apache/spark/example_dags/example_spark_dag.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">submit_job</span> <span class="o">=</span> <span class="n">SparkSubmitOperator</span><span class="p">(</span>
     <span class="n">application</span><span class="o">=</span><span class="s2">&quot;$</span><span class="si">{SPARK_HOME}</span><span class="s2">/examples/src/main/python/pi.py&quot;</span><span class="p">,</span> <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;submit_job&quot;</span>
 <span class="p">)</span>
diff --git a/docs-archive/apache-airflow-providers-apache-spark/2.1.3/operators.html b/docs-archive/apache-airflow-providers-apache-spark/2.1.3/operators.html
index 4e0c4f4e54..eda4495b74 100644
--- a/docs-archive/apache-airflow-providers-apache-spark/2.1.3/operators.html
+++ b/docs-archive/apache-airflow-providers-apache-spark/2.1.3/operators.html
@@ -603,7 +603,7 @@
 <h3>Using the operator<a class="headerlink" href="#using-the-operator" title="Permalink to this headline">¶</a></h3>
 <p>Using <code class="docutils literal notranslate"><span class="pre">cmd_type</span></code> parameter, is possible to transfer data from Spark to a database (<code class="docutils literal notranslate"><span class="pre">spark_to_jdbc</span></code>) or from a database to Spark (<code class="docutils literal notranslate"><span class="pre">jdbc_to_spark</span></code>), which will write the table using the Spark command <code class="docutils literal notranslate"><span class="pre">saveAsTable [...]
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/spark/example_dags/example_spark_dag.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-spark/2.1.3/airflow/providers/apache/spark/example_dags/example_spark_dag.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">jdbc_to_spark_job</span> <span class="o">=</span> <span class="n">SparkJDBCOperator</span><span class="p">(</span>
     <span class="n">cmd_type</span><span class="o">=</span><span class="s1">&#39;jdbc_to_spark&#39;</span><span class="p">,</span>
     <span class="n">jdbc_table</span><span class="o">=</span><span class="s2">&quot;foo&quot;</span><span class="p">,</span>
@@ -641,7 +641,7 @@ The operator will run the SQL query on Spark Hive metastore service, the <code c
 <div class="section" id="id1">
 <h3>Using the operator<a class="headerlink" href="#id1" title="Permalink to this headline">¶</a></h3>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/spark/example_dags/example_spark_dag.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-spark/2.1.3/airflow/providers/apache/spark/example_dags/example_spark_dag.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">sql_job</span> <span class="o">=</span> <span class="n">SparkSqlOperator</span><span class="p">(</span><span class="n">sql</span><span class="o">=</span><span class="s2">&quot;SELECT * FROM bar&quot;</span><span class="p">,</span> <span class="n">master</span><span class="o">=</span><span class="s2">&quot;local&quot;</span><span class="p">,</span> <span class="n">task_id</span><span class=" [...]
 </pre></div>
 </div>
@@ -659,7 +659,7 @@ The operator will run the SQL query on Spark Hive metastore service, the <code c
 <div class="section" id="id3">
 <h3>Using the operator<a class="headerlink" href="#id3" title="Permalink to this headline">¶</a></h3>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/apache/spark/example_dags/example_spark_dag.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/apache/spark/example_dags/example_spark_dag.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-apache-spark/2.1.3/airflow/providers/apache/spark/example_dags/example_spark_dag.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">submit_job</span> <span class="o">=</span> <span class="n">SparkSubmitOperator</span><span class="p">(</span>
     <span class="n">application</span><span class="o">=</span><span class="s2">&quot;$</span><span class="si">{SPARK_HOME}</span><span class="s2">/examples/src/main/python/pi.py&quot;</span><span class="p">,</span> <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;submit_job&quot;</span>
 <span class="p">)</span>
diff --git a/docs-archive/apache-airflow-providers-arangodb/1.0.0/operators/index.html b/docs-archive/apache-airflow-providers-arangodb/1.0.0/operators/index.html
index b399d53712..f2635f84bb 100644
--- a/docs-archive/apache-airflow-providers-arangodb/1.0.0/operators/index.html
+++ b/docs-archive/apache-airflow-providers-arangodb/1.0.0/operators/index.html
@@ -605,7 +605,7 @@ AQL query in <a class="reference external" href="https://www.arangodb.com/">Aran
 further process the result using <strong>result_processor</strong> Callable as you like.</p>
 <p>An example of Listing all Documents in <strong>students</strong> collection can be implemented as following:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/arangodb/example_dags/example_arangodb.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/arangodb/example_dags/example_arangodb.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/arangodb/example_dags/example_arangodb.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-arangodb/1.0.0/airflow/providers/arangodb/example_dags/example_arangodb.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>
 <span class="n">operator</span> <span class="o">=</span> <span class="n">AQLOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;aql_operator&#39;</span><span class="p">,</span>
@@ -620,7 +620,7 @@ further process the result using <strong>result_processor</strong> Callable as y
 <p>You can also provide file template (.sql) to load query, remember path is relative to <strong>dags/</strong> folder, if you want to provide any other path
 please provide <strong>template_searchpath</strong> while creating <strong>DAG</strong> object,</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/arangodb/example_dags/example_arangodb.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/arangodb/example_dags/example_arangodb.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/arangodb/example_dags/example_arangodb.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-arangodb/1.0.0/airflow/providers/arangodb/example_dags/example_arangodb.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>
 <span class="n">operator</span> <span class="o">=</span> <span class="n">AQLOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;aql_operator_template_file&#39;</span><span class="p">,</span>
@@ -639,7 +639,7 @@ please provide <strong>template_searchpath</strong> while creating <strong>DAG</
 AQL query in <a class="reference external" href="https://www.arangodb.com/">ArangoDB</a>.</p>
 <p>An example for waiting a document in <strong>students</strong> collection with student name <strong>judy</strong> can be implemented as following:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/arangodb/example_dags/example_arangodb.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/arangodb/example_dags/example_arangodb.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/arangodb/example_dags/example_arangodb.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-arangodb/1.0.0/airflow/providers/arangodb/example_dags/example_arangodb.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>
 <span class="n">sensor</span> <span class="o">=</span> <span class="n">AQLSensor</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;aql_sensor&quot;</span><span class="p">,</span>
@@ -654,7 +654,7 @@ AQL query in <a class="reference external" href="https://www.arangodb.com/">Aran
 </div>
 <p>Similar to <strong>AQLOperator</strong>, You can also provide file template to load query -</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/arangodb/example_dags/example_arangodb.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/arangodb/example_dags/example_arangodb.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/arangodb/example_dags/example_arangodb.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-arangodb/1.0.0/airflow/providers/arangodb/example_dags/example_arangodb.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>
 <span class="n">sensor</span> <span class="o">=</span> <span class="n">AQLSensor</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;aql_sensor_template_file&quot;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-cncf-kubernetes/1.0.0/operators.html b/docs-archive/apache-airflow-providers-cncf-kubernetes/1.0.0/operators.html
index 976c3bd58c..946ac38c86 100644
--- a/docs-archive/apache-airflow-providers-cncf-kubernetes/1.0.0/operators.html
+++ b/docs-archive/apache-airflow-providers-cncf-kubernetes/1.0.0/operators.html
@@ -623,7 +623,7 @@ Using this method will ensure correctness
 and type safety. While we have removed almost all Kubernetes convenience classes, we have kept the
 <code class="xref py py-class docutils literal notranslate"><span class="pre">Secret</span></code> class to simplify the process of generating secret volumes/env variables.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-cncf-kubernetes/1.0.0/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">secret_file</span> <span class="o">=</span> <span class="n">Secret</span><span class="p">(</span><span class="s1">&#39;volume&#39;</span><span class="p">,</span> <span class="s1">&#39;/etc/sql_conn&#39;</span><span class="p">,</span> <span class="s1">&#39;airflow-secrets&#39;</span><span class="p">,</span> <span class="s1">&#39;sql_alchemy_conn&#39;</span><span class="p">)</span>
 <span class="n">secret_env</span> <span class="o">=</span> <span class="n">Secret</span><span class="p">(</span><span class="s1">&#39;env&#39;</span><span class="p">,</span> <span class="s1">&#39;SQL_CONN&#39;</span><span class="p">,</span> <span class="s1">&#39;airflow-secrets&#39;</span><span class="p">,</span> <span class="s1">&#39;sql_alchemy_conn&#39;</span><span class="p">)</span>
 <span class="n">secret_all_keys</span> <span class="o">=</span> <span class="n">Secret</span><span class="p">(</span><span class="s1">&#39;env&#39;</span><span class="p">,</span> <span class="kc">None</span><span class="p">,</span> <span class="s1">&#39;airflow-secrets-2&#39;</span><span class="p">)</span>
@@ -719,7 +719,7 @@ specified in the <code class="docutils literal notranslate"><span class="pre">im
 </div>
 <p>Then use it in your pod like so:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-cncf-kubernetes/1.0.0/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">quay_k8s</span> <span class="o">=</span> <span class="n">KubernetesPodOperator</span><span class="p">(</span>
         <span class="n">namespace</span><span class="o">=</span><span class="s1">&#39;default&#39;</span><span class="p">,</span>
         <span class="n">image</span><span class="o">=</span><span class="s1">&#39;quay.io/apache/bash&#39;</span><span class="p">,</span>
@@ -745,7 +745,7 @@ from your Pod you must specify the <code class="docutils literal notranslate"><s
 alongside the Pod. The Pod must write the XCom value into this location at the <code class="docutils literal notranslate"><span class="pre">/airflow/xcom/return.json</span></code> path.</p>
 <p>See the following example on how this occurs:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-cncf-kubernetes/1.0.0/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">write_xcom</span> <span class="o">=</span> <span class="n">KubernetesPodOperator</span><span class="p">(</span>
         <span class="n">namespace</span><span class="o">=</span><span class="s1">&#39;default&#39;</span><span class="p">,</span>
         <span class="n">image</span><span class="o">=</span><span class="s1">&#39;alpine&#39;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-cncf-kubernetes/1.0.1/operators.html b/docs-archive/apache-airflow-providers-cncf-kubernetes/1.0.1/operators.html
index d07161eb1a..e47eeb17e8 100644
--- a/docs-archive/apache-airflow-providers-cncf-kubernetes/1.0.1/operators.html
+++ b/docs-archive/apache-airflow-providers-cncf-kubernetes/1.0.1/operators.html
@@ -633,7 +633,7 @@ Using this method will ensure correctness
 and type safety. While we have removed almost all Kubernetes convenience classes, we have kept the
 <code class="xref py py-class docutils literal notranslate"><span class="pre">Secret</span></code> class to simplify the process of generating secret volumes/env variables.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-cncf-kubernetes/1.0.1/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">secret_file</span> <span class="o">=</span> <span class="n">Secret</span><span class="p">(</span><span class="s1">&#39;volume&#39;</span><span class="p">,</span> <span class="s1">&#39;/etc/sql_conn&#39;</span><span class="p">,</span> <span class="s1">&#39;airflow-secrets&#39;</span><span class="p">,</span> <span class="s1">&#39;sql_alchemy_conn&#39;</span><span class="p">)</span>
 <span class="n">secret_env</span> <span class="o">=</span> <span class="n">Secret</span><span class="p">(</span><span class="s1">&#39;env&#39;</span><span class="p">,</span> <span class="s1">&#39;SQL_CONN&#39;</span><span class="p">,</span> <span class="s1">&#39;airflow-secrets&#39;</span><span class="p">,</span> <span class="s1">&#39;sql_alchemy_conn&#39;</span><span class="p">)</span>
 <span class="n">secret_all_keys</span> <span class="o">=</span> <span class="n">Secret</span><span class="p">(</span><span class="s1">&#39;env&#39;</span><span class="p">,</span> <span class="kc">None</span><span class="p">,</span> <span class="s1">&#39;airflow-secrets-2&#39;</span><span class="p">)</span>
@@ -729,7 +729,7 @@ specified in the <code class="docutils literal notranslate"><span class="pre">im
 </div>
 <p>Then use it in your pod like so:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-cncf-kubernetes/1.0.1/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">quay_k8s</span> <span class="o">=</span> <span class="n">KubernetesPodOperator</span><span class="p">(</span>
         <span class="n">namespace</span><span class="o">=</span><span class="s1">&#39;default&#39;</span><span class="p">,</span>
         <span class="n">image</span><span class="o">=</span><span class="s1">&#39;quay.io/apache/bash&#39;</span><span class="p">,</span>
@@ -755,7 +755,7 @@ from your Pod you must specify the <code class="docutils literal notranslate"><s
 alongside the Pod. The Pod must write the XCom value into this location at the <code class="docutils literal notranslate"><span class="pre">/airflow/xcom/return.json</span></code> path.</p>
 <p>See the following example on how this occurs:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-cncf-kubernetes/1.0.1/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">write_xcom</span> <span class="o">=</span> <span class="n">KubernetesPodOperator</span><span class="p">(</span>
         <span class="n">namespace</span><span class="o">=</span><span class="s1">&#39;default&#39;</span><span class="p">,</span>
         <span class="n">image</span><span class="o">=</span><span class="s1">&#39;alpine&#39;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-cncf-kubernetes/1.0.2/operators.html b/docs-archive/apache-airflow-providers-cncf-kubernetes/1.0.2/operators.html
index d07161eb1a..12ee915cc0 100644
--- a/docs-archive/apache-airflow-providers-cncf-kubernetes/1.0.2/operators.html
+++ b/docs-archive/apache-airflow-providers-cncf-kubernetes/1.0.2/operators.html
@@ -633,7 +633,7 @@ Using this method will ensure correctness
 and type safety. While we have removed almost all Kubernetes convenience classes, we have kept the
 <code class="xref py py-class docutils literal notranslate"><span class="pre">Secret</span></code> class to simplify the process of generating secret volumes/env variables.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-cncf-kubernetes/1.0.2/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">secret_file</span> <span class="o">=</span> <span class="n">Secret</span><span class="p">(</span><span class="s1">&#39;volume&#39;</span><span class="p">,</span> <span class="s1">&#39;/etc/sql_conn&#39;</span><span class="p">,</span> <span class="s1">&#39;airflow-secrets&#39;</span><span class="p">,</span> <span class="s1">&#39;sql_alchemy_conn&#39;</span><span class="p">)</span>
 <span class="n">secret_env</span> <span class="o">=</span> <span class="n">Secret</span><span class="p">(</span><span class="s1">&#39;env&#39;</span><span class="p">,</span> <span class="s1">&#39;SQL_CONN&#39;</span><span class="p">,</span> <span class="s1">&#39;airflow-secrets&#39;</span><span class="p">,</span> <span class="s1">&#39;sql_alchemy_conn&#39;</span><span class="p">)</span>
 <span class="n">secret_all_keys</span> <span class="o">=</span> <span class="n">Secret</span><span class="p">(</span><span class="s1">&#39;env&#39;</span><span class="p">,</span> <span class="kc">None</span><span class="p">,</span> <span class="s1">&#39;airflow-secrets-2&#39;</span><span class="p">)</span>
@@ -729,7 +729,7 @@ specified in the <code class="docutils literal notranslate"><span class="pre">im
 </div>
 <p>Then use it in your pod like so:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-cncf-kubernetes/1.0.2/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">quay_k8s</span> <span class="o">=</span> <span class="n">KubernetesPodOperator</span><span class="p">(</span>
         <span class="n">namespace</span><span class="o">=</span><span class="s1">&#39;default&#39;</span><span class="p">,</span>
         <span class="n">image</span><span class="o">=</span><span class="s1">&#39;quay.io/apache/bash&#39;</span><span class="p">,</span>
@@ -755,7 +755,7 @@ from your Pod you must specify the <code class="docutils literal notranslate"><s
 alongside the Pod. The Pod must write the XCom value into this location at the <code class="docutils literal notranslate"><span class="pre">/airflow/xcom/return.json</span></code> path.</p>
 <p>See the following example on how this occurs:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-cncf-kubernetes/1.0.2/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">write_xcom</span> <span class="o">=</span> <span class="n">KubernetesPodOperator</span><span class="p">(</span>
         <span class="n">namespace</span><span class="o">=</span><span class="s1">&#39;default&#39;</span><span class="p">,</span>
         <span class="n">image</span><span class="o">=</span><span class="s1">&#39;alpine&#39;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-cncf-kubernetes/1.1.0/operators.html b/docs-archive/apache-airflow-providers-cncf-kubernetes/1.1.0/operators.html
index f2dffe0911..eb40377448 100644
--- a/docs-archive/apache-airflow-providers-cncf-kubernetes/1.1.0/operators.html
+++ b/docs-archive/apache-airflow-providers-cncf-kubernetes/1.1.0/operators.html
@@ -633,7 +633,7 @@ Using this method will ensure correctness
 and type safety. While we have removed almost all Kubernetes convenience classes, we have kept the
 <code class="xref py py-class docutils literal notranslate"><span class="pre">Secret</span></code> class to simplify the process of generating secret volumes/env variables.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-cncf-kubernetes/1.1.0/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">secret_file</span> <span class="o">=</span> <span class="n">Secret</span><span class="p">(</span><span class="s1">&#39;volume&#39;</span><span class="p">,</span> <span class="s1">&#39;/etc/sql_conn&#39;</span><span class="p">,</span> <span class="s1">&#39;airflow-secrets&#39;</span><span class="p">,</span> <span class="s1">&#39;sql_alchemy_conn&#39;</span><span class="p">)</span>
 <span class="n">secret_env</span> <span class="o">=</span> <span class="n">Secret</span><span class="p">(</span><span class="s1">&#39;env&#39;</span><span class="p">,</span> <span class="s1">&#39;SQL_CONN&#39;</span><span class="p">,</span> <span class="s1">&#39;airflow-secrets&#39;</span><span class="p">,</span> <span class="s1">&#39;sql_alchemy_conn&#39;</span><span class="p">)</span>
 <span class="n">secret_all_keys</span> <span class="o">=</span> <span class="n">Secret</span><span class="p">(</span><span class="s1">&#39;env&#39;</span><span class="p">,</span> <span class="kc">None</span><span class="p">,</span> <span class="s1">&#39;airflow-secrets-2&#39;</span><span class="p">)</span>
@@ -729,7 +729,7 @@ specified in the <code class="docutils literal notranslate"><span class="pre">im
 </div>
 <p>Then use it in your pod like so:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-cncf-kubernetes/1.1.0/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">quay_k8s</span> <span class="o">=</span> <span class="n">KubernetesPodOperator</span><span class="p">(</span>
         <span class="n">namespace</span><span class="o">=</span><span class="s1">&#39;default&#39;</span><span class="p">,</span>
         <span class="n">image</span><span class="o">=</span><span class="s1">&#39;quay.io/apache/bash&#39;</span><span class="p">,</span>
@@ -755,7 +755,7 @@ from your Pod you must specify the <code class="docutils literal notranslate"><s
 alongside the Pod. The Pod must write the XCom value into this location at the <code class="docutils literal notranslate"><span class="pre">/airflow/xcom/return.json</span></code> path.</p>
 <p>See the following example on how this occurs:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-cncf-kubernetes/1.1.0/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">write_xcom</span> <span class="o">=</span> <span class="n">KubernetesPodOperator</span><span class="p">(</span>
         <span class="n">namespace</span><span class="o">=</span><span class="s1">&#39;default&#39;</span><span class="p">,</span>
         <span class="n">image</span><span class="o">=</span><span class="s1">&#39;alpine&#39;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-cncf-kubernetes/1.2.0/operators.html b/docs-archive/apache-airflow-providers-cncf-kubernetes/1.2.0/operators.html
index f2dffe0911..a08713e0ad 100644
--- a/docs-archive/apache-airflow-providers-cncf-kubernetes/1.2.0/operators.html
+++ b/docs-archive/apache-airflow-providers-cncf-kubernetes/1.2.0/operators.html
@@ -633,7 +633,7 @@ Using this method will ensure correctness
 and type safety. While we have removed almost all Kubernetes convenience classes, we have kept the
 <code class="xref py py-class docutils literal notranslate"><span class="pre">Secret</span></code> class to simplify the process of generating secret volumes/env variables.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-cncf-kubernetes/1.2.0/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">secret_file</span> <span class="o">=</span> <span class="n">Secret</span><span class="p">(</span><span class="s1">&#39;volume&#39;</span><span class="p">,</span> <span class="s1">&#39;/etc/sql_conn&#39;</span><span class="p">,</span> <span class="s1">&#39;airflow-secrets&#39;</span><span class="p">,</span> <span class="s1">&#39;sql_alchemy_conn&#39;</span><span class="p">)</span>
 <span class="n">secret_env</span> <span class="o">=</span> <span class="n">Secret</span><span class="p">(</span><span class="s1">&#39;env&#39;</span><span class="p">,</span> <span class="s1">&#39;SQL_CONN&#39;</span><span class="p">,</span> <span class="s1">&#39;airflow-secrets&#39;</span><span class="p">,</span> <span class="s1">&#39;sql_alchemy_conn&#39;</span><span class="p">)</span>
 <span class="n">secret_all_keys</span> <span class="o">=</span> <span class="n">Secret</span><span class="p">(</span><span class="s1">&#39;env&#39;</span><span class="p">,</span> <span class="kc">None</span><span class="p">,</span> <span class="s1">&#39;airflow-secrets-2&#39;</span><span class="p">)</span>
@@ -729,7 +729,7 @@ specified in the <code class="docutils literal notranslate"><span class="pre">im
 </div>
 <p>Then use it in your pod like so:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-cncf-kubernetes/1.2.0/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">quay_k8s</span> <span class="o">=</span> <span class="n">KubernetesPodOperator</span><span class="p">(</span>
         <span class="n">namespace</span><span class="o">=</span><span class="s1">&#39;default&#39;</span><span class="p">,</span>
         <span class="n">image</span><span class="o">=</span><span class="s1">&#39;quay.io/apache/bash&#39;</span><span class="p">,</span>
@@ -755,7 +755,7 @@ from your Pod you must specify the <code class="docutils literal notranslate"><s
 alongside the Pod. The Pod must write the XCom value into this location at the <code class="docutils literal notranslate"><span class="pre">/airflow/xcom/return.json</span></code> path.</p>
 <p>See the following example on how this occurs:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-cncf-kubernetes/1.2.0/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">write_xcom</span> <span class="o">=</span> <span class="n">KubernetesPodOperator</span><span class="p">(</span>
         <span class="n">namespace</span><span class="o">=</span><span class="s1">&#39;default&#39;</span><span class="p">,</span>
         <span class="n">image</span><span class="o">=</span><span class="s1">&#39;alpine&#39;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-cncf-kubernetes/2.0.0/operators.html b/docs-archive/apache-airflow-providers-cncf-kubernetes/2.0.0/operators.html
index 3d5f4bd09e..1564e2be0c 100644
--- a/docs-archive/apache-airflow-providers-cncf-kubernetes/2.0.0/operators.html
+++ b/docs-archive/apache-airflow-providers-cncf-kubernetes/2.0.0/operators.html
@@ -633,7 +633,7 @@ Using this method will ensure correctness
 and type safety. While we have removed almost all Kubernetes convenience classes, we have kept the
 <code class="xref py py-class docutils literal notranslate"><span class="pre">Secret</span></code> class to simplify the process of generating secret volumes/env variables.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-cncf-kubernetes/2.0.0/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">secret_file</span> <span class="o">=</span> <span class="n">Secret</span><span class="p">(</span><span class="s1">&#39;volume&#39;</span><span class="p">,</span> <span class="s1">&#39;/etc/sql_conn&#39;</span><span class="p">,</span> <span class="s1">&#39;airflow-secrets&#39;</span><span class="p">,</span> <span class="s1">&#39;sql_alchemy_conn&#39;</span><span class="p">)</span>
 <span class="n">secret_env</span> <span class="o">=</span> <span class="n">Secret</span><span class="p">(</span><span class="s1">&#39;env&#39;</span><span class="p">,</span> <span class="s1">&#39;SQL_CONN&#39;</span><span class="p">,</span> <span class="s1">&#39;airflow-secrets&#39;</span><span class="p">,</span> <span class="s1">&#39;sql_alchemy_conn&#39;</span><span class="p">)</span>
 <span class="n">secret_all_keys</span> <span class="o">=</span> <span class="n">Secret</span><span class="p">(</span><span class="s1">&#39;env&#39;</span><span class="p">,</span> <span class="kc">None</span><span class="p">,</span> <span class="s1">&#39;airflow-secrets-2&#39;</span><span class="p">)</span>
@@ -729,7 +729,7 @@ specified in the <code class="docutils literal notranslate"><span class="pre">im
 </div>
 <p>Then use it in your pod like so:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-cncf-kubernetes/2.0.0/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">quay_k8s</span> <span class="o">=</span> <span class="n">KubernetesPodOperator</span><span class="p">(</span>
         <span class="n">namespace</span><span class="o">=</span><span class="s1">&#39;default&#39;</span><span class="p">,</span>
         <span class="n">image</span><span class="o">=</span><span class="s1">&#39;quay.io/apache/bash&#39;</span><span class="p">,</span>
@@ -755,7 +755,7 @@ from your Pod you must specify the <code class="docutils literal notranslate"><s
 alongside the Pod. The Pod must write the XCom value into this location at the <code class="docutils literal notranslate"><span class="pre">/airflow/xcom/return.json</span></code> path.</p>
 <p>See the following example on how this occurs:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-cncf-kubernetes/2.0.0/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">write_xcom</span> <span class="o">=</span> <span class="n">KubernetesPodOperator</span><span class="p">(</span>
         <span class="n">namespace</span><span class="o">=</span><span class="s1">&#39;default&#39;</span><span class="p">,</span>
         <span class="n">image</span><span class="o">=</span><span class="s1">&#39;alpine&#39;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-cncf-kubernetes/2.0.1/operators.html b/docs-archive/apache-airflow-providers-cncf-kubernetes/2.0.1/operators.html
index 728ac7b5ff..108ed6dd54 100644
--- a/docs-archive/apache-airflow-providers-cncf-kubernetes/2.0.1/operators.html
+++ b/docs-archive/apache-airflow-providers-cncf-kubernetes/2.0.1/operators.html
@@ -632,7 +632,7 @@ Using this method will ensure correctness
 and type safety. While we have removed almost all Kubernetes convenience classes, we have kept the
 <code class="xref py py-class docutils literal notranslate"><span class="pre">Secret</span></code> class to simplify the process of generating secret volumes/env variables.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-cncf-kubernetes/2.0.1/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">secret_file</span> <span class="o">=</span> <span class="n">Secret</span><span class="p">(</span><span class="s1">&#39;volume&#39;</span><span class="p">,</span> <span class="s1">&#39;/etc/sql_conn&#39;</span><span class="p">,</span> <span class="s1">&#39;airflow-secrets&#39;</span><span class="p">,</span> <span class="s1">&#39;sql_alchemy_conn&#39;</span><span class="p">)</span>
 <span class="n">secret_env</span> <span class="o">=</span> <span class="n">Secret</span><span class="p">(</span><span class="s1">&#39;env&#39;</span><span class="p">,</span> <span class="s1">&#39;SQL_CONN&#39;</span><span class="p">,</span> <span class="s1">&#39;airflow-secrets&#39;</span><span class="p">,</span> <span class="s1">&#39;sql_alchemy_conn&#39;</span><span class="p">)</span>
 <span class="n">secret_all_keys</span> <span class="o">=</span> <span class="n">Secret</span><span class="p">(</span><span class="s1">&#39;env&#39;</span><span class="p">,</span> <span class="kc">None</span><span class="p">,</span> <span class="s1">&#39;airflow-secrets-2&#39;</span><span class="p">)</span>
@@ -728,7 +728,7 @@ specified in the <code class="docutils literal notranslate"><span class="pre">im
 </div>
 <p>Then use it in your pod like so:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-cncf-kubernetes/2.0.1/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">quay_k8s</span> <span class="o">=</span> <span class="n">KubernetesPodOperator</span><span class="p">(</span>
         <span class="n">namespace</span><span class="o">=</span><span class="s1">&#39;default&#39;</span><span class="p">,</span>
         <span class="n">image</span><span class="o">=</span><span class="s1">&#39;quay.io/apache/bash&#39;</span><span class="p">,</span>
@@ -754,7 +754,7 @@ from your Pod you must specify the <code class="docutils literal notranslate"><s
 alongside the Pod. The Pod must write the XCom value into this location at the <code class="docutils literal notranslate"><span class="pre">/airflow/xcom/return.json</span></code> path.</p>
 <p>See the following example on how this occurs:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-cncf-kubernetes/2.0.1/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">write_xcom</span> <span class="o">=</span> <span class="n">KubernetesPodOperator</span><span class="p">(</span>
         <span class="n">namespace</span><span class="o">=</span><span class="s1">&#39;default&#39;</span><span class="p">,</span>
         <span class="n">image</span><span class="o">=</span><span class="s1">&#39;alpine&#39;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-cncf-kubernetes/2.0.2/operators.html b/docs-archive/apache-airflow-providers-cncf-kubernetes/2.0.2/operators.html
index e7243dcfbf..0fbc933a55 100644
--- a/docs-archive/apache-airflow-providers-cncf-kubernetes/2.0.2/operators.html
+++ b/docs-archive/apache-airflow-providers-cncf-kubernetes/2.0.2/operators.html
@@ -632,7 +632,7 @@ Using this method will ensure correctness
 and type safety. While we have removed almost all Kubernetes convenience classes, we have kept the
 <code class="xref py py-class docutils literal notranslate"><span class="pre">Secret</span></code> class to simplify the process of generating secret volumes/env variables.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-cncf-kubernetes/2.0.2/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">secret_file</span> <span class="o">=</span> <span class="n">Secret</span><span class="p">(</span><span class="s1">&#39;volume&#39;</span><span class="p">,</span> <span class="s1">&#39;/etc/sql_conn&#39;</span><span class="p">,</span> <span class="s1">&#39;airflow-secrets&#39;</span><span class="p">,</span> <span class="s1">&#39;sql_alchemy_conn&#39;</span><span class="p">)</span>
 <span class="n">secret_env</span> <span class="o">=</span> <span class="n">Secret</span><span class="p">(</span><span class="s1">&#39;env&#39;</span><span class="p">,</span> <span class="s1">&#39;SQL_CONN&#39;</span><span class="p">,</span> <span class="s1">&#39;airflow-secrets&#39;</span><span class="p">,</span> <span class="s1">&#39;sql_alchemy_conn&#39;</span><span class="p">)</span>
 <span class="n">secret_all_keys</span> <span class="o">=</span> <span class="n">Secret</span><span class="p">(</span><span class="s1">&#39;env&#39;</span><span class="p">,</span> <span class="kc">None</span><span class="p">,</span> <span class="s1">&#39;airflow-secrets-2&#39;</span><span class="p">)</span>
@@ -728,7 +728,7 @@ specified in the <code class="docutils literal notranslate"><span class="pre">im
 </div>
 <p>Then use it in your pod like so:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-cncf-kubernetes/2.0.2/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">quay_k8s</span> <span class="o">=</span> <span class="n">KubernetesPodOperator</span><span class="p">(</span>
         <span class="n">namespace</span><span class="o">=</span><span class="s1">&#39;default&#39;</span><span class="p">,</span>
         <span class="n">image</span><span class="o">=</span><span class="s1">&#39;quay.io/apache/bash&#39;</span><span class="p">,</span>
@@ -754,7 +754,7 @@ from your Pod you must specify the <code class="docutils literal notranslate"><s
 alongside the Pod. The Pod must write the XCom value into this location at the <code class="docutils literal notranslate"><span class="pre">/airflow/xcom/return.json</span></code> path.</p>
 <p>See the following example on how this occurs:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-cncf-kubernetes/2.0.2/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">write_xcom</span> <span class="o">=</span> <span class="n">KubernetesPodOperator</span><span class="p">(</span>
         <span class="n">namespace</span><span class="o">=</span><span class="s1">&#39;default&#39;</span><span class="p">,</span>
         <span class="n">image</span><span class="o">=</span><span class="s1">&#39;alpine&#39;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-cncf-kubernetes/2.0.3/operators.html b/docs-archive/apache-airflow-providers-cncf-kubernetes/2.0.3/operators.html
index e45eca7a8a..d9b20b97a8 100644
--- a/docs-archive/apache-airflow-providers-cncf-kubernetes/2.0.3/operators.html
+++ b/docs-archive/apache-airflow-providers-cncf-kubernetes/2.0.3/operators.html
@@ -636,7 +636,7 @@ Using this method will ensure correctness
 and type safety. While we have removed almost all Kubernetes convenience classes, we have kept the
 <code class="xref py py-class docutils literal notranslate"><span class="pre">Secret</span></code> class to simplify the process of generating secret volumes/env variables.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-cncf-kubernetes/2.0.3/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">secret_file</span> <span class="o">=</span> <span class="n">Secret</span><span class="p">(</span><span class="s1">&#39;volume&#39;</span><span class="p">,</span> <span class="s1">&#39;/etc/sql_conn&#39;</span><span class="p">,</span> <span class="s1">&#39;airflow-secrets&#39;</span><span class="p">,</span> <span class="s1">&#39;sql_alchemy_conn&#39;</span><span class="p">)</span>
 <span class="n">secret_env</span> <span class="o">=</span> <span class="n">Secret</span><span class="p">(</span><span class="s1">&#39;env&#39;</span><span class="p">,</span> <span class="s1">&#39;SQL_CONN&#39;</span><span class="p">,</span> <span class="s1">&#39;airflow-secrets&#39;</span><span class="p">,</span> <span class="s1">&#39;sql_alchemy_conn&#39;</span><span class="p">)</span>
 <span class="n">secret_all_keys</span> <span class="o">=</span> <span class="n">Secret</span><span class="p">(</span><span class="s1">&#39;env&#39;</span><span class="p">,</span> <span class="kc">None</span><span class="p">,</span> <span class="s1">&#39;airflow-secrets-2&#39;</span><span class="p">)</span>
@@ -732,7 +732,7 @@ specified in the <code class="docutils literal notranslate"><span class="pre">im
 </div>
 <p>Then use it in your pod like so:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-cncf-kubernetes/2.0.3/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">quay_k8s</span> <span class="o">=</span> <span class="n">KubernetesPodOperator</span><span class="p">(</span>
         <span class="n">namespace</span><span class="o">=</span><span class="s1">&#39;default&#39;</span><span class="p">,</span>
         <span class="n">image</span><span class="o">=</span><span class="s1">&#39;quay.io/apache/bash&#39;</span><span class="p">,</span>
@@ -758,7 +758,7 @@ from your Pod you must specify the <code class="docutils literal notranslate"><s
 alongside the Pod. The Pod must write the XCom value into this location at the <code class="docutils literal notranslate"><span class="pre">/airflow/xcom/return.json</span></code> path.</p>
 <p>See the following example on how this occurs:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-cncf-kubernetes/2.0.3/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">write_xcom</span> <span class="o">=</span> <span class="n">KubernetesPodOperator</span><span class="p">(</span>
         <span class="n">namespace</span><span class="o">=</span><span class="s1">&#39;default&#39;</span><span class="p">,</span>
         <span class="n">image</span><span class="o">=</span><span class="s1">&#39;alpine&#39;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-cncf-kubernetes/2.1.0/operators.html b/docs-archive/apache-airflow-providers-cncf-kubernetes/2.1.0/operators.html
index 39fe7f0b13..266cd895b0 100644
--- a/docs-archive/apache-airflow-providers-cncf-kubernetes/2.1.0/operators.html
+++ b/docs-archive/apache-airflow-providers-cncf-kubernetes/2.1.0/operators.html
@@ -626,7 +626,7 @@ Using this method will ensure correctness
 and type safety. While we have removed almost all Kubernetes convenience classes, we have kept the
 <code class="xref py py-class docutils literal notranslate"><span class="pre">Secret</span></code> class to simplify the process of generating secret volumes/env variables.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-cncf-kubernetes/2.1.0/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">secret_file</span> <span class="o">=</span> <span class="n">Secret</span><span class="p">(</span><span class="s1">&#39;volume&#39;</span><span class="p">,</span> <span class="s1">&#39;/etc/sql_conn&#39;</span><span class="p">,</span> <span class="s1">&#39;airflow-secrets&#39;</span><span class="p">,</span> <span class="s1">&#39;sql_alchemy_conn&#39;</span><span class="p">)</span>
 <span class="n">secret_env</span> <span class="o">=</span> <span class="n">Secret</span><span class="p">(</span><span class="s1">&#39;env&#39;</span><span class="p">,</span> <span class="s1">&#39;SQL_CONN&#39;</span><span class="p">,</span> <span class="s1">&#39;airflow-secrets&#39;</span><span class="p">,</span> <span class="s1">&#39;sql_alchemy_conn&#39;</span><span class="p">)</span>
 <span class="n">secret_all_keys</span> <span class="o">=</span> <span class="n">Secret</span><span class="p">(</span><span class="s1">&#39;env&#39;</span><span class="p">,</span> <span class="kc">None</span><span class="p">,</span> <span class="s1">&#39;airflow-secrets-2&#39;</span><span class="p">)</span>
@@ -722,7 +722,7 @@ specified in the <code class="docutils literal notranslate"><span class="pre">im
 </div>
 <p>Then use it in your pod like so:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-cncf-kubernetes/2.1.0/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">quay_k8s</span> <span class="o">=</span> <span class="n">KubernetesPodOperator</span><span class="p">(</span>
         <span class="n">namespace</span><span class="o">=</span><span class="s1">&#39;default&#39;</span><span class="p">,</span>
         <span class="n">image</span><span class="o">=</span><span class="s1">&#39;quay.io/apache/bash&#39;</span><span class="p">,</span>
@@ -748,7 +748,7 @@ from your Pod you must specify the <code class="docutils literal notranslate"><s
 alongside the Pod. The Pod must write the XCom value into this location at the <code class="docutils literal notranslate"><span class="pre">/airflow/xcom/return.json</span></code> path.</p>
 <p>See the following example on how this occurs:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-cncf-kubernetes/2.1.0/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">write_xcom</span> <span class="o">=</span> <span class="n">KubernetesPodOperator</span><span class="p">(</span>
         <span class="n">namespace</span><span class="o">=</span><span class="s1">&#39;default&#39;</span><span class="p">,</span>
         <span class="n">image</span><span class="o">=</span><span class="s1">&#39;alpine&#39;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-cncf-kubernetes/2.2.0/operators.html b/docs-archive/apache-airflow-providers-cncf-kubernetes/2.2.0/operators.html
index 5a209f813e..8e60fae815 100644
--- a/docs-archive/apache-airflow-providers-cncf-kubernetes/2.2.0/operators.html
+++ b/docs-archive/apache-airflow-providers-cncf-kubernetes/2.2.0/operators.html
@@ -626,7 +626,7 @@ Using this method will ensure correctness
 and type safety. While we have removed almost all Kubernetes convenience classes, we have kept the
 <code class="xref py py-class docutils literal notranslate"><span class="pre">Secret</span></code> class to simplify the process of generating secret volumes/env variables.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-cncf-kubernetes/2.2.0/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">secret_file</span> <span class="o">=</span> <span class="n">Secret</span><span class="p">(</span><span class="s1">&#39;volume&#39;</span><span class="p">,</span> <span class="s1">&#39;/etc/sql_conn&#39;</span><span class="p">,</span> <span class="s1">&#39;airflow-secrets&#39;</span><span class="p">,</span> <span class="s1">&#39;sql_alchemy_conn&#39;</span><span class="p">)</span>
 <span class="n">secret_env</span> <span class="o">=</span> <span class="n">Secret</span><span class="p">(</span><span class="s1">&#39;env&#39;</span><span class="p">,</span> <span class="s1">&#39;SQL_CONN&#39;</span><span class="p">,</span> <span class="s1">&#39;airflow-secrets&#39;</span><span class="p">,</span> <span class="s1">&#39;sql_alchemy_conn&#39;</span><span class="p">)</span>
 <span class="n">secret_all_keys</span> <span class="o">=</span> <span class="n">Secret</span><span class="p">(</span><span class="s1">&#39;env&#39;</span><span class="p">,</span> <span class="kc">None</span><span class="p">,</span> <span class="s1">&#39;airflow-secrets-2&#39;</span><span class="p">)</span>
@@ -722,7 +722,7 @@ specified in the <code class="docutils literal notranslate"><span class="pre">im
 </div>
 <p>Then use it in your pod like so:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-cncf-kubernetes/2.2.0/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">quay_k8s</span> <span class="o">=</span> <span class="n">KubernetesPodOperator</span><span class="p">(</span>
         <span class="n">namespace</span><span class="o">=</span><span class="s1">&#39;default&#39;</span><span class="p">,</span>
         <span class="n">image</span><span class="o">=</span><span class="s1">&#39;quay.io/apache/bash&#39;</span><span class="p">,</span>
@@ -748,7 +748,7 @@ from your Pod you must specify the <code class="docutils literal notranslate"><s
 alongside the Pod. The Pod must write the XCom value into this location at the <code class="docutils literal notranslate"><span class="pre">/airflow/xcom/return.json</span></code> path.</p>
 <p>See the following example on how this occurs:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-cncf-kubernetes/2.2.0/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">write_xcom</span> <span class="o">=</span> <span class="n">KubernetesPodOperator</span><span class="p">(</span>
         <span class="n">namespace</span><span class="o">=</span><span class="s1">&#39;default&#39;</span><span class="p">,</span>
         <span class="n">image</span><span class="o">=</span><span class="s1">&#39;alpine&#39;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-cncf-kubernetes/3.0.0/operators.html b/docs-archive/apache-airflow-providers-cncf-kubernetes/3.0.0/operators.html
index 392a39aefc..bc8f31c7e0 100644
--- a/docs-archive/apache-airflow-providers-cncf-kubernetes/3.0.0/operators.html
+++ b/docs-archive/apache-airflow-providers-cncf-kubernetes/3.0.0/operators.html
@@ -649,7 +649,7 @@ Using this method will ensure correctness
 and type safety. While we have removed almost all Kubernetes convenience classes, we have kept the
 <code class="xref py py-class docutils literal notranslate"><span class="pre">Secret</span></code> class to simplify the process of generating secret volumes/env variables.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-cncf-kubernetes/3.0.0/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">secret_file</span> <span class="o">=</span> <span class="n">Secret</span><span class="p">(</span><span class="s1">&#39;volume&#39;</span><span class="p">,</span> <span class="s1">&#39;/etc/sql_conn&#39;</span><span class="p">,</span> <span class="s1">&#39;airflow-secrets&#39;</span><span class="p">,</span> <span class="s1">&#39;sql_alchemy_conn&#39;</span><span class="p">)</span>
 <span class="n">secret_env</span> <span class="o">=</span> <span class="n">Secret</span><span class="p">(</span><span class="s1">&#39;env&#39;</span><span class="p">,</span> <span class="s1">&#39;SQL_CONN&#39;</span><span class="p">,</span> <span class="s1">&#39;airflow-secrets&#39;</span><span class="p">,</span> <span class="s1">&#39;sql_alchemy_conn&#39;</span><span class="p">)</span>
 <span class="n">secret_all_keys</span> <span class="o">=</span> <span class="n">Secret</span><span class="p">(</span><span class="s1">&#39;env&#39;</span><span class="p">,</span> <span class="kc">None</span><span class="p">,</span> <span class="s1">&#39;airflow-secrets-2&#39;</span><span class="p">)</span>
@@ -745,7 +745,7 @@ specified in the <code class="docutils literal notranslate"><span class="pre">im
 </div>
 <p>Then use it in your pod like so:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-cncf-kubernetes/3.0.0/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">quay_k8s</span> <span class="o">=</span> <span class="n">KubernetesPodOperator</span><span class="p">(</span>
         <span class="n">namespace</span><span class="o">=</span><span class="s1">&#39;default&#39;</span><span class="p">,</span>
         <span class="n">image</span><span class="o">=</span><span class="s1">&#39;quay.io/apache/bash&#39;</span><span class="p">,</span>
@@ -771,7 +771,7 @@ from your Pod you must specify the <code class="docutils literal notranslate"><s
 alongside the Pod. The Pod must write the XCom value into this location at the <code class="docutils literal notranslate"><span class="pre">/airflow/xcom/return.json</span></code> path.</p>
 <p>See the following example on how this occurs:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-cncf-kubernetes/3.0.0/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">write_xcom</span> <span class="o">=</span> <span class="n">KubernetesPodOperator</span><span class="p">(</span>
         <span class="n">namespace</span><span class="o">=</span><span class="s1">&#39;default&#39;</span><span class="p">,</span>
         <span class="n">image</span><span class="o">=</span><span class="s1">&#39;alpine&#39;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-cncf-kubernetes/3.0.1/operators.html b/docs-archive/apache-airflow-providers-cncf-kubernetes/3.0.1/operators.html
index 4af5ec5ae9..f452f0e5b8 100644
--- a/docs-archive/apache-airflow-providers-cncf-kubernetes/3.0.1/operators.html
+++ b/docs-archive/apache-airflow-providers-cncf-kubernetes/3.0.1/operators.html
@@ -649,7 +649,7 @@ Using this method will ensure correctness
 and type safety. While we have removed almost all Kubernetes convenience classes, we have kept the
 <code class="xref py py-class docutils literal notranslate"><span class="pre">Secret</span></code> class to simplify the process of generating secret volumes/env variables.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-cncf-kubernetes/3.0.1/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">secret_file</span> <span class="o">=</span> <span class="n">Secret</span><span class="p">(</span><span class="s1">&#39;volume&#39;</span><span class="p">,</span> <span class="s1">&#39;/etc/sql_conn&#39;</span><span class="p">,</span> <span class="s1">&#39;airflow-secrets&#39;</span><span class="p">,</span> <span class="s1">&#39;sql_alchemy_conn&#39;</span><span class="p">)</span>
 <span class="n">secret_env</span> <span class="o">=</span> <span class="n">Secret</span><span class="p">(</span><span class="s1">&#39;env&#39;</span><span class="p">,</span> <span class="s1">&#39;SQL_CONN&#39;</span><span class="p">,</span> <span class="s1">&#39;airflow-secrets&#39;</span><span class="p">,</span> <span class="s1">&#39;sql_alchemy_conn&#39;</span><span class="p">)</span>
 <span class="n">secret_all_keys</span> <span class="o">=</span> <span class="n">Secret</span><span class="p">(</span><span class="s1">&#39;env&#39;</span><span class="p">,</span> <span class="kc">None</span><span class="p">,</span> <span class="s1">&#39;airflow-secrets-2&#39;</span><span class="p">)</span>
@@ -745,7 +745,7 @@ specified in the <code class="docutils literal notranslate"><span class="pre">im
 </div>
 <p>Then use it in your pod like so:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-cncf-kubernetes/3.0.1/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">quay_k8s</span> <span class="o">=</span> <span class="n">KubernetesPodOperator</span><span class="p">(</span>
         <span class="n">namespace</span><span class="o">=</span><span class="s1">&#39;default&#39;</span><span class="p">,</span>
         <span class="n">image</span><span class="o">=</span><span class="s1">&#39;quay.io/apache/bash&#39;</span><span class="p">,</span>
@@ -771,7 +771,7 @@ from your Pod you must specify the <code class="docutils literal notranslate"><s
 alongside the Pod. The Pod must write the XCom value into this location at the <code class="docutils literal notranslate"><span class="pre">/airflow/xcom/return.json</span></code> path.</p>
 <p>See the following example on how this occurs:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-cncf-kubernetes/3.0.1/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">write_xcom</span> <span class="o">=</span> <span class="n">KubernetesPodOperator</span><span class="p">(</span>
         <span class="n">namespace</span><span class="o">=</span><span class="s1">&#39;default&#39;</span><span class="p">,</span>
         <span class="n">image</span><span class="o">=</span><span class="s1">&#39;alpine&#39;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-cncf-kubernetes/3.0.2/operators.html b/docs-archive/apache-airflow-providers-cncf-kubernetes/3.0.2/operators.html
index 3651b17b44..1336ef4362 100644
--- a/docs-archive/apache-airflow-providers-cncf-kubernetes/3.0.2/operators.html
+++ b/docs-archive/apache-airflow-providers-cncf-kubernetes/3.0.2/operators.html
@@ -649,7 +649,7 @@ Using this method will ensure correctness
 and type safety. While we have removed almost all Kubernetes convenience classes, we have kept the
 <code class="xref py py-class docutils literal notranslate"><span class="pre">Secret</span></code> class to simplify the process of generating secret volumes/env variables.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-cncf-kubernetes/3.0.2/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">secret_file</span> <span class="o">=</span> <span class="n">Secret</span><span class="p">(</span><span class="s1">&#39;volume&#39;</span><span class="p">,</span> <span class="s1">&#39;/etc/sql_conn&#39;</span><span class="p">,</span> <span class="s1">&#39;airflow-secrets&#39;</span><span class="p">,</span> <span class="s1">&#39;sql_alchemy_conn&#39;</span><span class="p">)</span>
 <span class="n">secret_env</span> <span class="o">=</span> <span class="n">Secret</span><span class="p">(</span><span class="s1">&#39;env&#39;</span><span class="p">,</span> <span class="s1">&#39;SQL_CONN&#39;</span><span class="p">,</span> <span class="s1">&#39;airflow-secrets&#39;</span><span class="p">,</span> <span class="s1">&#39;sql_alchemy_conn&#39;</span><span class="p">)</span>
 <span class="n">secret_all_keys</span> <span class="o">=</span> <span class="n">Secret</span><span class="p">(</span><span class="s1">&#39;env&#39;</span><span class="p">,</span> <span class="kc">None</span><span class="p">,</span> <span class="s1">&#39;airflow-secrets-2&#39;</span><span class="p">)</span>
@@ -745,7 +745,7 @@ specified in the <code class="docutils literal notranslate"><span class="pre">im
 </div>
 <p>Then use it in your pod like so:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-cncf-kubernetes/3.0.2/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">quay_k8s</span> <span class="o">=</span> <span class="n">KubernetesPodOperator</span><span class="p">(</span>
         <span class="n">namespace</span><span class="o">=</span><span class="s1">&#39;default&#39;</span><span class="p">,</span>
         <span class="n">image</span><span class="o">=</span><span class="s1">&#39;quay.io/apache/bash&#39;</span><span class="p">,</span>
@@ -771,7 +771,7 @@ from your Pod you must specify the <code class="docutils literal notranslate"><s
 alongside the Pod. The Pod must write the XCom value into this location at the <code class="docutils literal notranslate"><span class="pre">/airflow/xcom/return.json</span></code> path.</p>
 <p>See the following example on how this occurs:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-cncf-kubernetes/3.0.2/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">write_xcom</span> <span class="o">=</span> <span class="n">KubernetesPodOperator</span><span class="p">(</span>
         <span class="n">namespace</span><span class="o">=</span><span class="s1">&#39;default&#39;</span><span class="p">,</span>
         <span class="n">image</span><span class="o">=</span><span class="s1">&#39;alpine&#39;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-cncf-kubernetes/3.1.0/operators.html b/docs-archive/apache-airflow-providers-cncf-kubernetes/3.1.0/operators.html
index 5cdc2490bc..f4e040ec6b 100644
--- a/docs-archive/apache-airflow-providers-cncf-kubernetes/3.1.0/operators.html
+++ b/docs-archive/apache-airflow-providers-cncf-kubernetes/3.1.0/operators.html
@@ -649,7 +649,7 @@ Using this method will ensure correctness
 and type safety. While we have removed almost all Kubernetes convenience classes, we have kept the
 <code class="xref py py-class docutils literal notranslate"><span class="pre">Secret</span></code> class to simplify the process of generating secret volumes/env variables.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-cncf-kubernetes/3.1.0/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">secret_file</span> <span class="o">=</span> <span class="n">Secret</span><span class="p">(</span><span class="s1">&#39;volume&#39;</span><span class="p">,</span> <span class="s1">&#39;/etc/sql_conn&#39;</span><span class="p">,</span> <span class="s1">&#39;airflow-secrets&#39;</span><span class="p">,</span> <span class="s1">&#39;sql_alchemy_conn&#39;</span><span class="p">)</span>
 <span class="n">secret_env</span> <span class="o">=</span> <span class="n">Secret</span><span class="p">(</span><span class="s1">&#39;env&#39;</span><span class="p">,</span> <span class="s1">&#39;SQL_CONN&#39;</span><span class="p">,</span> <span class="s1">&#39;airflow-secrets&#39;</span><span class="p">,</span> <span class="s1">&#39;sql_alchemy_conn&#39;</span><span class="p">)</span>
 <span class="n">secret_all_keys</span> <span class="o">=</span> <span class="n">Secret</span><span class="p">(</span><span class="s1">&#39;env&#39;</span><span class="p">,</span> <span class="kc">None</span><span class="p">,</span> <span class="s1">&#39;airflow-secrets-2&#39;</span><span class="p">)</span>
@@ -745,7 +745,7 @@ specified in the <code class="docutils literal notranslate"><span class="pre">im
 </div>
 <p>Then use it in your pod like so:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-cncf-kubernetes/3.1.0/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">quay_k8s</span> <span class="o">=</span> <span class="n">KubernetesPodOperator</span><span class="p">(</span>
         <span class="n">namespace</span><span class="o">=</span><span class="s1">&#39;default&#39;</span><span class="p">,</span>
         <span class="n">image</span><span class="o">=</span><span class="s1">&#39;quay.io/apache/bash&#39;</span><span class="p">,</span>
@@ -771,7 +771,7 @@ from your Pod you must specify the <code class="docutils literal notranslate"><s
 alongside the Pod. The Pod must write the XCom value into this location at the <code class="docutils literal notranslate"><span class="pre">/airflow/xcom/return.json</span></code> path.</p>
 <p>See the following example on how this occurs:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-cncf-kubernetes/3.1.0/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">write_xcom</span> <span class="o">=</span> <span class="n">KubernetesPodOperator</span><span class="p">(</span>
         <span class="n">namespace</span><span class="o">=</span><span class="s1">&#39;default&#39;</span><span class="p">,</span>
         <span class="n">image</span><span class="o">=</span><span class="s1">&#39;alpine&#39;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-cncf-kubernetes/3.1.1/operators.html b/docs-archive/apache-airflow-providers-cncf-kubernetes/3.1.1/operators.html
index 20340555d4..5fea26e1d1 100644
--- a/docs-archive/apache-airflow-providers-cncf-kubernetes/3.1.1/operators.html
+++ b/docs-archive/apache-airflow-providers-cncf-kubernetes/3.1.1/operators.html
@@ -649,7 +649,7 @@ Using this method will ensure correctness
 and type safety. While we have removed almost all Kubernetes convenience classes, we have kept the
 <code class="xref py py-class docutils literal notranslate"><span class="pre">Secret</span></code> class to simplify the process of generating secret volumes/env variables.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-cncf-kubernetes/3.1.1/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">secret_file</span> <span class="o">=</span> <span class="n">Secret</span><span class="p">(</span><span class="s1">&#39;volume&#39;</span><span class="p">,</span> <span class="s1">&#39;/etc/sql_conn&#39;</span><span class="p">,</span> <span class="s1">&#39;airflow-secrets&#39;</span><span class="p">,</span> <span class="s1">&#39;sql_alchemy_conn&#39;</span><span class="p">)</span>
 <span class="n">secret_env</span> <span class="o">=</span> <span class="n">Secret</span><span class="p">(</span><span class="s1">&#39;env&#39;</span><span class="p">,</span> <span class="s1">&#39;SQL_CONN&#39;</span><span class="p">,</span> <span class="s1">&#39;airflow-secrets&#39;</span><span class="p">,</span> <span class="s1">&#39;sql_alchemy_conn&#39;</span><span class="p">)</span>
 <span class="n">secret_all_keys</span> <span class="o">=</span> <span class="n">Secret</span><span class="p">(</span><span class="s1">&#39;env&#39;</span><span class="p">,</span> <span class="kc">None</span><span class="p">,</span> <span class="s1">&#39;airflow-secrets-2&#39;</span><span class="p">)</span>
@@ -745,7 +745,7 @@ specified in the <code class="docutils literal notranslate"><span class="pre">im
 </div>
 <p>Then use it in your pod like so:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-cncf-kubernetes/3.1.1/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">quay_k8s</span> <span class="o">=</span> <span class="n">KubernetesPodOperator</span><span class="p">(</span>
         <span class="n">namespace</span><span class="o">=</span><span class="s1">&#39;default&#39;</span><span class="p">,</span>
         <span class="n">image</span><span class="o">=</span><span class="s1">&#39;quay.io/apache/bash&#39;</span><span class="p">,</span>
@@ -771,7 +771,7 @@ from your Pod you must specify the <code class="docutils literal notranslate"><s
 alongside the Pod. The Pod must write the XCom value into this location at the <code class="docutils literal notranslate"><span class="pre">/airflow/xcom/return.json</span></code> path.</p>
 <p>See the following example on how this occurs:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-cncf-kubernetes/3.1.1/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">write_xcom</span> <span class="o">=</span> <span class="n">KubernetesPodOperator</span><span class="p">(</span>
         <span class="n">namespace</span><span class="o">=</span><span class="s1">&#39;default&#39;</span><span class="p">,</span>
         <span class="n">image</span><span class="o">=</span><span class="s1">&#39;alpine&#39;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-cncf-kubernetes/3.1.2/operators.html b/docs-archive/apache-airflow-providers-cncf-kubernetes/3.1.2/operators.html
index 379ca43dcc..bf52f08bd9 100644
--- a/docs-archive/apache-airflow-providers-cncf-kubernetes/3.1.2/operators.html
+++ b/docs-archive/apache-airflow-providers-cncf-kubernetes/3.1.2/operators.html
@@ -649,7 +649,7 @@ Using this method will ensure correctness
 and type safety. While we have removed almost all Kubernetes convenience classes, we have kept the
 <code class="xref py py-class docutils literal notranslate"><span class="pre">Secret</span></code> class to simplify the process of generating secret volumes/env variables.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-cncf-kubernetes/3.1.2/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">secret_file</span> <span class="o">=</span> <span class="n">Secret</span><span class="p">(</span><span class="s1">&#39;volume&#39;</span><span class="p">,</span> <span class="s1">&#39;/etc/sql_conn&#39;</span><span class="p">,</span> <span class="s1">&#39;airflow-secrets&#39;</span><span class="p">,</span> <span class="s1">&#39;sql_alchemy_conn&#39;</span><span class="p">)</span>
 <span class="n">secret_env</span> <span class="o">=</span> <span class="n">Secret</span><span class="p">(</span><span class="s1">&#39;env&#39;</span><span class="p">,</span> <span class="s1">&#39;SQL_CONN&#39;</span><span class="p">,</span> <span class="s1">&#39;airflow-secrets&#39;</span><span class="p">,</span> <span class="s1">&#39;sql_alchemy_conn&#39;</span><span class="p">)</span>
 <span class="n">secret_all_keys</span> <span class="o">=</span> <span class="n">Secret</span><span class="p">(</span><span class="s1">&#39;env&#39;</span><span class="p">,</span> <span class="kc">None</span><span class="p">,</span> <span class="s1">&#39;airflow-secrets-2&#39;</span><span class="p">)</span>
@@ -745,7 +745,7 @@ specified in the <code class="docutils literal notranslate"><span class="pre">im
 </div>
 <p>Then use it in your pod like so:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-cncf-kubernetes/3.1.2/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">quay_k8s</span> <span class="o">=</span> <span class="n">KubernetesPodOperator</span><span class="p">(</span>
         <span class="n">namespace</span><span class="o">=</span><span class="s1">&#39;default&#39;</span><span class="p">,</span>
         <span class="n">image</span><span class="o">=</span><span class="s1">&#39;quay.io/apache/bash&#39;</span><span class="p">,</span>
@@ -771,7 +771,7 @@ from your Pod you must specify the <code class="docutils literal notranslate"><s
 alongside the Pod. The Pod must write the XCom value into this location at the <code class="docutils literal notranslate"><span class="pre">/airflow/xcom/return.json</span></code> path.</p>
 <p>See the following example on how this occurs:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-cncf-kubernetes/3.1.2/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">write_xcom</span> <span class="o">=</span> <span class="n">KubernetesPodOperator</span><span class="p">(</span>
         <span class="n">namespace</span><span class="o">=</span><span class="s1">&#39;default&#39;</span><span class="p">,</span>
         <span class="n">image</span><span class="o">=</span><span class="s1">&#39;alpine&#39;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-cncf-kubernetes/4.0.0/operators.html b/docs-archive/apache-airflow-providers-cncf-kubernetes/4.0.0/operators.html
index b4ac7b9179..5b0de45e9e 100644
--- a/docs-archive/apache-airflow-providers-cncf-kubernetes/4.0.0/operators.html
+++ b/docs-archive/apache-airflow-providers-cncf-kubernetes/4.0.0/operators.html
@@ -649,7 +649,7 @@ Using this method will ensure correctness
 and type safety. While we have removed almost all Kubernetes convenience classes, we have kept the
 <code class="xref py py-class docutils literal notranslate"><span class="pre">Secret</span></code> class to simplify the process of generating secret volumes/env variables.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-cncf-kubernetes/4.0.0/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">secret_file</span> <span class="o">=</span> <span class="n">Secret</span><span class="p">(</span><span class="s1">&#39;volume&#39;</span><span class="p">,</span> <span class="s1">&#39;/etc/sql_conn&#39;</span><span class="p">,</span> <span class="s1">&#39;airflow-secrets&#39;</span><span class="p">,</span> <span class="s1">&#39;sql_alchemy_conn&#39;</span><span class="p">)</span>
 <span class="n">secret_env</span> <span class="o">=</span> <span class="n">Secret</span><span class="p">(</span><span class="s1">&#39;env&#39;</span><span class="p">,</span> <span class="s1">&#39;SQL_CONN&#39;</span><span class="p">,</span> <span class="s1">&#39;airflow-secrets&#39;</span><span class="p">,</span> <span class="s1">&#39;sql_alchemy_conn&#39;</span><span class="p">)</span>
 <span class="n">secret_all_keys</span> <span class="o">=</span> <span class="n">Secret</span><span class="p">(</span><span class="s1">&#39;env&#39;</span><span class="p">,</span> <span class="kc">None</span><span class="p">,</span> <span class="s1">&#39;airflow-secrets-2&#39;</span><span class="p">)</span>
@@ -745,7 +745,7 @@ specified in the <code class="docutils literal notranslate"><span class="pre">im
 </div>
 <p>Then use it in your pod like so:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-cncf-kubernetes/4.0.0/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">quay_k8s</span> <span class="o">=</span> <span class="n">KubernetesPodOperator</span><span class="p">(</span>
         <span class="n">namespace</span><span class="o">=</span><span class="s1">&#39;default&#39;</span><span class="p">,</span>
         <span class="n">image</span><span class="o">=</span><span class="s1">&#39;quay.io/apache/bash&#39;</span><span class="p">,</span>
@@ -771,7 +771,7 @@ from your Pod you must specify the <code class="docutils literal notranslate"><s
 alongside the Pod. The Pod must write the XCom value into this location at the <code class="docutils literal notranslate"><span class="pre">/airflow/xcom/return.json</span></code> path.</p>
 <p>See the following example on how this occurs:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-cncf-kubernetes/4.0.0/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">write_xcom</span> <span class="o">=</span> <span class="n">KubernetesPodOperator</span><span class="p">(</span>
         <span class="n">namespace</span><span class="o">=</span><span class="s1">&#39;default&#39;</span><span class="p">,</span>
         <span class="n">image</span><span class="o">=</span><span class="s1">&#39;alpine&#39;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-cncf-kubernetes/4.0.1/operators.html b/docs-archive/apache-airflow-providers-cncf-kubernetes/4.0.1/operators.html
index 9d80b2b3e1..131b80b2df 100644
--- a/docs-archive/apache-airflow-providers-cncf-kubernetes/4.0.1/operators.html
+++ b/docs-archive/apache-airflow-providers-cncf-kubernetes/4.0.1/operators.html
@@ -651,7 +651,7 @@ Using this method will ensure correctness
 and type safety. While we have removed almost all Kubernetes convenience classes, we have kept the
 <code class="xref py py-class docutils literal notranslate"><span class="pre">Secret</span></code> class to simplify the process of generating secret volumes/env variables.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-cncf-kubernetes/4.0.1/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">secret_file</span> <span class="o">=</span> <span class="n">Secret</span><span class="p">(</span><span class="s1">&#39;volume&#39;</span><span class="p">,</span> <span class="s1">&#39;/etc/sql_conn&#39;</span><span class="p">,</span> <span class="s1">&#39;airflow-secrets&#39;</span><span class="p">,</span> <span class="s1">&#39;sql_alchemy_conn&#39;</span><span class="p">)</span>
 <span class="n">secret_env</span> <span class="o">=</span> <span class="n">Secret</span><span class="p">(</span><span class="s1">&#39;env&#39;</span><span class="p">,</span> <span class="s1">&#39;SQL_CONN&#39;</span><span class="p">,</span> <span class="s1">&#39;airflow-secrets&#39;</span><span class="p">,</span> <span class="s1">&#39;sql_alchemy_conn&#39;</span><span class="p">)</span>
 <span class="n">secret_all_keys</span> <span class="o">=</span> <span class="n">Secret</span><span class="p">(</span><span class="s1">&#39;env&#39;</span><span class="p">,</span> <span class="kc">None</span><span class="p">,</span> <span class="s1">&#39;airflow-secrets-2&#39;</span><span class="p">)</span>
@@ -747,7 +747,7 @@ specified in the <code class="docutils literal notranslate"><span class="pre">im
 </div>
 <p>Then use it in your pod like so:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-cncf-kubernetes/4.0.1/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">quay_k8s</span> <span class="o">=</span> <span class="n">KubernetesPodOperator</span><span class="p">(</span>
         <span class="n">namespace</span><span class="o">=</span><span class="s1">&#39;default&#39;</span><span class="p">,</span>
         <span class="n">image</span><span class="o">=</span><span class="s1">&#39;quay.io/apache/bash&#39;</span><span class="p">,</span>
@@ -773,7 +773,7 @@ from your Pod you must specify the <code class="docutils literal notranslate"><s
 alongside the Pod. The Pod must write the XCom value into this location at the <code class="docutils literal notranslate"><span class="pre">/airflow/xcom/return.json</span></code> path.</p>
 <p>See the following example on how this occurs:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-cncf-kubernetes/4.0.1/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">write_xcom</span> <span class="o">=</span> <span class="n">KubernetesPodOperator</span><span class="p">(</span>
         <span class="n">namespace</span><span class="o">=</span><span class="s1">&#39;default&#39;</span><span class="p">,</span>
         <span class="n">image</span><span class="o">=</span><span class="s1">&#39;alpine&#39;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-cncf-kubernetes/4.0.2/operators.html b/docs-archive/apache-airflow-providers-cncf-kubernetes/4.0.2/operators.html
index e44cccd7a3..82821d9e30 100644
--- a/docs-archive/apache-airflow-providers-cncf-kubernetes/4.0.2/operators.html
+++ b/docs-archive/apache-airflow-providers-cncf-kubernetes/4.0.2/operators.html
@@ -651,7 +651,7 @@ Using this method will ensure correctness
 and type safety. While we have removed almost all Kubernetes convenience classes, we have kept the
 <code class="xref py py-class docutils literal notranslate"><span class="pre">Secret</span></code> class to simplify the process of generating secret volumes/env variables.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-cncf-kubernetes/4.0.2/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">secret_file</span> <span class="o">=</span> <span class="n">Secret</span><span class="p">(</span><span class="s1">&#39;volume&#39;</span><span class="p">,</span> <span class="s1">&#39;/etc/sql_conn&#39;</span><span class="p">,</span> <span class="s1">&#39;airflow-secrets&#39;</span><span class="p">,</span> <span class="s1">&#39;sql_alchemy_conn&#39;</span><span class="p">)</span>
 <span class="n">secret_env</span> <span class="o">=</span> <span class="n">Secret</span><span class="p">(</span><span class="s1">&#39;env&#39;</span><span class="p">,</span> <span class="s1">&#39;SQL_CONN&#39;</span><span class="p">,</span> <span class="s1">&#39;airflow-secrets&#39;</span><span class="p">,</span> <span class="s1">&#39;sql_alchemy_conn&#39;</span><span class="p">)</span>
 <span class="n">secret_all_keys</span> <span class="o">=</span> <span class="n">Secret</span><span class="p">(</span><span class="s1">&#39;env&#39;</span><span class="p">,</span> <span class="kc">None</span><span class="p">,</span> <span class="s1">&#39;airflow-secrets-2&#39;</span><span class="p">)</span>
@@ -747,7 +747,7 @@ specified in the <code class="docutils literal notranslate"><span class="pre">im
 </div>
 <p>Then use it in your pod like so:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-cncf-kubernetes/4.0.2/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">quay_k8s</span> <span class="o">=</span> <span class="n">KubernetesPodOperator</span><span class="p">(</span>
         <span class="n">namespace</span><span class="o">=</span><span class="s1">&#39;default&#39;</span><span class="p">,</span>
         <span class="n">image</span><span class="o">=</span><span class="s1">&#39;quay.io/apache/bash&#39;</span><span class="p">,</span>
@@ -773,7 +773,7 @@ from your Pod you must specify the <code class="docutils literal notranslate"><s
 alongside the Pod. The Pod must write the XCom value into this location at the <code class="docutils literal notranslate"><span class="pre">/airflow/xcom/return.json</span></code> path.</p>
 <p>See the following example on how this occurs:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-cncf-kubernetes/4.0.2/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="n">write_xcom</span> <span class="o">=</span> <span class="n">KubernetesPodOperator</span><span class="p">(</span>
         <span class="n">namespace</span><span class="o">=</span><span class="s1">&#39;default&#39;</span><span class="p">,</span>
         <span class="n">image</span><span class="o">=</span><span class="s1">&#39;alpine&#39;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-databricks/1.0.0/operators.html b/docs-archive/apache-airflow-providers-databricks/1.0.0/operators.html
index a2bdec4bb9..b059c66e31 100644
--- a/docs-archive/apache-airflow-providers-databricks/1.0.0/operators.html
+++ b/docs-archive/apache-airflow-providers-databricks/1.0.0/operators.html
@@ -638,7 +638,7 @@ one named parameter for each top level parameter in the <code class="docutils li
 </table>
 <p>An example usage of the DatabricksSubmitRunOperator is as follows:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/databricks/example_dags/example_databricks.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-databricks/1.0.0/airflow/providers/databricks/example_dags/example_databricks.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="c1"># Example of using the JSON parameter to initialize the operator.</span>
     <span class="n">notebook_task</span> <span class="o">=</span> <span class="n">DatabricksSubmitRunOperator</span><span class="p">(</span><span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;notebook_task&#39;</span><span class="p">,</span> <span class="n">json</span><span class="o">=</span><span class="n">notebook_task_params</span><span class="p">)</span>
 </pre></div>
@@ -646,7 +646,7 @@ one named parameter for each top level parameter in the <code class="docutils li
 </div>
 <p>You can also use named parameters to initialize the operator and run the job.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/databricks/example_dags/example_databricks.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-databricks/1.0.0/airflow/providers/databricks/example_dags/example_databricks.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="c1"># Example of using the named parameters of DatabricksSubmitRunOperator</span>
     <span class="c1"># to initialize the operator.</span>
     <span class="n">spark_jar_task</span> <span class="o">=</span> <span class="n">DatabricksSubmitRunOperator</span><span class="p">(</span>
diff --git a/docs-archive/apache-airflow-providers-databricks/1.0.1/operators.html b/docs-archive/apache-airflow-providers-databricks/1.0.1/operators.html
index 5bf9760c84..09bdde3c00 100644
--- a/docs-archive/apache-airflow-providers-databricks/1.0.1/operators.html
+++ b/docs-archive/apache-airflow-providers-databricks/1.0.1/operators.html
@@ -650,7 +650,7 @@ one named parameter for each top level parameter in the <code class="docutils li
 </table>
 <p>An example usage of the DatabricksSubmitRunOperator is as follows:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/databricks/example_dags/example_databricks.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-databricks/1.0.1/airflow/providers/databricks/example_dags/example_databricks.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="c1"># Example of using the JSON parameter to initialize the operator.</span>
     <span class="n">notebook_task</span> <span class="o">=</span> <span class="n">DatabricksSubmitRunOperator</span><span class="p">(</span><span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;notebook_task&#39;</span><span class="p">,</span> <span class="n">json</span><span class="o">=</span><span class="n">notebook_task_params</span><span class="p">)</span>
 </pre></div>
@@ -658,7 +658,7 @@ one named parameter for each top level parameter in the <code class="docutils li
 </div>
 <p>You can also use named parameters to initialize the operator and run the job.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/databricks/example_dags/example_databricks.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-databricks/1.0.1/airflow/providers/databricks/example_dags/example_databricks.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="c1"># Example of using the named parameters of DatabricksSubmitRunOperator</span>
     <span class="c1"># to initialize the operator.</span>
     <span class="n">spark_jar_task</span> <span class="o">=</span> <span class="n">DatabricksSubmitRunOperator</span><span class="p">(</span>
diff --git a/docs-archive/apache-airflow-providers-databricks/2.0.0/operators.html b/docs-archive/apache-airflow-providers-databricks/2.0.0/operators.html
index 6675a6a2c4..9beaa3bf29 100644
--- a/docs-archive/apache-airflow-providers-databricks/2.0.0/operators.html
+++ b/docs-archive/apache-airflow-providers-databricks/2.0.0/operators.html
@@ -649,7 +649,7 @@ one named parameter for each top level parameter in the <code class="docutils li
 </table>
 <p>An example usage of the DatabricksSubmitRunOperator is as follows:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/databricks/example_dags/example_databricks.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-databricks/2.0.0/airflow/providers/databricks/example_dags/example_databricks.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="c1"># Example of using the JSON parameter to initialize the operator.</span>
     <span class="n">notebook_task</span> <span class="o">=</span> <span class="n">DatabricksSubmitRunOperator</span><span class="p">(</span><span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;notebook_task&#39;</span><span class="p">,</span> <span class="n">json</span><span class="o">=</span><span class="n">notebook_task_params</span><span class="p">)</span>
 </pre></div>
@@ -657,7 +657,7 @@ one named parameter for each top level parameter in the <code class="docutils li
 </div>
 <p>You can also use named parameters to initialize the operator and run the job.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/databricks/example_dags/example_databricks.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-databricks/2.0.0/airflow/providers/databricks/example_dags/example_databricks.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="c1"># Example of using the named parameters of DatabricksSubmitRunOperator</span>
     <span class="c1"># to initialize the operator.</span>
     <span class="n">spark_jar_task</span> <span class="o">=</span> <span class="n">DatabricksSubmitRunOperator</span><span class="p">(</span>
diff --git a/docs-archive/apache-airflow-providers-databricks/2.0.1/operators.html b/docs-archive/apache-airflow-providers-databricks/2.0.1/operators.html
index c6bb61939a..b8003fe368 100644
--- a/docs-archive/apache-airflow-providers-databricks/2.0.1/operators.html
+++ b/docs-archive/apache-airflow-providers-databricks/2.0.1/operators.html
@@ -649,7 +649,7 @@ one named parameter for each top level parameter in the <code class="docutils li
 </table>
 <p>An example usage of the DatabricksSubmitRunOperator is as follows:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/databricks/example_dags/example_databricks.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-databricks/2.0.1/airflow/providers/databricks/example_dags/example_databricks.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="c1"># Example of using the JSON parameter to initialize the operator.</span>
     <span class="n">notebook_task</span> <span class="o">=</span> <span class="n">DatabricksSubmitRunOperator</span><span class="p">(</span><span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;notebook_task&#39;</span><span class="p">,</span> <span class="n">json</span><span class="o">=</span><span class="n">notebook_task_params</span><span class="p">)</span>
 </pre></div>
@@ -657,7 +657,7 @@ one named parameter for each top level parameter in the <code class="docutils li
 </div>
 <p>You can also use named parameters to initialize the operator and run the job.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/databricks/example_dags/example_databricks.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-databricks/2.0.1/airflow/providers/databricks/example_dags/example_databricks.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="c1"># Example of using the named parameters of DatabricksSubmitRunOperator</span>
     <span class="c1"># to initialize the operator.</span>
     <span class="n">spark_jar_task</span> <span class="o">=</span> <span class="n">DatabricksSubmitRunOperator</span><span class="p">(</span>
diff --git a/docs-archive/apache-airflow-providers-databricks/2.0.2/operators.html b/docs-archive/apache-airflow-providers-databricks/2.0.2/operators.html
index cc0cb8d1e9..3d8098e07e 100644
--- a/docs-archive/apache-airflow-providers-databricks/2.0.2/operators.html
+++ b/docs-archive/apache-airflow-providers-databricks/2.0.2/operators.html
@@ -651,7 +651,7 @@ one named parameter for each top level parameter in the <code class="docutils li
 </table>
 <p>An example usage of the DatabricksSubmitRunOperator is as follows:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/databricks/example_dags/example_databricks.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-databricks/2.0.2/airflow/providers/databricks/example_dags/example_databricks.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="c1"># Example of using the JSON parameter to initialize the operator.</span>
     <span class="n">notebook_task</span> <span class="o">=</span> <span class="n">DatabricksSubmitRunOperator</span><span class="p">(</span><span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;notebook_task&#39;</span><span class="p">,</span> <span class="n">json</span><span class="o">=</span><span class="n">notebook_task_params</span><span class="p">)</span>
 </pre></div>
@@ -659,7 +659,7 @@ one named parameter for each top level parameter in the <code class="docutils li
 </div>
 <p>You can also use named parameters to initialize the operator and run the job.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/databricks/example_dags/example_databricks.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-databricks/2.0.2/airflow/providers/databricks/example_dags/example_databricks.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="c1"># Example of using the named parameters of DatabricksSubmitRunOperator</span>
     <span class="c1"># to initialize the operator.</span>
     <span class="n">spark_jar_task</span> <span class="o">=</span> <span class="n">DatabricksSubmitRunOperator</span><span class="p">(</span>
diff --git a/docs-archive/apache-airflow-providers-databricks/2.1.0/operators.html b/docs-archive/apache-airflow-providers-databricks/2.1.0/operators.html
index 47dfa51ebf..5ff95c8bfa 100644
--- a/docs-archive/apache-airflow-providers-databricks/2.1.0/operators.html
+++ b/docs-archive/apache-airflow-providers-databricks/2.1.0/operators.html
@@ -659,7 +659,7 @@ one named parameter for each top level parameter in the <code class="docutils li
 </table>
 <p>An example usage of the DatabricksSubmitRunOperator is as follows:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/databricks/example_dags/example_databricks.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-databricks/2.1.0/airflow/providers/databricks/example_dags/example_databricks.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="c1"># Example of using the JSON parameter to initialize the operator.</span>
     <span class="n">notebook_task</span> <span class="o">=</span> <span class="n">DatabricksSubmitRunOperator</span><span class="p">(</span><span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;notebook_task&#39;</span><span class="p">,</span> <span class="n">json</span><span class="o">=</span><span class="n">notebook_task_params</span><span class="p">)</span>
 </pre></div>
@@ -667,7 +667,7 @@ one named parameter for each top level parameter in the <code class="docutils li
 </div>
 <p>You can also use named parameters to initialize the operator and run the job.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/databricks/example_dags/example_databricks.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-databricks/2.1.0/airflow/providers/databricks/example_dags/example_databricks.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="c1"># Example of using the named parameters of DatabricksSubmitRunOperator</span>
     <span class="c1"># to initialize the operator.</span>
     <span class="n">spark_jar_task</span> <span class="o">=</span> <span class="n">DatabricksSubmitRunOperator</span><span class="p">(</span>
diff --git a/docs-archive/apache-airflow-providers-databricks/2.2.0/operators.html b/docs-archive/apache-airflow-providers-databricks/2.2.0/operators.html
index dfc9947e98..24b5d54830 100644
--- a/docs-archive/apache-airflow-providers-databricks/2.2.0/operators.html
+++ b/docs-archive/apache-airflow-providers-databricks/2.2.0/operators.html
@@ -659,7 +659,7 @@ one named parameter for each top level parameter in the <code class="docutils li
 </table>
 <p>An example usage of the DatabricksSubmitRunOperator is as follows:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/databricks/example_dags/example_databricks.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-databricks/2.2.0/airflow/providers/databricks/example_dags/example_databricks.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="c1"># Example of using the JSON parameter to initialize the operator.</span>
     <span class="n">notebook_task</span> <span class="o">=</span> <span class="n">DatabricksSubmitRunOperator</span><span class="p">(</span><span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;notebook_task&#39;</span><span class="p">,</span> <span class="n">json</span><span class="o">=</span><span class="n">notebook_task_params</span><span class="p">)</span>
 </pre></div>
@@ -667,7 +667,7 @@ one named parameter for each top level parameter in the <code class="docutils li
 </div>
 <p>You can also use named parameters to initialize the operator and run the job.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/databricks/example_dags/example_databricks.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-databricks/2.2.0/airflow/providers/databricks/example_dags/example_databricks.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="c1"># Example of using the named parameters of DatabricksSubmitRunOperator</span>
     <span class="c1"># to initialize the operator.</span>
     <span class="n">spark_jar_task</span> <span class="o">=</span> <span class="n">DatabricksSubmitRunOperator</span><span class="p">(</span>
diff --git a/docs-archive/apache-airflow-providers-databricks/2.3.0/operators/copy_into.html b/docs-archive/apache-airflow-providers-databricks/2.3.0/operators/copy_into.html
index e77e900a05..aaa21ff627 100644
--- a/docs-archive/apache-airflow-providers-databricks/2.3.0/operators/copy_into.html
+++ b/docs-archive/apache-airflow-providers-databricks/2.3.0/operators/copy_into.html
@@ -652,7 +652,7 @@ command.</p>
 <h3>Importing CSV data<a class="headerlink" href="#importing-csv-data" title="Permalink to this headline">¶</a></h3>
 <p>An example usage of the DatabricksCopyIntoOperator to import CSV data into a table is as follows:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks_sql.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/databricks/example_dags/example_databricks_sql.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks_sql.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-databricks/2.3.0/airflow/providers/databricks/example_dags/example_databricks_sql.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="c1"># Example of importing data using COPY_INTO SQL command</span>
     <span class="n">import_csv</span> <span class="o">=</span> <span class="n">DatabricksCopyIntoOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;import_csv&#39;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-databricks/2.3.0/operators/sql.html b/docs-archive/apache-airflow-providers-databricks/2.3.0/operators/sql.html
index 559b12b22b..d13b47e630 100644
--- a/docs-archive/apache-airflow-providers-databricks/2.3.0/operators/sql.html
+++ b/docs-archive/apache-airflow-providers-databricks/2.3.0/operators/sql.html
@@ -648,7 +648,7 @@ on a <a class="reference external" href="https://docs.databricks.com/sql/admin/s
 <h3>Selecting data<a class="headerlink" href="#selecting-data" title="Permalink to this headline">¶</a></h3>
 <p>An example usage of the DatabricksSqlOperator to select data from a table is as follows:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks_sql.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/databricks/example_dags/example_databricks_sql.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks_sql.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-databricks/2.3.0/airflow/providers/databricks/example_dags/example_databricks_sql.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="c1"># Example of using the Databricks SQL Operator to select data.</span>
     <span class="n">select</span> <span class="o">=</span> <span class="n">DatabricksSqlOperator</span><span class="p">(</span>
         <span class="n">databricks_conn_id</span><span class="o">=</span><span class="n">connection_id</span><span class="p">,</span>
@@ -664,7 +664,7 @@ on a <a class="reference external" href="https://docs.databricks.com/sql/admin/s
 <h3>Selecting data into a file<a class="headerlink" href="#selecting-data-into-a-file" title="Permalink to this headline">¶</a></h3>
 <p>An example usage of the DatabricksSqlOperator to select data from a table and store in a file is as follows:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks_sql.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/databricks/example_dags/example_databricks_sql.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks_sql.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-databricks/2.3.0/airflow/providers/databricks/example_dags/example_databricks_sql.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="c1"># Example of using the Databricks SQL Operator to select data into a file with JSONL format.</span>
     <span class="n">select_into_file</span> <span class="o">=</span> <span class="n">DatabricksSqlOperator</span><span class="p">(</span>
         <span class="n">databricks_conn_id</span><span class="o">=</span><span class="n">connection_id</span><span class="p">,</span>
@@ -682,7 +682,7 @@ on a <a class="reference external" href="https://docs.databricks.com/sql/admin/s
 <h3>Executing multiple statements<a class="headerlink" href="#executing-multiple-statements" title="Permalink to this headline">¶</a></h3>
 <p>An example usage of the DatabricksSqlOperator to perform multiple SQL statements is as follows:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks_sql.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/databricks/example_dags/example_databricks_sql.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks_sql.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-databricks/2.3.0/airflow/providers/databricks/example_dags/example_databricks_sql.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="c1"># Example of using the Databricks SQL Operator to perform multiple operations.</span>
     <span class="n">create</span> <span class="o">=</span> <span class="n">DatabricksSqlOperator</span><span class="p">(</span>
         <span class="n">databricks_conn_id</span><span class="o">=</span><span class="n">connection_id</span><span class="p">,</span>
@@ -702,7 +702,7 @@ on a <a class="reference external" href="https://docs.databricks.com/sql/admin/s
 <h3>Executing multiple statements from a file<a class="headerlink" href="#executing-multiple-statements-from-a-file" title="Permalink to this headline">¶</a></h3>
 <p>An example usage of the DatabricksSqlOperator to perform statements from a file is as follows:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks_sql.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/databricks/example_dags/example_databricks_sql.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks_sql.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-databricks/2.3.0/airflow/providers/databricks/example_dags/example_databricks_sql.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="c1"># Example of using the Databricks SQL Operator to select data.</span>
     <span class="c1"># SQL statements should be in the file with name test.sql</span>
     <span class="n">create_file</span> <span class="o">=</span> <span class="n">DatabricksSqlOperator</span><span class="p">(</span>
diff --git a/docs-archive/apache-airflow-providers-databricks/2.3.0/operators/submit_run.html b/docs-archive/apache-airflow-providers-databricks/2.3.0/operators/submit_run.html
index 34acd68479..134a7b0d0c 100644
--- a/docs-archive/apache-airflow-providers-databricks/2.3.0/operators/submit_run.html
+++ b/docs-archive/apache-airflow-providers-databricks/2.3.0/operators/submit_run.html
@@ -663,7 +663,7 @@ one named parameter for each top level parameter in the <code class="docutils li
 <h3>Specifying parameters as JSON<a class="headerlink" href="#specifying-parameters-as-json" title="Permalink to this headline">¶</a></h3>
 <p>An example usage of the DatabricksSubmitRunOperator is as follows:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/databricks/example_dags/example_databricks.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-databricks/2.3.0/airflow/providers/databricks/example_dags/example_databricks.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="c1"># Example of using the JSON parameter to initialize the operator.</span>
     <span class="n">new_cluster</span> <span class="o">=</span> <span class="p">{</span>
         <span class="s1">&#39;spark_version&#39;</span><span class="p">:</span> <span class="s1">&#39;9.1.x-scala2.12&#39;</span><span class="p">,</span>
@@ -688,7 +688,7 @@ one named parameter for each top level parameter in the <code class="docutils li
 <h3>Using named parameters<a class="headerlink" href="#using-named-parameters" title="Permalink to this headline">¶</a></h3>
 <p>You can also use named parameters to initialize the operator and run the job.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/databricks/example_dags/example_databricks.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-databricks/2.3.0/airflow/providers/databricks/example_dags/example_databricks.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="c1"># Example of using the named parameters of DatabricksSubmitRunOperator</span>
     <span class="c1"># to initialize the operator.</span>
     <span class="n">spark_jar_task</span> <span class="o">=</span> <span class="n">DatabricksSubmitRunOperator</span><span class="p">(</span>
diff --git a/docs-archive/apache-airflow-providers-databricks/2.4.0/operators/copy_into.html b/docs-archive/apache-airflow-providers-databricks/2.4.0/operators/copy_into.html
index 6d5312cf05..2a83e2b0fc 100644
--- a/docs-archive/apache-airflow-providers-databricks/2.4.0/operators/copy_into.html
+++ b/docs-archive/apache-airflow-providers-databricks/2.4.0/operators/copy_into.html
@@ -661,7 +661,7 @@ command.</p>
 <h3>Importing CSV data<a class="headerlink" href="#importing-csv-data" title="Permalink to this headline">¶</a></h3>
 <p>An example usage of the DatabricksCopyIntoOperator to import CSV data into a table is as follows:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks_sql.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/databricks/example_dags/example_databricks_sql.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks_sql.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-databricks/2.4.0/airflow/providers/databricks/example_dags/example_databricks_sql.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="c1"># Example of importing data using COPY_INTO SQL command</span>
     <span class="n">import_csv</span> <span class="o">=</span> <span class="n">DatabricksCopyIntoOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;import_csv&#39;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-databricks/2.4.0/operators/sql.html b/docs-archive/apache-airflow-providers-databricks/2.4.0/operators/sql.html
index 582fbb017d..e01ff1efaa 100644
--- a/docs-archive/apache-airflow-providers-databricks/2.4.0/operators/sql.html
+++ b/docs-archive/apache-airflow-providers-databricks/2.4.0/operators/sql.html
@@ -648,7 +648,7 @@ on a <a class="reference external" href="https://docs.databricks.com/sql/admin/s
 <h3>Selecting data<a class="headerlink" href="#selecting-data" title="Permalink to this headline">¶</a></h3>
 <p>An example usage of the DatabricksSqlOperator to select data from a table is as follows:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks_sql.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/databricks/example_dags/example_databricks_sql.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks_sql.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-databricks/2.4.0/airflow/providers/databricks/example_dags/example_databricks_sql.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="c1"># Example of using the Databricks SQL Operator to select data.</span>
     <span class="n">select</span> <span class="o">=</span> <span class="n">DatabricksSqlOperator</span><span class="p">(</span>
         <span class="n">databricks_conn_id</span><span class="o">=</span><span class="n">connection_id</span><span class="p">,</span>
@@ -664,7 +664,7 @@ on a <a class="reference external" href="https://docs.databricks.com/sql/admin/s
 <h3>Selecting data into a file<a class="headerlink" href="#selecting-data-into-a-file" title="Permalink to this headline">¶</a></h3>
 <p>An example usage of the DatabricksSqlOperator to select data from a table and store in a file is as follows:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks_sql.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/databricks/example_dags/example_databricks_sql.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks_sql.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-databricks/2.4.0/airflow/providers/databricks/example_dags/example_databricks_sql.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="c1"># Example of using the Databricks SQL Operator to select data into a file with JSONL format.</span>
     <span class="n">select_into_file</span> <span class="o">=</span> <span class="n">DatabricksSqlOperator</span><span class="p">(</span>
         <span class="n">databricks_conn_id</span><span class="o">=</span><span class="n">connection_id</span><span class="p">,</span>
@@ -682,7 +682,7 @@ on a <a class="reference external" href="https://docs.databricks.com/sql/admin/s
 <h3>Executing multiple statements<a class="headerlink" href="#executing-multiple-statements" title="Permalink to this headline">¶</a></h3>
 <p>An example usage of the DatabricksSqlOperator to perform multiple SQL statements is as follows:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks_sql.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/databricks/example_dags/example_databricks_sql.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks_sql.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-databricks/2.4.0/airflow/providers/databricks/example_dags/example_databricks_sql.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="c1"># Example of using the Databricks SQL Operator to perform multiple operations.</span>
     <span class="n">create</span> <span class="o">=</span> <span class="n">DatabricksSqlOperator</span><span class="p">(</span>
         <span class="n">databricks_conn_id</span><span class="o">=</span><span class="n">connection_id</span><span class="p">,</span>
@@ -702,7 +702,7 @@ on a <a class="reference external" href="https://docs.databricks.com/sql/admin/s
 <h3>Executing multiple statements from a file<a class="headerlink" href="#executing-multiple-statements-from-a-file" title="Permalink to this headline">¶</a></h3>
 <p>An example usage of the DatabricksSqlOperator to perform statements from a file is as follows:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks_sql.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/databricks/example_dags/example_databricks_sql.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks_sql.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-databricks/2.4.0/airflow/providers/databricks/example_dags/example_databricks_sql.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="c1"># Example of using the Databricks SQL Operator to select data.</span>
     <span class="c1"># SQL statements should be in the file with name test.sql</span>
     <span class="n">create_file</span> <span class="o">=</span> <span class="n">DatabricksSqlOperator</span><span class="p">(</span>
diff --git a/docs-archive/apache-airflow-providers-databricks/2.4.0/operators/submit_run.html b/docs-archive/apache-airflow-providers-databricks/2.4.0/operators/submit_run.html
index 26fb27588c..542c32b3d4 100644
--- a/docs-archive/apache-airflow-providers-databricks/2.4.0/operators/submit_run.html
+++ b/docs-archive/apache-airflow-providers-databricks/2.4.0/operators/submit_run.html
@@ -666,7 +666,7 @@ one named parameter for each top level parameter in the <code class="docutils li
 <h3>Specifying parameters as JSON<a class="headerlink" href="#specifying-parameters-as-json" title="Permalink to this headline">¶</a></h3>
 <p>An example usage of the DatabricksSubmitRunOperator is as follows:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/databricks/example_dags/example_databricks.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-databricks/2.4.0/airflow/providers/databricks/example_dags/example_databricks.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="c1"># Example of using the JSON parameter to initialize the operator.</span>
     <span class="n">new_cluster</span> <span class="o">=</span> <span class="p">{</span>
         <span class="s1">&#39;spark_version&#39;</span><span class="p">:</span> <span class="s1">&#39;9.1.x-scala2.12&#39;</span><span class="p">,</span>
@@ -691,7 +691,7 @@ one named parameter for each top level parameter in the <code class="docutils li
 <h3>Using named parameters<a class="headerlink" href="#using-named-parameters" title="Permalink to this headline">¶</a></h3>
 <p>You can also use named parameters to initialize the operator and run the job.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/databricks/example_dags/example_databricks.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-databricks/2.4.0/airflow/providers/databricks/example_dags/example_databricks.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="c1"># Example of using the named parameters of DatabricksSubmitRunOperator</span>
     <span class="c1"># to initialize the operator.</span>
     <span class="n">spark_jar_task</span> <span class="o">=</span> <span class="n">DatabricksSubmitRunOperator</span><span class="p">(</span>
diff --git a/docs-archive/apache-airflow-providers-databricks/2.5.0/operators/copy_into.html b/docs-archive/apache-airflow-providers-databricks/2.5.0/operators/copy_into.html
index 1dd3b1a609..a905bcc4ee 100644
--- a/docs-archive/apache-airflow-providers-databricks/2.5.0/operators/copy_into.html
+++ b/docs-archive/apache-airflow-providers-databricks/2.5.0/operators/copy_into.html
@@ -661,7 +661,7 @@ command.</p>
 <h3>Importing CSV data<a class="headerlink" href="#importing-csv-data" title="Permalink to this headline">¶</a></h3>
 <p>An example usage of the DatabricksCopyIntoOperator to import CSV data into a table is as follows:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks_sql.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/databricks/example_dags/example_databricks_sql.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks_sql.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-databricks/2.5.0/airflow/providers/databricks/example_dags/example_databricks_sql.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="c1"># Example of importing data using COPY_INTO SQL command</span>
     <span class="n">import_csv</span> <span class="o">=</span> <span class="n">DatabricksCopyIntoOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;import_csv&#39;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-databricks/2.5.0/operators/repos_update.html b/docs-archive/apache-airflow-providers-databricks/2.5.0/operators/repos_update.html
index 51c5198e51..380f068754 100644
--- a/docs-archive/apache-airflow-providers-databricks/2.5.0/operators/repos_update.html
+++ b/docs-archive/apache-airflow-providers-databricks/2.5.0/operators/repos_update.html
@@ -639,7 +639,7 @@ To use this operator you need to provide either <code class="docutils literal no
 <h2>Updating Databricks Repo by specifying path<a class="headerlink" href="#updating-databricks-repo-by-specifying-path" title="Permalink to this headline">¶</a></h2>
 <p>An example usage of the DatabricksReposUpdateOperator is as follows:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks_repos.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/databricks/example_dags/example_databricks_repos.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks_repos.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-databricks/2.5.0/airflow/providers/databricks/example_dags/example_databricks_repos.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="c1"># Example of updating a Databricks Repo to the latest code</span>
     <span class="n">repo_path</span> <span class="o">=</span> <span class="s2">&quot;/Repos/user@domain.com/demo-repo&quot;</span>
     <span class="n">update_repo</span> <span class="o">=</span> <span class="n">DatabricksReposUpdateOperator</span><span class="p">(</span><span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;update_repo&#39;</span><span class="p">,</span> <span class="n">repo_path</span><span class="o">=</span><span class="n">repo_path</span><span class="p">,</span> <span class="n">branch</span><span class="o">=</span><span class="s2">&quot;releases&quot;</span><span class="p">)</span>
diff --git a/docs-archive/apache-airflow-providers-databricks/2.5.0/operators/sql.html b/docs-archive/apache-airflow-providers-databricks/2.5.0/operators/sql.html
index cc864d271b..bdc919dfdf 100644
--- a/docs-archive/apache-airflow-providers-databricks/2.5.0/operators/sql.html
+++ b/docs-archive/apache-airflow-providers-databricks/2.5.0/operators/sql.html
@@ -648,7 +648,7 @@ on a <a class="reference external" href="https://docs.databricks.com/sql/admin/s
 <h3>Selecting data<a class="headerlink" href="#selecting-data" title="Permalink to this headline">¶</a></h3>
 <p>An example usage of the DatabricksSqlOperator to select data from a table is as follows:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks_sql.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/databricks/example_dags/example_databricks_sql.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks_sql.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-databricks/2.5.0/airflow/providers/databricks/example_dags/example_databricks_sql.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="c1"># Example of using the Databricks SQL Operator to select data.</span>
     <span class="n">select</span> <span class="o">=</span> <span class="n">DatabricksSqlOperator</span><span class="p">(</span>
         <span class="n">databricks_conn_id</span><span class="o">=</span><span class="n">connection_id</span><span class="p">,</span>
@@ -664,7 +664,7 @@ on a <a class="reference external" href="https://docs.databricks.com/sql/admin/s
 <h3>Selecting data into a file<a class="headerlink" href="#selecting-data-into-a-file" title="Permalink to this headline">¶</a></h3>
 <p>An example usage of the DatabricksSqlOperator to select data from a table and store in a file is as follows:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks_sql.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/databricks/example_dags/example_databricks_sql.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks_sql.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-databricks/2.5.0/airflow/providers/databricks/example_dags/example_databricks_sql.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="c1"># Example of using the Databricks SQL Operator to select data into a file with JSONL format.</span>
     <span class="n">select_into_file</span> <span class="o">=</span> <span class="n">DatabricksSqlOperator</span><span class="p">(</span>
         <span class="n">databricks_conn_id</span><span class="o">=</span><span class="n">connection_id</span><span class="p">,</span>
@@ -682,7 +682,7 @@ on a <a class="reference external" href="https://docs.databricks.com/sql/admin/s
 <h3>Executing multiple statements<a class="headerlink" href="#executing-multiple-statements" title="Permalink to this headline">¶</a></h3>
 <p>An example usage of the DatabricksSqlOperator to perform multiple SQL statements is as follows:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks_sql.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/databricks/example_dags/example_databricks_sql.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks_sql.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-databricks/2.5.0/airflow/providers/databricks/example_dags/example_databricks_sql.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="c1"># Example of using the Databricks SQL Operator to perform multiple operations.</span>
     <span class="n">create</span> <span class="o">=</span> <span class="n">DatabricksSqlOperator</span><span class="p">(</span>
         <span class="n">databricks_conn_id</span><span class="o">=</span><span class="n">connection_id</span><span class="p">,</span>
@@ -702,7 +702,7 @@ on a <a class="reference external" href="https://docs.databricks.com/sql/admin/s
 <h3>Executing multiple statements from a file<a class="headerlink" href="#executing-multiple-statements-from-a-file" title="Permalink to this headline">¶</a></h3>
 <p>An example usage of the DatabricksSqlOperator to perform statements from a file is as follows:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks_sql.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/databricks/example_dags/example_databricks_sql.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks_sql.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-databricks/2.5.0/airflow/providers/databricks/example_dags/example_databricks_sql.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="c1"># Example of using the Databricks SQL Operator to select data.</span>
     <span class="c1"># SQL statements should be in the file with name test.sql</span>
     <span class="n">create_file</span> <span class="o">=</span> <span class="n">DatabricksSqlOperator</span><span class="p">(</span>
diff --git a/docs-archive/apache-airflow-providers-databricks/2.5.0/operators/submit_run.html b/docs-archive/apache-airflow-providers-databricks/2.5.0/operators/submit_run.html
index 8ab2ccc43e..bcbbd587d9 100644
--- a/docs-archive/apache-airflow-providers-databricks/2.5.0/operators/submit_run.html
+++ b/docs-archive/apache-airflow-providers-databricks/2.5.0/operators/submit_run.html
@@ -666,7 +666,7 @@ one named parameter for each top level parameter in the <code class="docutils li
 <h3>Specifying parameters as JSON<a class="headerlink" href="#specifying-parameters-as-json" title="Permalink to this headline">¶</a></h3>
 <p>An example usage of the DatabricksSubmitRunOperator is as follows:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/databricks/example_dags/example_databricks.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-databricks/2.5.0/airflow/providers/databricks/example_dags/example_databricks.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="c1"># Example of using the JSON parameter to initialize the operator.</span>
     <span class="n">new_cluster</span> <span class="o">=</span> <span class="p">{</span>
         <span class="s1">&#39;spark_version&#39;</span><span class="p">:</span> <span class="s1">&#39;9.1.x-scala2.12&#39;</span><span class="p">,</span>
@@ -691,7 +691,7 @@ one named parameter for each top level parameter in the <code class="docutils li
 <h3>Using named parameters<a class="headerlink" href="#using-named-parameters" title="Permalink to this headline">¶</a></h3>
 <p>You can also use named parameters to initialize the operator and run the job.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/databricks/example_dags/example_databricks.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-databricks/2.5.0/airflow/providers/databricks/example_dags/example_databricks.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="c1"># Example of using the named parameters of DatabricksSubmitRunOperator</span>
     <span class="c1"># to initialize the operator.</span>
     <span class="n">spark_jar_task</span> <span class="o">=</span> <span class="n">DatabricksSubmitRunOperator</span><span class="p">(</span>
diff --git a/docs-archive/apache-airflow-providers-databricks/2.6.0/operators/copy_into.html b/docs-archive/apache-airflow-providers-databricks/2.6.0/operators/copy_into.html
index 4ec890ebe2..198d54b88f 100644
--- a/docs-archive/apache-airflow-providers-databricks/2.6.0/operators/copy_into.html
+++ b/docs-archive/apache-airflow-providers-databricks/2.6.0/operators/copy_into.html
@@ -653,7 +653,7 @@ command.</p>
 <h3>Importing CSV data<a class="headerlink" href="#importing-csv-data" title="Permalink to this headline">¶</a></h3>
 <p>An example usage of the DatabricksCopyIntoOperator to import CSV data into a table is as follows:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks_sql.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/databricks/example_dags/example_databricks_sql.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks_sql.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-databricks/2.6.0/airflow/providers/databricks/example_dags/example_databricks_sql.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="c1"># Example of importing data using COPY_INTO SQL command</span>
     <span class="n">import_csv</span> <span class="o">=</span> <span class="n">DatabricksCopyIntoOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;import_csv&#39;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-databricks/2.6.0/operators/repos_create.html b/docs-archive/apache-airflow-providers-databricks/2.6.0/operators/repos_create.html
index 7668669368..271835c0e1 100644
--- a/docs-archive/apache-airflow-providers-databricks/2.6.0/operators/repos_create.html
+++ b/docs-archive/apache-airflow-providers-databricks/2.6.0/operators/repos_create.html
@@ -636,7 +636,7 @@ via <a class="reference external" href="https://docs.databricks.com/dev-tools/ap
 <h2>Create a Databricks Repo<a class="headerlink" href="#create-a-databricks-repo" title="Permalink to this headline">¶</a></h2>
 <p>An example usage of the DatabricksReposCreateOperator is as follows:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks_repos.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/databricks/example_dags/example_databricks_repos.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks_repos.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-databricks/2.6.0/airflow/providers/databricks/example_dags/example_databricks_repos.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="c1"># Example of creating a Databricks Repo</span>
     <span class="n">repo_path</span> <span class="o">=</span> <span class="s2">&quot;/Repos/user@domain.com/demo-repo&quot;</span>
     <span class="n">git_url</span> <span class="o">=</span> <span class="s2">&quot;https://github.com/test/test&quot;</span>
diff --git a/docs-archive/apache-airflow-providers-databricks/2.6.0/operators/repos_delete.html b/docs-archive/apache-airflow-providers-databricks/2.6.0/operators/repos_delete.html
index cbd5065173..8ce6f75e7e 100644
--- a/docs-archive/apache-airflow-providers-databricks/2.6.0/operators/repos_delete.html
+++ b/docs-archive/apache-airflow-providers-databricks/2.6.0/operators/repos_delete.html
@@ -624,7 +624,7 @@ via <a class="reference external" href="https://docs.databricks.com/dev-tools/ap
 <h2>Deleting Databricks Repo by specifying path<a class="headerlink" href="#deleting-databricks-repo-by-specifying-path" title="Permalink to this headline">¶</a></h2>
 <p>An example usage of the DatabricksReposDeleteOperator is as follows:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks_repos.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/databricks/example_dags/example_databricks_repos.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks_repos.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-databricks/2.6.0/airflow/providers/databricks/example_dags/example_databricks_repos.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="c1"># Example of deleting a Databricks Repo</span>
     <span class="n">repo_path</span> <span class="o">=</span> <span class="s2">&quot;/Repos/user@domain.com/demo-repo&quot;</span>
     <span class="n">delete_repo</span> <span class="o">=</span> <span class="n">DatabricksReposDeleteOperator</span><span class="p">(</span><span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;delete_repo&#39;</span><span class="p">,</span> <span class="n">repo_path</span><span class="o">=</span><span class="n">repo_path</span><span class="p">)</span>
diff --git a/docs-archive/apache-airflow-providers-databricks/2.6.0/operators/repos_update.html b/docs-archive/apache-airflow-providers-databricks/2.6.0/operators/repos_update.html
index 69f06133db..a695a52560 100644
--- a/docs-archive/apache-airflow-providers-databricks/2.6.0/operators/repos_update.html
+++ b/docs-archive/apache-airflow-providers-databricks/2.6.0/operators/repos_update.html
@@ -631,7 +631,7 @@ To use this operator you need to provide either <code class="docutils literal no
 <h2>Updating Databricks Repo by specifying path<a class="headerlink" href="#updating-databricks-repo-by-specifying-path" title="Permalink to this headline">¶</a></h2>
 <p>An example usage of the DatabricksReposUpdateOperator is as follows:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks_repos.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/databricks/example_dags/example_databricks_repos.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks_repos.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-databricks/2.6.0/airflow/providers/databricks/example_dags/example_databricks_repos.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="c1"># Example of updating a Databricks Repo to the latest code</span>
     <span class="n">repo_path</span> <span class="o">=</span> <span class="s2">&quot;/Repos/user@domain.com/demo-repo&quot;</span>
     <span class="n">update_repo</span> <span class="o">=</span> <span class="n">DatabricksReposUpdateOperator</span><span class="p">(</span><span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;update_repo&#39;</span><span class="p">,</span> <span class="n">repo_path</span><span class="o">=</span><span class="n">repo_path</span><span class="p">,</span> <span class="n">branch</span><span class="o">=</span><span class="s2">&quot;releases&quot;</span><span class="p">)</span>
diff --git a/docs-archive/apache-airflow-providers-databricks/2.6.0/operators/sql.html b/docs-archive/apache-airflow-providers-databricks/2.6.0/operators/sql.html
index f098f7387d..38df03fa78 100644
--- a/docs-archive/apache-airflow-providers-databricks/2.6.0/operators/sql.html
+++ b/docs-archive/apache-airflow-providers-databricks/2.6.0/operators/sql.html
@@ -640,7 +640,7 @@ on a <a class="reference external" href="https://docs.databricks.com/sql/admin/s
 <h3>Selecting data<a class="headerlink" href="#selecting-data" title="Permalink to this headline">¶</a></h3>
 <p>An example usage of the DatabricksSqlOperator to select data from a table is as follows:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks_sql.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/databricks/example_dags/example_databricks_sql.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks_sql.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-databricks/2.6.0/airflow/providers/databricks/example_dags/example_databricks_sql.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="c1"># Example of using the Databricks SQL Operator to select data.</span>
     <span class="n">select</span> <span class="o">=</span> <span class="n">DatabricksSqlOperator</span><span class="p">(</span>
         <span class="n">databricks_conn_id</span><span class="o">=</span><span class="n">connection_id</span><span class="p">,</span>
@@ -656,7 +656,7 @@ on a <a class="reference external" href="https://docs.databricks.com/sql/admin/s
 <h3>Selecting data into a file<a class="headerlink" href="#selecting-data-into-a-file" title="Permalink to this headline">¶</a></h3>
 <p>An example usage of the DatabricksSqlOperator to select data from a table and store in a file is as follows:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks_sql.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/databricks/example_dags/example_databricks_sql.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks_sql.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-databricks/2.6.0/airflow/providers/databricks/example_dags/example_databricks_sql.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="c1"># Example of using the Databricks SQL Operator to select data into a file with JSONL format.</span>
     <span class="n">select_into_file</span> <span class="o">=</span> <span class="n">DatabricksSqlOperator</span><span class="p">(</span>
         <span class="n">databricks_conn_id</span><span class="o">=</span><span class="n">connection_id</span><span class="p">,</span>
@@ -674,7 +674,7 @@ on a <a class="reference external" href="https://docs.databricks.com/sql/admin/s
 <h3>Executing multiple statements<a class="headerlink" href="#executing-multiple-statements" title="Permalink to this headline">¶</a></h3>
 <p>An example usage of the DatabricksSqlOperator to perform multiple SQL statements is as follows:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks_sql.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/databricks/example_dags/example_databricks_sql.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks_sql.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-databricks/2.6.0/airflow/providers/databricks/example_dags/example_databricks_sql.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="c1"># Example of using the Databricks SQL Operator to perform multiple operations.</span>
     <span class="n">create</span> <span class="o">=</span> <span class="n">DatabricksSqlOperator</span><span class="p">(</span>
         <span class="n">databricks_conn_id</span><span class="o">=</span><span class="n">connection_id</span><span class="p">,</span>
@@ -694,7 +694,7 @@ on a <a class="reference external" href="https://docs.databricks.com/sql/admin/s
 <h3>Executing multiple statements from a file<a class="headerlink" href="#executing-multiple-statements-from-a-file" title="Permalink to this headline">¶</a></h3>
 <p>An example usage of the DatabricksSqlOperator to perform statements from a file is as follows:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks_sql.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/databricks/example_dags/example_databricks_sql.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks_sql.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-databricks/2.6.0/airflow/providers/databricks/example_dags/example_databricks_sql.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="c1"># Example of using the Databricks SQL Operator to select data.</span>
     <span class="c1"># SQL statements should be in the file with name test.sql</span>
     <span class="n">create_file</span> <span class="o">=</span> <span class="n">DatabricksSqlOperator</span><span class="p">(</span>
diff --git a/docs-archive/apache-airflow-providers-databricks/2.6.0/operators/submit_run.html b/docs-archive/apache-airflow-providers-databricks/2.6.0/operators/submit_run.html
index 7afb6d0004..e57f8960af 100644
--- a/docs-archive/apache-airflow-providers-databricks/2.6.0/operators/submit_run.html
+++ b/docs-archive/apache-airflow-providers-databricks/2.6.0/operators/submit_run.html
@@ -658,7 +658,7 @@ one named parameter for each top level parameter in the <code class="docutils li
 <h3>Specifying parameters as JSON<a class="headerlink" href="#specifying-parameters-as-json" title="Permalink to this headline">¶</a></h3>
 <p>An example usage of the DatabricksSubmitRunOperator is as follows:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/databricks/example_dags/example_databricks.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-databricks/2.6.0/airflow/providers/databricks/example_dags/example_databricks.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="c1"># Example of using the JSON parameter to initialize the operator.</span>
     <span class="n">new_cluster</span> <span class="o">=</span> <span class="p">{</span>
         <span class="s1">&#39;spark_version&#39;</span><span class="p">:</span> <span class="s1">&#39;9.1.x-scala2.12&#39;</span><span class="p">,</span>
@@ -683,7 +683,7 @@ one named parameter for each top level parameter in the <code class="docutils li
 <h3>Using named parameters<a class="headerlink" href="#using-named-parameters" title="Permalink to this headline">¶</a></h3>
 <p>You can also use named parameters to initialize the operator and run the job.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/databricks/example_dags/example_databricks.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-databricks/2.6.0/airflow/providers/databricks/example_dags/example_databricks.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="c1"># Example of using the named parameters of DatabricksSubmitRunOperator</span>
     <span class="c1"># to initialize the operator.</span>
     <span class="n">spark_jar_task</span> <span class="o">=</span> <span class="n">DatabricksSubmitRunOperator</span><span class="p">(</span>
diff --git a/docs-archive/apache-airflow-providers-databricks/2.7.0/operators/copy_into.html b/docs-archive/apache-airflow-providers-databricks/2.7.0/operators/copy_into.html
index ff57551aea..c5cdcc64ce 100644
--- a/docs-archive/apache-airflow-providers-databricks/2.7.0/operators/copy_into.html
+++ b/docs-archive/apache-airflow-providers-databricks/2.7.0/operators/copy_into.html
@@ -612,7 +612,7 @@ command.</p>
 <h3>Importing CSV data<a class="headerlink" href="#importing-csv-data" title="Permalink to this headline">¶</a></h3>
 <p>An example usage of the DatabricksCopyIntoOperator to import CSV data into a table is as follows:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks_sql.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/databricks/example_dags/example_databricks_sql.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks_sql.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-databricks/2.7.0/airflow/providers/databricks/example_dags/example_databricks_sql.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="c1"># Example of importing data using COPY_INTO SQL command</span>
     <span class="n">import_csv</span> <span class="o">=</span> <span class="n">DatabricksCopyIntoOperator</span><span class="p">(</span>
         <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;import_csv&#39;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-databricks/2.7.0/operators/repos_create.html b/docs-archive/apache-airflow-providers-databricks/2.7.0/operators/repos_create.html
index c22cce5f09..95a68aab06 100644
--- a/docs-archive/apache-airflow-providers-databricks/2.7.0/operators/repos_create.html
+++ b/docs-archive/apache-airflow-providers-databricks/2.7.0/operators/repos_create.html
@@ -646,7 +646,7 @@ via <a class="reference external" href="https://docs.databricks.com/dev-tools/ap
 <h2>Create a Databricks Repo<a class="headerlink" href="#create-a-databricks-repo" title="Permalink to this headline">¶</a></h2>
 <p>An example usage of the DatabricksReposCreateOperator is as follows:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks_repos.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/databricks/example_dags/example_databricks_repos.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks_repos.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-databricks/2.7.0/airflow/providers/databricks/example_dags/example_databricks_repos.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="c1"># Example of creating a Databricks Repo</span>
     <span class="n">repo_path</span> <span class="o">=</span> <span class="s2">&quot;/Repos/user@domain.com/demo-repo&quot;</span>
     <span class="n">git_url</span> <span class="o">=</span> <span class="s2">&quot;https://github.com/test/test&quot;</span>
diff --git a/docs-archive/apache-airflow-providers-databricks/2.7.0/operators/repos_delete.html b/docs-archive/apache-airflow-providers-databricks/2.7.0/operators/repos_delete.html
index 5376bc4036..7d79678187 100644
--- a/docs-archive/apache-airflow-providers-databricks/2.7.0/operators/repos_delete.html
+++ b/docs-archive/apache-airflow-providers-databricks/2.7.0/operators/repos_delete.html
@@ -634,7 +634,7 @@ via <a class="reference external" href="https://docs.databricks.com/dev-tools/ap
 <h2>Deleting Databricks Repo by specifying path<a class="headerlink" href="#deleting-databricks-repo-by-specifying-path" title="Permalink to this headline">¶</a></h2>
 <p>An example usage of the DatabricksReposDeleteOperator is as follows:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks_repos.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/databricks/example_dags/example_databricks_repos.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks_repos.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-databricks/2.7.0/airflow/providers/databricks/example_dags/example_databricks_repos.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="c1"># Example of deleting a Databricks Repo</span>
     <span class="n">repo_path</span> <span class="o">=</span> <span class="s2">&quot;/Repos/user@domain.com/demo-repo&quot;</span>
     <span class="n">delete_repo</span> <span class="o">=</span> <span class="n">DatabricksReposDeleteOperator</span><span class="p">(</span><span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;delete_repo&#39;</span><span class="p">,</span> <span class="n">repo_path</span><span class="o">=</span><span class="n">repo_path</span><span class="p">)</span>
diff --git a/docs-archive/apache-airflow-providers-databricks/2.7.0/operators/repos_update.html b/docs-archive/apache-airflow-providers-databricks/2.7.0/operators/repos_update.html
index a4136bac0d..ab82ccc9a8 100644
--- a/docs-archive/apache-airflow-providers-databricks/2.7.0/operators/repos_update.html
+++ b/docs-archive/apache-airflow-providers-databricks/2.7.0/operators/repos_update.html
@@ -641,7 +641,7 @@ To use this operator you need to provide either <code class="docutils literal no
 <h2>Updating Databricks Repo by specifying path<a class="headerlink" href="#updating-databricks-repo-by-specifying-path" title="Permalink to this headline">¶</a></h2>
 <p>An example usage of the DatabricksReposUpdateOperator is as follows:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks_repos.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/databricks/example_dags/example_databricks_repos.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks_repos.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-databricks/2.7.0/airflow/providers/databricks/example_dags/example_databricks_repos.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="c1"># Example of updating a Databricks Repo to the latest code</span>
     <span class="n">repo_path</span> <span class="o">=</span> <span class="s2">&quot;/Repos/user@domain.com/demo-repo&quot;</span>
     <span class="n">update_repo</span> <span class="o">=</span> <span class="n">DatabricksReposUpdateOperator</span><span class="p">(</span><span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;update_repo&#39;</span><span class="p">,</span> <span class="n">repo_path</span><span class="o">=</span><span class="n">repo_path</span><span class="p">,</span> <span class="n">branch</span><span class="o">=</span><span class="s2">&quot;releases&quot;</span><span class="p">)</span>
diff --git a/docs-archive/apache-airflow-providers-databricks/2.7.0/operators/sql.html b/docs-archive/apache-airflow-providers-databricks/2.7.0/operators/sql.html
index 7dee1e5f52..d1abecda6e 100644
--- a/docs-archive/apache-airflow-providers-databricks/2.7.0/operators/sql.html
+++ b/docs-archive/apache-airflow-providers-databricks/2.7.0/operators/sql.html
@@ -616,7 +616,7 @@ on a <a class="reference external" href="https://docs.databricks.com/sql/admin/s
 <h3>Selecting data<a class="headerlink" href="#selecting-data" title="Permalink to this headline">¶</a></h3>
 <p>An example usage of the DatabricksSqlOperator to select data from a table is as follows:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks_sql.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/databricks/example_dags/example_databricks_sql.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks_sql.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-databricks/2.7.0/airflow/providers/databricks/example_dags/example_databricks_sql.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="c1"># Example of using the Databricks SQL Operator to select data.</span>
     <span class="n">select</span> <span class="o">=</span> <span class="n">DatabricksSqlOperator</span><span class="p">(</span>
         <span class="n">databricks_conn_id</span><span class="o">=</span><span class="n">connection_id</span><span class="p">,</span>
@@ -632,7 +632,7 @@ on a <a class="reference external" href="https://docs.databricks.com/sql/admin/s
 <h3>Selecting data into a file<a class="headerlink" href="#selecting-data-into-a-file" title="Permalink to this headline">¶</a></h3>
 <p>An example usage of the DatabricksSqlOperator to select data from a table and store in a file is as follows:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks_sql.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/databricks/example_dags/example_databricks_sql.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks_sql.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-databricks/2.7.0/airflow/providers/databricks/example_dags/example_databricks_sql.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="c1"># Example of using the Databricks SQL Operator to select data into a file with JSONL format.</span>
     <span class="n">select_into_file</span> <span class="o">=</span> <span class="n">DatabricksSqlOperator</span><span class="p">(</span>
         <span class="n">databricks_conn_id</span><span class="o">=</span><span class="n">connection_id</span><span class="p">,</span>
@@ -650,7 +650,7 @@ on a <a class="reference external" href="https://docs.databricks.com/sql/admin/s
 <h3>Executing multiple statements<a class="headerlink" href="#executing-multiple-statements" title="Permalink to this headline">¶</a></h3>
 <p>An example usage of the DatabricksSqlOperator to perform multiple SQL statements is as follows:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks_sql.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/databricks/example_dags/example_databricks_sql.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks_sql.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-databricks/2.7.0/airflow/providers/databricks/example_dags/example_databricks_sql.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="c1"># Example of using the Databricks SQL Operator to perform multiple operations.</span>
     <span class="n">create</span> <span class="o">=</span> <span class="n">DatabricksSqlOperator</span><span class="p">(</span>
         <span class="n">databricks_conn_id</span><span class="o">=</span><span class="n">connection_id</span><span class="p">,</span>
@@ -670,7 +670,7 @@ on a <a class="reference external" href="https://docs.databricks.com/sql/admin/s
 <h3>Executing multiple statements from a file<a class="headerlink" href="#executing-multiple-statements-from-a-file" title="Permalink to this headline">¶</a></h3>
 <p>An example usage of the DatabricksSqlOperator to perform statements from a file is as follows:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks_sql.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/databricks/example_dags/example_databricks_sql.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks_sql.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-databricks/2.7.0/airflow/providers/databricks/example_dags/example_databricks_sql.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="c1"># Example of using the Databricks SQL Operator to select data.</span>
     <span class="c1"># SQL statements should be in the file with name test.sql</span>
     <span class="n">create_file</span> <span class="o">=</span> <span class="n">DatabricksSqlOperator</span><span class="p">(</span>
diff --git a/docs-archive/apache-airflow-providers-databricks/2.7.0/operators/submit_run.html b/docs-archive/apache-airflow-providers-databricks/2.7.0/operators/submit_run.html
index 838166c693..9440515a47 100644
--- a/docs-archive/apache-airflow-providers-databricks/2.7.0/operators/submit_run.html
+++ b/docs-archive/apache-airflow-providers-databricks/2.7.0/operators/submit_run.html
@@ -624,7 +624,7 @@ one named parameter for each top level parameter in the <code class="docutils li
 <h3>Specifying parameters as JSON<a class="headerlink" href="#specifying-parameters-as-json" title="Permalink to this headline">¶</a></h3>
 <p>An example usage of the DatabricksSubmitRunOperator is as follows:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/databricks/example_dags/example_databricks.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-databricks/2.7.0/airflow/providers/databricks/example_dags/example_databricks.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="c1"># Example of using the JSON parameter to initialize the operator.</span>
     <span class="n">new_cluster</span> <span class="o">=</span> <span class="p">{</span>
         <span class="s1">&#39;spark_version&#39;</span><span class="p">:</span> <span class="s1">&#39;9.1.x-scala2.12&#39;</span><span class="p">,</span>
@@ -649,7 +649,7 @@ one named parameter for each top level parameter in the <code class="docutils li
 <h3>Using named parameters<a class="headerlink" href="#using-named-parameters" title="Permalink to this headline">¶</a></h3>
 <p>You can also use named parameters to initialize the operator and run the job.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/databricks/example_dags/example_databricks.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/databricks/example_dags/example_databricks.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-databricks/2.7.0/airflow/providers/databricks/example_dags/example_databricks.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>    <span class="c1"># Example of using the named parameters of DatabricksSubmitRunOperator</span>
     <span class="c1"># to initialize the operator.</span>
     <span class="n">spark_jar_task</span> <span class="o">=</span> <span class="n">DatabricksSubmitRunOperator</span><span class="p">(</span>
diff --git a/docs-archive/apache-airflow-providers-dbt-cloud/1.0.1/operators.html b/docs-archive/apache-airflow-providers-dbt-cloud/1.0.1/operators.html
index adf4d7b13a..ef98aa85e0 100644
--- a/docs-archive/apache-airflow-providers-dbt-cloud/1.0.1/operators.html
+++ b/docs-archive/apache-airflow-providers-dbt-cloud/1.0.1/operators.html
@@ -617,7 +617,7 @@ configurations or overrides for the job run such as <code class="docutils litera
 asynchronous waiting for run termination, respectively. To note, the <code class="docutils literal notranslate"><span class="pre">account_id</span></code> for the operators is
 referenced within the <code class="docutils literal notranslate"><span class="pre">default_args</span></code> of the example DAG.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dbt/cloud/example_dags/example_dbt_cloud.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/dbt/cloud/example_dags/example_dbt_cloud.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dbt/cloud/example_dags/example_dbt_cloud.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-dbt-cloud/1.0.1/airflow/providers/dbt/cloud/example_dags/example_dbt_cloud.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">trigger_job_run1</span> <span class="o">=</span> <span class="n">DbtCloudRunJobOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;trigger_job_run1&quot;</span><span class="p">,</span>
     <span class="n">job_id</span><span class="o">=</span><span class="mi">48617</span><span class="p">,</span>
@@ -630,7 +630,7 @@ referenced within the <code class="docutils literal notranslate"><span class="pr
 <p>This next example also shows how to pass in custom runtime configuration (in this case for <code class="docutils literal notranslate"><span class="pre">threads_override</span></code>)
 via the <code class="docutils literal notranslate"><span class="pre">additional_run_config</span></code> dictionary.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dbt/cloud/example_dags/example_dbt_cloud.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/dbt/cloud/example_dags/example_dbt_cloud.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dbt/cloud/example_dags/example_dbt_cloud.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-dbt-cloud/1.0.1/airflow/providers/dbt/cloud/example_dags/example_dbt_cloud.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">trigger_job_run2</span> <span class="o">=</span> <span class="n">DbtCloudRunJobOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;trigger_job_run2&quot;</span><span class="p">,</span>
     <span class="n">job_id</span><span class="o">=</span><span class="mi">48617</span><span class="p">,</span>
@@ -650,7 +650,7 @@ functionality available with the <a class="reference external" href="/docs/apach
 DbtCloudRunJobOperator task by utilizing the <code class="docutils literal notranslate"><span class="pre">.output</span></code> property exposed for all operators. Also, to note,
 the <code class="docutils literal notranslate"><span class="pre">account_id</span></code> for the task is referenced within the <code class="docutils literal notranslate"><span class="pre">default_args</span></code> of the example DAG.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dbt/cloud/example_dags/example_dbt_cloud.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/dbt/cloud/example_dags/example_dbt_cloud.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dbt/cloud/example_dags/example_dbt_cloud.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-dbt-cloud/1.0.1/airflow/providers/dbt/cloud/example_dags/example_dbt_cloud.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">job_run_sensor</span> <span class="o">=</span> <span class="n">DbtCloudJobRunSensor</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;job_run_sensor&quot;</span><span class="p">,</span> <span class="n">run_id</span><span class="o">=</span><span class="n">trigger_job_run2</span><span class="o">.</span><span class="n">output</span><span class="p">,</span> <span class="n">timeout</span><span class="o">=</span><span class="mi">20</span>
 <span class="p">)</span>
@@ -668,7 +668,7 @@ downloaded.</p>
 <p>For more information on dbt Cloud artifacts, reference
 <a class="reference external" href="https://docs.getdbt.com/docs/dbt-cloud/using-dbt-cloud/artifacts">this documentation</a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dbt/cloud/example_dags/example_dbt_cloud.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/dbt/cloud/example_dags/example_dbt_cloud.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dbt/cloud/example_dags/example_dbt_cloud.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-dbt-cloud/1.0.1/airflow/providers/dbt/cloud/example_dags/example_dbt_cloud.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">get_run_results_artifact</span> <span class="o">=</span> <span class="n">DbtCloudGetJobRunArtifactOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;get_run_results_artifact&quot;</span><span class="p">,</span> <span class="n">run_id</span><span class="o">=</span><span class="n">trigger_job_run1</span><span class="o">.</span><span class="n">output</span><span class="p">,</span> <span class="n">path</span><span class="o">=</span><span class="s2">&quot;run_results.json&quot;</span>
 <span class="p">)</span>
diff --git a/docs-archive/apache-airflow-providers-dbt-cloud/1.0.2/operators.html b/docs-archive/apache-airflow-providers-dbt-cloud/1.0.2/operators.html
index 46987aa8f1..b30d3a8196 100644
--- a/docs-archive/apache-airflow-providers-dbt-cloud/1.0.2/operators.html
+++ b/docs-archive/apache-airflow-providers-dbt-cloud/1.0.2/operators.html
@@ -619,7 +619,7 @@ configurations or overrides for the job run such as <code class="docutils litera
 asynchronous waiting for run termination, respectively. To note, the <code class="docutils literal notranslate"><span class="pre">account_id</span></code> for the operators is
 referenced within the <code class="docutils literal notranslate"><span class="pre">default_args</span></code> of the example DAG.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dbt/cloud/example_dags/example_dbt_cloud.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/dbt/cloud/example_dags/example_dbt_cloud.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dbt/cloud/example_dags/example_dbt_cloud.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-dbt-cloud/1.0.2/airflow/providers/dbt/cloud/example_dags/example_dbt_cloud.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">trigger_job_run1</span> <span class="o">=</span> <span class="n">DbtCloudRunJobOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;trigger_job_run1&quot;</span><span class="p">,</span>
     <span class="n">job_id</span><span class="o">=</span><span class="mi">48617</span><span class="p">,</span>
@@ -632,7 +632,7 @@ referenced within the <code class="docutils literal notranslate"><span class="pr
 <p>This next example also shows how to pass in custom runtime configuration (in this case for <code class="docutils literal notranslate"><span class="pre">threads_override</span></code>)
 via the <code class="docutils literal notranslate"><span class="pre">additional_run_config</span></code> dictionary.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dbt/cloud/example_dags/example_dbt_cloud.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/dbt/cloud/example_dags/example_dbt_cloud.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dbt/cloud/example_dags/example_dbt_cloud.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-dbt-cloud/1.0.2/airflow/providers/dbt/cloud/example_dags/example_dbt_cloud.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">trigger_job_run2</span> <span class="o">=</span> <span class="n">DbtCloudRunJobOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;trigger_job_run2&quot;</span><span class="p">,</span>
     <span class="n">job_id</span><span class="o">=</span><span class="mi">48617</span><span class="p">,</span>
@@ -652,7 +652,7 @@ functionality available with the <a class="reference external" href="/docs/apach
 DbtCloudRunJobOperator task by utilizing the <code class="docutils literal notranslate"><span class="pre">.output</span></code> property exposed for all operators. Also, to note,
 the <code class="docutils literal notranslate"><span class="pre">account_id</span></code> for the task is referenced within the <code class="docutils literal notranslate"><span class="pre">default_args</span></code> of the example DAG.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dbt/cloud/example_dags/example_dbt_cloud.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/dbt/cloud/example_dags/example_dbt_cloud.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dbt/cloud/example_dags/example_dbt_cloud.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-dbt-cloud/1.0.2/airflow/providers/dbt/cloud/example_dags/example_dbt_cloud.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">job_run_sensor</span><span class="p">:</span> <span class="n">BaseOperator</span> <span class="o">=</span> <span class="n">DbtCloudJobRunSensor</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;job_run_sensor&quot;</span><span class="p">,</span> <span class="n">run_id</span><span class="o">=</span><span class="n">trigger_job_run2</span><span class="o">.</span><span class="n">output</span><span class="p">,</span> <span class="n">timeout</span><span class="o">=</span><span class="mi">20</span>
 <span class="p">)</span>
@@ -670,7 +670,7 @@ downloaded.</p>
 <p>For more information on dbt Cloud artifacts, reference
 <a class="reference external" href="https://docs.getdbt.com/docs/dbt-cloud/using-dbt-cloud/artifacts">this documentation</a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dbt/cloud/example_dags/example_dbt_cloud.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/dbt/cloud/example_dags/example_dbt_cloud.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dbt/cloud/example_dags/example_dbt_cloud.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-dbt-cloud/1.0.2/airflow/providers/dbt/cloud/example_dags/example_dbt_cloud.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">get_run_results_artifact</span><span class="p">:</span> <span class="n">BaseOperator</span> <span class="o">=</span> <span class="n">DbtCloudGetJobRunArtifactOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;get_run_results_artifact&quot;</span><span class="p">,</span> <span class="n">run_id</span><span class="o">=</span><span class="n">trigger_job_run1</span><span class="o">.</span><span class="n">output</span><span class="p">,</span> <span class="n">path</span><span class="o">=</span><span class="s2">&quot;run_results.json&quot;</span>
 <span class="p">)</span>
diff --git a/docs-archive/apache-airflow-providers-dingding/1.0.0/operators.html b/docs-archive/apache-airflow-providers-dingding/1.0.0/operators.html
index 3788f20424..341323fb2c 100644
--- a/docs-archive/apache-airflow-providers-dingding/1.0.0/operators.html
+++ b/docs-archive/apache-airflow-providers-dingding/1.0.0/operators.html
@@ -589,7 +589,7 @@ Connection. Notice that you just need token rather than the whole webhook string
 <p>Use the <a class="reference internal" href="_api/airflow/providers/dingding/operators/dingding/index.html#airflow.providers.dingding.operators.dingding.DingdingOperator" title="airflow.providers.dingding.operators.dingding.DingdingOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">DingdingOperator</span></code></a>
 to send Dingding message:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dingding/example_dags/example_dingding.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/dingding/example_dags/example_dingding.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dingding/example_dags/example_dingding.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-dingding/1.0.0/airflow/providers/dingding/example_dags/example_dingding.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">text_msg_remind_none</span> <span class="o">=</span> <span class="n">DingdingOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;text_msg_remind_none&#39;</span><span class="p">,</span>
     <span class="n">dingding_conn_id</span><span class="o">=</span><span class="s1">&#39;dingding_default&#39;</span><span class="p">,</span>
@@ -607,7 +607,7 @@ to send Dingding message:</p>
 <p>Use parameters <code class="docutils literal notranslate"><span class="pre">at_mobiles</span></code> and <code class="docutils literal notranslate"><span class="pre">at_all</span></code> to remind specific users when you send message,
 <code class="docutils literal notranslate"><span class="pre">at_mobiles</span></code> will be ignored When <code class="docutils literal notranslate"><span class="pre">at_all</span></code> is set to <code class="docutils literal notranslate"><span class="pre">True</span></code>:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dingding/example_dags/example_dingding.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/dingding/example_dags/example_dingding.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dingding/example_dags/example_dingding.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-dingding/1.0.0/airflow/providers/dingding/example_dags/example_dingding.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">text_msg_remind_all</span> <span class="o">=</span> <span class="n">DingdingOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;text_msg_remind_all&#39;</span><span class="p">,</span>
     <span class="n">dingding_conn_id</span><span class="o">=</span><span class="s1">&#39;dingding_default&#39;</span><span class="p">,</span>
@@ -627,7 +627,7 @@ to send Dingding message:</p>
 <p>The Dingding operator can send rich text messages including link, markdown, actionCard and feedCard.
 A rich text message can not remind specific users except by using markdown type message:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dingding/example_dags/example_dingding.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/dingding/example_dags/example_dingding.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dingding/example_dags/example_dingding.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-dingding/1.0.0/airflow/providers/dingding/example_dags/example_dingding.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">markdown_msg</span> <span class="o">=</span> <span class="n">DingdingOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;markdown_msg&#39;</span><span class="p">,</span>
     <span class="n">dingding_conn_id</span><span class="o">=</span><span class="s1">&#39;dingding_default&#39;</span><span class="p">,</span>
@@ -652,7 +652,7 @@ A rich text message can not remind specific users except by using markdown type
 and then pass the function to <code class="docutils literal notranslate"><span class="pre">sla_miss_callback</span></code>, <code class="docutils literal notranslate"><span class="pre">on_success_callback</span></code>, <code class="docutils literal notranslate"><span class="pre">on_failure_callback</span></code>,
 or <code class="docutils literal notranslate"><span class="pre">on_retry_callback</span></code>. Here we use <code class="docutils literal notranslate"><span class="pre">on_failure_callback</span></code> as an example:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dingding/example_dags/example_dingding.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/dingding/example_dags/example_dingding.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dingding/example_dags/example_dingding.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-dingding/1.0.0/airflow/providers/dingding/example_dags/example_dingding.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="k">def</span> <span class="nf">failure_callback</span><span class="p">(</span><span class="n">context</span><span class="p">):</span>
     <span class="sd">&quot;&quot;&quot;</span>
 <span class="sd">    The function that will be executed on failure.</span>
diff --git a/docs-archive/apache-airflow-providers-dingding/1.0.1/operators.html b/docs-archive/apache-airflow-providers-dingding/1.0.1/operators.html
index 1bfd81faf5..8bd1884614 100644
--- a/docs-archive/apache-airflow-providers-dingding/1.0.1/operators.html
+++ b/docs-archive/apache-airflow-providers-dingding/1.0.1/operators.html
@@ -599,7 +599,7 @@ Connection. Notice that you just need token rather than the whole webhook string
 <p>Use the <a class="reference internal" href="_api/airflow/providers/dingding/operators/dingding/index.html#airflow.providers.dingding.operators.dingding.DingdingOperator" title="airflow.providers.dingding.operators.dingding.DingdingOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">DingdingOperator</span></code></a>
 to send Dingding message:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dingding/example_dags/example_dingding.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/dingding/example_dags/example_dingding.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dingding/example_dags/example_dingding.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-dingding/1.0.1/airflow/providers/dingding/example_dags/example_dingding.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">text_msg_remind_none</span> <span class="o">=</span> <span class="n">DingdingOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;text_msg_remind_none&#39;</span><span class="p">,</span>
     <span class="n">dingding_conn_id</span><span class="o">=</span><span class="s1">&#39;dingding_default&#39;</span><span class="p">,</span>
@@ -617,7 +617,7 @@ to send Dingding message:</p>
 <p>Use parameters <code class="docutils literal notranslate"><span class="pre">at_mobiles</span></code> and <code class="docutils literal notranslate"><span class="pre">at_all</span></code> to remind specific users when you send message,
 <code class="docutils literal notranslate"><span class="pre">at_mobiles</span></code> will be ignored When <code class="docutils literal notranslate"><span class="pre">at_all</span></code> is set to <code class="docutils literal notranslate"><span class="pre">True</span></code>:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dingding/example_dags/example_dingding.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/dingding/example_dags/example_dingding.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dingding/example_dags/example_dingding.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-dingding/1.0.1/airflow/providers/dingding/example_dags/example_dingding.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">text_msg_remind_all</span> <span class="o">=</span> <span class="n">DingdingOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;text_msg_remind_all&#39;</span><span class="p">,</span>
     <span class="n">dingding_conn_id</span><span class="o">=</span><span class="s1">&#39;dingding_default&#39;</span><span class="p">,</span>
@@ -637,7 +637,7 @@ to send Dingding message:</p>
 <p>The Dingding operator can send rich text messages including link, markdown, actionCard and feedCard.
 A rich text message can not remind specific users except by using markdown type message:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dingding/example_dags/example_dingding.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/dingding/example_dags/example_dingding.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dingding/example_dags/example_dingding.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-dingding/1.0.1/airflow/providers/dingding/example_dags/example_dingding.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">markdown_msg</span> <span class="o">=</span> <span class="n">DingdingOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;markdown_msg&#39;</span><span class="p">,</span>
     <span class="n">dingding_conn_id</span><span class="o">=</span><span class="s1">&#39;dingding_default&#39;</span><span class="p">,</span>
@@ -662,7 +662,7 @@ A rich text message can not remind specific users except by using markdown type
 and then pass the function to <code class="docutils literal notranslate"><span class="pre">sla_miss_callback</span></code>, <code class="docutils literal notranslate"><span class="pre">on_success_callback</span></code>, <code class="docutils literal notranslate"><span class="pre">on_failure_callback</span></code>,
 or <code class="docutils literal notranslate"><span class="pre">on_retry_callback</span></code>. Here we use <code class="docutils literal notranslate"><span class="pre">on_failure_callback</span></code> as an example:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dingding/example_dags/example_dingding.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/dingding/example_dags/example_dingding.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dingding/example_dags/example_dingding.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-dingding/1.0.1/airflow/providers/dingding/example_dags/example_dingding.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="k">def</span> <span class="nf">failure_callback</span><span class="p">(</span><span class="n">context</span><span class="p">):</span>
     <span class="sd">&quot;&quot;&quot;</span>
 <span class="sd">    The function that will be executed on failure.</span>
diff --git a/docs-archive/apache-airflow-providers-dingding/1.0.2/operators.html b/docs-archive/apache-airflow-providers-dingding/1.0.2/operators.html
index 4664d94c72..d8d23373de 100644
--- a/docs-archive/apache-airflow-providers-dingding/1.0.2/operators.html
+++ b/docs-archive/apache-airflow-providers-dingding/1.0.2/operators.html
@@ -599,7 +599,7 @@ Connection. Notice that you just need token rather than the whole webhook string
 <p>Use the <a class="reference internal" href="_api/airflow/providers/dingding/operators/dingding/index.html#airflow.providers.dingding.operators.dingding.DingdingOperator" title="airflow.providers.dingding.operators.dingding.DingdingOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">DingdingOperator</span></code></a>
 to send Dingding message:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dingding/example_dags/example_dingding.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/dingding/example_dags/example_dingding.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dingding/example_dags/example_dingding.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-dingding/1.0.2/airflow/providers/dingding/example_dags/example_dingding.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">text_msg_remind_none</span> <span class="o">=</span> <span class="n">DingdingOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;text_msg_remind_none&#39;</span><span class="p">,</span>
     <span class="n">dingding_conn_id</span><span class="o">=</span><span class="s1">&#39;dingding_default&#39;</span><span class="p">,</span>
@@ -617,7 +617,7 @@ to send Dingding message:</p>
 <p>Use parameters <code class="docutils literal notranslate"><span class="pre">at_mobiles</span></code> and <code class="docutils literal notranslate"><span class="pre">at_all</span></code> to remind specific users when you send message,
 <code class="docutils literal notranslate"><span class="pre">at_mobiles</span></code> will be ignored When <code class="docutils literal notranslate"><span class="pre">at_all</span></code> is set to <code class="docutils literal notranslate"><span class="pre">True</span></code>:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dingding/example_dags/example_dingding.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/dingding/example_dags/example_dingding.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dingding/example_dags/example_dingding.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-dingding/1.0.2/airflow/providers/dingding/example_dags/example_dingding.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">text_msg_remind_all</span> <span class="o">=</span> <span class="n">DingdingOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;text_msg_remind_all&#39;</span><span class="p">,</span>
     <span class="n">dingding_conn_id</span><span class="o">=</span><span class="s1">&#39;dingding_default&#39;</span><span class="p">,</span>
@@ -637,7 +637,7 @@ to send Dingding message:</p>
 <p>The Dingding operator can send rich text messages including link, markdown, actionCard and feedCard.
 A rich text message can not remind specific users except by using markdown type message:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dingding/example_dags/example_dingding.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/dingding/example_dags/example_dingding.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dingding/example_dags/example_dingding.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-dingding/1.0.2/airflow/providers/dingding/example_dags/example_dingding.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">markdown_msg</span> <span class="o">=</span> <span class="n">DingdingOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;markdown_msg&#39;</span><span class="p">,</span>
     <span class="n">dingding_conn_id</span><span class="o">=</span><span class="s1">&#39;dingding_default&#39;</span><span class="p">,</span>
@@ -662,7 +662,7 @@ A rich text message can not remind specific users except by using markdown type
 and then pass the function to <code class="docutils literal notranslate"><span class="pre">sla_miss_callback</span></code>, <code class="docutils literal notranslate"><span class="pre">on_success_callback</span></code>, <code class="docutils literal notranslate"><span class="pre">on_failure_callback</span></code>,
 or <code class="docutils literal notranslate"><span class="pre">on_retry_callback</span></code>. Here we use <code class="docutils literal notranslate"><span class="pre">on_failure_callback</span></code> as an example:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dingding/example_dags/example_dingding.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/dingding/example_dags/example_dingding.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dingding/example_dags/example_dingding.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-dingding/1.0.2/airflow/providers/dingding/example_dags/example_dingding.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="k">def</span> <span class="nf">failure_callback</span><span class="p">(</span><span class="n">context</span><span class="p">):</span>
     <span class="sd">&quot;&quot;&quot;</span>
 <span class="sd">    The function that will be executed on failure.</span>
diff --git a/docs-archive/apache-airflow-providers-dingding/2.0.0/operators.html b/docs-archive/apache-airflow-providers-dingding/2.0.0/operators.html
index 73aba470a0..0a7287ffa2 100644
--- a/docs-archive/apache-airflow-providers-dingding/2.0.0/operators.html
+++ b/docs-archive/apache-airflow-providers-dingding/2.0.0/operators.html
@@ -598,7 +598,7 @@ Connection. Notice that you just need token rather than the whole webhook string
 <p>Use the <a class="reference internal" href="_api/airflow/providers/dingding/operators/dingding/index.html#airflow.providers.dingding.operators.dingding.DingdingOperator" title="airflow.providers.dingding.operators.dingding.DingdingOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">DingdingOperator</span></code></a>
 to send Dingding message:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dingding/example_dags/example_dingding.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/dingding/example_dags/example_dingding.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dingding/example_dags/example_dingding.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-dingding/2.0.0/airflow/providers/dingding/example_dags/example_dingding.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">text_msg_remind_none</span> <span class="o">=</span> <span class="n">DingdingOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;text_msg_remind_none&#39;</span><span class="p">,</span>
     <span class="n">dingding_conn_id</span><span class="o">=</span><span class="s1">&#39;dingding_default&#39;</span><span class="p">,</span>
@@ -616,7 +616,7 @@ to send Dingding message:</p>
 <p>Use parameters <code class="docutils literal notranslate"><span class="pre">at_mobiles</span></code> and <code class="docutils literal notranslate"><span class="pre">at_all</span></code> to remind specific users when you send message,
 <code class="docutils literal notranslate"><span class="pre">at_mobiles</span></code> will be ignored When <code class="docutils literal notranslate"><span class="pre">at_all</span></code> is set to <code class="docutils literal notranslate"><span class="pre">True</span></code>:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dingding/example_dags/example_dingding.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/dingding/example_dags/example_dingding.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dingding/example_dags/example_dingding.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-dingding/2.0.0/airflow/providers/dingding/example_dags/example_dingding.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">text_msg_remind_all</span> <span class="o">=</span> <span class="n">DingdingOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;text_msg_remind_all&#39;</span><span class="p">,</span>
     <span class="n">dingding_conn_id</span><span class="o">=</span><span class="s1">&#39;dingding_default&#39;</span><span class="p">,</span>
@@ -636,7 +636,7 @@ to send Dingding message:</p>
 <p>The Dingding operator can send rich text messages including link, markdown, actionCard and feedCard.
 A rich text message can not remind specific users except by using markdown type message:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dingding/example_dags/example_dingding.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/dingding/example_dags/example_dingding.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dingding/example_dags/example_dingding.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-dingding/2.0.0/airflow/providers/dingding/example_dags/example_dingding.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">markdown_msg</span> <span class="o">=</span> <span class="n">DingdingOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;markdown_msg&#39;</span><span class="p">,</span>
     <span class="n">dingding_conn_id</span><span class="o">=</span><span class="s1">&#39;dingding_default&#39;</span><span class="p">,</span>
@@ -661,7 +661,7 @@ A rich text message can not remind specific users except by using markdown type
 and then pass the function to <code class="docutils literal notranslate"><span class="pre">sla_miss_callback</span></code>, <code class="docutils literal notranslate"><span class="pre">on_success_callback</span></code>, <code class="docutils literal notranslate"><span class="pre">on_failure_callback</span></code>,
 or <code class="docutils literal notranslate"><span class="pre">on_retry_callback</span></code>. Here we use <code class="docutils literal notranslate"><span class="pre">on_failure_callback</span></code> as an example:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dingding/example_dags/example_dingding.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/dingding/example_dags/example_dingding.html"><span>View Source</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dingding/example_dags/example_dingding.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-dingding/2.0.0/airflow/providers/dingding/example_dags/example_dingding.py" target="_blank"><span>View Source</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="k">def</span> <span class="nf">failure_callback</span><span class="p">(</span><span class="n">context</span><span class="p">):</span>
     <span class="sd">&quot;&quot;&quot;</span>
 <span class="sd">    The function that will be executed on failure.</span>
diff --git a/docs-archive/apache-airflow-providers-dingding/2.0.1/operators.html b/docs-archive/apache-airflow-providers-dingding/2.0.1/operators.html
index fa3518ecb5..da789fefc3 100644
--- a/docs-archive/apache-airflow-providers-dingding/2.0.1/operators.html
+++ b/docs-archive/apache-airflow-providers-dingding/2.0.1/operators.html
@@ -603,7 +603,7 @@ Connection. Notice that you just need token rather than the whole webhook string
 <p>Use the <a class="reference internal" href="_api/airflow/providers/dingding/operators/dingding/index.html#airflow.providers.dingding.operators.dingding.DingdingOperator" title="airflow.providers.dingding.operators.dingding.DingdingOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">DingdingOperator</span></code></a>
 to send Dingding message:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dingding/example_dags/example_dingding.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/dingding/example_dags/example_dingding.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dingding/example_dags/example_dingding.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-dingding/2.0.1/airflow/providers/dingding/example_dags/example_dingding.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">text_msg_remind_none</span> <span class="o">=</span> <span class="n">DingdingOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;text_msg_remind_none&#39;</span><span class="p">,</span>
     <span class="n">message_type</span><span class="o">=</span><span class="s1">&#39;text&#39;</span><span class="p">,</span>
@@ -620,7 +620,7 @@ to send Dingding message:</p>
 <p>Use parameters <code class="docutils literal notranslate"><span class="pre">at_mobiles</span></code> and <code class="docutils literal notranslate"><span class="pre">at_all</span></code> to remind specific users when you send message,
 <code class="docutils literal notranslate"><span class="pre">at_mobiles</span></code> will be ignored When <code class="docutils literal notranslate"><span class="pre">at_all</span></code> is set to <code class="docutils literal notranslate"><span class="pre">True</span></code>:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dingding/example_dags/example_dingding.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/dingding/example_dags/example_dingding.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dingding/example_dags/example_dingding.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-dingding/2.0.1/airflow/providers/dingding/example_dags/example_dingding.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">text_msg_remind_all</span> <span class="o">=</span> <span class="n">DingdingOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;text_msg_remind_all&#39;</span><span class="p">,</span>
     <span class="n">message_type</span><span class="o">=</span><span class="s1">&#39;text&#39;</span><span class="p">,</span>
@@ -639,7 +639,7 @@ to send Dingding message:</p>
 <p>The Dingding operator can send rich text messages including link, markdown, actionCard and feedCard.
 A rich text message can not remind specific users except by using markdown type message:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dingding/example_dags/example_dingding.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/dingding/example_dags/example_dingding.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dingding/example_dags/example_dingding.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-dingding/2.0.1/airflow/providers/dingding/example_dags/example_dingding.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">markdown_msg</span> <span class="o">=</span> <span class="n">DingdingOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;markdown_msg&#39;</span><span class="p">,</span>
     <span class="n">message_type</span><span class="o">=</span><span class="s1">&#39;markdown&#39;</span><span class="p">,</span>
@@ -663,7 +663,7 @@ A rich text message can not remind specific users except by using markdown type
 and then pass the function to <code class="docutils literal notranslate"><span class="pre">sla_miss_callback</span></code>, <code class="docutils literal notranslate"><span class="pre">on_success_callback</span></code>, <code class="docutils literal notranslate"><span class="pre">on_failure_callback</span></code>,
 or <code class="docutils literal notranslate"><span class="pre">on_retry_callback</span></code>. Here we use <code class="docutils literal notranslate"><span class="pre">on_failure_callback</span></code> as an example:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dingding/example_dags/example_dingding.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/dingding/example_dags/example_dingding.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dingding/example_dags/example_dingding.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-dingding/2.0.1/airflow/providers/dingding/example_dags/example_dingding.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="k">def</span> <span class="nf">failure_callback</span><span class="p">(</span><span class="n">context</span><span class="p">):</span>
     <span class="sd">&quot;&quot;&quot;</span>
 <span class="sd">    The function that will be executed on failure.</span>
diff --git a/docs-archive/apache-airflow-providers-dingding/2.0.2/operators.html b/docs-archive/apache-airflow-providers-dingding/2.0.2/operators.html
index d765791e71..fd2ac38bd4 100644
--- a/docs-archive/apache-airflow-providers-dingding/2.0.2/operators.html
+++ b/docs-archive/apache-airflow-providers-dingding/2.0.2/operators.html
@@ -603,7 +603,7 @@ Connection. Notice that you just need token rather than the whole webhook string
 <p>Use the <a class="reference internal" href="_api/airflow/providers/dingding/operators/dingding/index.html#airflow.providers.dingding.operators.dingding.DingdingOperator" title="airflow.providers.dingding.operators.dingding.DingdingOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">DingdingOperator</span></code></a>
 to send Dingding message:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dingding/example_dags/example_dingding.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/dingding/example_dags/example_dingding.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dingding/example_dags/example_dingding.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-dingding/2.0.2/airflow/providers/dingding/example_dags/example_dingding.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">text_msg_remind_none</span> <span class="o">=</span> <span class="n">DingdingOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;text_msg_remind_none&#39;</span><span class="p">,</span>
     <span class="n">message_type</span><span class="o">=</span><span class="s1">&#39;text&#39;</span><span class="p">,</span>
@@ -620,7 +620,7 @@ to send Dingding message:</p>
 <p>Use parameters <code class="docutils literal notranslate"><span class="pre">at_mobiles</span></code> and <code class="docutils literal notranslate"><span class="pre">at_all</span></code> to remind specific users when you send message,
 <code class="docutils literal notranslate"><span class="pre">at_mobiles</span></code> will be ignored When <code class="docutils literal notranslate"><span class="pre">at_all</span></code> is set to <code class="docutils literal notranslate"><span class="pre">True</span></code>:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dingding/example_dags/example_dingding.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/dingding/example_dags/example_dingding.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dingding/example_dags/example_dingding.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-dingding/2.0.2/airflow/providers/dingding/example_dags/example_dingding.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">text_msg_remind_all</span> <span class="o">=</span> <span class="n">DingdingOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;text_msg_remind_all&#39;</span><span class="p">,</span>
     <span class="n">message_type</span><span class="o">=</span><span class="s1">&#39;text&#39;</span><span class="p">,</span>
@@ -639,7 +639,7 @@ to send Dingding message:</p>
 <p>The Dingding operator can send rich text messages including link, markdown, actionCard and feedCard.
 A rich text message can not remind specific users except by using markdown type message:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dingding/example_dags/example_dingding.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/dingding/example_dags/example_dingding.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dingding/example_dags/example_dingding.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-dingding/2.0.2/airflow/providers/dingding/example_dags/example_dingding.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">markdown_msg</span> <span class="o">=</span> <span class="n">DingdingOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;markdown_msg&#39;</span><span class="p">,</span>
     <span class="n">message_type</span><span class="o">=</span><span class="s1">&#39;markdown&#39;</span><span class="p">,</span>
@@ -663,7 +663,7 @@ A rich text message can not remind specific users except by using markdown type
 and then pass the function to <code class="docutils literal notranslate"><span class="pre">sla_miss_callback</span></code>, <code class="docutils literal notranslate"><span class="pre">on_success_callback</span></code>, <code class="docutils literal notranslate"><span class="pre">on_failure_callback</span></code>,
 or <code class="docutils literal notranslate"><span class="pre">on_retry_callback</span></code>. Here we use <code class="docutils literal notranslate"><span class="pre">on_failure_callback</span></code> as an example:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dingding/example_dags/example_dingding.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/dingding/example_dags/example_dingding.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dingding/example_dags/example_dingding.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-dingding/2.0.2/airflow/providers/dingding/example_dags/example_dingding.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="k">def</span> <span class="nf">failure_callback</span><span class="p">(</span><span class="n">context</span><span class="p">):</span>
     <span class="sd">&quot;&quot;&quot;</span>
 <span class="sd">    The function that will be executed on failure.</span>
diff --git a/docs-archive/apache-airflow-providers-dingding/2.0.3/operators.html b/docs-archive/apache-airflow-providers-dingding/2.0.3/operators.html
index 215e707bec..eaf0575220 100644
--- a/docs-archive/apache-airflow-providers-dingding/2.0.3/operators.html
+++ b/docs-archive/apache-airflow-providers-dingding/2.0.3/operators.html
@@ -603,7 +603,7 @@ Connection. Notice that you just need token rather than the whole webhook string
 <p>Use the <a class="reference internal" href="_api/airflow/providers/dingding/operators/dingding/index.html#airflow.providers.dingding.operators.dingding.DingdingOperator" title="airflow.providers.dingding.operators.dingding.DingdingOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">DingdingOperator</span></code></a>
 to send Dingding message:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dingding/example_dags/example_dingding.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/dingding/example_dags/example_dingding.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dingding/example_dags/example_dingding.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-dingding/2.0.3/airflow/providers/dingding/example_dags/example_dingding.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">text_msg_remind_none</span> <span class="o">=</span> <span class="n">DingdingOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;text_msg_remind_none&#39;</span><span class="p">,</span>
     <span class="n">message_type</span><span class="o">=</span><span class="s1">&#39;text&#39;</span><span class="p">,</span>
@@ -620,7 +620,7 @@ to send Dingding message:</p>
 <p>Use parameters <code class="docutils literal notranslate"><span class="pre">at_mobiles</span></code> and <code class="docutils literal notranslate"><span class="pre">at_all</span></code> to remind specific users when you send message,
 <code class="docutils literal notranslate"><span class="pre">at_mobiles</span></code> will be ignored When <code class="docutils literal notranslate"><span class="pre">at_all</span></code> is set to <code class="docutils literal notranslate"><span class="pre">True</span></code>:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dingding/example_dags/example_dingding.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/dingding/example_dags/example_dingding.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dingding/example_dags/example_dingding.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-dingding/2.0.3/airflow/providers/dingding/example_dags/example_dingding.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">text_msg_remind_all</span> <span class="o">=</span> <span class="n">DingdingOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;text_msg_remind_all&#39;</span><span class="p">,</span>
     <span class="n">message_type</span><span class="o">=</span><span class="s1">&#39;text&#39;</span><span class="p">,</span>
@@ -639,7 +639,7 @@ to send Dingding message:</p>
 <p>The Dingding operator can send rich text messages including link, markdown, actionCard and feedCard.
 A rich text message can not remind specific users except by using markdown type message:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dingding/example_dags/example_dingding.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/dingding/example_dags/example_dingding.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dingding/example_dags/example_dingding.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-dingding/2.0.3/airflow/providers/dingding/example_dags/example_dingding.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">markdown_msg</span> <span class="o">=</span> <span class="n">DingdingOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;markdown_msg&#39;</span><span class="p">,</span>
     <span class="n">message_type</span><span class="o">=</span><span class="s1">&#39;markdown&#39;</span><span class="p">,</span>
@@ -663,7 +663,7 @@ A rich text message can not remind specific users except by using markdown type
 and then pass the function to <code class="docutils literal notranslate"><span class="pre">sla_miss_callback</span></code>, <code class="docutils literal notranslate"><span class="pre">on_success_callback</span></code>, <code class="docutils literal notranslate"><span class="pre">on_failure_callback</span></code>,
 or <code class="docutils literal notranslate"><span class="pre">on_retry_callback</span></code>. Here we use <code class="docutils literal notranslate"><span class="pre">on_failure_callback</span></code> as an example:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dingding/example_dags/example_dingding.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/dingding/example_dags/example_dingding.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dingding/example_dags/example_dingding.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-dingding/2.0.3/airflow/providers/dingding/example_dags/example_dingding.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="k">def</span> <span class="nf">failure_callback</span><span class="p">(</span><span class="n">context</span><span class="p">):</span>
     <span class="sd">&quot;&quot;&quot;</span>
 <span class="sd">    The function that will be executed on failure.</span>
diff --git a/docs-archive/apache-airflow-providers-dingding/2.0.4/operators.html b/docs-archive/apache-airflow-providers-dingding/2.0.4/operators.html
index c309506f6a..0c98c3a1ac 100644
--- a/docs-archive/apache-airflow-providers-dingding/2.0.4/operators.html
+++ b/docs-archive/apache-airflow-providers-dingding/2.0.4/operators.html
@@ -605,7 +605,7 @@ Connection. Notice that you just need token rather than the whole webhook string
 <p>Use the <a class="reference internal" href="_api/airflow/providers/dingding/operators/dingding/index.html#airflow.providers.dingding.operators.dingding.DingdingOperator" title="airflow.providers.dingding.operators.dingding.DingdingOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">DingdingOperator</span></code></a>
 to send Dingding message:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dingding/example_dags/example_dingding.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/dingding/example_dags/example_dingding.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dingding/example_dags/example_dingding.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-dingding/2.0.4/airflow/providers/dingding/example_dags/example_dingding.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">text_msg_remind_none</span> <span class="o">=</span> <span class="n">DingdingOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;text_msg_remind_none&#39;</span><span class="p">,</span>
     <span class="n">message_type</span><span class="o">=</span><span class="s1">&#39;text&#39;</span><span class="p">,</span>
@@ -622,7 +622,7 @@ to send Dingding message:</p>
 <p>Use parameters <code class="docutils literal notranslate"><span class="pre">at_mobiles</span></code> and <code class="docutils literal notranslate"><span class="pre">at_all</span></code> to remind specific users when you send message,
 <code class="docutils literal notranslate"><span class="pre">at_mobiles</span></code> will be ignored When <code class="docutils literal notranslate"><span class="pre">at_all</span></code> is set to <code class="docutils literal notranslate"><span class="pre">True</span></code>:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dingding/example_dags/example_dingding.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/dingding/example_dags/example_dingding.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dingding/example_dags/example_dingding.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-dingding/2.0.4/airflow/providers/dingding/example_dags/example_dingding.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">text_msg_remind_all</span> <span class="o">=</span> <span class="n">DingdingOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;text_msg_remind_all&#39;</span><span class="p">,</span>
     <span class="n">message_type</span><span class="o">=</span><span class="s1">&#39;text&#39;</span><span class="p">,</span>
@@ -641,7 +641,7 @@ to send Dingding message:</p>
 <p>The Dingding operator can send rich text messages including link, markdown, actionCard and feedCard.
 A rich text message can not remind specific users except by using markdown type message:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dingding/example_dags/example_dingding.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/dingding/example_dags/example_dingding.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dingding/example_dags/example_dingding.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-dingding/2.0.4/airflow/providers/dingding/example_dags/example_dingding.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">markdown_msg</span> <span class="o">=</span> <span class="n">DingdingOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;markdown_msg&#39;</span><span class="p">,</span>
     <span class="n">message_type</span><span class="o">=</span><span class="s1">&#39;markdown&#39;</span><span class="p">,</span>
@@ -665,7 +665,7 @@ A rich text message can not remind specific users except by using markdown type
 and then pass the function to <code class="docutils literal notranslate"><span class="pre">sla_miss_callback</span></code>, <code class="docutils literal notranslate"><span class="pre">on_success_callback</span></code>, <code class="docutils literal notranslate"><span class="pre">on_failure_callback</span></code>,
 or <code class="docutils literal notranslate"><span class="pre">on_retry_callback</span></code>. Here we use <code class="docutils literal notranslate"><span class="pre">on_failure_callback</span></code> as an example:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dingding/example_dags/example_dingding.py</span><a class="example-header-button viewcode-button reference internal" href="_modules/airflow/providers/dingding/example_dags/example_dingding.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/dingding/example_dags/example_dingding.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-dingding/2.0.4/airflow/providers/dingding/example_dags/example_dingding.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="k">def</span> <span class="nf">failure_callback</span><span class="p">(</span><span class="n">context</span><span class="p">):</span>
     <span class="sd">&quot;&quot;&quot;</span>
 <span class="sd">    The function that will be executed on failure.</span>
diff --git a/docs-archive/apache-airflow-providers-github/1.0.0/operators/index.html b/docs-archive/apache-airflow-providers-github/1.0.0/operators/index.html
index 00665ab17c..203dcedaff 100644
--- a/docs-archive/apache-airflow-providers-github/1.0.0/operators/index.html
+++ b/docs-archive/apache-airflow-providers-github/1.0.0/operators/index.html
@@ -619,7 +619,7 @@ and passing <strong>github_method</strong> and <strong>github_method_args</stron
 You can further process the result using <strong>result_processor</strong> Callable as you like.</p>
 <p>An example of Listing all Repositories owned by a user, <strong>client.get_user().get_repos()</strong> can be implemented as following:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/github/example_dags/example_github.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/github/example_dags/example_github.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/github/example_dags/example_github.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-github/1.0.0/airflow/providers/github/example_dags/example_github.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>
 <span class="n">github_list_repos</span> <span class="o">=</span> <span class="n">GithubOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;github_list_repos&#39;</span><span class="p">,</span>
@@ -634,7 +634,7 @@ You can further process the result using <strong>result_processor</strong> Calla
 </div>
 <p>An example of Listing Tags in a Repository, <strong>client.get_repo(full_name_or_id='apache/airflow').get_tags()</strong> can be implemented as following:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/github/example_dags/example_github.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/github/example_dags/example_github.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/github/example_dags/example_github.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-github/1.0.0/airflow/providers/github/example_dags/example_github.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>
 <span class="n">list_repo_tags</span> <span class="o">=</span> <span class="n">GithubOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;list_repo_tags&#39;</span><span class="p">,</span>
@@ -657,7 +657,7 @@ an example of this is <code class="xref py py-class docutils literal notranslate
 a Tag in <a class="reference external" href="https://www.github.com/">Github</a>.</p>
 <p>An example for tag <strong>v1.0</strong>:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/github/example_dags/example_github.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/github/example_dags/example_github.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/github/example_dags/example_github.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-github/1.0.0/airflow/providers/github/example_dags/example_github.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>
 <span class="n">tag_sensor</span> <span class="o">=</span> <span class="n">GithubTagSensor</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;example_tag_sensor&#39;</span><span class="p">,</span>
@@ -674,7 +674,7 @@ a Tag in <a class="reference external" href="https://www.github.com/">Github</a>
 </div>
 <p>Similar Functionality can be achieved by directly using <code class="xref py py-class docutils literal notranslate"><span class="pre">GithubSensor</span></code> ,</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/github/example_dags/example_github.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/github/example_dags/example_github.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/github/example_dags/example_github.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-github/1.0.0/airflow/providers/github/example_dags/example_github.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>
 
 <span class="k">def</span> <span class="nf">tag_checker</span><span class="p">(</span><span class="n">repo</span><span class="p">:</span> <span class="n">Any</span><span class="p">,</span> <span class="n">tag_name</span><span class="p">:</span> <span class="nb">str</span><span class="p">)</span> <span class="o">-&gt;</span> <span class="n">Optional</span><span class="p">[</span><span class="nb">bool</span><span class="p">]:</span>
diff --git a/docs-archive/apache-airflow-providers-github/1.0.1/operators/index.html b/docs-archive/apache-airflow-providers-github/1.0.1/operators/index.html
index 1b69ed8ccc..d23a0cf15b 100644
--- a/docs-archive/apache-airflow-providers-github/1.0.1/operators/index.html
+++ b/docs-archive/apache-airflow-providers-github/1.0.1/operators/index.html
@@ -619,7 +619,7 @@ and passing <strong>github_method</strong> and <strong>github_method_args</stron
 You can further process the result using <strong>result_processor</strong> Callable as you like.</p>
 <p>An example of Listing all Repositories owned by a user, <strong>client.get_user().get_repos()</strong> can be implemented as following:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/github/example_dags/example_github.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/github/example_dags/example_github.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/github/example_dags/example_github.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-github/1.0.1/airflow/providers/github/example_dags/example_github.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>
 <span class="n">github_list_repos</span> <span class="o">=</span> <span class="n">GithubOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;github_list_repos&#39;</span><span class="p">,</span>
@@ -634,7 +634,7 @@ You can further process the result using <strong>result_processor</strong> Calla
 </div>
 <p>An example of Listing Tags in a Repository, <strong>client.get_repo(full_name_or_id='apache/airflow').get_tags()</strong> can be implemented as following:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/github/example_dags/example_github.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/github/example_dags/example_github.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/github/example_dags/example_github.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-github/1.0.1/airflow/providers/github/example_dags/example_github.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>
 <span class="n">list_repo_tags</span> <span class="o">=</span> <span class="n">GithubOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;list_repo_tags&#39;</span><span class="p">,</span>
@@ -657,7 +657,7 @@ an example of this is <code class="xref py py-class docutils literal notranslate
 a Tag in <a class="reference external" href="https://www.github.com/">Github</a>.</p>
 <p>An example for tag <strong>v1.0</strong>:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/github/example_dags/example_github.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/github/example_dags/example_github.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/github/example_dags/example_github.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-github/1.0.1/airflow/providers/github/example_dags/example_github.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>
 <span class="n">tag_sensor</span> <span class="o">=</span> <span class="n">GithubTagSensor</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;example_tag_sensor&#39;</span><span class="p">,</span>
@@ -674,7 +674,7 @@ a Tag in <a class="reference external" href="https://www.github.com/">Github</a>
 </div>
 <p>Similar Functionality can be achieved by directly using <code class="xref py py-class docutils literal notranslate"><span class="pre">GithubSensor</span></code> ,</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/github/example_dags/example_github.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/github/example_dags/example_github.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/github/example_dags/example_github.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-github/1.0.1/airflow/providers/github/example_dags/example_github.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>
 
 <span class="k">def</span> <span class="nf">tag_checker</span><span class="p">(</span><span class="n">repo</span><span class="p">:</span> <span class="n">Any</span><span class="p">,</span> <span class="n">tag_name</span><span class="p">:</span> <span class="nb">str</span><span class="p">)</span> <span class="o">-&gt;</span> <span class="n">Optional</span><span class="p">[</span><span class="nb">bool</span><span class="p">]:</span>
diff --git a/docs-archive/apache-airflow-providers-github/1.0.2/operators/index.html b/docs-archive/apache-airflow-providers-github/1.0.2/operators/index.html
index 8a1c2de61c..4a0e2f8e70 100644
--- a/docs-archive/apache-airflow-providers-github/1.0.2/operators/index.html
+++ b/docs-archive/apache-airflow-providers-github/1.0.2/operators/index.html
@@ -619,7 +619,7 @@ and passing <strong>github_method</strong> and <strong>github_method_args</stron
 You can further process the result using <strong>result_processor</strong> Callable as you like.</p>
 <p>An example of Listing all Repositories owned by a user, <strong>client.get_user().get_repos()</strong> can be implemented as following:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/github/example_dags/example_github.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/github/example_dags/example_github.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/github/example_dags/example_github.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-github/1.0.2/airflow/providers/github/example_dags/example_github.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>
 <span class="n">github_list_repos</span> <span class="o">=</span> <span class="n">GithubOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;github_list_repos&#39;</span><span class="p">,</span>
@@ -634,7 +634,7 @@ You can further process the result using <strong>result_processor</strong> Calla
 </div>
 <p>An example of Listing Tags in a Repository, <strong>client.get_repo(full_name_or_id='apache/airflow').get_tags()</strong> can be implemented as following:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/github/example_dags/example_github.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/github/example_dags/example_github.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/github/example_dags/example_github.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-github/1.0.2/airflow/providers/github/example_dags/example_github.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>
 <span class="n">list_repo_tags</span> <span class="o">=</span> <span class="n">GithubOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;list_repo_tags&#39;</span><span class="p">,</span>
@@ -657,7 +657,7 @@ an example of this is <code class="xref py py-class docutils literal notranslate
 a Tag in <a class="reference external" href="https://www.github.com/">Github</a>.</p>
 <p>An example for tag <strong>v1.0</strong>:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/github/example_dags/example_github.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/github/example_dags/example_github.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/github/example_dags/example_github.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-github/1.0.2/airflow/providers/github/example_dags/example_github.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>
 <span class="n">tag_sensor</span> <span class="o">=</span> <span class="n">GithubTagSensor</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;example_tag_sensor&#39;</span><span class="p">,</span>
@@ -674,7 +674,7 @@ a Tag in <a class="reference external" href="https://www.github.com/">Github</a>
 </div>
 <p>Similar Functionality can be achieved by directly using <code class="xref py py-class docutils literal notranslate"><span class="pre">GithubSensor</span></code> ,</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/github/example_dags/example_github.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/github/example_dags/example_github.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/github/example_dags/example_github.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-github/1.0.2/airflow/providers/github/example_dags/example_github.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>
 
 <span class="k">def</span> <span class="nf">tag_checker</span><span class="p">(</span><span class="n">repo</span><span class="p">:</span> <span class="n">Any</span><span class="p">,</span> <span class="n">tag_name</span><span class="p">:</span> <span class="nb">str</span><span class="p">)</span> <span class="o">-&gt;</span> <span class="n">Optional</span><span class="p">[</span><span class="nb">bool</span><span class="p">]:</span>
diff --git a/docs-archive/apache-airflow-providers-github/1.0.3/operators/index.html b/docs-archive/apache-airflow-providers-github/1.0.3/operators/index.html
index 4053d5a2f5..5b8a5942b7 100644
--- a/docs-archive/apache-airflow-providers-github/1.0.3/operators/index.html
+++ b/docs-archive/apache-airflow-providers-github/1.0.3/operators/index.html
@@ -621,7 +621,7 @@ and passing <strong>github_method</strong> and <strong>github_method_args</stron
 You can further process the result using <strong>result_processor</strong> Callable as you like.</p>
 <p>An example of Listing all Repositories owned by a user, <strong>client.get_user().get_repos()</strong> can be implemented as following:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/github/example_dags/example_github.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/github/example_dags/example_github.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/github/example_dags/example_github.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-github/1.0.3/airflow/providers/github/example_dags/example_github.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>
 <span class="n">github_list_repos</span> <span class="o">=</span> <span class="n">GithubOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;github_list_repos&#39;</span><span class="p">,</span>
@@ -636,7 +636,7 @@ You can further process the result using <strong>result_processor</strong> Calla
 </div>
 <p>An example of Listing Tags in a Repository, <strong>client.get_repo(full_name_or_id='apache/airflow').get_tags()</strong> can be implemented as following:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/github/example_dags/example_github.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/github/example_dags/example_github.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/github/example_dags/example_github.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-github/1.0.3/airflow/providers/github/example_dags/example_github.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>
 <span class="n">list_repo_tags</span> <span class="o">=</span> <span class="n">GithubOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;list_repo_tags&#39;</span><span class="p">,</span>
@@ -659,7 +659,7 @@ an example of this is <code class="xref py py-class docutils literal notranslate
 a Tag in <a class="reference external" href="https://www.github.com/">Github</a>.</p>
 <p>An example for tag <strong>v1.0</strong>:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/github/example_dags/example_github.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/github/example_dags/example_github.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/github/example_dags/example_github.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-github/1.0.3/airflow/providers/github/example_dags/example_github.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>
 <span class="n">tag_sensor</span> <span class="o">=</span> <span class="n">GithubTagSensor</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;example_tag_sensor&#39;</span><span class="p">,</span>
@@ -676,7 +676,7 @@ a Tag in <a class="reference external" href="https://www.github.com/">Github</a>
 </div>
 <p>Similar Functionality can be achieved by directly using <code class="xref py py-class docutils literal notranslate"><span class="pre">GithubSensor</span></code> ,</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/github/example_dags/example_github.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/github/example_dags/example_github.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/github/example_dags/example_github.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-github/1.0.3/airflow/providers/github/example_dags/example_github.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span>
 
 <span class="k">def</span> <span class="nf">tag_checker</span><span class="p">(</span><span class="n">repo</span><span class="p">:</span> <span class="n">Any</span><span class="p">,</span> <span class="n">tag_name</span><span class="p">:</span> <span class="nb">str</span><span class="p">)</span> <span class="o">-&gt;</span> <span class="n">Optional</span><span class="p">[</span><span class="nb">bool</span><span class="p">]:</span>
diff --git a/docs-archive/apache-airflow-providers-google/6.3.0/operators/ads.html b/docs-archive/apache-airflow-providers-google/6.3.0/operators/ads.html
index f49562071c..72366365a1 100644
--- a/docs-archive/apache-airflow-providers-google/6.3.0/operators/ads.html
+++ b/docs-archive/apache-airflow-providers-google/6.3.0/operators/ads.html
@@ -608,7 +608,7 @@ businesses to advertise on Google Search, YouTube and other sites across the web
 <p>To query the Google Ads API and generate a CSV report of the results use
 <a class="reference internal" href="../_api/airflow/providers/google/ads/transfers/ads_to_gcs/index.html#airflow.providers.google.ads.transfers.ads_to_gcs.GoogleAdsToGcsOperator" title="airflow.providers.google.ads.transfers.ads_to_gcs.GoogleAdsToGcsOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">GoogleAdsToGcsOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/ads/example_dags/example_ads.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/google/ads/example_dags/example_ads.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/ads/example_dags/example_ads.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/ads/example_dags/example_ads.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">run_operator</span> <span class="o">=</span> <span class="n">GoogleAdsToGcsOperator</span><span class="p">(</span>
     <span class="n">client_ids</span><span class="o">=</span><span class="n">CLIENT_IDS</span><span class="p">,</span>
     <span class="n">query</span><span class="o">=</span><span class="n">QUERY</span><span class="p">,</span>
@@ -630,7 +630,7 @@ The result is saved to <a class="reference external" href="/docs/apache-airflow/
 <p>To upload Google Ads accounts to Google Cloud Storage bucket use the
 <code class="xref py py-class docutils literal notranslate"><span class="pre">GoogleAdsListAccountsOperator</span></code>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/ads/example_dags/example_ads.py</span><a class="example-header-button viewcode-button reference internal" href="../_modules/airflow/providers/google/ads/example_dags/example_ads.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/ads/example_dags/example_ads.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/ads/example_dags/example_ads.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">list_accounts</span> <span class="o">=</span> <span class="n">GoogleAdsListAccountsOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;list_accounts&quot;</span><span class="p">,</span> <span class="n">bucket</span><span class="o">=</span><span class="n">BUCKET</span><span class="p">,</span> <span class="n">object_name</span><span class="o">=</span><span class="n">GCS_ACCOUNTS_CSV</span>
 <span class="p">)</span>
diff --git a/docs-archive/apache-airflow-providers-google/6.3.0/operators/cloud/automl.html b/docs-archive/apache-airflow-providers-google/6.3.0/operators/cloud/automl.html
index b613892a64..35c4fd4ec4 100644
--- a/docs-archive/apache-airflow-providers-google/6.3.0/operators/cloud/automl.html
+++ b/docs-archive/apache-airflow-providers-google/6.3.0/operators/cloud/automl.html
@@ -614,7 +614,7 @@ and then integrate those models into your applications and web sites.</p>
 <a class="reference internal" href="../../_api/airflow/providers/google/cloud/operators/automl/index.html#airflow.providers.google.cloud.operators.automl.AutoMLCreateDatasetOperator" title="airflow.providers.google.cloud.operators.automl.AutoMLCreateDatasetOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">AutoMLCreateDatasetOperator</span></code></a>.
 The operator returns dataset id in <a class="reference external" href="/docs/apache-airflow/stable/concepts/xcoms.html#concepts-xcom" title="(in apache-airflow v2.3.0.dev0)"><span class="xref std std-ref">XCom</span></a> under <code class="docutils literal notranslate"><span class="pre">dataset_id</span></code> key.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_automl_tables.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_automl_tables.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_automl_tables.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_automl_tables.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">create_dataset_task</span> <span class="o">=</span> <span class="n">AutoMLCreateDatasetOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;create_dataset_task&quot;</span><span class="p">,</span>
     <span class="n">dataset</span><span class="o">=</span><span class="n">DATASET</span><span class="p">,</span>
@@ -629,7 +629,7 @@ The operator returns dataset id in <a class="reference external" href="/docs/apa
 <p>After creating a dataset you can use it to import some data using
 <a class="reference internal" href="../../_api/airflow/providers/google/cloud/operators/automl/index.html#airflow.providers.google.cloud.operators.automl.AutoMLImportDataOperator" title="airflow.providers.google.cloud.operators.automl.AutoMLImportDataOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">AutoMLImportDataOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_automl_tables.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_automl_tables.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_automl_tables.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_automl_tables.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">import_dataset_task</span> <span class="o">=</span> <span class="n">AutoMLImportDataOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;import_dataset_task&quot;</span><span class="p">,</span>
     <span class="n">dataset_id</span><span class="o">=</span><span class="n">dataset_id</span><span class="p">,</span>
@@ -642,7 +642,7 @@ The operator returns dataset id in <a class="reference external" href="/docs/apa
 <p>To update dataset you can use
 <a class="reference internal" href="../../_api/airflow/providers/google/cloud/operators/automl/index.html#airflow.providers.google.cloud.operators.automl.AutoMLTablesUpdateDatasetOperator" title="airflow.providers.google.cloud.operators.automl.AutoMLTablesUpdateDatasetOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">AutoMLTablesUpdateDatasetOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_automl_tables.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_automl_tables.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_automl_tables.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_automl_tables.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">update</span> <span class="o">=</span> <span class="n">deepcopy</span><span class="p">(</span><span class="n">DATASET</span><span class="p">)</span>
 <span class="n">update</span><span class="p">[</span><span class="s2">&quot;name&quot;</span><span class="p">]</span> <span class="o">=</span> <span class="s1">&#39;{{ task_instance.xcom_pull(&quot;create_dataset_task&quot;)[&quot;name&quot;] }}&#39;</span>
 <span class="n">update</span><span class="p">[</span><span class="s2">&quot;tables_dataset_metadata&quot;</span><span class="p">][</span>  <span class="c1"># type: ignore</span>
@@ -663,7 +663,7 @@ The operator returns dataset id in <a class="reference external" href="/docs/apa
 <p>To list table specs you can use
 <a class="reference internal" href="../../_api/airflow/providers/google/cloud/operators/automl/index.html#airflow.providers.google.cloud.operators.automl.AutoMLTablesListTableSpecsOperator" title="airflow.providers.google.cloud.operators.automl.AutoMLTablesListTableSpecsOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">AutoMLTablesListTableSpecsOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_automl_tables.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_automl_tables.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_automl_tables.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_automl_tables.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">list_tables_spec_task</span> <span class="o">=</span> <span class="n">AutoMLTablesListTableSpecsOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;list_tables_spec_task&quot;</span><span class="p">,</span>
     <span class="n">dataset_id</span><span class="o">=</span><span class="n">dataset_id</span><span class="p">,</span>
@@ -676,7 +676,7 @@ The operator returns dataset id in <a class="reference external" href="/docs/apa
 <p>To list column specs you can use
 <a class="reference internal" href="../../_api/airflow/providers/google/cloud/operators/automl/index.html#airflow.providers.google.cloud.operators.automl.AutoMLTablesListColumnSpecsOperator" title="airflow.providers.google.cloud.operators.automl.AutoMLTablesListColumnSpecsOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">AutoMLTablesListColumnSpecsOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_automl_tables.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_automl_tables.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_automl_tables.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_automl_tables.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">list_columns_spec_task</span> <span class="o">=</span> <span class="n">AutoMLTablesListColumnSpecsOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;list_columns_spec_task&quot;</span><span class="p">,</span>
     <span class="n">dataset_id</span><span class="o">=</span><span class="n">dataset_id</span><span class="p">,</span>
@@ -695,7 +695,7 @@ The operator returns dataset id in <a class="reference external" href="/docs/apa
 The operator will wait for the operation to complete. Additionally the operator
 returns the id of model in <a class="reference external" href="/docs/apache-airflow/stable/concepts/xcoms.html#concepts-xcom" title="(in apache-airflow v2.3.0.dev0)"><span class="xref std std-ref">XCom</span></a> under <code class="docutils literal notranslate"><span class="pre">model_id</span></code> key.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_automl_tables.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_automl_tables.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_automl_tables.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_automl_tables.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">create_model_task</span> <span class="o">=</span> <span class="n">AutoMLTrainModelOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;create_model_task&quot;</span><span class="p">,</span>
     <span class="n">model</span><span class="o">=</span><span class="n">MODEL</span><span class="p">,</span>
@@ -710,7 +710,7 @@ returns the id of model in <a class="reference external" href="/docs/apache-airf
 <p>To get existing model one can use
 <a class="reference internal" href="../../_api/airflow/providers/google/cloud/operators/automl/index.html#airflow.providers.google.cloud.operators.automl.AutoMLGetModelOperator" title="airflow.providers.google.cloud.operators.automl.AutoMLGetModelOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">AutoMLGetModelOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_automl_tables.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_automl_tables.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_automl_tables.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_automl_tables.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">get_model_task</span> <span class="o">=</span> <span class="n">AutoMLGetModelOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;get_model_task&quot;</span><span class="p">,</span>
     <span class="n">model_id</span><span class="o">=</span><span class="n">MODEL_ID</span><span class="p">,</span>
@@ -723,7 +723,7 @@ returns the id of model in <a class="reference external" href="/docs/apache-airf
 <p>Once a model is created it could be deployed using
 <a class="reference internal" href="../../_api/airflow/providers/google/cloud/operators/automl/index.html#airflow.providers.google.cloud.operators.automl.AutoMLDeployModelOperator" title="airflow.providers.google.cloud.operators.automl.AutoMLDeployModelOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">AutoMLDeployModelOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_automl_tables.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_automl_tables.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_automl_tables.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_automl_tables.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">deploy_model_task</span> <span class="o">=</span> <span class="n">AutoMLDeployModelOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;deploy_model_task&quot;</span><span class="p">,</span>
     <span class="n">model_id</span><span class="o">=</span><span class="n">MODEL_ID</span><span class="p">,</span>
@@ -736,7 +736,7 @@ returns the id of model in <a class="reference external" href="/docs/apache-airf
 <p>If you wish to delete a model you can use
 <a class="reference internal" href="../../_api/airflow/providers/google/cloud/operators/automl/index.html#airflow.providers.google.cloud.operators.automl.AutoMLDeleteModelOperator" title="airflow.providers.google.cloud.operators.automl.AutoMLDeleteModelOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">AutoMLDeleteModelOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_automl_tables.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_automl_tables.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_automl_tables.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_automl_tables.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">delete_model_task</span> <span class="o">=</span> <span class="n">AutoMLDeleteModelOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;delete_model_task&quot;</span><span class="p">,</span>
     <span class="n">model_id</span><span class="o">=</span><span class="n">model_id</span><span class="p">,</span>
@@ -754,7 +754,7 @@ returns the id of model in <a class="reference external" href="/docs/apache-airf
 <a class="reference internal" href="../../_api/airflow/providers/google/cloud/operators/automl/index.html#airflow.providers.google.cloud.operators.automl.AutoMLBatchPredictOperator" title="airflow.providers.google.cloud.operators.automl.AutoMLBatchPredictOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">AutoMLBatchPredictOperator</span></code></a>. In the first case
 the model must be deployed.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_automl_tables.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_automl_tables.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_automl_tables.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_automl_tables.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">predict_task</span> <span class="o">=</span> <span class="n">AutoMLPredictOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;predict_task&quot;</span><span class="p">,</span>
     <span class="n">model_id</span><span class="o">=</span><span class="n">MODEL_ID</span><span class="p">,</span>
@@ -766,7 +766,7 @@ the model must be deployed.</p>
 </div>
 </div>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_automl_tables.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_automl_tables.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_automl_tables.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_automl_tables.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">batch_predict_task</span> <span class="o">=</span> <span class="n">AutoMLBatchPredictOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;batch_predict_task&quot;</span><span class="p">,</span>
     <span class="n">model_id</span><span class="o">=</span><span class="n">MODEL_ID</span><span class="p">,</span>
@@ -785,7 +785,7 @@ the model must be deployed.</p>
 <a class="reference internal" href="../../_api/airflow/providers/google/cloud/operators/automl/index.html#airflow.providers.google.cloud.operators.automl.AutoMLListDatasetOperator" title="airflow.providers.google.cloud.operators.automl.AutoMLListDatasetOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">AutoMLListDatasetOperator</span></code></a>. The operator returns list
 of datasets ids in <a class="reference external" href="/docs/apache-airflow/stable/concepts/xcoms.html#concepts-xcom" title="(in apache-airflow v2.3.0.dev0)"><span class="xref std std-ref">XCom</span></a> under <code class="docutils literal notranslate"><span class="pre">dataset_id_list</span></code> key.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_automl_tables.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_automl_tables.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_automl_tables.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_automl_tables.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">list_datasets_task</span> <span class="o">=</span> <span class="n">AutoMLListDatasetOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;list_datasets_task&quot;</span><span class="p">,</span>
     <span class="n">location</span><span class="o">=</span><span class="n">GCP_AUTOML_LOCATION</span><span class="p">,</span>
@@ -797,7 +797,7 @@ of datasets ids in <a class="reference external" href="/docs/apache-airflow/stab
 <p>To delete a model you can use <a class="reference internal" href="../../_api/airflow/providers/google/cloud/operators/automl/index.html#airflow.providers.google.cloud.operators.automl.AutoMLDeleteDatasetOperator" title="airflow.providers.google.cloud.operators.automl.AutoMLDeleteDatasetOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">AutoMLDeleteDatasetOperator</span></code></a>.
 The delete operator allows also to pass list or coma separated string of datasets ids to be deleted.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_automl_tables.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_automl_tables.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_automl_tables.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_automl_tables.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">delete_datasets_task</span> <span class="o">=</span> <span class="n">AutoMLDeleteDatasetOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;delete_datasets_task&quot;</span><span class="p">,</span>
     <span class="n">dataset_id</span><span class="o">=</span><span class="s2">&quot;{{ task_instance.xcom_pull(&#39;list_datasets_task&#39;, key=&#39;dataset_id_list&#39;) | list }}&quot;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-google/6.3.0/operators/cloud/bigquery.html b/docs-archive/apache-airflow-providers-google/6.3.0/operators/cloud/bigquery.html
index d4e0cbd56b..53e938dce9 100644
--- a/docs-archive/apache-airflow-providers-google/6.3.0/operators/cloud/bigquery.html
+++ b/docs-archive/apache-airflow-providers-google/6.3.0/operators/cloud/bigquery.html
@@ -616,7 +616,7 @@ data.</p>
 <p>To create an empty dataset in a BigQuery database you can use
 <a class="reference internal" href="../../_api/airflow/providers/google/cloud/operators/bigquery/index.html#airflow.providers.google.cloud.operators.bigquery.BigQueryCreateEmptyDatasetOperator" title="airflow.providers.google.cloud.operators.bigquery.BigQueryCreateEmptyDatasetOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">BigQueryCreateEmptyDatasetOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_bigquery_operations.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_bigquery_operations.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_bigquery_operations.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_bigquery_operations.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">create_dataset</span> <span class="o">=</span> <span class="n">BigQueryCreateEmptyDatasetOperator</span><span class="p">(</span><span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;create-dataset&quot;</span><span class="p">,</span> <span class="n">dataset_id</span><span class="o">=</span><span class="n">DATASET_NAME</span><span class="p">)</span>
 </pre></div>
 </div>
@@ -628,7 +628,7 @@ data.</p>
 <a class="reference internal" href="../../_api/airflow/providers/google/cloud/operators/bigquery/index.html#airflow.providers.google.cloud.operators.bigquery.BigQueryGetDatasetOperator" title="airflow.providers.google.cloud.operators.bigquery.BigQueryGetDatasetOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">BigQueryGetDatasetOperator</span></code></a>.</p>
 <p>This operator returns a <a class="reference external" href="https://cloud.google.com/bigquery/docs/reference/rest/v2/datasets#resource">Dataset Resource</a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_bigquery_operations.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_bigquery_operations.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_bigquery_operations.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_bigquery_operations.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">get_dataset</span> <span class="o">=</span> <span class="n">BigQueryGetDatasetOperator</span><span class="p">(</span><span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;get-dataset&quot;</span><span class="p">,</span> <span class="n">dataset_id</span><span class="o">=</span><span class="n">DATASET_NAME</span><span class="p">)</span>
 </pre></div>
 </div>
@@ -639,7 +639,7 @@ data.</p>
 <p>To retrieve the list of tables in a given dataset use
 <a class="reference internal" href="../../_api/airflow/providers/google/cloud/operators/bigquery/index.html#airflow.providers.google.cloud.operators.bigquery.BigQueryGetDatasetTablesOperator" title="airflow.providers.google.cloud.operators.bigquery.BigQueryGetDatasetTablesOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">BigQueryGetDatasetTablesOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_bigquery_operations.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_bigquery_operations.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_bigquery_operations.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_bigquery_operations.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">get_dataset_tables</span> <span class="o">=</span> <span class="n">BigQueryGetDatasetTablesOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;get_dataset_tables&quot;</span><span class="p">,</span> <span class="n">dataset_id</span><span class="o">=</span><span class="n">DATASET_NAME</span>
 <span class="p">)</span>
@@ -654,7 +654,7 @@ data.</p>
 <p>The update method replaces the entire Table resource, whereas the patch
 method only replaces fields that are provided in the submitted Table resource.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_bigquery_operations.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_bigquery_operations.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_bigquery_operations.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_bigquery_operations.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">update_table</span> <span class="o">=</span> <span class="n">BigQueryUpdateTableOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;update_table&quot;</span><span class="p">,</span>
     <span class="n">dataset_id</span><span class="o">=</span><span class="n">DATASET_NAME</span><span class="p">,</span>
@@ -676,7 +676,7 @@ method only replaces fields that are provided in the submitted Table resource.</
 <p>The update method replaces the entire dataset resource, whereas the patch
 method only replaces fields that are provided in the submitted dataset resource.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_bigquery_operations.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_bigquery_operations.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_bigquery_operations.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_bigquery_operations.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">update_dataset</span> <span class="o">=</span> <span class="n">BigQueryUpdateDatasetOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;update_dataset&quot;</span><span class="p">,</span>
     <span class="n">dataset_id</span><span class="o">=</span><span class="n">DATASET_NAME</span><span class="p">,</span>
@@ -691,7 +691,7 @@ method only replaces fields that are provided in the submitted dataset resource.
 <p>To delete an existing dataset from a BigQuery database you can use
 <a class="reference internal" href="../../_api/airflow/providers/google/cloud/operators/bigquery/index.html#airflow.providers.google.cloud.operators.bigquery.BigQueryDeleteDatasetOperator" title="airflow.providers.google.cloud.operators.bigquery.BigQueryDeleteDatasetOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">BigQueryDeleteDatasetOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_bigquery_operations.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_bigquery_operations.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_bigquery_operations.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_bigquery_operations.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">delete_dataset</span> <span class="o">=</span> <span class="n">BigQueryDeleteDatasetOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;delete_dataset&quot;</span><span class="p">,</span> <span class="n">dataset_id</span><span class="o">=</span><span class="n">DATASET_NAME</span><span class="p">,</span> <span class="n">delete_contents</span><span class="o">=</span><span class="kc">True</span>
 <span class="p">)</span>
@@ -712,7 +712,7 @@ ways. You may either directly pass the schema fields in, or you may point the
 operator to a Google Cloud Storage object name. The object in Google Cloud
 Storage must be a JSON file with the schema fields in it.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_bigquery_operations.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_bigquery_operations.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_bigquery_operations.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_bigquery_operations.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">create_table</span> <span class="o">=</span> <span class="n">BigQueryCreateEmptyTableOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;create_table&quot;</span><span class="p">,</span>
     <span class="n">dataset_id</span><span class="o">=</span><span class="n">DATASET_NAME</span><span class="p">,</span>
@@ -727,7 +727,7 @@ Storage must be a JSON file with the schema fields in it.</p>
 </div>
 <p>You can use this operator to create a view on top of an existing table.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_bigquery_operations.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_bigquery_operations.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_bigquery_operations.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_bigquery_operations.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">create_view</span> <span class="o">=</span> <span class="n">BigQueryCreateEmptyTableOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;create_view&quot;</span><span class="p">,</span>
     <span class="n">dataset_id</span><span class="o">=</span><span class="n">DATASET_NAME</span><span class="p">,</span>
@@ -743,7 +743,7 @@ Storage must be a JSON file with the schema fields in it.</p>
 <p>You can also use this operator to create a materialized view that periodically
 cache results of a query for increased performance and efficiency.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_bigquery_operations.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_bigquery_operations.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_bigquery_operations.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_bigquery_operations.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">create_materialized_view</span> <span class="o">=</span> <span class="n">BigQueryCreateEmptyTableOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;create_materialized_view&quot;</span><span class="p">,</span>
     <span class="n">dataset_id</span><span class="o">=</span><span class="n">DATASET_NAME</span><span class="p">,</span>
@@ -768,7 +768,7 @@ you can use
 you may either directly pass the schema fields in, or you may point the operator
 to a Google Cloud Storage object name.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_bigquery_operations.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_bigquery_operations.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_bigquery_operations.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_bigquery_operations.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">create_external_table</span> <span class="o">=</span> <span class="n">BigQueryCreateExternalTableOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;create_external_table&quot;</span><span class="p">,</span>
     <span class="n">table_resource</span><span class="o">=</span><span class="p">{</span>
@@ -806,7 +806,7 @@ returned list will be equal to the number of rows fetched. Each element in the
 list will again be a list where elements would represent the column values for
 that row.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_bigquery_queries.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_bigquery_queries.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_bigquery_queries.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_bigquery_queries.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">get_data</span> <span class="o">=</span> <span class="n">BigQueryGetDataOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;get_data&quot;</span><span class="p">,</span>
     <span class="n">dataset_id</span><span class="o">=</span><span class="n">DATASET</span><span class="p">,</span>
@@ -826,7 +826,7 @@ that row.</p>
 <p>This operator either updates the existing table or creates a new, empty table
 in the given dataset.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_bigquery_operations.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_bigquery_operations.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_bigquery_operations.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_bigquery_operations.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">upsert_table</span> <span class="o">=</span> <span class="n">BigQueryUpsertTableOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;upsert_table&quot;</span><span class="p">,</span>
     <span class="n">dataset_id</span><span class="o">=</span><span class="n">DATASET_NAME</span><span class="p">,</span>
@@ -846,7 +846,7 @@ in the given dataset.</p>
 <p>This operator updates the schema field values supplied, while leaving the rest unchanged. This is useful
 for instance to set new field descriptions on an existing table schema.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_bigquery_operations.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_bigquery_operations.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_bigquery_operations.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_bigquery_operations.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">update_table_schema</span> <span class="o">=</span> <span class="n">BigQueryUpdateTableSchemaOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;update_table_schema&quot;</span><span class="p">,</span>
     <span class="n">dataset_id</span><span class="o">=</span><span class="n">DATASET_NAME</span><span class="p">,</span>
@@ -865,7 +865,7 @@ for instance to set new field descriptions on an existing table schema.</p>
 <p>To delete an existing table you can use
 <a class="reference internal" href="../../_api/airflow/providers/google/cloud/operators/bigquery/index.html#airflow.providers.google.cloud.operators.bigquery.BigQueryDeleteTableOperator" title="airflow.providers.google.cloud.operators.bigquery.BigQueryDeleteTableOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">BigQueryDeleteTableOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_bigquery_operations.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_bigquery_operations.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_bigquery_operations.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_bigquery_operations.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">delete_table</span> <span class="o">=</span> <span class="n">BigQueryDeleteTableOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;delete_table&quot;</span><span class="p">,</span>
     <span class="n">deletion_dataset_table</span><span class="o">=</span><span class="sa">f</span><span class="s2">&quot;</span><span class="si">{</span><span class="n">PROJECT_ID</span><span class="si">}</span><span class="s2">.</span><span class="si">{</span><span class="n">DATASET_NAME</span><span class="si">}</span><span class="s2">.test_table&quot;</span><span class="p">,</span>
@@ -875,7 +875,7 @@ for instance to set new field descriptions on an existing table schema.</p>
 </div>
 <p>You can also use this operator to delete a view.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_bigquery_operations.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_bigquery_operations.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_bigquery_operations.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_bigquery_operations.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">delete_view</span> <span class="o">=</span> <span class="n">BigQueryDeleteTableOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;delete_view&quot;</span><span class="p">,</span> <span class="n">deletion_dataset_table</span><span class="o">=</span><span class="sa">f</span><span class="s2">&quot;</span><span class="si">{</span><span class="n">PROJECT_ID</span><span class="si">}</span><span class="s2">.</span><span class="si">{</span><span class="n">DATASET_NAME</span><span class="si">}</span><span class="s2">.test_view&quot;</span>
 <span class="p">)</span>
@@ -884,7 +884,7 @@ for instance to set new field descriptions on an existing table schema.</p>
 </div>
 <p>You can also use this operator to delete a materialized view.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_bigquery_operations.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_bigquery_operations.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_bigquery_operations.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_bigquery_operations.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">delete_materialized_view</span> <span class="o">=</span> <span class="n">BigQueryDeleteTableOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;delete_materialized_view&quot;</span><span class="p">,</span>
     <span class="n">deletion_dataset_table</span><span class="o">=</span><span class="sa">f</span><span class="s2">&quot;</span><span class="si">{</span><span class="n">PROJECT_ID</span><span class="si">}</span><span class="s2">.</span><span class="si">{</span><span class="n">DATASET_NAME</span><span class="si">}</span><span class="s2">.test_materialized_view&quot;</span><span class="p">,</span>
@@ -898,7 +898,7 @@ for instance to set new field descriptions on an existing table schema.</p>
 <span id="howto-operator-bigqueryinsertjoboperator"></span><h2>Execute BigQuery jobs<a class="headerlink" href="#execute-bigquery-jobs" title="Permalink to this headline">¶</a></h2>
 <p>Let's say you would like to execute the following query.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_bigquery_queries.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_bigquery_queries.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_bigquery_queries.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_bigquery_queries.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">INSERT_ROWS_QUERY</span> <span class="o">=</span> <span class="p">(</span>
     <span class="sa">f</span><span class="s2">&quot;INSERT </span><span class="si">{</span><span class="n">DATASET</span><span class="si">}</span><span class="s2">.</span><span class="si">{</span><span class="n">TABLE_1</span><span class="si">}</span><span class="s2"> VALUES &quot;</span>
     <span class="sa">f</span><span class="s2">&quot;(42, &#39;monthy python&#39;, &#39;</span><span class="si">{</span><span class="n">INSERT_DATE</span><span class="si">}</span><span class="s2">&#39;), &quot;</span>
@@ -911,7 +911,7 @@ for instance to set new field descriptions on an existing table schema.</p>
 <a class="reference internal" href="../../_api/airflow/providers/google/cloud/operators/bigquery/index.html#airflow.providers.google.cloud.operators.bigquery.BigQueryInsertJobOperator" title="airflow.providers.google.cloud.operators.bigquery.BigQueryInsertJobOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">BigQueryInsertJobOperator</span></code></a> with
 proper query job configuration that can be Jinja templated.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_bigquery_queries.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_bigquery_queries.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_bigquery_queries.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_bigquery_queries.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">insert_query_job</span> <span class="o">=</span> <span class="n">BigQueryInsertJobOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;insert_query_job&quot;</span><span class="p">,</span>
     <span class="n">configuration</span><span class="o">=</span><span class="p">{</span>
@@ -930,7 +930,7 @@ proper query job configuration that can be Jinja templated.</p>
 <p>If you want to include some files in your configuration you can use <code class="docutils literal notranslate"><span class="pre">include</span></code> clause of Jinja template
 language as follow:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_bigquery_queries.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_bigquery_queries.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_bigquery_queries.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_bigquery_queries.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">select_query_job</span> <span class="o">=</span> <span class="n">BigQueryInsertJobOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;select_query_job&quot;</span><span class="p">,</span>
     <span class="n">configuration</span><span class="o">=</span><span class="p">{</span>
@@ -961,7 +961,7 @@ then it will reattach to the existing job.</p>
 that first row is evaluated using python <code class="docutils literal notranslate"><span class="pre">bool</span></code> casting. If any of the values
 return <code class="docutils literal notranslate"><span class="pre">False</span></code> the check is failed and errors out.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_bigquery_queries.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_bigquery_queries.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_bigquery_queries.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_bigquery_queries.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">check_count</span> <span class="o">=</span> <span class="n">BigQueryCheckOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;check_count&quot;</span><span class="p">,</span>
     <span class="n">sql</span><span class="o">=</span><span class="sa">f</span><span class="s2">&quot;SELECT COUNT(*) FROM </span><span class="si">{</span><span class="n">DATASET</span><span class="si">}</span><span class="s2">.</span><span class="si">{</span><span class="n">TABLE_1</span><span class="si">}</span><span class="s2">&quot;</span><span class="p">,</span>
@@ -980,7 +980,7 @@ return <code class="docutils literal notranslate"><span class="pre">False</span>
 that first row is evaluated against <code class="docutils literal notranslate"><span class="pre">pass_value</span></code> which can be either a string
 or numeric value. If numeric, you can also specify <code class="docutils literal notranslate"><span class="pre">tolerance</span></code>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_bigquery_queries.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_bigquery_queries.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_bigquery_queries.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_bigquery_queries.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">check_value</span> <span class="o">=</span> <span class="n">BigQueryValueCheckOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;check_value&quot;</span><span class="p">,</span>
     <span class="n">sql</span><span class="o">=</span><span class="sa">f</span><span class="s2">&quot;SELECT COUNT(*) FROM </span><span class="si">{</span><span class="n">DATASET</span><span class="si">}</span><span class="s2">.</span><span class="si">{</span><span class="n">TABLE_1</span><span class="si">}</span><span class="s2">&quot;</span><span class="p">,</span>
@@ -998,7 +998,7 @@ or numeric value. If numeric, you can also specify <code class="docutils literal
 tolerance of the ones from <code class="docutils literal notranslate"><span class="pre">days_back</span></code> before you can use
 <a class="reference internal" href="../../_api/airflow/providers/google/cloud/operators/bigquery/index.html#airflow.providers.google.cloud.operators.bigquery.BigQueryIntervalCheckOperator" title="airflow.providers.google.cloud.operators.bigquery.BigQueryIntervalCheckOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">BigQueryIntervalCheckOperator</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_bigquery_queries.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_bigquery_queries.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_bigquery_queries.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_bigquery_queries.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">check_interval</span> <span class="o">=</span> <span class="n">BigQueryIntervalCheckOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;check_interval&quot;</span><span class="p">,</span>
     <span class="n">table</span><span class="o">=</span><span class="sa">f</span><span class="s2">&quot;</span><span class="si">{</span><span class="n">DATASET</span><span class="si">}</span><span class="s2">.</span><span class="si">{</span><span class="n">TABLE_1</span><span class="si">}</span><span class="s2">&quot;</span><span class="p">,</span>
@@ -1021,7 +1021,7 @@ of downstream operators until a table exist. If the table is sharded on dates yo
 use the <code class="docutils literal notranslate"><span class="pre">{{</span> <span class="pre">ds_nodash</span> <span class="pre">}}</span></code> macro as the table name suffix.</p>
 <p><a class="reference internal" href="../../_api/airflow/providers/google/cloud/sensors/bigquery/index.html#airflow.providers.google.cloud.sensors.bigquery.BigQueryTableExistenceSensor" title="airflow.providers.google.cloud.sensors.bigquery.BigQueryTableExistenceSensor"><code class="xref py py-class docutils literal notranslate"><span class="pre">BigQueryTableExistenceSensor</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_bigquery_sensors.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_bigquery_sensors.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_bigquery_sensors.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_bigquery_sensors.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">check_table_exists</span> <span class="o">=</span> <span class="n">BigQueryTableExistenceSensor</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;check_table_exists&quot;</span><span class="p">,</span> <span class="n">project_id</span><span class="o">=</span><span class="n">PROJECT_ID</span><span class="p">,</span> <span class="n">dataset_id</span><span class="o">=</span><span class="n">DATASET_NAME</span><span class="p">,</span> <span class="n">table_id</span><span class="o">=</span><span class="n">TABLE_NAME</span>
 <span class="p">)</span>
@@ -1034,7 +1034,7 @@ use the <code class="docutils literal notranslate"><span class="pre">{{</span> <
 <p>To check that a table exists and has a partition you can use.
 <a class="reference internal" href="../../_api/airflow/providers/google/cloud/sensors/bigquery/index.html#airflow.providers.google.cloud.sensors.bigquery.BigQueryTablePartitionExistenceSensor" title="airflow.providers.google.cloud.sensors.bigquery.BigQueryTablePartitionExistenceSensor"><code class="xref py py-class docutils literal notranslate"><span class="pre">BigQueryTablePartitionExistenceSensor</span></code></a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_bigquery_sensors.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_bigquery_sensors.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_bigquery_sensors.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_bigquery_sensors.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">check_table_partition_exists</span> <span class="o">=</span> <span class="n">BigQueryTablePartitionExistenceSensor</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;check_table_partition_exists&quot;</span><span class="p">,</span>
     <span class="n">project_id</span><span class="o">=</span><span class="n">PROJECT_ID</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-google/6.3.0/operators/cloud/bigtable.html b/docs-archive/apache-airflow-providers-google/6.3.0/operators/cloud/bigtable.html
index be67806132..6d57c601dd 100644
--- a/docs-archive/apache-airflow-providers-google/6.3.0/operators/cloud/bigtable.html
+++ b/docs-archive/apache-airflow-providers-google/6.3.0/operators/cloud/bigtable.html
@@ -614,7 +614,7 @@ and immediately succeeds. No changes are made to the existing instance.</p>
 <p>You can create the operator with or without project id. If project id is missing
 it will be retrieved from the Google Cloud connection used. Both variants are shown:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_bigtable.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_bigtable.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_bigtable.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_bigtable.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">create_instance_task</span> <span class="o">=</span> <span class="n">BigtableCreateInstanceOperator</span><span class="p">(</span>
     <span class="n">project_id</span><span class="o">=</span><span class="n">GCP_PROJECT_ID</span><span class="p">,</span>
     <span class="n">instance_id</span><span class="o">=</span><span class="n">CBT_INSTANCE_ID</span><span class="p">,</span>
@@ -655,7 +655,7 @@ instance_display_name, instance_type and instance_labels.</p>
 <p>You can create the operator with or without project id. If project id is missing
 it will be retrieved from the Google Cloud connection used. Both variants are shown:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_bigtable.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_bigtable.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_bigtable.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_bigtable.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">update_instance_task</span> <span class="o">=</span> <span class="n">BigtableUpdateInstanceOperator</span><span class="p">(</span>
     <span class="n">instance_id</span><span class="o">=</span><span class="n">CBT_INSTANCE_ID</span><span class="p">,</span>
     <span class="n">instance_display_name</span><span class="o">=</span><span class="n">CBT_INSTANCE_DISPLAY_NAME_UPDATED</span><span class="p">,</span>
@@ -677,7 +677,7 @@ to delete a Google Cloud Bigtable instance.</p>
 <p>You can create the operator with or without project id. If project id is missing
 it will be retrieved from the Google Cloud connection used. Both variants are shown:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_bigtable.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_bigtable.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_bigtable.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_bigtable.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">delete_instance_task</span> <span class="o">=</span> <span class="n">BigtableDeleteInstanceOperator</span><span class="p">(</span>
     <span class="n">project_id</span><span class="o">=</span><span class="n">GCP_PROJECT_ID</span><span class="p">,</span>
     <span class="n">instance_id</span><span class="o">=</span><span class="n">CBT_INSTANCE_ID</span><span class="p">,</span>
@@ -701,7 +701,7 @@ to modify number of nodes in a Cloud Bigtable cluster.</p>
 <p>You can create the operator with or without project id. If project id is missing
 it will be retrieved from the Google Cloud connection used. Both variants are shown:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_bigtable.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_bigtable.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_bigtable.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_bigtable.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">cluster_update_task</span> <span class="o">=</span> <span class="n">BigtableUpdateClusterOperator</span><span class="p">(</span>
     <span class="n">project_id</span><span class="o">=</span><span class="n">GCP_PROJECT_ID</span><span class="p">,</span>
     <span class="n">instance_id</span><span class="o">=</span><span class="n">CBT_INSTANCE_ID</span><span class="p">,</span>
@@ -732,7 +732,7 @@ error message.</p>
 <p>You can create the operator with or without project id. If project id is missing
 it will be retrieved from the Google Cloud connection used. Both variants are shown:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_bigtable.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_bigtable.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_bigtable.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_bigtable.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">create_table_task</span> <span class="o">=</span> <span class="n">BigtableCreateTableOperator</span><span class="p">(</span>
     <span class="n">project_id</span><span class="o">=</span><span class="n">GCP_PROJECT_ID</span><span class="p">,</span>
     <span class="n">instance_id</span><span class="o">=</span><span class="n">CBT_INSTANCE_ID</span><span class="p">,</span>
@@ -766,7 +766,7 @@ to delete a table in Google Cloud Bigtable.</p>
 <p>You can create the operator with or without project id. If project id is missing
 it will be retrieved from the Google Cloud connection used. Both variants are shown:</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_bigtable.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_bigtable.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_bigtable.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_bigtable.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">delete_table_task</span> <span class="o">=</span> <span class="n">BigtableDeleteTableOperator</span><span class="p">(</span>
     <span class="n">project_id</span><span class="o">=</span><span class="n">GCP_PROJECT_ID</span><span class="p">,</span>
     <span class="n">instance_id</span><span class="o">=</span><span class="n">CBT_INSTANCE_ID</span><span class="p">,</span>
@@ -795,7 +795,7 @@ timeout hits and does not raise any exception.</p>
 <div class="section" id="id6">
 <h3>Using the operator<a class="headerlink" href="#id6" title="Permalink to this headline">¶</a></h3>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_bigtable.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_bigtable.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_bigtable.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_bigtable.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">wait_for_table_replication_task</span> <span class="o">=</span> <span class="n">BigtableTableReplicationCompletedSensor</span><span class="p">(</span>
     <span class="n">project_id</span><span class="o">=</span><span class="n">GCP_PROJECT_ID</span><span class="p">,</span>
     <span class="n">instance_id</span><span class="o">=</span><span class="n">CBT_INSTANCE_ID</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-google/6.3.0/operators/cloud/cloud_build.html b/docs-archive/apache-airflow-providers-google/6.3.0/operators/cloud/cloud_build.html
index 415e08abf8..0bac99de6e 100644
--- a/docs-archive/apache-airflow-providers-google/6.3.0/operators/cloud/cloud_build.html
+++ b/docs-archive/apache-airflow-providers-google/6.3.0/operators/cloud/cloud_build.html
@@ -618,7 +618,7 @@ artifacts such as Docker containers or Java archives.</p>
 <p>Cancel a build in progress with the
 <a class="reference internal" href="../../_api/airflow/providers/google/cloud/operators/cloud_build/index.html#airflow.providers.google.cloud.operators.cloud_build.CloudBuildCancelBuildOperator" title="airflow.providers.google.cloud.operators.cloud_build.CloudBuildCancelBuildOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">CloudBuildCancelBuildOperator</span></code></a> operator.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_cloud_build.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_cloud_build.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_cloud_build.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_cloud_build.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">cancel_build</span> <span class="o">=</span> <span class="n">CloudBuildCancelBuildOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;cancel_build&quot;</span><span class="p">,</span>
     <span class="n">id_</span><span class="o">=</span><span class="n">create_build_without_wait</span><span class="o">.</span><span class="n">output</span><span class="p">[</span><span class="s1">&#39;id&#39;</span><span class="p">],</span>
@@ -643,7 +643,7 @@ is not idempotent.</p>
 <h2>Build configuration<a class="headerlink" href="#build-configuration" title="Permalink to this headline">¶</a></h2>
 <p>In order to trigger a build, it is necessary to pass the build configuration.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_cloud_build.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_cloud_build.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_cloud_build.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_cloud_build.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">create_build_from_storage_body</span> <span class="o">=</span> <span class="p">{</span>
     <span class="s2">&quot;source&quot;</span><span class="p">:</span> <span class="p">{</span><span class="s2">&quot;storage_source&quot;</span><span class="p">:</span> <span class="n">GCP_SOURCE_ARCHIVE_URL</span><span class="p">},</span>
     <span class="s2">&quot;steps&quot;</span><span class="p">:</span> <span class="p">[</span>
@@ -659,7 +659,7 @@ is not idempotent.</p>
 </div>
 <p>In addition, a build can refer to source stored in <a class="reference external" href="https://cloud.google.com/source-repositories/docs/">Google Cloud Source Repositories</a>.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_cloud_build.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_cloud_build.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_cloud_build.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_cloud_build.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">create_build_from_repo_body</span><span class="p">:</span> <span class="n">Dict</span><span class="p">[</span><span class="nb">str</span><span class="p">,</span> <span class="n">Any</span><span class="p">]</span> <span class="o">=</span> <span class="p">{</span>
     <span class="s2">&quot;source&quot;</span><span class="p">:</span> <span class="p">{</span><span class="s2">&quot;repo_source&quot;</span><span class="p">:</span> <span class="p">{</span><span class="s2">&quot;repo_name&quot;</span><span class="p">:</span> <span class="n">GCP_SOURCE_REPOSITORY_NAME</span><span class="p">,</span> <span class="s2">&quot;branch_name&quot;</span><span class="p">:</span> <span class="s2">&quot;main&quot;</span><span class="p">}},</span>
     <span class="s2">&quot;steps&quot;</span><span class="p">:</span> <span class="p">[</span>
@@ -680,7 +680,7 @@ is not idempotent.</p>
 <p>Trigger a build is performed with the
 <a class="reference internal" href="../../_api/airflow/providers/google/cloud/operators/cloud_build/index.html#airflow.providers.google.cloud.operators.cloud_build.CloudBuildCreateBuildOperator" title="airflow.providers.google.cloud.operators.cloud_build.CloudBuildCreateBuildOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">CloudBuildCreateBuildOperator</span></code></a> operator.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_cloud_build.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_cloud_build.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_cloud_build.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_cloud_build.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">create_build_from_storage</span> <span class="o">=</span> <span class="n">CloudBuildCreateBuildOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;create_build_from_storage&quot;</span><span class="p">,</span> <span class="n">project_id</span><span class="o">=</span><span class="n">GCP_PROJECT_ID</span><span class="p">,</span> <span class="n">build</span><span class="o">=</span><span class="n">create_build_from_storage_body</span>
 <span class="p">)</span>
@@ -692,7 +692,7 @@ is not idempotent.</p>
 parameters which allows you to dynamically determine values. The result is saved to <a class="reference external" href="/docs/apache-airflow/stable/concepts/xcoms.html#concepts-xcom" title="(in apache-airflow v2.3.0.dev0)"><span class="xref std std-ref">XCom</span></a>, which allows it
 to be used by other operators.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_cloud_build.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_cloud_build.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_cloud_build.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_cloud_build.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">create_build_from_storage_result</span> <span class="o">=</span> <span class="n">BashOperator</span><span class="p">(</span>
     <span class="n">bash_command</span><span class="o">=</span><span class="sa">f</span><span class="s2">&quot;echo </span><span class="si">{</span> <span class="n">create_build_from_storage</span><span class="o">.</span><span class="n">output</span><span class="p">[</span><span class="s1">&#39;results&#39;</span><span class="p">]</span> <span class="si">}</span><span class="s2">&quot;</span><span class="p">,</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;create_build_from_storage_result&quot;</span><span class="p">,</span>
@@ -703,7 +703,7 @@ to be used by other operators.</p>
 <p>By default, after the build is created, it will wait for the build operation to complete. If there is no need to wait for complete,
 you can pass wait=False as example shown below.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_cloud_build.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_cloud_build.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_cloud_build.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_cloud_build.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">create_build_without_wait</span> <span class="o">=</span> <span class="n">CloudBuildCreateBuildOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;create_build_without_wait&quot;</span><span class="p">,</span>
     <span class="n">project_id</span><span class="o">=</span><span class="n">GCP_PROJECT_ID</span><span class="p">,</span>
@@ -726,7 +726,7 @@ is not idempotent.</p>
 <p>Creates a new Cloud Build trigger with the
 <a class="reference internal" href="../../_api/airflow/providers/google/cloud/operators/cloud_build/index.html#airflow.providers.google.cloud.operators.cloud_build.CloudBuildCreateBuildTriggerOperator" title="airflow.providers.google.cloud.operators.cloud_build.CloudBuildCreateBuildTriggerOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">CloudBuildCreateBuildTriggerOperator</span></code></a> operator.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_cloud_build.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_cloud_build.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_cloud_build.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_cloud_build.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">create_build_trigger</span> <span class="o">=</span> <span class="n">CloudBuildCreateBuildTriggerOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;create_build_trigger&quot;</span><span class="p">,</span> <span class="n">project_id</span><span class="o">=</span><span class="n">GCP_PROJECT_ID</span><span class="p">,</span> <span class="n">trigger</span><span class="o">=</span><span class="n">create_build_trigger_body</span>
 <span class="p">)</span>
@@ -749,7 +749,7 @@ to be used by other operators.</p>
 <p>Deletes a new Cloud Build trigger with the
 <a class="reference internal" href="../../_api/airflow/providers/google/cloud/operators/cloud_build/index.html#airflow.providers.google.cloud.operators.cloud_build.CloudBuildDeleteBuildTriggerOperator" title="airflow.providers.google.cloud.operators.cloud_build.CloudBuildDeleteBuildTriggerOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">CloudBuildDeleteBuildTriggerOperator</span></code></a> operator.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_cloud_build.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_cloud_build.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_cloud_build.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_cloud_build.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">delete_build_trigger</span> <span class="o">=</span> <span class="n">CloudBuildDeleteBuildTriggerOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;delete_build_trigger&quot;</span><span class="p">,</span>
     <span class="n">project_id</span><span class="o">=</span><span class="n">GCP_PROJECT_ID</span><span class="p">,</span>
@@ -774,7 +774,7 @@ to be used by other operators.</p>
 <p>Returns information about a previously requested build with the
 <a class="reference internal" href="../../_api/airflow/providers/google/cloud/operators/cloud_build/index.html#airflow.providers.google.cloud.operators.cloud_build.CloudBuildGetBuildOperator" title="airflow.providers.google.cloud.operators.cloud_build.CloudBuildGetBuildOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">CloudBuildGetBuildOperator</span></code></a> operator.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_cloud_build.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_cloud_build.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_cloud_build.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_cloud_build.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">get_build</span> <span class="o">=</span> <span class="n">CloudBuildGetBuildOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;get_build&quot;</span><span class="p">,</span>
     <span class="n">id_</span><span class="o">=</span><span class="n">retry_build</span><span class="o">.</span><span class="n">output</span><span class="p">[</span><span class="s1">&#39;id&#39;</span><span class="p">],</span>
@@ -799,7 +799,7 @@ to be used by other operators.</p>
 <p>Returns information about a Cloud Build trigger with the
 <a class="reference internal" href="../../_api/airflow/providers/google/cloud/operators/cloud_build/index.html#airflow.providers.google.cloud.operators.cloud_build.CloudBuildGetBuildTriggerOperator" title="airflow.providers.google.cloud.operators.cloud_build.CloudBuildGetBuildTriggerOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">CloudBuildGetBuildTriggerOperator</span></code></a> operator.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_cloud_build.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_cloud_build.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_cloud_build.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_cloud_build.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">get_build_trigger</span> <span class="o">=</span> <span class="n">CloudBuildGetBuildTriggerOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;get_build_trigger&quot;</span><span class="p">,</span>
     <span class="n">project_id</span><span class="o">=</span><span class="n">GCP_PROJECT_ID</span><span class="p">,</span>
@@ -824,7 +824,7 @@ to be used by other operators.</p>
 <p>Lists all the existing Cloud Build triggers with the
 <a class="reference internal" href="../../_api/airflow/providers/google/cloud/operators/cloud_build/index.html#airflow.providers.google.cloud.operators.cloud_build.CloudBuildListBuildTriggersOperator" title="airflow.providers.google.cloud.operators.cloud_build.CloudBuildListBuildTriggersOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">CloudBuildListBuildTriggersOperator</span></code></a> operator.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_cloud_build.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_cloud_build.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_cloud_build.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_cloud_build.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">list_build_triggers</span> <span class="o">=</span> <span class="n">CloudBuildListBuildTriggersOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;list_build_triggers&quot;</span><span class="p">,</span> <span class="n">project_id</span><span class="o">=</span><span class="n">GCP_PROJECT_ID</span><span class="p">,</span> <span class="n">location</span><span class="o">=</span><span class="s2">&quot;global&quot;</span><span class="p">,</span> <span class="n">page_size</span><span class="o">=</span><span class="mi">5</span>
 <span class="p">)</span>
@@ -847,7 +847,7 @@ to be used by other operators.</p>
 <p>Lists previously requested builds with the
 <a class="reference internal" href="../../_api/airflow/providers/google/cloud/operators/cloud_build/index.html#airflow.providers.google.cloud.operators.cloud_build.CloudBuildListBuildsOperator" title="airflow.providers.google.cloud.operators.cloud_build.CloudBuildListBuildsOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">CloudBuildListBuildsOperator</span></code></a> operator.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_cloud_build.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_cloud_build.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_cloud_build.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_cloud_build.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">list_builds</span> <span class="o">=</span> <span class="n">CloudBuildListBuildsOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;list_builds&quot;</span><span class="p">,</span> <span class="n">project_id</span><span class="o">=</span><span class="n">GCP_PROJECT_ID</span><span class="p">,</span> <span class="n">location</span><span class="o">=</span><span class="s2">&quot;global&quot;</span>
 <span class="p">)</span>
@@ -871,7 +871,7 @@ using the original build request, which may or may not result in an identical bu
 <p>Creates a new build based on the specified build with the
 <a class="reference internal" href="../../_api/airflow/providers/google/cloud/operators/cloud_build/index.html#airflow.providers.google.cloud.operators.cloud_build.CloudBuildRetryBuildOperator" title="airflow.providers.google.cloud.operators.cloud_build.CloudBuildRetryBuildOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">CloudBuildRetryBuildOperator</span></code></a> operator.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_cloud_build.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_cloud_build.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_cloud_build.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_cloud_build.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">retry_build</span> <span class="o">=</span> <span class="n">CloudBuildRetryBuildOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;retry_build&quot;</span><span class="p">,</span>
     <span class="n">id_</span><span class="o">=</span><span class="n">cancel_build</span><span class="o">.</span><span class="n">output</span><span class="p">[</span><span class="s1">&#39;id&#39;</span><span class="p">],</span>
@@ -896,7 +896,7 @@ to be used by other operators.</p>
 <p>Runs a trigger at a particular source revision with the
 <a class="reference internal" href="../../_api/airflow/providers/google/cloud/operators/cloud_build/index.html#airflow.providers.google.cloud.operators.cloud_build.CloudBuildRunBuildTriggerOperator" title="airflow.providers.google.cloud.operators.cloud_build.CloudBuildRunBuildTriggerOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">CloudBuildRunBuildTriggerOperator</span></code></a> operator.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_cloud_build.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_cloud_build.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_cloud_build.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_cloud_build.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">run_build_trigger</span> <span class="o">=</span> <span class="n">CloudBuildRunBuildTriggerOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;run_build_trigger&quot;</span><span class="p">,</span>
     <span class="n">project_id</span><span class="o">=</span><span class="n">GCP_PROJECT_ID</span><span class="p">,</span>
@@ -922,7 +922,7 @@ to be used by other operators.</p>
 <p>Updates a Cloud Build trigger with the
 <a class="reference internal" href="../../_api/airflow/providers/google/cloud/operators/cloud_build/index.html#airflow.providers.google.cloud.operators.cloud_build.CloudBuildUpdateBuildTriggerOperator" title="airflow.providers.google.cloud.operators.cloud_build.CloudBuildUpdateBuildTriggerOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">CloudBuildUpdateBuildTriggerOperator</span></code></a> operator.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_cloud_build.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_cloud_build.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_cloud_build.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_cloud_build.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">create_build_trigger</span> <span class="o">=</span> <span class="n">CloudBuildCreateBuildTriggerOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;create_build_trigger&quot;</span><span class="p">,</span> <span class="n">project_id</span><span class="o">=</span><span class="n">GCP_PROJECT_ID</span><span class="p">,</span> <span class="n">trigger</span><span class="o">=</span><span class="n">create_build_trigger_body</span>
 <span class="p">)</span>
diff --git a/docs-archive/apache-airflow-providers-google/6.3.0/operators/cloud/cloud_memorystore.html b/docs-archive/apache-airflow-providers-google/6.3.0/operators/cloud/cloud_memorystore.html
index 4b31e3525f..7371f1365f 100644
--- a/docs-archive/apache-airflow-providers-google/6.3.0/operators/cloud/cloud_memorystore.html
+++ b/docs-archive/apache-airflow-providers-google/6.3.0/operators/cloud/cloud_memorystore.html
@@ -613,7 +613,7 @@ of managing complex Redis deployments.</p>
 presented as a compatible dictionary also.</p>
 <p>Here is an example of instance</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_cloud_memorystore.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_cloud_memorystore.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_cloud_memorystore.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_cloud_memorystore.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">FIRST_INSTANCE</span> <span class="o">=</span> <span class="p">{</span><span class="s2">&quot;tier&quot;</span><span class="p">:</span> <span class="n">Instance</span><span class="o">.</span><span class="n">Tier</span><span class="o">.</span><span class="n">BASIC</span><span class="p">,</span> <span class="s2">&quot;memory_size_gb&quot;</span><span class="p">:</span> <span class="mi">1</spa [...]
 </pre></div>
 </div>
@@ -628,7 +628,7 @@ make a use of the service account listed under <code class="docutils literal not
 <p>You can use <a class="reference internal" href="../../_api/airflow/providers/google/cloud/operators/gcs/index.html#airflow.providers.google.cloud.operators.gcs.GCSBucketCreateAclEntryOperator" title="airflow.providers.google.cloud.operators.gcs.GCSBucketCreateAclEntryOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">GCSBucketCreateAclEntryOperator</span></code></a>
 operator to set permissions.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_cloud_memorystore.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_cloud_memorystore.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_cloud_memorystore.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_cloud_memorystore.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">set_acl_permission</span> <span class="o">=</span> <span class="n">GCSBucketCreateAclEntryOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;gcs-set-acl-permission&quot;</span><span class="p">,</span>
     <span class="n">bucket</span><span class="o">=</span><span class="n">BUCKET_NAME</span><span class="p">,</span>
@@ -646,7 +646,7 @@ operator to set permissions.</p>
 <p>Create a instance is performed with the
 <a class="reference internal" href="../../_api/airflow/providers/google/cloud/operators/cloud_memorystore/index.html#airflow.providers.google.cloud.operators.cloud_memorystore.CloudMemorystoreCreateInstanceOperator" title="airflow.providers.google.cloud.operators.cloud_memorystore.CloudMemorystoreCreateInstanceOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">CloudMemorystoreCreateInstanceOperator</span></code></a> operator.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_cloud_memorystore.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_cloud_memorystore.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_cloud_memorystore.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_cloud_memorystore.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">create_instance</span> <span class="o">=</span> <span class="n">CloudMemorystoreCreateInstanceOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;create-instance&quot;</span><span class="p">,</span>
     <span class="n">location</span><span class="o">=</span><span class="s2">&quot;europe-north1&quot;</span><span class="p">,</span>
@@ -662,7 +662,7 @@ operator to set permissions.</p>
 parameters which allows you to dynamically determine values. The result is saved to <a class="reference external" href="/docs/apache-airflow/stable/concepts/xcoms.html#concepts-xcom" title="(in apache-airflow v2.3.0.dev0)"><span class="xref std std-ref">XCom</span></a>, which allows it
 to be used by other operators.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_cloud_memorystore.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_cloud_memorystore.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_cloud_memorystore.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_cloud_memorystore.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">create_instance_result</span> <span class="o">=</span> <span class="n">BashOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;create-instance-result&quot;</span><span class="p">,</span>
     <span class="n">bash_command</span><span class="o">=</span><span class="sa">f</span><span class="s2">&quot;echo </span><span class="si">{</span><span class="n">create_instance</span><span class="o">.</span><span class="n">output</span><span class="si">}</span><span class="s2">&quot;</span><span class="p">,</span>
@@ -676,7 +676,7 @@ to be used by other operators.</p>
 <p>Delete a instance is performed with the
 <a class="reference internal" href="../../_api/airflow/providers/google/cloud/operators/cloud_memorystore/index.html#airflow.providers.google.cloud.operators.cloud_memorystore.CloudMemorystoreDeleteInstanceOperator" title="airflow.providers.google.cloud.operators.cloud_memorystore.CloudMemorystoreDeleteInstanceOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">CloudMemorystoreDeleteInstanceOperator</span></code></a> operator.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_cloud_memorystore.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_cloud_memorystore.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_cloud_memorystore.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_cloud_memorystore.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">delete_instance</span> <span class="o">=</span> <span class="n">CloudMemorystoreDeleteInstanceOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;delete-instance&quot;</span><span class="p">,</span>
     <span class="n">location</span><span class="o">=</span><span class="s2">&quot;europe-north1&quot;</span><span class="p">,</span>
@@ -695,7 +695,7 @@ parameters which allows you to dynamically determine values.</p>
 <p>Delete a instance is performed with the
 <a class="reference internal" href="../../_api/airflow/providers/google/cloud/operators/cloud_memorystore/index.html#airflow.providers.google.cloud.operators.cloud_memorystore.CloudMemorystoreExportInstanceOperator" title="airflow.providers.google.cloud.operators.cloud_memorystore.CloudMemorystoreExportInstanceOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">CloudMemorystoreExportInstanceOperator</span></code></a> operator.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_cloud_memorystore.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_cloud_memorystore.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_cloud_memorystore.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_cloud_memorystore.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">export_instance</span> <span class="o">=</span> <span class="n">CloudMemorystoreExportInstanceOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;export-instance&quot;</span><span class="p">,</span>
     <span class="n">location</span><span class="o">=</span><span class="s2">&quot;europe-north1&quot;</span><span class="p">,</span>
@@ -715,7 +715,7 @@ parameters which allows you to dynamically determine values.</p>
 <p>Delete a instance is performed with the
 <a class="reference internal" href="../../_api/airflow/providers/google/cloud/operators/cloud_memorystore/index.html#airflow.providers.google.cloud.operators.cloud_memorystore.CloudMemorystoreFailoverInstanceOperator" title="airflow.providers.google.cloud.operators.cloud_memorystore.CloudMemorystoreFailoverInstanceOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">CloudMemorystoreFailoverInstanceOperator</span></code></a> operator.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_cloud_memorystore.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_cloud_memorystore.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_cloud_memorystore.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_cloud_memorystore.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">failover_instance</span> <span class="o">=</span> <span class="n">CloudMemorystoreFailoverInstanceOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;failover-instance&quot;</span><span class="p">,</span>
     <span class="n">location</span><span class="o">=</span><span class="s2">&quot;europe-north1&quot;</span><span class="p">,</span>
@@ -737,7 +737,7 @@ parameters which allows you to dynamically determine values.</p>
 <p>Delete a instance is performed with the
 <a class="reference internal" href="../../_api/airflow/providers/google/cloud/operators/cloud_memorystore/index.html#airflow.providers.google.cloud.operators.cloud_memorystore.CloudMemorystoreGetInstanceOperator" title="airflow.providers.google.cloud.operators.cloud_memorystore.CloudMemorystoreGetInstanceOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">CloudMemorystoreGetInstanceOperator</span></code></a> operator.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_cloud_memorystore.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_cloud_memorystore.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_cloud_memorystore.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_cloud_memorystore.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">get_instance</span> <span class="o">=</span> <span class="n">CloudMemorystoreGetInstanceOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;get-instance&quot;</span><span class="p">,</span>
     <span class="n">location</span><span class="o">=</span><span class="s2">&quot;europe-north1&quot;</span><span class="p">,</span>
@@ -757,7 +757,7 @@ parameters which allows you to dynamically determine values.</p>
 <p>Delete a instance is performed with the
 <a class="reference internal" href="../../_api/airflow/providers/google/cloud/operators/cloud_memorystore/index.html#airflow.providers.google.cloud.operators.cloud_memorystore.CloudMemorystoreImportOperator" title="airflow.providers.google.cloud.operators.cloud_memorystore.CloudMemorystoreImportOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">CloudMemorystoreImportOperator</span></code></a> operator.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_cloud_memorystore.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_cloud_memorystore.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_cloud_memorystore.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_cloud_memorystore.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">import_instance</span> <span class="o">=</span> <span class="n">CloudMemorystoreImportOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;import-instance&quot;</span><span class="p">,</span>
     <span class="n">location</span><span class="o">=</span><span class="s2">&quot;europe-north1&quot;</span><span class="p">,</span>
@@ -777,7 +777,7 @@ parameters which allows you to dynamically determine values.</p>
 <p>List a instances is performed with the
 <a class="reference internal" href="../../_api/airflow/providers/google/cloud/operators/cloud_memorystore/index.html#airflow.providers.google.cloud.operators.cloud_memorystore.CloudMemorystoreListInstancesOperator" title="airflow.providers.google.cloud.operators.cloud_memorystore.CloudMemorystoreListInstancesOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">CloudMemorystoreListInstancesOperator</span></code></a> operator.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_cloud_memorystore.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_cloud_memorystore.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_cloud_memorystore.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_cloud_memorystore.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">list_instances</span> <span class="o">=</span> <span class="n">CloudMemorystoreListInstancesOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;list-instances&quot;</span><span class="p">,</span> <span class="n">location</span><span class="o">=</span><span class="s2">&quot;-&quot;</span><span class="p">,</span> <span class="n">page_size</span><span class="o">=</span><span class="mi">100</span><span class="p">,</span> <span class="n">project_id</span><span class="o">=</span><span class="n">GCP_PROJECT_ID</span>
 <span class="p">)</span>
@@ -789,7 +789,7 @@ parameters which allows you to dynamically determine values.</p>
 parameters which allows you to dynamically determine values. The result is saved to <a class="reference external" href="/docs/apache-airflow/stable/concepts/xcoms.html#concepts-xcom" title="(in apache-airflow v2.3.0.dev0)"><span class="xref std std-ref">XCom</span></a>, which allows it
 to be used by other operators.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_cloud_memorystore.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_cloud_memorystore.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_cloud_memorystore.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_cloud_memorystore.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">list_instances_result</span> <span class="o">=</span> <span class="n">BashOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;list-instances-result&quot;</span><span class="p">,</span> <span class="n">bash_command</span><span class="o">=</span><span class="sa">f</span><span class="s2">&quot;echo </span><span class="si">{</span><span class="n">get_instance</span><span class="o">.</span><span class="n">output</span><span class="si">}</span><span class="s2">&quot;</span>
 <span class="p">)</span>
@@ -802,7 +802,7 @@ to be used by other operators.</p>
 <p>Update a instance is performed with the
 <a class="reference internal" href="../../_api/airflow/providers/google/cloud/operators/cloud_memorystore/index.html#airflow.providers.google.cloud.operators.cloud_memorystore.CloudMemorystoreUpdateInstanceOperator" title="airflow.providers.google.cloud.operators.cloud_memorystore.CloudMemorystoreUpdateInstanceOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">CloudMemorystoreUpdateInstanceOperator</span></code></a> operator.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_cloud_memorystore.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_cloud_memorystore.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_cloud_memorystore.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_cloud_memorystore.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">update_instance</span> <span class="o">=</span> <span class="n">CloudMemorystoreUpdateInstanceOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;update-instance&quot;</span><span class="p">,</span>
     <span class="n">location</span><span class="o">=</span><span class="s2">&quot;europe-north1&quot;</span><span class="p">,</span>
@@ -823,7 +823,7 @@ parameters which allows you to dynamically determine values.</p>
 <p>Scale a instance is performed with the
 <a class="reference internal" href="../../_api/airflow/providers/google/cloud/operators/cloud_memorystore/index.html#airflow.providers.google.cloud.operators.cloud_memorystore.CloudMemorystoreScaleInstanceOperator" title="airflow.providers.google.cloud.operators.cloud_memorystore.CloudMemorystoreScaleInstanceOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">CloudMemorystoreScaleInstanceOperator</span></code></a> operator.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_cloud_memorystore.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_cloud_memorystore.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_cloud_memorystore.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_cloud_memorystore.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">scale_instance</span> <span class="o">=</span> <span class="n">CloudMemorystoreScaleInstanceOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;scale-instance&quot;</span><span class="p">,</span>
     <span class="n">location</span><span class="o">=</span><span class="s2">&quot;europe-north1&quot;</span><span class="p">,</span>
diff --git a/docs-archive/apache-airflow-providers-google/6.3.0/operators/cloud/cloud_memorystore_memcached.html b/docs-archive/apache-airflow-providers-google/6.3.0/operators/cloud/cloud_memorystore_memcached.html
index 5ea9f6387c..d8e7ff9e6e 100644
--- a/docs-archive/apache-airflow-providers-google/6.3.0/operators/cloud/cloud_memorystore_memcached.html
+++ b/docs-archive/apache-airflow-providers-google/6.3.0/operators/cloud/cloud_memorystore_memcached.html
@@ -630,7 +630,7 @@ Memcached deployments.</p>
 The object can be presented as a compatible dictionary also.</p>
 <p>Here is an example of instance</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_cloud_memorystore.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_cloud_memorystore.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_cloud_memorystore.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_cloud_memorystore.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">MEMCACHED_INSTANCE</span> <span class="o">=</span> <span class="p">{</span><span class="s2">&quot;name&quot;</span><span class="p">:</span> <span class="s2">&quot;&quot;</span><span class="p">,</span> <span class="s2">&quot;node_count&quot;</span><span class="p">:</span> <span class="mi">1</span><span class="p">,</span> <span class="s2">&quot;node_config&quot;</span><span class="p">:</span> [...]
 </pre></div>
 </div>
@@ -642,7 +642,7 @@ The object can be presented as a compatible dictionary also.</p>
 <a class="reference internal" href="../../_api/airflow/providers/google/cloud/operators/cloud_memorystore/index.html#airflow.providers.google.cloud.operators.cloud_memorystore.CloudMemorystoreMemcachedCreateInstanceOperator" title="airflow.providers.google.cloud.operators.cloud_memorystore.CloudMemorystoreMemcachedCreateInstanceOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">CloudMemorystoreMemcachedCreateInstanceOperator</span></code></a>
 operator.</p>
 <div class="example-block-wrapper docutils container">
-<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_cloud_memorystore.py</span><a class="example-header-button viewcode-button reference internal" href="../../_modules/airflow/providers/google/cloud/example_dags/example_cloud_memorystore.html"><span class="viewcode-link">[source]</span></a></p>
+<p class="example-header example-header--with-button"><span class="example-title">airflow/providers/google/cloud/example_dags/example_cloud_memorystore.py</span><a class="example-header-button viewcode-button reference internal" href="https://github.com/apache/airflow/tree/providers-google/6.3.0/airflow/providers/google/cloud/example_dags/example_cloud_memorystore.py" target="_blank">[source]</span></a></p>
 <div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">create_memcached_instance</span> <span class="o">=</span> <span class="n">CloudMemorystoreMemcachedCreateInstanceOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s2">&quot;create-instance&quot;</span><span class="p">,</span>
     <span class="n">location</span><span class="o">=</span><span class="s2">&quot;europe-north1&quot;</span><span class="p">,</span>
@@ -660,7 +660,7 @@ operator.</p>
 <a class="reference internal" href="../../_api/airflow/providers/google/cloud/operators/cloud_memorystore/index.html#airflow.providers.google.cloud.operators.cloud_memorystore.CloudMemorystoreMemcachedDeleteInstanceOperator" title="airflow.providers.google.cloud.operators.cloud_memorystore.CloudMemorystoreMemcachedDeleteInstanceOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">CloudMemorystoreMemcachedDeleteInstanceOperator</span></code></a>
 operator.</p>
... 42757 lines suppressed ...