You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by GitBox <gi...@apache.org> on 2022/08/24 10:25:16 UTC

[GitHub] [airflow] vara-bonthu opened a new pull request, #25931: Added Doc - Airflow remote logging with S3 and IRSA

vara-bonthu opened a new pull request, #25931:
URL: https://github.com/apache/airflow/pull/25931

   <!--
   Thank you for contributing! Please make sure that your code changes
   are covered with tests. And in case of new features or big changes
   remember to adjust the documentation.
   
   Feel free to ping committers for the review!
   
   In case of an existing issue, reference it using one of the following:
   
   closes: #25322
   related: #ISSUE
   
   How to write a good git commit message:
   http://chris.beams.io/posts/git-commit/
   -->
   
   ## Description
   This doc helps to configure Airflow remote logging for Amazon S3 using IAM Roles for Service Accounts(IRSA) by providing detailed instructions. This also provides steps to create IRSA using `eksctl` and fully production ready Self-managed Apache Airflow Terraform deployment with Helm. 
   
   This is my first my PR. However I am looking forward to contribute more :) 
   
   Thanks for your help
   
   ## Motivation
   I am using the latest Helm Chart version (1.6.0) to deploy Airflow on **Amazon EKS** and trying to configure **S3 for logging**. I found few docs that explain how to add logging variables through `values.yaml` but couldn't find enough information to setup remote logging with S3 using IAM Roles for Service Accounts(IRSA). Hence i decided to contribute to this repo with the instructions i followed to setup suecessfully.
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request Guidelines](https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#pull-request-guidelines)** for more information.
   In case of fundamental code changes, an Airflow Improvement Proposal ([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvement+Proposals)) is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in a newsfragment file, named `{pr_number}.significant.rst` or `{issue_number}.significant.rst`, in [newsfragments](https://github.com/apache/airflow/tree/main/newsfragments).
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] Taragolis commented on a diff in pull request #25931: Added Doc - Airflow remote logging with S3 and IRSA

Posted by GitBox <gi...@apache.org>.
Taragolis commented on code in PR #25931:
URL: https://github.com/apache/airflow/pull/25931#discussion_r954748912


##########
docs/apache-airflow-providers-amazon/logging/s3-task-handler.rst:
##########
@@ -47,3 +47,86 @@ You can also use `LocalStack <https://localstack.cloud/>`_ to emulate Amazon S3
 To configure it, you must additionally set the endpoint url to point to your local stack.
 You can do this via the Connection Extra ``host`` field.
 For example, ``{"host": "http://localstack:4572"}``
+
+Enabling remote logging for Amazon S3 with AWS IRSA
+'''''''''''''''''''''''''''''''''''''''''''''''''''
+`IRSA <https://docs.aws.amazon.com/eks/latest/userguide/iam-roles-for-service-accounts.html>`_ is a feature that allows you to assign an IAM role to a Kubernetes service account.
+It works by leveraging a `Kubernetes <https://kubernetes.io/>`_ feature known as `Service Account <https://kubernetes.io/docs/tasks/configure-pod-container/configure-service-account/>`_ Token Volume Projection.
+When Pods are configured with a Service Account that references an `IAM Role <https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles.html>`_, the Kubernetes API server will call the public OIDC discovery endpoint for the cluster on startup.When an AWS API is invoked, the AWS SDKs calls ``sts:AssumeRoleWithWebIdentity``. IAM exchanges the Kubernetes issued token for a temporary AWS role credential after validating the token's signature.
+
+It's recommended best practise to use IAM Role for ServiceAccounts to access AWS services(e.g., S3) from Amazon EKS.
+The steps below guides you to create a new IAM role with ServiceAccount and use with Airflow WebServers and Workers (Kubernetes Executors) Pods.
+
+Step1: Create IAM role for service account (IRSA)
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+This step is creating IAM role and service account using `eksctl <https://eksctl.io/>`_.
+Also, note that this example is using managed policy with full S3 permissions attached to the IAM role. This is only used for testing purpose.
+We highly recommend you to create a restricted S3 IAM policy and use it with ``--attach-policy-arn``
+
+Alternatively, you can use other IaC tools like Terraform. For deploying Airflow with Terraform including IRSA. Checkout this example `link <https://github.com/aws-ia/terraform-aws-eks-blueprints/tree/main/examples/analytics/airflow-on-eks>`_.
+
+Execute the following command by providing all the necessary inputs.
+
+.. code-block:: bash
+
+    eksctl create iamserviceaccount --cluster="<EKS_CLUSTER_ID>" --name="<SERVICE_ACCOUNT_NAME>" --namespace="<NAMESPACE>" --attach-policy-arn="<IAM_POLICY_ARN>" --approve``
+
+Example with sample inputs
+
+.. code-block:: bash
+
+    eksctl create iamserviceaccount --cluster=airflow-eks-cluster --name=airflow-sa --namespace=airflow --attach-policy-arn=arn:aws:iam::aws:policy/AmazonS3FullAccess --approve
+
+
+Step2: Update Helm Chart `values.yaml` with Service Account
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+This step is using `Airflow Helm Chart <https://github.com/apache/airflow/tree/main/chart>`_ deployment.
+If you are deploying Airflow using Helm Chart then you can modify the ``values.yaml`` as mentioned below.
+Add the Service Account (e.g., ``airflow-sa``) created by Step1 to Helm Chart ``values.yaml`` under the following sections.
+We are  using the existing ``serviceAccount`` hence ``create: false`` with existing name as ``name: airflow-sa``.
+
+
+.. code-block:: yaml
+
+    workers:
+      serviceAccount:
+        create: false
+        name: airflow-sa
+        # Annotations are automatically added by **Step1** to serviceAccount. So, you dont need to mention the annotations. We have added this for information purpose
+        annotations:
+          eks.amazonaws.com/role-arn: <ENTER_IAM_ROLE_ARN_CREATED_BY_EKSCTL_COMMAND>
+
+    webserver:
+      serviceAccount:
+        create: false
+        name: airflow-sa
+        # Annotations are automatically added by **Step1** to serviceAccount. So, you dont need to mention the annotations. We have added this for information purpose
+        annotations:
+          eks.amazonaws.com/role-arn: <ENTER_IAM_ROLE_ARN_CREATED_BY_EKSCTL_COMMAND
+
+    config:
+      logging:
+        remote_logging: 'True'
+        logging_level: 'INFO'
+        remote_base_log_folder: 's3://<ENTER_YOUR_BUCKET_NAME>/<FOLDER_PATH' # Specify the S3 bucket used for logging
+        remote_log_conn_id: 'aws_s3_conn' # Notice that this name is used in Step3 for creating connections through Airflow UI
+        delete_worker_pods: 'False'
+        encrypt_s3_logs: 'True'
+
+Step3: Create Amazon S3 connection in Airflow Web UI
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+With the above configurations, Webserver and Worker Pods can access Amazon S3 bucket and write logs without using any Access Key and Secret Key or Instance profile credentials.
+
+The final step to create connections under Airflow UI before executing the DAGs
+
+* Login to Airflow Web UI with ``admin`` credentials and Navigate to ``Admin -> Connections``
+* Create connection for ``S3`` and select the options(Connection ID and Connection Type) as shown in the image.

Review Comment:
   S3Hook, as well as all of hooks based on AwsBaseHook, intend to use [Amazon Web Services Connection](https://airflow.apache.org/docs/apache-airflow-providers-amazon/stable/connections/aws.html).
   
   It could use Amazon S3 Connection (not documented, exists for historical reason???) but in this case users get UserWarning that connection has incorrect connection type.



##########
docs/apache-airflow-providers-amazon/logging/s3-task-handler.rst:
##########
@@ -47,3 +47,86 @@ You can also use `LocalStack <https://localstack.cloud/>`_ to emulate Amazon S3
 To configure it, you must additionally set the endpoint url to point to your local stack.
 You can do this via the Connection Extra ``host`` field.
 For example, ``{"host": "http://localstack:4572"}``
+
+Enabling remote logging for Amazon S3 with AWS IRSA
+'''''''''''''''''''''''''''''''''''''''''''''''''''

Review Comment:
   [Using IAM Roles for Service Accounts (IRSA) on EKS](https://airflow.apache.org/docs/apache-airflow-providers-amazon/stable/connections/aws.html#using-iam-roles-for-service-accounts-irsa-on-eks) already contain information about IRSA on EKS in Connection page.
   
   Might be better append general configuration stuff in connection rather than only for S3 Logging?
   This configurations might be useful not only for S3 Logging but also in secrets backends and Cloudwatch Logging



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] vara-bonthu commented on pull request #25931: Added Doc - Airflow remote logging with S3 and IRSA

Posted by GitBox <gi...@apache.org>.
vara-bonthu commented on PR #25931:
URL: https://github.com/apache/airflow/pull/25931#issuecomment-1269751338

   Apologies for the delay! Branch is rebased now.
   I have issues setting up breeze to run build docs locally. I will update the PR by tomorrow once these are resolved.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk merged pull request #25931: Added Doc - Airflow remote logging with S3 and IRSA

Posted by GitBox <gi...@apache.org>.
potiuk merged PR #25931:
URL: https://github.com/apache/airflow/pull/25931


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] boring-cyborg[bot] commented on pull request #25931: Added Doc - Airflow remote logging with S3 and IRSA

Posted by GitBox <gi...@apache.org>.
boring-cyborg[bot] commented on PR #25931:
URL: https://github.com/apache/airflow/pull/25931#issuecomment-1272972968

   Awesome work, congrats on your first merged pull request!
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on pull request #25931: Added Doc - Airflow remote logging with S3 and IRSA

Posted by GitBox <gi...@apache.org>.
potiuk commented on PR #25931:
URL: https://github.com/apache/airflow/pull/25931#issuecomment-1272972734

   Cool. Thanks @vara-bonthu !


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on pull request #25931: Added Doc - Airflow remote logging with S3 and IRSA

Posted by GitBox <gi...@apache.org>.
potiuk commented on PR #25931:
URL: https://github.com/apache/airflow/pull/25931#issuecomment-1229065078

   Approved pending addressing the points raised by @Taragolis 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] Taragolis commented on pull request #25931: Added Doc - Airflow remote logging with S3 and IRSA

Posted by GitBox <gi...@apache.org>.
Taragolis commented on PR #25931:
URL: https://github.com/apache/airflow/pull/25931#issuecomment-1234370434

   @vara-bonthu 
   `docs/apache-airflow-providers-amazon/img/aws-base-conn-airflow.png` - this image contains `{"region": "eu-west-1"}` rather than `{"region_name": "eu-west-1"}`
   
   Also there is some format/spelling errors (IRSA, IaC and eksctl not in `docs/spelling_wordlist.txt`), you could try to [building the documentation locally](https://github.com/apache/airflow/blob/main/BREEZE.rst#building-the-documentation)
   
   ```shell
   breeze build-docs --package-filter apache-airflow-providers-amazon
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk commented on pull request #25931: Added Doc - Airflow remote logging with S3 and IRSA

Posted by GitBox <gi...@apache.org>.
potiuk commented on PR #25931:
URL: https://github.com/apache/airflow/pull/25931#issuecomment-1250386390

   Static checks/docs are still failing @vara-bonthu . This needs rebase and fixing.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] vara-bonthu commented on a diff in pull request #25931: Added Doc - Airflow remote logging with S3 and IRSA

Posted by GitBox <gi...@apache.org>.
vara-bonthu commented on code in PR #25931:
URL: https://github.com/apache/airflow/pull/25931#discussion_r957164369


##########
docs/apache-airflow-providers-amazon/logging/s3-task-handler.rst:
##########
@@ -47,3 +47,86 @@ You can also use `LocalStack <https://localstack.cloud/>`_ to emulate Amazon S3
 To configure it, you must additionally set the endpoint url to point to your local stack.
 You can do this via the Connection Extra ``host`` field.
 For example, ``{"host": "http://localstack:4572"}``
+
+Enabling remote logging for Amazon S3 with AWS IRSA
+'''''''''''''''''''''''''''''''''''''''''''''''''''

Review Comment:
   I have updated the aws.rst file as well with general config



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] eladkal commented on pull request #25931: Added Doc - Airflow remote logging with S3 and IRSA

Posted by GitBox <gi...@apache.org>.
eladkal commented on PR #25931:
URL: https://github.com/apache/airflow/pull/25931#issuecomment-1269435132

   @vara-bonthu can you please rebase and fix the tests?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] vara-bonthu commented on pull request #25931: Added Doc - Airflow remote logging with S3 and IRSA

Posted by GitBox <gi...@apache.org>.
vara-bonthu commented on PR #25931:
URL: https://github.com/apache/airflow/pull/25931#issuecomment-1272496497

   @potiuk @Taragolis @eladkal Thanks for your review. I have update the PR with all the changes and the checks are passing now.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] vara-bonthu commented on a diff in pull request #25931: Added Doc - Airflow remote logging with S3 and IRSA

Posted by GitBox <gi...@apache.org>.
vara-bonthu commented on code in PR #25931:
URL: https://github.com/apache/airflow/pull/25931#discussion_r956701340


##########
docs/apache-airflow-providers-amazon/logging/s3-task-handler.rst:
##########
@@ -47,3 +47,86 @@ You can also use `LocalStack <https://localstack.cloud/>`_ to emulate Amazon S3
 To configure it, you must additionally set the endpoint url to point to your local stack.
 You can do this via the Connection Extra ``host`` field.
 For example, ``{"host": "http://localstack:4572"}``
+
+Enabling remote logging for Amazon S3 with AWS IRSA
+'''''''''''''''''''''''''''''''''''''''''''''''''''

Review Comment:
   100% agree on adding IRSA configuration to [this page](https://airflow.apache.org/docs/apache-airflow-providers-amazon/stable/connections/aws.html#using-iam-roles-for-service-accounts-irsa-on-eks).
   
   I will update the page
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] vara-bonthu commented on a diff in pull request #25931: Added Doc - Airflow remote logging with S3 and IRSA

Posted by GitBox <gi...@apache.org>.
vara-bonthu commented on code in PR #25931:
URL: https://github.com/apache/airflow/pull/25931#discussion_r956719047


##########
docs/apache-airflow-providers-amazon/logging/s3-task-handler.rst:
##########
@@ -47,3 +47,86 @@ You can also use `LocalStack <https://localstack.cloud/>`_ to emulate Amazon S3
 To configure it, you must additionally set the endpoint url to point to your local stack.
 You can do this via the Connection Extra ``host`` field.
 For example, ``{"host": "http://localstack:4572"}``
+
+Enabling remote logging for Amazon S3 with AWS IRSA
+'''''''''''''''''''''''''''''''''''''''''''''''''''
+`IRSA <https://docs.aws.amazon.com/eks/latest/userguide/iam-roles-for-service-accounts.html>`_ is a feature that allows you to assign an IAM role to a Kubernetes service account.
+It works by leveraging a `Kubernetes <https://kubernetes.io/>`_ feature known as `Service Account <https://kubernetes.io/docs/tasks/configure-pod-container/configure-service-account/>`_ Token Volume Projection.
+When Pods are configured with a Service Account that references an `IAM Role <https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles.html>`_, the Kubernetes API server will call the public OIDC discovery endpoint for the cluster on startup.When an AWS API is invoked, the AWS SDKs calls ``sts:AssumeRoleWithWebIdentity``. IAM exchanges the Kubernetes issued token for a temporary AWS role credential after validating the token's signature.
+
+It's recommended best practise to use IAM Role for ServiceAccounts to access AWS services(e.g., S3) from Amazon EKS.
+The steps below guides you to create a new IAM role with ServiceAccount and use with Airflow WebServers and Workers (Kubernetes Executors) Pods.
+
+Step1: Create IAM role for service account (IRSA)
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+This step is creating IAM role and service account using `eksctl <https://eksctl.io/>`_.
+Also, note that this example is using managed policy with full S3 permissions attached to the IAM role. This is only used for testing purpose.
+We highly recommend you to create a restricted S3 IAM policy and use it with ``--attach-policy-arn``
+
+Alternatively, you can use other IaC tools like Terraform. For deploying Airflow with Terraform including IRSA. Checkout this example `link <https://github.com/aws-ia/terraform-aws-eks-blueprints/tree/main/examples/analytics/airflow-on-eks>`_.
+
+Execute the following command by providing all the necessary inputs.
+
+.. code-block:: bash
+
+    eksctl create iamserviceaccount --cluster="<EKS_CLUSTER_ID>" --name="<SERVICE_ACCOUNT_NAME>" --namespace="<NAMESPACE>" --attach-policy-arn="<IAM_POLICY_ARN>" --approve``
+
+Example with sample inputs
+
+.. code-block:: bash
+
+    eksctl create iamserviceaccount --cluster=airflow-eks-cluster --name=airflow-sa --namespace=airflow --attach-policy-arn=arn:aws:iam::aws:policy/AmazonS3FullAccess --approve
+
+
+Step2: Update Helm Chart `values.yaml` with Service Account
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+This step is using `Airflow Helm Chart <https://github.com/apache/airflow/tree/main/chart>`_ deployment.
+If you are deploying Airflow using Helm Chart then you can modify the ``values.yaml`` as mentioned below.
+Add the Service Account (e.g., ``airflow-sa``) created by Step1 to Helm Chart ``values.yaml`` under the following sections.
+We are  using the existing ``serviceAccount`` hence ``create: false`` with existing name as ``name: airflow-sa``.
+
+
+.. code-block:: yaml
+
+    workers:
+      serviceAccount:
+        create: false
+        name: airflow-sa
+        # Annotations are automatically added by **Step1** to serviceAccount. So, you dont need to mention the annotations. We have added this for information purpose
+        annotations:
+          eks.amazonaws.com/role-arn: <ENTER_IAM_ROLE_ARN_CREATED_BY_EKSCTL_COMMAND>
+
+    webserver:
+      serviceAccount:
+        create: false
+        name: airflow-sa
+        # Annotations are automatically added by **Step1** to serviceAccount. So, you dont need to mention the annotations. We have added this for information purpose
+        annotations:
+          eks.amazonaws.com/role-arn: <ENTER_IAM_ROLE_ARN_CREATED_BY_EKSCTL_COMMAND
+
+    config:
+      logging:
+        remote_logging: 'True'
+        logging_level: 'INFO'
+        remote_base_log_folder: 's3://<ENTER_YOUR_BUCKET_NAME>/<FOLDER_PATH' # Specify the S3 bucket used for logging
+        remote_log_conn_id: 'aws_s3_conn' # Notice that this name is used in Step3 for creating connections through Airflow UI
+        delete_worker_pods: 'False'
+        encrypt_s3_logs: 'True'
+
+Step3: Create Amazon S3 connection in Airflow Web UI
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+With the above configurations, Webserver and Worker Pods can access Amazon S3 bucket and write logs without using any Access Key and Secret Key or Instance profile credentials.
+
+The final step to create connections under Airflow UI before executing the DAGs
+
+* Login to Airflow Web UI with ``admin`` credentials and Navigate to ``Admin -> Connections``
+* Create connection for ``S3`` and select the options(Connection ID and Connection Type) as shown in the image.

Review Comment:
   Thanks! It does make sense to use AWS base connection. I have updated the instructions and the image to reflect the Amazon Web Services base connection. 



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] boring-cyborg[bot] commented on pull request #25931: Added Doc - Airflow remote logging with S3 and IRSA

Posted by GitBox <gi...@apache.org>.
boring-cyborg[bot] commented on PR #25931:
URL: https://github.com/apache/airflow/pull/25931#issuecomment-1225527322

   Congratulations on your first Pull Request and welcome to the Apache Airflow community! If you have any issues or are unsure about any anything please check our Contribution Guide (https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst)
   Here are some useful points:
   - Pay attention to the quality of your code (flake8, mypy and type annotations). Our [pre-commits]( https://github.com/apache/airflow/blob/main/STATIC_CODE_CHECKS.rst#prerequisites-for-pre-commit-hooks) will help you with that.
   - In case of a new feature add useful documentation (in docstrings or in `docs/` directory). Adding a new operator? Check this short [guide](https://github.com/apache/airflow/blob/main/docs/apache-airflow/howto/custom-operator.rst) Consider adding an example DAG that shows how users should use it.
   - Consider using [Breeze environment](https://github.com/apache/airflow/blob/main/BREEZE.rst) for testing locally, it's a heavy docker but it ships with a working Airflow and a lot of integrations.
   - Be patient and persistent. It might take some time to get a review or get the final approval from Committers.
   - Please follow [ASF Code of Conduct](https://www.apache.org/foundation/policies/conduct) for all communication including (but not limited to) comments on Pull Requests, Mailing list and Slack.
   - Be sure to read the [Airflow Coding style]( https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#coding-style-and-best-practices).
   Apache Airflow is a community-driven project and together we are making it better 🚀.
   In case of doubts contact the developers at:
   Mailing List: dev@airflow.apache.org
   Slack: https://s.apache.org/airflow-slack
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] vara-bonthu commented on pull request #25931: Added Doc - Airflow remote logging with S3 and IRSA

Posted by GitBox <gi...@apache.org>.
vara-bonthu commented on PR #25931:
URL: https://github.com/apache/airflow/pull/25931#issuecomment-1234311137

   @Taragolis Could you please review the changes to see if this is good to go. Thanks


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org