You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by GitBox <gi...@apache.org> on 2021/01/13 05:47:52 UTC

[GitHub] [airflow] hamsapriyak edited a comment on issue #11983: airflow.contrib.operators.pubsub_operator.PubSubPublishOperator can't handle multiple messages in a template

hamsapriyak edited a comment on issue #11983:
URL: https://github.com/apache/airflow/issues/11983#issuecomment-758799172


   It is important that the render_templates(context) returns **python native types.** Without this, even if there are airflow operators available we end up creating additional custom operators to handle this . 
   
   
   **Eg:.** We need to delete files **[‘file_1’, ‘file_2’, ‘file_3’]** from s3 bucket. This file list is passed from a previous task (get_files) via Xcom.  Although  S3DeleteObjectsOperator accepts list of keys , due to templating it is changed as  **"[‘file_1’, ‘file_2’, ‘file_3’]"**. To handle this we need to create a custom operator which would receive this keys via context and call the s3_hook. 
   
   
   **Sample Code using templating:**
   ```
   get_files = S3ListOperator(
       task_id='get_files',
       bucket='source_bucket',
       dag=dag
   )
   
   delete_files = S3DeleteObjectsOperator(
       task_id='delete_files',
       bucket=source_bucket,
       keys=" {{ task_instance.xcom_pull(task_ids='get_files') }}",
       dag=dag
   )
   
   get_files >> delete_files
   ```
   
   
   **Log output:**
   
   {s3_delete_objects_operator.py:83} INFO - Deleted: [“[‘file_1’, ‘file_2’, ‘file_3]”]
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org