You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by GitBox <gi...@apache.org> on 2020/01/31 21:00:13 UTC

[GitHub] [airflow] mik-laj commented on a change in pull request #7307: [AIRFLOW-6676] added GCSDeleteBucketOperator

mik-laj commented on a change in pull request #7307: [AIRFLOW-6676] added GCSDeleteBucketOperator
URL: https://github.com/apache/airflow/pull/7307#discussion_r373682965
 
 

 ##########
 File path: airflow/providers/google/cloud/example_dags/example_gcs.py
 ##########
 @@ -127,7 +128,32 @@
         task_id="delete_files", bucket_name=BUCKET_1, objects=[BUCKET_FILE_LOCATION]
     )
 
+    # [START howto_operator_gcs_delete_bucket]
+    delete_bucket_1 = GCSDeleteBucketOperator(task_id="delete_bucket", bucket_name=BUCKET_1)
+    delete_bucket_2 = GCSDeleteBucketOperator(task_id="delete_bucket", bucket_name=BUCKET_2)
+    # [END howto_operator_gcs_delete_bucket]
+
     [create_bucket1, create_bucket2] >> list_buckets >> list_buckets_result
     [create_bucket1, create_bucket2] >> upload_file
     upload_file >> [download_file, copy_file]
     upload_file >> gcs_bucket_create_acl_entry_task >> gcs_object_create_acl_entry_task >> delete_files
+
+    create_bucket1 >> delete_bucket_1
+    create_bucket2 >> delete_bucket_2
+    create_bucket2 >> copy_file
+    create_bucket1 >> copy_file
+    list_buckets >> delete_bucket_1
+    upload_file >> delete_bucket_1
+    create_bucket1 >> upload_file >> delete_bucket_1
+    transform_file >> delete_bucket_1
+    gcs_bucket_create_acl_entry_task >> delete_bucket_1
+    gcs_object_create_acl_entry_task >> delete_bucket_1
+    download_file >> delete_bucket_1
+    copy_file >> delete_bucket_1
+    copy_file >> delete_bucket_2
+    delete_files >> delete_bucket_1
 
 Review comment:
   The previous version was incorrect in a few cases. This can be simplified if you look at the fact that some of the requirements are included in others, but the current version makes it easier to understand the relationship between tasks. I note that in Airflow we define requirements and Airflow can automatically detect redundancy and optimize it. If there is something the computer can do, the computer should do it.
   
   I'm worried about task `transform_file`, which has no upstream dependencies, so so this task can be executed before the bucket is created.
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services