You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by GitBox <gi...@apache.org> on 2022/06/06 17:10:44 UTC

[GitHub] [airflow] o-nikolas commented on a diff in pull request #24239: AIP-47 - Migrate redshift DAGs to new design #22438

o-nikolas commented on code in PR #24239:
URL: https://github.com/apache/airflow/pull/24239#discussion_r890355257


##########
tests/system/providers/amazon/aws/redshift/example_redshift_data_execute_sql.py:
##########
@@ -22,7 +22,11 @@
 from airflow.decorators import task
 from airflow.providers.amazon.aws.hooks.redshift_data import RedshiftDataHook
 from airflow.providers.amazon.aws.operators.redshift_data import RedshiftDataOperator
+from airflow.utils.trigger_rule import TriggerRule
+from tests.system.providers.amazon.aws.utils import set_env_id
 
+ENV_ID = set_env_id()
+DAG_ID = 'example_redshift_cluster'
 REDSHIFT_CLUSTER_IDENTIFIER = getenv("REDSHIFT_CLUSTER_IDENTIFIER", "redshift_cluster_identifier")
 REDSHIFT_DATABASE = getenv("REDSHIFT_DATABASE", "redshift_database")
 REDSHIFT_DATABASE_USER = getenv("REDSHIFT_DATABASE_USER", "awsuser")

Review Comment:
   Please create these resources during a test setup phase (using env id to keep resource identifiers unique). If any resources are unreasonable to create in the test then please use [fetch_variable](https://github.com/ferruzzi/airflow/blob/18896156aac8c62c3fe098b546ee601ff386da98/tests/system/providers/amazon/aws/utils/__init__.py#L105) instead of getenv (this allows test configuration to live in SSM Parameterstore for easier test automation).



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org