You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by GitBox <gi...@apache.org> on 2021/08/20 18:54:59 UTC

[GitHub] [airflow] denimalpaca opened a new pull request #17741: Add Snowflake DQ Operators

denimalpaca opened a new pull request #17741:
URL: https://github.com/apache/airflow/pull/17741


   Add three new Snowflake operators based on SQL Checks
   
   The SnowflakeCheckOperator, SnowflakeValueCheckOperator, and
   SnowflakeIntervalCheckOperators are added as subclasses of their respective
   SQL Operators. These additions follow the conventions set in the BigQueryOperators
   subclassing from the same SQL_CheckOperators.
   
   closes: #17694


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] potiuk commented on a change in pull request #17741: Add Snowflake DQ Operators

Posted by GitBox <gi...@apache.org>.
potiuk commented on a change in pull request #17741:
URL: https://github.com/apache/airflow/pull/17741#discussion_r698722419



##########
File path: airflow/providers/snowflake/operators/snowflake.py
##########
@@ -125,3 +126,285 @@ def execute(self, context: Any) -> None:
 
         if self.do_xcom_push:
             return execution_info
+
+
+class _SnowflakeDbHookMixin:
+    def get_db_hook(self) -> SnowflakeHook:
+        """
+        Create and return SnowflakeHook.
+
+        :return: a SnowflakeHook instance.
+        :rtype: SnowflakeHook
+        """
+        return SnowflakeHook(
+            snowflake_conn_id=self.snowflake_conn_id,
+            warehouse=self.warehouse,
+            database=self.database,
+            role=self.role,
+            schema=self.schema,
+            authenticator=self.authenticator,
+            session_parameters=self.session_parameters,
+        )
+
+
+class SnowflakeCheckOperator(_SnowflakeDbHookMixin, SQLCheckOperator):
+    """
+    Performs a check against Snowflake. The ``SnowflakeCheckOperator`` expects
+    a sql query that will return a single row. Each value on that
+    first row is evaluated using python ``bool`` casting. If any of the
+    values return ``False`` the check is failed and errors out.
+
+    Note that Python bool casting evals the following as ``False``:
+
+    * ``False``
+    * ``0``
+    * Empty string (``""``)
+    * Empty list (``[]``)
+    * Empty dictionary or set (``{}``)
+
+    Given a query like ``SELECT COUNT(*) FROM foo``, it will fail only if
+    the count ``== 0``. You can craft much more complex query that could,
+    for instance, check that the table has the same number of rows as
+    the source table upstream, or that the count of today's partition is
+    greater than yesterday's partition, or that a set of metrics are less
+    than 3 standard deviation for the 7 day average.
+
+    This operator can be used as a data quality check in your pipeline, and
+    depending on where you put it in your DAG, you have the choice to
+    stop the critical path, preventing from
+    publishing dubious data, or on the side and receive email alerts
+    without stopping the progress of the DAG.
+
+    :param sql: the sql code to be executed. (templated)
+    :type sql: Can receive a str representing a sql statement,
+        a list of str (sql statements), or reference to a template file.
+        Template reference are recognized by str ending in '.sql'
+    :param snowflake_conn_id: Reference to
+        :ref:`Snowflake connection id<howto/connection:snowflake>`
+    :type snowflake_conn_id: str
+    :param autocommit: if True, each command is automatically committed.
+        (default value: True)
+    :type autocommit: bool
+    :param parameters: (optional) the parameters to render the SQL query with.
+    :type parameters: dict or iterable
+    :param warehouse: name of warehouse (will overwrite any warehouse
+        defined in the connection's extra JSON)
+    :type warehouse: str
+    :param database: name of database (will overwrite database defined
+        in connection)
+    :type database: str
+    :param schema: name of schema (will overwrite schema defined in
+        connection)
+    :type schema: str
+    :param role: name of role (will overwrite any role defined in
+        connection's extra JSON)
+    :type role: str
+    :param authenticator: authenticator for Snowflake.
+        'snowflake' (default) to use the internal Snowflake authenticator
+        'externalbrowser' to authenticate using your web browser and
+        Okta, ADFS or any other SAML 2.0-compliant identify provider
+        (IdP) that has been defined for your account
+        'https://<your_okta_account_name>.okta.com' to authenticate
+        through native Okta.
+    :type authenticator: str
+    :param session_parameters: You can set session-level parameters at
+        the time you connect to Snowflake
+    :type session_parameters: dict
+    """
+
+    template_fields = ('sql',)
+    template_ext = ('.sql',)
+    ui_color = '#ededed'
+
+    def __init__(
+        self,
+        *,
+        sql: Any,
+        snowflake_conn_id: str = 'snowflake_default',
+        parameters: Optional[dict] = None,
+        autocommit: bool = True,
+        do_xcom_push: bool = True,
+        warehouse: Optional[str] = None,
+        database: Optional[str] = None,
+        role: Optional[str] = None,
+        schema: Optional[str] = None,
+        authenticator: Optional[str] = None,
+        session_parameters: Optional[dict] = None,
+        **kwargs,
+    ) -> None:
+        super().__init__(sql=sql, **kwargs)
+        self.snowflake_conn_id = snowflake_conn_id
+        self.sql = sql
+        self.autocommit = autocommit
+        self.do_xcom_push = do_xcom_push
+        self.parameters = parameters
+        self.warehouse = warehouse
+        self.database = database
+        self.role = role
+        self.schema = schema
+        self.authenticator = authenticator
+        self.session_parameters = session_parameters
+        self.query_ids = []
+
+
+class SnowflakeValueCheckOperator(_SnowflakeDbHookMixin, SQLValueCheckOperator):
+    """
+    Performs a simple value check using sql code.
+
+    :param sql: the sql to be executed
+    :type sql: str
+    :param snowflake_conn_id: Reference to
+        :ref:`Snowflake connection id<howto/connection:snowflake>`
+    :type snowflake_conn_id: str
+    :param autocommit: if True, each command is automatically committed.
+        (default value: True)
+    :type autocommit: bool
+    :param parameters: (optional) the parameters to render the SQL query with.
+    :type parameters: dict or iterable
+    :param warehouse: name of warehouse (will overwrite any warehouse
+        defined in the connection's extra JSON)
+    :type warehouse: str
+    :param database: name of database (will overwrite database defined
+        in connection)
+    :type database: str
+    :param schema: name of schema (will overwrite schema defined in
+        connection)
+    :type schema: str
+    :param role: name of role (will overwrite any role defined in
+        connection's extra JSON)
+    :type role: str
+    :param authenticator: authenticator for Snowflake.
+        'snowflake' (default) to use the internal Snowflake authenticator
+        'externalbrowser' to authenticate using your web browser and
+        Okta, ADFS or any other SAML 2.0-compliant identify provider
+        (IdP) that has been defined for your account
+        'https://<your_okta_account_name>.okta.com' to authenticate
+        through native Okta.
+    :type authenticator: str
+    :param session_parameters: You can set session-level parameters at
+        the time you connect to Snowflake
+    :type session_parameters: dict
+    """
+
+    def __init__(
+        self,
+        *,
+        sql: Any,

Review comment:
       Good point too.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] kaxil commented on a change in pull request #17741: Add Snowflake DQ Operators

Posted by GitBox <gi...@apache.org>.
kaxil commented on a change in pull request #17741:
URL: https://github.com/apache/airflow/pull/17741#discussion_r705792140



##########
File path: airflow/providers/snowflake/operators/snowflake.py
##########
@@ -100,28 +119,293 @@ def __init__(
         self.session_parameters = session_parameters
         self.query_ids = []
 
-    def get_hook(self) -> SnowflakeHook:
-        """
-        Create and return SnowflakeHook.
-        :return: a SnowflakeHook instance.
-        :rtype: SnowflakeHook
-        """
-        return SnowflakeHook(
-            snowflake_conn_id=self.snowflake_conn_id,
-            warehouse=self.warehouse,
-            database=self.database,
-            role=self.role,
-            schema=self.schema,
-            authenticator=self.authenticator,
-            session_parameters=self.session_parameters,
-        )
+    def get_db_hook(self) -> SnowflakeHook:
+        return get_db_hook(self)

Review comment:
       ???




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] denimalpaca commented on a change in pull request #17741: Add Snowflake DQ Operators

Posted by GitBox <gi...@apache.org>.
denimalpaca commented on a change in pull request #17741:
URL: https://github.com/apache/airflow/pull/17741#discussion_r698754985



##########
File path: airflow/providers/snowflake/operators/snowflake.py
##########
@@ -125,3 +126,285 @@ def execute(self, context: Any) -> None:
 
         if self.do_xcom_push:
             return execution_info
+
+
+class _SnowflakeDbHookMixin:
+    def get_db_hook(self) -> SnowflakeHook:
+        """
+        Create and return SnowflakeHook.
+
+        :return: a SnowflakeHook instance.
+        :rtype: SnowflakeHook
+        """
+        return SnowflakeHook(
+            snowflake_conn_id=self.snowflake_conn_id,
+            warehouse=self.warehouse,
+            database=self.database,
+            role=self.role,
+            schema=self.schema,
+            authenticator=self.authenticator,
+            session_parameters=self.session_parameters,
+        )
+
+
+class SnowflakeCheckOperator(_SnowflakeDbHookMixin, SQLCheckOperator):
+    """
+    Performs a check against Snowflake. The ``SnowflakeCheckOperator`` expects
+    a sql query that will return a single row. Each value on that
+    first row is evaluated using python ``bool`` casting. If any of the
+    values return ``False`` the check is failed and errors out.
+
+    Note that Python bool casting evals the following as ``False``:
+
+    * ``False``
+    * ``0``
+    * Empty string (``""``)
+    * Empty list (``[]``)
+    * Empty dictionary or set (``{}``)
+
+    Given a query like ``SELECT COUNT(*) FROM foo``, it will fail only if
+    the count ``== 0``. You can craft much more complex query that could,
+    for instance, check that the table has the same number of rows as
+    the source table upstream, or that the count of today's partition is
+    greater than yesterday's partition, or that a set of metrics are less
+    than 3 standard deviation for the 7 day average.
+
+    This operator can be used as a data quality check in your pipeline, and
+    depending on where you put it in your DAG, you have the choice to
+    stop the critical path, preventing from
+    publishing dubious data, or on the side and receive email alerts
+    without stopping the progress of the DAG.
+
+    :param sql: the sql code to be executed. (templated)
+    :type sql: Can receive a str representing a sql statement,
+        a list of str (sql statements), or reference to a template file.
+        Template reference are recognized by str ending in '.sql'
+    :param snowflake_conn_id: Reference to
+        :ref:`Snowflake connection id<howto/connection:snowflake>`
+    :type snowflake_conn_id: str
+    :param autocommit: if True, each command is automatically committed.
+        (default value: True)
+    :type autocommit: bool
+    :param parameters: (optional) the parameters to render the SQL query with.
+    :type parameters: dict or iterable
+    :param warehouse: name of warehouse (will overwrite any warehouse
+        defined in the connection's extra JSON)
+    :type warehouse: str
+    :param database: name of database (will overwrite database defined
+        in connection)
+    :type database: str
+    :param schema: name of schema (will overwrite schema defined in
+        connection)
+    :type schema: str
+    :param role: name of role (will overwrite any role defined in
+        connection's extra JSON)
+    :type role: str
+    :param authenticator: authenticator for Snowflake.
+        'snowflake' (default) to use the internal Snowflake authenticator
+        'externalbrowser' to authenticate using your web browser and
+        Okta, ADFS or any other SAML 2.0-compliant identify provider
+        (IdP) that has been defined for your account
+        'https://<your_okta_account_name>.okta.com' to authenticate
+        through native Okta.
+    :type authenticator: str
+    :param session_parameters: You can set session-level parameters at
+        the time you connect to Snowflake
+    :type session_parameters: dict
+    """
+
+    template_fields = ('sql',)
+    template_ext = ('.sql',)
+    ui_color = '#ededed'
+
+    def __init__(
+        self,
+        *,
+        sql: Any,
+        snowflake_conn_id: str = 'snowflake_default',
+        parameters: Optional[dict] = None,
+        autocommit: bool = True,
+        do_xcom_push: bool = True,
+        warehouse: Optional[str] = None,
+        database: Optional[str] = None,
+        role: Optional[str] = None,
+        schema: Optional[str] = None,
+        authenticator: Optional[str] = None,
+        session_parameters: Optional[dict] = None,
+        **kwargs,
+    ) -> None:
+        super().__init__(sql=sql, **kwargs)
+        self.snowflake_conn_id = snowflake_conn_id
+        self.sql = sql
+        self.autocommit = autocommit
+        self.do_xcom_push = do_xcom_push
+        self.parameters = parameters
+        self.warehouse = warehouse
+        self.database = database
+        self.role = role
+        self.schema = schema
+        self.authenticator = authenticator
+        self.session_parameters = session_parameters
+        self.query_ids = []
+
+
+class SnowflakeValueCheckOperator(_SnowflakeDbHookMixin, SQLValueCheckOperator):
+    """
+    Performs a simple value check using sql code.
+
+    :param sql: the sql to be executed
+    :type sql: str
+    :param snowflake_conn_id: Reference to
+        :ref:`Snowflake connection id<howto/connection:snowflake>`
+    :type snowflake_conn_id: str
+    :param autocommit: if True, each command is automatically committed.
+        (default value: True)
+    :type autocommit: bool
+    :param parameters: (optional) the parameters to render the SQL query with.
+    :type parameters: dict or iterable
+    :param warehouse: name of warehouse (will overwrite any warehouse
+        defined in the connection's extra JSON)
+    :type warehouse: str
+    :param database: name of database (will overwrite database defined
+        in connection)
+    :type database: str
+    :param schema: name of schema (will overwrite schema defined in
+        connection)
+    :type schema: str
+    :param role: name of role (will overwrite any role defined in
+        connection's extra JSON)
+    :type role: str
+    :param authenticator: authenticator for Snowflake.
+        'snowflake' (default) to use the internal Snowflake authenticator
+        'externalbrowser' to authenticate using your web browser and
+        Okta, ADFS or any other SAML 2.0-compliant identify provider
+        (IdP) that has been defined for your account
+        'https://<your_okta_account_name>.okta.com' to authenticate
+        through native Okta.
+    :type authenticator: str
+    :param session_parameters: You can set session-level parameters at
+        the time you connect to Snowflake
+    :type session_parameters: dict
+    """
+
+    def __init__(
+        self,
+        *,
+        sql: Any,
+        pass_value: Any,
+        tolerance: Any = None,
+        snowflake_conn_id: str = 'snowflake_default',
+        parameters: Optional[dict] = None,
+        autocommit: bool = True,
+        do_xcom_push: bool = True,
+        warehouse: Optional[str] = None,
+        database: Optional[str] = None,
+        role: Optional[str] = None,
+        schema: Optional[str] = None,
+        authenticator: Optional[str] = None,
+        session_parameters: Optional[dict] = None,
+        **kwargs,
+    ) -> None:
+        super().__init__(sql=sql, pass_value=pass_value, tolerance=tolerance, **kwargs)
+        self.snowflake_conn_id = snowflake_conn_id
+        self.sql = sql
+        self.autocommit = autocommit
+        self.do_xcom_push = do_xcom_push
+        self.parameters = parameters
+        self.warehouse = warehouse
+        self.database = database
+        self.role = role
+        self.schema = schema
+        self.authenticator = authenticator
+        self.session_parameters = session_parameters
+        self.query_ids = []
+
+
+class SnowflakeIntervalCheckOperator(_SnowflakeDbHookMixin, SQLIntervalCheckOperator):
+    """
+    Checks that the values of metrics given as SQL expressions are within
+    a certain tolerance of the ones from days_back before.
+
+    This method constructs a query like so ::
+
+        SELECT {metrics_threshold_dict_key} FROM {table}
+        WHERE {date_filter_column}=<date>
+
+    :param table: the table name
+    :type table: str
+    :param days_back: number of days between ds and the ds we want to check

Review comment:
       Agreed. Those docstrings aren't even in the SQL Operators (and should be).




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] github-actions[bot] commented on pull request #17741: Add Snowflake DQ Operators

Posted by GitBox <gi...@apache.org>.
github-actions[bot] commented on pull request #17741:
URL: https://github.com/apache/airflow/pull/17741#issuecomment-902891491


   The PR is likely OK to be merged with just subset of tests for default Python and Database versions without running the full matrix of tests, because it does not modify the core of Airflow. If the committers decide that the full tests matrix is needed, they will add the label 'full tests needed'. Then you should rebase to the latest main or amend the last commit of the PR, and push it with --force-with-lease.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] JavierLopezT commented on a change in pull request #17741: Add Snowflake DQ Operators

Posted by GitBox <gi...@apache.org>.
JavierLopezT commented on a change in pull request #17741:
URL: https://github.com/apache/airflow/pull/17741#discussion_r698382156



##########
File path: airflow/providers/snowflake/operators/snowflake.py
##########
@@ -125,3 +126,285 @@ def execute(self, context: Any) -> None:
 
         if self.do_xcom_push:
             return execution_info
+
+
+class _SnowflakeDbHookMixin:
+    def get_db_hook(self) -> SnowflakeHook:
+        """
+        Create and return SnowflakeHook.
+
+        :return: a SnowflakeHook instance.
+        :rtype: SnowflakeHook
+        """
+        return SnowflakeHook(
+            snowflake_conn_id=self.snowflake_conn_id,
+            warehouse=self.warehouse,
+            database=self.database,
+            role=self.role,
+            schema=self.schema,
+            authenticator=self.authenticator,
+            session_parameters=self.session_parameters,
+        )
+
+
+class SnowflakeCheckOperator(_SnowflakeDbHookMixin, SQLCheckOperator):
+    """
+    Performs a check against Snowflake. The ``SnowflakeCheckOperator`` expects
+    a sql query that will return a single row. Each value on that
+    first row is evaluated using python ``bool`` casting. If any of the
+    values return ``False`` the check is failed and errors out.
+
+    Note that Python bool casting evals the following as ``False``:
+
+    * ``False``
+    * ``0``
+    * Empty string (``""``)
+    * Empty list (``[]``)
+    * Empty dictionary or set (``{}``)
+
+    Given a query like ``SELECT COUNT(*) FROM foo``, it will fail only if
+    the count ``== 0``. You can craft much more complex query that could,
+    for instance, check that the table has the same number of rows as
+    the source table upstream, or that the count of today's partition is
+    greater than yesterday's partition, or that a set of metrics are less
+    than 3 standard deviation for the 7 day average.
+
+    This operator can be used as a data quality check in your pipeline, and
+    depending on where you put it in your DAG, you have the choice to
+    stop the critical path, preventing from
+    publishing dubious data, or on the side and receive email alerts
+    without stopping the progress of the DAG.
+
+    :param sql: the sql code to be executed. (templated)
+    :type sql: Can receive a str representing a sql statement,
+        a list of str (sql statements), or reference to a template file.
+        Template reference are recognized by str ending in '.sql'
+    :param snowflake_conn_id: Reference to
+        :ref:`Snowflake connection id<howto/connection:snowflake>`
+    :type snowflake_conn_id: str
+    :param autocommit: if True, each command is automatically committed.
+        (default value: True)
+    :type autocommit: bool
+    :param parameters: (optional) the parameters to render the SQL query with.
+    :type parameters: dict or iterable
+    :param warehouse: name of warehouse (will overwrite any warehouse
+        defined in the connection's extra JSON)
+    :type warehouse: str
+    :param database: name of database (will overwrite database defined
+        in connection)
+    :type database: str
+    :param schema: name of schema (will overwrite schema defined in
+        connection)
+    :type schema: str
+    :param role: name of role (will overwrite any role defined in
+        connection's extra JSON)
+    :type role: str
+    :param authenticator: authenticator for Snowflake.
+        'snowflake' (default) to use the internal Snowflake authenticator
+        'externalbrowser' to authenticate using your web browser and
+        Okta, ADFS or any other SAML 2.0-compliant identify provider
+        (IdP) that has been defined for your account
+        'https://<your_okta_account_name>.okta.com' to authenticate
+        through native Okta.
+    :type authenticator: str
+    :param session_parameters: You can set session-level parameters at
+        the time you connect to Snowflake
+    :type session_parameters: dict
+    """
+
+    template_fields = ('sql',)
+    template_ext = ('.sql',)
+    ui_color = '#ededed'
+
+    def __init__(
+        self,
+        *,
+        sql: Any,
+        snowflake_conn_id: str = 'snowflake_default',
+        parameters: Optional[dict] = None,
+        autocommit: bool = True,
+        do_xcom_push: bool = True,
+        warehouse: Optional[str] = None,
+        database: Optional[str] = None,
+        role: Optional[str] = None,
+        schema: Optional[str] = None,
+        authenticator: Optional[str] = None,
+        session_parameters: Optional[dict] = None,
+        **kwargs,
+    ) -> None:
+        super().__init__(sql=sql, **kwargs)
+        self.snowflake_conn_id = snowflake_conn_id
+        self.sql = sql
+        self.autocommit = autocommit
+        self.do_xcom_push = do_xcom_push
+        self.parameters = parameters
+        self.warehouse = warehouse
+        self.database = database
+        self.role = role
+        self.schema = schema
+        self.authenticator = authenticator
+        self.session_parameters = session_parameters
+        self.query_ids = []
+
+
+class SnowflakeValueCheckOperator(_SnowflakeDbHookMixin, SQLValueCheckOperator):
+    """
+    Performs a simple value check using sql code.

Review comment:
       I know that this is copied from https://github.com/apache/airflow/blob/main/airflow/operators/sql.py#L149, but I think it's unclear what the operator really is. I think adding something like 'it checks against a value that you define' would be really useful

##########
File path: airflow/providers/snowflake/operators/snowflake.py
##########
@@ -125,3 +126,285 @@ def execute(self, context: Any) -> None:
 
         if self.do_xcom_push:
             return execution_info
+
+
+class _SnowflakeDbHookMixin:
+    def get_db_hook(self) -> SnowflakeHook:
+        """
+        Create and return SnowflakeHook.
+
+        :return: a SnowflakeHook instance.
+        :rtype: SnowflakeHook
+        """
+        return SnowflakeHook(
+            snowflake_conn_id=self.snowflake_conn_id,
+            warehouse=self.warehouse,
+            database=self.database,
+            role=self.role,
+            schema=self.schema,
+            authenticator=self.authenticator,
+            session_parameters=self.session_parameters,
+        )
+
+
+class SnowflakeCheckOperator(_SnowflakeDbHookMixin, SQLCheckOperator):
+    """
+    Performs a check against Snowflake. The ``SnowflakeCheckOperator`` expects
+    a sql query that will return a single row. Each value on that
+    first row is evaluated using python ``bool`` casting. If any of the
+    values return ``False`` the check is failed and errors out.
+
+    Note that Python bool casting evals the following as ``False``:
+
+    * ``False``
+    * ``0``
+    * Empty string (``""``)
+    * Empty list (``[]``)
+    * Empty dictionary or set (``{}``)
+
+    Given a query like ``SELECT COUNT(*) FROM foo``, it will fail only if
+    the count ``== 0``. You can craft much more complex query that could,
+    for instance, check that the table has the same number of rows as
+    the source table upstream, or that the count of today's partition is
+    greater than yesterday's partition, or that a set of metrics are less
+    than 3 standard deviation for the 7 day average.
+
+    This operator can be used as a data quality check in your pipeline, and
+    depending on where you put it in your DAG, you have the choice to
+    stop the critical path, preventing from
+    publishing dubious data, or on the side and receive email alerts
+    without stopping the progress of the DAG.
+
+    :param sql: the sql code to be executed. (templated)
+    :type sql: Can receive a str representing a sql statement,
+        a list of str (sql statements), or reference to a template file.
+        Template reference are recognized by str ending in '.sql'
+    :param snowflake_conn_id: Reference to
+        :ref:`Snowflake connection id<howto/connection:snowflake>`
+    :type snowflake_conn_id: str
+    :param autocommit: if True, each command is automatically committed.

Review comment:
       Given that the the sqls are meant to be SELECT statements, does it make sense to have this param?

##########
File path: airflow/providers/snowflake/operators/snowflake.py
##########
@@ -125,3 +126,285 @@ def execute(self, context: Any) -> None:
 
         if self.do_xcom_push:
             return execution_info
+
+
+class _SnowflakeDbHookMixin:

Review comment:
       Why a class and not just a method inside the SnowflakeCheckOperator?

##########
File path: airflow/providers/snowflake/operators/snowflake.py
##########
@@ -125,3 +126,285 @@ def execute(self, context: Any) -> None:
 
         if self.do_xcom_push:
             return execution_info
+
+
+class _SnowflakeDbHookMixin:
+    def get_db_hook(self) -> SnowflakeHook:
+        """
+        Create and return SnowflakeHook.
+
+        :return: a SnowflakeHook instance.
+        :rtype: SnowflakeHook
+        """
+        return SnowflakeHook(
+            snowflake_conn_id=self.snowflake_conn_id,
+            warehouse=self.warehouse,
+            database=self.database,
+            role=self.role,
+            schema=self.schema,
+            authenticator=self.authenticator,
+            session_parameters=self.session_parameters,
+        )
+
+
+class SnowflakeCheckOperator(_SnowflakeDbHookMixin, SQLCheckOperator):
+    """
+    Performs a check against Snowflake. The ``SnowflakeCheckOperator`` expects
+    a sql query that will return a single row. Each value on that
+    first row is evaluated using python ``bool`` casting. If any of the
+    values return ``False`` the check is failed and errors out.
+
+    Note that Python bool casting evals the following as ``False``:
+
+    * ``False``
+    * ``0``
+    * Empty string (``""``)
+    * Empty list (``[]``)
+    * Empty dictionary or set (``{}``)
+
+    Given a query like ``SELECT COUNT(*) FROM foo``, it will fail only if
+    the count ``== 0``. You can craft much more complex query that could,
+    for instance, check that the table has the same number of rows as
+    the source table upstream, or that the count of today's partition is
+    greater than yesterday's partition, or that a set of metrics are less
+    than 3 standard deviation for the 7 day average.
+
+    This operator can be used as a data quality check in your pipeline, and
+    depending on where you put it in your DAG, you have the choice to
+    stop the critical path, preventing from
+    publishing dubious data, or on the side and receive email alerts
+    without stopping the progress of the DAG.
+
+    :param sql: the sql code to be executed. (templated)
+    :type sql: Can receive a str representing a sql statement,
+        a list of str (sql statements), or reference to a template file.
+        Template reference are recognized by str ending in '.sql'
+    :param snowflake_conn_id: Reference to
+        :ref:`Snowflake connection id<howto/connection:snowflake>`
+    :type snowflake_conn_id: str
+    :param autocommit: if True, each command is automatically committed.
+        (default value: True)
+    :type autocommit: bool
+    :param parameters: (optional) the parameters to render the SQL query with.
+    :type parameters: dict or iterable
+    :param warehouse: name of warehouse (will overwrite any warehouse
+        defined in the connection's extra JSON)
+    :type warehouse: str
+    :param database: name of database (will overwrite database defined
+        in connection)
+    :type database: str
+    :param schema: name of schema (will overwrite schema defined in
+        connection)
+    :type schema: str
+    :param role: name of role (will overwrite any role defined in
+        connection's extra JSON)
+    :type role: str
+    :param authenticator: authenticator for Snowflake.
+        'snowflake' (default) to use the internal Snowflake authenticator
+        'externalbrowser' to authenticate using your web browser and
+        Okta, ADFS or any other SAML 2.0-compliant identify provider
+        (IdP) that has been defined for your account
+        'https://<your_okta_account_name>.okta.com' to authenticate
+        through native Okta.
+    :type authenticator: str
+    :param session_parameters: You can set session-level parameters at
+        the time you connect to Snowflake
+    :type session_parameters: dict
+    """
+
+    template_fields = ('sql',)
+    template_ext = ('.sql',)
+    ui_color = '#ededed'
+
+    def __init__(
+        self,
+        *,
+        sql: Any,
+        snowflake_conn_id: str = 'snowflake_default',
+        parameters: Optional[dict] = None,
+        autocommit: bool = True,
+        do_xcom_push: bool = True,
+        warehouse: Optional[str] = None,
+        database: Optional[str] = None,
+        role: Optional[str] = None,
+        schema: Optional[str] = None,
+        authenticator: Optional[str] = None,
+        session_parameters: Optional[dict] = None,
+        **kwargs,
+    ) -> None:
+        super().__init__(sql=sql, **kwargs)
+        self.snowflake_conn_id = snowflake_conn_id
+        self.sql = sql
+        self.autocommit = autocommit
+        self.do_xcom_push = do_xcom_push
+        self.parameters = parameters
+        self.warehouse = warehouse
+        self.database = database
+        self.role = role
+        self.schema = schema
+        self.authenticator = authenticator
+        self.session_parameters = session_parameters
+        self.query_ids = []
+
+
+class SnowflakeValueCheckOperator(_SnowflakeDbHookMixin, SQLValueCheckOperator):
+    """
+    Performs a simple value check using sql code.
+
+    :param sql: the sql to be executed
+    :type sql: str
+    :param snowflake_conn_id: Reference to
+        :ref:`Snowflake connection id<howto/connection:snowflake>`
+    :type snowflake_conn_id: str
+    :param autocommit: if True, each command is automatically committed.
+        (default value: True)
+    :type autocommit: bool
+    :param parameters: (optional) the parameters to render the SQL query with.
+    :type parameters: dict or iterable
+    :param warehouse: name of warehouse (will overwrite any warehouse
+        defined in the connection's extra JSON)
+    :type warehouse: str
+    :param database: name of database (will overwrite database defined
+        in connection)
+    :type database: str
+    :param schema: name of schema (will overwrite schema defined in
+        connection)
+    :type schema: str
+    :param role: name of role (will overwrite any role defined in
+        connection's extra JSON)
+    :type role: str
+    :param authenticator: authenticator for Snowflake.
+        'snowflake' (default) to use the internal Snowflake authenticator
+        'externalbrowser' to authenticate using your web browser and
+        Okta, ADFS or any other SAML 2.0-compliant identify provider
+        (IdP) that has been defined for your account
+        'https://<your_okta_account_name>.okta.com' to authenticate
+        through native Okta.
+    :type authenticator: str
+    :param session_parameters: You can set session-level parameters at
+        the time you connect to Snowflake
+    :type session_parameters: dict
+    """
+
+    def __init__(
+        self,
+        *,
+        sql: Any,

Review comment:
       Is it really Any? Or union str and list?

##########
File path: airflow/providers/snowflake/operators/snowflake.py
##########
@@ -125,3 +126,285 @@ def execute(self, context: Any) -> None:
 
         if self.do_xcom_push:
             return execution_info
+
+
+class _SnowflakeDbHookMixin:
+    def get_db_hook(self) -> SnowflakeHook:
+        """
+        Create and return SnowflakeHook.
+
+        :return: a SnowflakeHook instance.
+        :rtype: SnowflakeHook
+        """
+        return SnowflakeHook(
+            snowflake_conn_id=self.snowflake_conn_id,
+            warehouse=self.warehouse,
+            database=self.database,
+            role=self.role,
+            schema=self.schema,
+            authenticator=self.authenticator,
+            session_parameters=self.session_parameters,
+        )
+
+
+class SnowflakeCheckOperator(_SnowflakeDbHookMixin, SQLCheckOperator):
+    """
+    Performs a check against Snowflake. The ``SnowflakeCheckOperator`` expects
+    a sql query that will return a single row. Each value on that
+    first row is evaluated using python ``bool`` casting. If any of the
+    values return ``False`` the check is failed and errors out.
+
+    Note that Python bool casting evals the following as ``False``:
+
+    * ``False``
+    * ``0``
+    * Empty string (``""``)
+    * Empty list (``[]``)
+    * Empty dictionary or set (``{}``)
+
+    Given a query like ``SELECT COUNT(*) FROM foo``, it will fail only if
+    the count ``== 0``. You can craft much more complex query that could,
+    for instance, check that the table has the same number of rows as
+    the source table upstream, or that the count of today's partition is
+    greater than yesterday's partition, or that a set of metrics are less
+    than 3 standard deviation for the 7 day average.
+
+    This operator can be used as a data quality check in your pipeline, and
+    depending on where you put it in your DAG, you have the choice to
+    stop the critical path, preventing from
+    publishing dubious data, or on the side and receive email alerts
+    without stopping the progress of the DAG.
+
+    :param sql: the sql code to be executed. (templated)
+    :type sql: Can receive a str representing a sql statement,
+        a list of str (sql statements), or reference to a template file.
+        Template reference are recognized by str ending in '.sql'
+    :param snowflake_conn_id: Reference to
+        :ref:`Snowflake connection id<howto/connection:snowflake>`
+    :type snowflake_conn_id: str
+    :param autocommit: if True, each command is automatically committed.
+        (default value: True)
+    :type autocommit: bool
+    :param parameters: (optional) the parameters to render the SQL query with.
+    :type parameters: dict or iterable
+    :param warehouse: name of warehouse (will overwrite any warehouse
+        defined in the connection's extra JSON)
+    :type warehouse: str
+    :param database: name of database (will overwrite database defined
+        in connection)
+    :type database: str
+    :param schema: name of schema (will overwrite schema defined in
+        connection)
+    :type schema: str
+    :param role: name of role (will overwrite any role defined in
+        connection's extra JSON)
+    :type role: str
+    :param authenticator: authenticator for Snowflake.
+        'snowflake' (default) to use the internal Snowflake authenticator
+        'externalbrowser' to authenticate using your web browser and
+        Okta, ADFS or any other SAML 2.0-compliant identify provider
+        (IdP) that has been defined for your account
+        'https://<your_okta_account_name>.okta.com' to authenticate
+        through native Okta.
+    :type authenticator: str
+    :param session_parameters: You can set session-level parameters at
+        the time you connect to Snowflake
+    :type session_parameters: dict
+    """
+
+    template_fields = ('sql',)
+    template_ext = ('.sql',)
+    ui_color = '#ededed'
+
+    def __init__(
+        self,
+        *,
+        sql: Any,
+        snowflake_conn_id: str = 'snowflake_default',
+        parameters: Optional[dict] = None,
+        autocommit: bool = True,
+        do_xcom_push: bool = True,
+        warehouse: Optional[str] = None,
+        database: Optional[str] = None,
+        role: Optional[str] = None,
+        schema: Optional[str] = None,
+        authenticator: Optional[str] = None,
+        session_parameters: Optional[dict] = None,
+        **kwargs,
+    ) -> None:
+        super().__init__(sql=sql, **kwargs)
+        self.snowflake_conn_id = snowflake_conn_id
+        self.sql = sql
+        self.autocommit = autocommit
+        self.do_xcom_push = do_xcom_push
+        self.parameters = parameters
+        self.warehouse = warehouse
+        self.database = database
+        self.role = role
+        self.schema = schema
+        self.authenticator = authenticator
+        self.session_parameters = session_parameters
+        self.query_ids = []
+
+
+class SnowflakeValueCheckOperator(_SnowflakeDbHookMixin, SQLValueCheckOperator):
+    """
+    Performs a simple value check using sql code.
+
+    :param sql: the sql to be executed
+    :type sql: str
+    :param snowflake_conn_id: Reference to
+        :ref:`Snowflake connection id<howto/connection:snowflake>`
+    :type snowflake_conn_id: str
+    :param autocommit: if True, each command is automatically committed.
+        (default value: True)
+    :type autocommit: bool
+    :param parameters: (optional) the parameters to render the SQL query with.
+    :type parameters: dict or iterable
+    :param warehouse: name of warehouse (will overwrite any warehouse
+        defined in the connection's extra JSON)
+    :type warehouse: str
+    :param database: name of database (will overwrite database defined
+        in connection)
+    :type database: str
+    :param schema: name of schema (will overwrite schema defined in
+        connection)
+    :type schema: str
+    :param role: name of role (will overwrite any role defined in
+        connection's extra JSON)
+    :type role: str
+    :param authenticator: authenticator for Snowflake.
+        'snowflake' (default) to use the internal Snowflake authenticator
+        'externalbrowser' to authenticate using your web browser and
+        Okta, ADFS or any other SAML 2.0-compliant identify provider
+        (IdP) that has been defined for your account
+        'https://<your_okta_account_name>.okta.com' to authenticate
+        through native Okta.
+    :type authenticator: str
+    :param session_parameters: You can set session-level parameters at
+        the time you connect to Snowflake
+    :type session_parameters: dict
+    """
+
+    def __init__(
+        self,
+        *,
+        sql: Any,
+        pass_value: Any,
+        tolerance: Any = None,
+        snowflake_conn_id: str = 'snowflake_default',
+        parameters: Optional[dict] = None,
+        autocommit: bool = True,
+        do_xcom_push: bool = True,
+        warehouse: Optional[str] = None,
+        database: Optional[str] = None,
+        role: Optional[str] = None,
+        schema: Optional[str] = None,
+        authenticator: Optional[str] = None,
+        session_parameters: Optional[dict] = None,
+        **kwargs,
+    ) -> None:
+        super().__init__(sql=sql, pass_value=pass_value, tolerance=tolerance, **kwargs)
+        self.snowflake_conn_id = snowflake_conn_id
+        self.sql = sql
+        self.autocommit = autocommit
+        self.do_xcom_push = do_xcom_push
+        self.parameters = parameters
+        self.warehouse = warehouse
+        self.database = database
+        self.role = role
+        self.schema = schema
+        self.authenticator = authenticator
+        self.session_parameters = session_parameters
+        self.query_ids = []
+
+
+class SnowflakeIntervalCheckOperator(_SnowflakeDbHookMixin, SQLIntervalCheckOperator):
+    """
+    Checks that the values of metrics given as SQL expressions are within
+    a certain tolerance of the ones from days_back before.
+
+    This method constructs a query like so ::
+
+        SELECT {metrics_threshold_dict_key} FROM {table}
+        WHERE {date_filter_column}=<date>
+
+    :param table: the table name
+    :type table: str
+    :param days_back: number of days between ds and the ds we want to check

Review comment:
       If this is here, I guess `tolerance` and `pass_value` should be in the docstring of `SnowflakeValueCheckOperator`. It seems that for Sphinx is not necessary https://stackoverflow.com/a/2025599/9621172 but certainly it really helps to have them here if you are just reading the code




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] denimalpaca commented on a change in pull request #17741: Add Snowflake DQ Operators

Posted by GitBox <gi...@apache.org>.
denimalpaca commented on a change in pull request #17741:
URL: https://github.com/apache/airflow/pull/17741#discussion_r698726394



##########
File path: airflow/providers/snowflake/operators/snowflake.py
##########
@@ -125,3 +126,285 @@ def execute(self, context: Any) -> None:
 
         if self.do_xcom_push:
             return execution_info
+
+
+class _SnowflakeDbHookMixin:

Review comment:
       Trying to be more DRY. Originally I tried making the `Check` classes inherit from just the `SnowflakeOperator`, but it resulted in an error in the constructors. 




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] potiuk commented on a change in pull request #17741: Add Snowflake DQ Operators

Posted by GitBox <gi...@apache.org>.
potiuk commented on a change in pull request #17741:
URL: https://github.com/apache/airflow/pull/17741#discussion_r698722020



##########
File path: airflow/providers/snowflake/operators/snowflake.py
##########
@@ -125,3 +126,285 @@ def execute(self, context: Any) -> None:
 
         if self.do_xcom_push:
             return execution_info
+
+
+class _SnowflakeDbHookMixin:
+    def get_db_hook(self) -> SnowflakeHook:
+        """
+        Create and return SnowflakeHook.
+
+        :return: a SnowflakeHook instance.
+        :rtype: SnowflakeHook
+        """
+        return SnowflakeHook(
+            snowflake_conn_id=self.snowflake_conn_id,
+            warehouse=self.warehouse,
+            database=self.database,
+            role=self.role,
+            schema=self.schema,
+            authenticator=self.authenticator,
+            session_parameters=self.session_parameters,
+        )
+
+
+class SnowflakeCheckOperator(_SnowflakeDbHookMixin, SQLCheckOperator):
+    """
+    Performs a check against Snowflake. The ``SnowflakeCheckOperator`` expects
+    a sql query that will return a single row. Each value on that
+    first row is evaluated using python ``bool`` casting. If any of the
+    values return ``False`` the check is failed and errors out.
+
+    Note that Python bool casting evals the following as ``False``:
+
+    * ``False``
+    * ``0``
+    * Empty string (``""``)
+    * Empty list (``[]``)
+    * Empty dictionary or set (``{}``)
+
+    Given a query like ``SELECT COUNT(*) FROM foo``, it will fail only if
+    the count ``== 0``. You can craft much more complex query that could,
+    for instance, check that the table has the same number of rows as
+    the source table upstream, or that the count of today's partition is
+    greater than yesterday's partition, or that a set of metrics are less
+    than 3 standard deviation for the 7 day average.
+
+    This operator can be used as a data quality check in your pipeline, and
+    depending on where you put it in your DAG, you have the choice to
+    stop the critical path, preventing from
+    publishing dubious data, or on the side and receive email alerts
+    without stopping the progress of the DAG.
+
+    :param sql: the sql code to be executed. (templated)
+    :type sql: Can receive a str representing a sql statement,
+        a list of str (sql statements), or reference to a template file.
+        Template reference are recognized by str ending in '.sql'
+    :param snowflake_conn_id: Reference to
+        :ref:`Snowflake connection id<howto/connection:snowflake>`
+    :type snowflake_conn_id: str
+    :param autocommit: if True, each command is automatically committed.

Review comment:
       The can be also UPDATE statements




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] potiuk commented on a change in pull request #17741: Add Snowflake DQ Operators

Posted by GitBox <gi...@apache.org>.
potiuk commented on a change in pull request #17741:
URL: https://github.com/apache/airflow/pull/17741#discussion_r701772795



##########
File path: airflow/providers/snowflake/operators/snowflake.py
##########
@@ -125,3 +126,285 @@ def execute(self, context: Any) -> None:
 
         if self.do_xcom_push:
             return execution_info
+
+
+class _SnowflakeDbHookMixin:

Review comment:
       Would you mind changing that @denimalpaca ?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] uranusjr commented on pull request #17741: Add Snowflake DQ Operators

Posted by GitBox <gi...@apache.org>.
uranusjr commented on pull request #17741:
URL: https://github.com/apache/airflow/pull/17741#issuecomment-902563552


   I don’t understand Snowflake a bit to comment on whether this actually works, but the implementation seems straightforward (using things already present in Airflow) and sound.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] denimalpaca commented on a change in pull request #17741: Add Snowflake DQ Operators

Posted by GitBox <gi...@apache.org>.
denimalpaca commented on a change in pull request #17741:
URL: https://github.com/apache/airflow/pull/17741#discussion_r698756492



##########
File path: airflow/providers/snowflake/operators/snowflake.py
##########
@@ -125,3 +126,285 @@ def execute(self, context: Any) -> None:
 
         if self.do_xcom_push:
             return execution_info
+
+
+class _SnowflakeDbHookMixin:
+    def get_db_hook(self) -> SnowflakeHook:
+        """
+        Create and return SnowflakeHook.
+
+        :return: a SnowflakeHook instance.
+        :rtype: SnowflakeHook
+        """
+        return SnowflakeHook(
+            snowflake_conn_id=self.snowflake_conn_id,
+            warehouse=self.warehouse,
+            database=self.database,
+            role=self.role,
+            schema=self.schema,
+            authenticator=self.authenticator,
+            session_parameters=self.session_parameters,
+        )
+
+
+class SnowflakeCheckOperator(_SnowflakeDbHookMixin, SQLCheckOperator):
+    """
+    Performs a check against Snowflake. The ``SnowflakeCheckOperator`` expects
+    a sql query that will return a single row. Each value on that
+    first row is evaluated using python ``bool`` casting. If any of the
+    values return ``False`` the check is failed and errors out.
+
+    Note that Python bool casting evals the following as ``False``:
+
+    * ``False``
+    * ``0``
+    * Empty string (``""``)
+    * Empty list (``[]``)
+    * Empty dictionary or set (``{}``)
+
+    Given a query like ``SELECT COUNT(*) FROM foo``, it will fail only if
+    the count ``== 0``. You can craft much more complex query that could,
+    for instance, check that the table has the same number of rows as
+    the source table upstream, or that the count of today's partition is
+    greater than yesterday's partition, or that a set of metrics are less
+    than 3 standard deviation for the 7 day average.
+
+    This operator can be used as a data quality check in your pipeline, and
+    depending on where you put it in your DAG, you have the choice to
+    stop the critical path, preventing from
+    publishing dubious data, or on the side and receive email alerts
+    without stopping the progress of the DAG.
+
+    :param sql: the sql code to be executed. (templated)
+    :type sql: Can receive a str representing a sql statement,
+        a list of str (sql statements), or reference to a template file.
+        Template reference are recognized by str ending in '.sql'
+    :param snowflake_conn_id: Reference to
+        :ref:`Snowflake connection id<howto/connection:snowflake>`
+    :type snowflake_conn_id: str
+    :param autocommit: if True, each command is automatically committed.
+        (default value: True)
+    :type autocommit: bool
+    :param parameters: (optional) the parameters to render the SQL query with.
+    :type parameters: dict or iterable
+    :param warehouse: name of warehouse (will overwrite any warehouse
+        defined in the connection's extra JSON)
+    :type warehouse: str
+    :param database: name of database (will overwrite database defined
+        in connection)
+    :type database: str
+    :param schema: name of schema (will overwrite schema defined in
+        connection)
+    :type schema: str
+    :param role: name of role (will overwrite any role defined in
+        connection's extra JSON)
+    :type role: str
+    :param authenticator: authenticator for Snowflake.
+        'snowflake' (default) to use the internal Snowflake authenticator
+        'externalbrowser' to authenticate using your web browser and
+        Okta, ADFS or any other SAML 2.0-compliant identify provider
+        (IdP) that has been defined for your account
+        'https://<your_okta_account_name>.okta.com' to authenticate
+        through native Okta.
+    :type authenticator: str
+    :param session_parameters: You can set session-level parameters at
+        the time you connect to Snowflake
+    :type session_parameters: dict
+    """
+
+    template_fields = ('sql',)
+    template_ext = ('.sql',)
+    ui_color = '#ededed'
+
+    def __init__(
+        self,
+        *,
+        sql: Any,
+        snowflake_conn_id: str = 'snowflake_default',
+        parameters: Optional[dict] = None,
+        autocommit: bool = True,
+        do_xcom_push: bool = True,
+        warehouse: Optional[str] = None,
+        database: Optional[str] = None,
+        role: Optional[str] = None,
+        schema: Optional[str] = None,
+        authenticator: Optional[str] = None,
+        session_parameters: Optional[dict] = None,
+        **kwargs,
+    ) -> None:
+        super().__init__(sql=sql, **kwargs)
+        self.snowflake_conn_id = snowflake_conn_id
+        self.sql = sql
+        self.autocommit = autocommit
+        self.do_xcom_push = do_xcom_push
+        self.parameters = parameters
+        self.warehouse = warehouse
+        self.database = database
+        self.role = role
+        self.schema = schema
+        self.authenticator = authenticator
+        self.session_parameters = session_parameters
+        self.query_ids = []
+
+
+class SnowflakeValueCheckOperator(_SnowflakeDbHookMixin, SQLValueCheckOperator):
+    """
+    Performs a simple value check using sql code.
+
+    :param sql: the sql to be executed
+    :type sql: str
+    :param snowflake_conn_id: Reference to
+        :ref:`Snowflake connection id<howto/connection:snowflake>`
+    :type snowflake_conn_id: str
+    :param autocommit: if True, each command is automatically committed.
+        (default value: True)
+    :type autocommit: bool
+    :param parameters: (optional) the parameters to render the SQL query with.
+    :type parameters: dict or iterable
+    :param warehouse: name of warehouse (will overwrite any warehouse
+        defined in the connection's extra JSON)
+    :type warehouse: str
+    :param database: name of database (will overwrite database defined
+        in connection)
+    :type database: str
+    :param schema: name of schema (will overwrite schema defined in
+        connection)
+    :type schema: str
+    :param role: name of role (will overwrite any role defined in
+        connection's extra JSON)
+    :type role: str
+    :param authenticator: authenticator for Snowflake.
+        'snowflake' (default) to use the internal Snowflake authenticator
+        'externalbrowser' to authenticate using your web browser and
+        Okta, ADFS or any other SAML 2.0-compliant identify provider
+        (IdP) that has been defined for your account
+        'https://<your_okta_account_name>.okta.com' to authenticate
+        through native Okta.
+    :type authenticator: str
+    :param session_parameters: You can set session-level parameters at
+        the time you connect to Snowflake
+    :type session_parameters: dict
+    """
+
+    def __init__(
+        self,
+        *,
+        sql: Any,

Review comment:
       Actually just `str` according to the [SQL Operator](https://airflow.apache.org/docs/apache-airflow/stable/_modules/airflow/operators/sql.html#SQLValueCheckOperator) it inherits from.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] mik-laj commented on a change in pull request #17741: Add Snowflake DQ Operators

Posted by GitBox <gi...@apache.org>.
mik-laj commented on a change in pull request #17741:
URL: https://github.com/apache/airflow/pull/17741#discussion_r695248154



##########
File path: airflow/providers/snowflake/operators/snowflake.py
##########
@@ -125,3 +126,299 @@ def execute(self, context: Any) -> None:
 
         if self.do_xcom_push:
             return execution_info
+
+
+class _SnowflakeDbHookMixin:
+    def get_db_hook(self) -> SnowflakeHook:
+        """
+        Create and return SnowflakeHook.
+        :return: a SnowflakeHook instance.
+        :rtype: SnowflakeHook
+        """
+        return SnowflakeHook(
+            snowflake_conn_id=self.snowflake_conn_id,
+            warehouse=self.warehouse,
+            database=self.database,
+            role=self.role,
+            schema=self.schema,
+            authenticator=self.authenticator,
+            session_parameters=self.session_parameters,
+        )
+
+
+class SnowflakeCheckOperator(_SnowflakeDbHookMixin, SQLCheckOperator):
+    """
+    Performs a check against Snowflake. The ``SnowflakeCheckOperator`` expects
+    a sql query that will return a single row. Each value on that
+    first row is evaluated using python ``bool`` casting. If any of the
+    values return ``False`` the check is failed and errors out.
+
+    Note that Python bool casting evals the following as ``False``:
+
+    * ``False``
+    * ``0``
+    * Empty string (``""``)
+    * Empty list (``[]``)
+    * Empty dictionary or set (``{}``)
+
+    Given a query like ``SELECT COUNT(*) FROM foo``, it will fail only if
+    the count ``== 0``. You can craft much more complex query that could,
+    for instance, check that the table has the same number of rows as
+    the source table upstream, or that the count of today's partition is
+    greater than yesterday's partition, or that a set of metrics are less
+    than 3 standard deviation for the 7 day average.
+
+    This operator can be used as a data quality check in your pipeline, and
+    depending on where you put it in your DAG, you have the choice to
+    stop the critical path, preventing from
+    publishing dubious data, or on the side and receive email alerts
+    without stopping the progress of the DAG.
+
+    :param sql: the sql code to be executed. (templated)
+    :type sql: Can receive a str representing a sql statement,
+        a list of str (sql statements), or reference to a template file.
+        Template reference are recognized by str ending in '.sql'
+    :param use_legacy_sql: Whether to use legacy SQL (true)

Review comment:
       Is it used?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] uranusjr commented on pull request #17741: Add Snowflake DQ Operators

Posted by GitBox <gi...@apache.org>.
uranusjr commented on pull request #17741:
URL: https://github.com/apache/airflow/pull/17741#issuecomment-909035478


   This needs to be rebased to latest main.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] mik-laj commented on a change in pull request #17741: Add Snowflake DQ Operators

Posted by GitBox <gi...@apache.org>.
mik-laj commented on a change in pull request #17741:
URL: https://github.com/apache/airflow/pull/17741#discussion_r695897882



##########
File path: airflow/providers/snowflake/operators/snowflake.py
##########
@@ -125,3 +126,284 @@ def execute(self, context: Any) -> None:
 
         if self.do_xcom_push:
             return execution_info
+
+
+class _SnowflakeDbHookMixin:
+    def get_db_hook(self) -> SnowflakeHook:
+        """
+        Create and return SnowflakeHook.
+        :return: a SnowflakeHook instance.

Review comment:
       ```suggestion
   
           :return: a SnowflakeHook instance.
   ```
   this is needed to render the documentation properly.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] mik-laj commented on a change in pull request #17741: Add Snowflake DQ Operators

Posted by GitBox <gi...@apache.org>.
mik-laj commented on a change in pull request #17741:
URL: https://github.com/apache/airflow/pull/17741#discussion_r695248660



##########
File path: airflow/providers/snowflake/operators/snowflake.py
##########
@@ -125,3 +126,299 @@ def execute(self, context: Any) -> None:
 
         if self.do_xcom_push:
             return execution_info
+
+
+class _SnowflakeDbHookMixin:
+    def get_db_hook(self) -> SnowflakeHook:
+        """
+        Create and return SnowflakeHook.
+        :return: a SnowflakeHook instance.
+        :rtype: SnowflakeHook
+        """
+        return SnowflakeHook(
+            snowflake_conn_id=self.snowflake_conn_id,
+            warehouse=self.warehouse,
+            database=self.database,
+            role=self.role,
+            schema=self.schema,
+            authenticator=self.authenticator,
+            session_parameters=self.session_parameters,
+        )
+
+
+class SnowflakeCheckOperator(_SnowflakeDbHookMixin, SQLCheckOperator):
+    """
+    Performs a check against Snowflake. The ``SnowflakeCheckOperator`` expects
+    a sql query that will return a single row. Each value on that
+    first row is evaluated using python ``bool`` casting. If any of the
+    values return ``False`` the check is failed and errors out.
+
+    Note that Python bool casting evals the following as ``False``:
+
+    * ``False``
+    * ``0``
+    * Empty string (``""``)
+    * Empty list (``[]``)
+    * Empty dictionary or set (``{}``)
+
+    Given a query like ``SELECT COUNT(*) FROM foo``, it will fail only if
+    the count ``== 0``. You can craft much more complex query that could,
+    for instance, check that the table has the same number of rows as
+    the source table upstream, or that the count of today's partition is
+    greater than yesterday's partition, or that a set of metrics are less
+    than 3 standard deviation for the 7 day average.
+
+    This operator can be used as a data quality check in your pipeline, and
+    depending on where you put it in your DAG, you have the choice to
+    stop the critical path, preventing from
+    publishing dubious data, or on the side and receive email alerts
+    without stopping the progress of the DAG.
+
+    :param sql: the sql code to be executed. (templated)
+    :type sql: Can receive a str representing a sql statement,
+        a list of str (sql statements), or reference to a template file.
+        Template reference are recognized by str ending in '.sql'
+    :param use_legacy_sql: Whether to use legacy SQL (true)
+        or standard SQL (false).
+    :type use_legacy_sql: bool
+    :param snowflake_conn_id: Reference to
+        :ref:`Snowflake connection id<howto/connection:snowflake>`
+    :type snowflake_conn_id: str
+    :param autocommit: if True, each command is automatically committed.
+        (default value: True)
+    :type autocommit: bool
+    :param parameters: (optional) the parameters to render the SQL query with.
+    :type parameters: dict or iterable
+    :param warehouse: name of warehouse (will overwrite any warehouse
+        defined in the connection's extra JSON)
+    :type warehouse: str
+    :param database: name of database (will overwrite database defined
+        in connection)
+    :type database: str
+    :param schema: name of schema (will overwrite schema defined in
+        connection)
+    :type schema: str
+    :param role: name of role (will overwrite any role defined in
+        connection's extra JSON)
+    :type role: str
+    :param authenticator: authenticator for Snowflake.
+        'snowflake' (default) to use the internal Snowflake authenticator
+        'externalbrowser' to authenticate using your web browser and
+        Okta, ADFS or any other SAML 2.0-compliant identify provider
+        (IdP) that has been defined for your account
+        'https://<your_okta_account_name>.okta.com' to authenticate
+        through native Okta.
+    :type authenticator: str
+    :param session_parameters: You can set session-level parameters at
+        the time you connect to Snowflake
+    :type session_parameters: dict
+    """
+
+    template_fields = ('sql',)
+    template_ext = ('.sql',)
+    ui_color = '#ededed'
+
+    def __init__(
+        self,
+        *,
+        sql: Any,
+        use_legacy_sql: bool = True,
+        snowflake_conn_id: str = 'snowflake_default',
+        parameters: Optional[dict] = None,
+        autocommit: bool = True,
+        do_xcom_push: bool = True,
+        warehouse: Optional[str] = None,
+        database: Optional[str] = None,
+        role: Optional[str] = None,
+        schema: Optional[str] = None,
+        authenticator: Optional[str] = None,
+        session_parameters: Optional[dict] = None,
+        **kwargs,
+    ) -> None:
+        super().__init__(sql=sql, **kwargs)
+        self.snowflake_conn_id = snowflake_conn_id
+        self.sql = sql
+        self.use_legacy_sql = use_legacy_sql
+        self.autocommit = autocommit
+        self.do_xcom_push = do_xcom_push
+        self.parameters = parameters
+        self.warehouse = warehouse
+        self.database = database
+        self.role = role
+        self.schema = schema
+        self.authenticator = authenticator
+        self.session_parameters = session_parameters
+        self.query_ids = []
+
+
+class SnowflakeValueCheckOperator(_SnowflakeDbHookMixin, SQLValueCheckOperator):
+    """
+    Performs a simple value check using sql code.
+
+    :param sql: the sql to be executed
+    :type sql: str
+    :param use_legacy_sql: Whether to use legacy SQL (true)

Review comment:
       Snowflake have only one SQL variant. BigQuery support legacy SQL and ZetaSQL.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] potiuk commented on a change in pull request #17741: Add Snowflake DQ Operators

Posted by GitBox <gi...@apache.org>.
potiuk commented on a change in pull request #17741:
URL: https://github.com/apache/airflow/pull/17741#discussion_r701772506



##########
File path: airflow/providers/snowflake/operators/snowflake.py
##########
@@ -125,3 +126,285 @@ def execute(self, context: Any) -> None:
 
         if self.do_xcom_push:
             return execution_info
+
+
+class _SnowflakeDbHookMixin:

Review comment:
       I wached the presentation :). Entertaining and I mostly agree with it  (not everything :) 
   
   But yeah - in  this case using mixin is definitely over-the-top.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] denimalpaca commented on a change in pull request #17741: Add Snowflake DQ Operators

Posted by GitBox <gi...@apache.org>.
denimalpaca commented on a change in pull request #17741:
URL: https://github.com/apache/airflow/pull/17741#discussion_r698726394



##########
File path: airflow/providers/snowflake/operators/snowflake.py
##########
@@ -125,3 +126,285 @@ def execute(self, context: Any) -> None:
 
         if self.do_xcom_push:
             return execution_info
+
+
+class _SnowflakeDbHookMixin:

Review comment:
       Trying to be more DRY. Originally I tried making the `Check` classes inherit from just the `SnowflakeOperator`, but it resulted in an error in the constructors. 

##########
File path: airflow/providers/snowflake/operators/snowflake.py
##########
@@ -125,3 +126,285 @@ def execute(self, context: Any) -> None:
 
         if self.do_xcom_push:
             return execution_info
+
+
+class _SnowflakeDbHookMixin:
+    def get_db_hook(self) -> SnowflakeHook:
+        """
+        Create and return SnowflakeHook.
+
+        :return: a SnowflakeHook instance.
+        :rtype: SnowflakeHook
+        """
+        return SnowflakeHook(
+            snowflake_conn_id=self.snowflake_conn_id,
+            warehouse=self.warehouse,
+            database=self.database,
+            role=self.role,
+            schema=self.schema,
+            authenticator=self.authenticator,
+            session_parameters=self.session_parameters,
+        )
+
+
+class SnowflakeCheckOperator(_SnowflakeDbHookMixin, SQLCheckOperator):
+    """
+    Performs a check against Snowflake. The ``SnowflakeCheckOperator`` expects
+    a sql query that will return a single row. Each value on that
+    first row is evaluated using python ``bool`` casting. If any of the
+    values return ``False`` the check is failed and errors out.
+
+    Note that Python bool casting evals the following as ``False``:
+
+    * ``False``
+    * ``0``
+    * Empty string (``""``)
+    * Empty list (``[]``)
+    * Empty dictionary or set (``{}``)
+
+    Given a query like ``SELECT COUNT(*) FROM foo``, it will fail only if
+    the count ``== 0``. You can craft much more complex query that could,
+    for instance, check that the table has the same number of rows as
+    the source table upstream, or that the count of today's partition is
+    greater than yesterday's partition, or that a set of metrics are less
+    than 3 standard deviation for the 7 day average.
+
+    This operator can be used as a data quality check in your pipeline, and
+    depending on where you put it in your DAG, you have the choice to
+    stop the critical path, preventing from
+    publishing dubious data, or on the side and receive email alerts
+    without stopping the progress of the DAG.
+
+    :param sql: the sql code to be executed. (templated)
+    :type sql: Can receive a str representing a sql statement,
+        a list of str (sql statements), or reference to a template file.
+        Template reference are recognized by str ending in '.sql'
+    :param snowflake_conn_id: Reference to
+        :ref:`Snowflake connection id<howto/connection:snowflake>`
+    :type snowflake_conn_id: str
+    :param autocommit: if True, each command is automatically committed.
+        (default value: True)
+    :type autocommit: bool
+    :param parameters: (optional) the parameters to render the SQL query with.
+    :type parameters: dict or iterable
+    :param warehouse: name of warehouse (will overwrite any warehouse
+        defined in the connection's extra JSON)
+    :type warehouse: str
+    :param database: name of database (will overwrite database defined
+        in connection)
+    :type database: str
+    :param schema: name of schema (will overwrite schema defined in
+        connection)
+    :type schema: str
+    :param role: name of role (will overwrite any role defined in
+        connection's extra JSON)
+    :type role: str
+    :param authenticator: authenticator for Snowflake.
+        'snowflake' (default) to use the internal Snowflake authenticator
+        'externalbrowser' to authenticate using your web browser and
+        Okta, ADFS or any other SAML 2.0-compliant identify provider
+        (IdP) that has been defined for your account
+        'https://<your_okta_account_name>.okta.com' to authenticate
+        through native Okta.
+    :type authenticator: str
+    :param session_parameters: You can set session-level parameters at
+        the time you connect to Snowflake
+    :type session_parameters: dict
+    """
+
+    template_fields = ('sql',)
+    template_ext = ('.sql',)
+    ui_color = '#ededed'
+
+    def __init__(
+        self,
+        *,
+        sql: Any,
+        snowflake_conn_id: str = 'snowflake_default',
+        parameters: Optional[dict] = None,
+        autocommit: bool = True,
+        do_xcom_push: bool = True,
+        warehouse: Optional[str] = None,
+        database: Optional[str] = None,
+        role: Optional[str] = None,
+        schema: Optional[str] = None,
+        authenticator: Optional[str] = None,
+        session_parameters: Optional[dict] = None,
+        **kwargs,
+    ) -> None:
+        super().__init__(sql=sql, **kwargs)
+        self.snowflake_conn_id = snowflake_conn_id
+        self.sql = sql
+        self.autocommit = autocommit
+        self.do_xcom_push = do_xcom_push
+        self.parameters = parameters
+        self.warehouse = warehouse
+        self.database = database
+        self.role = role
+        self.schema = schema
+        self.authenticator = authenticator
+        self.session_parameters = session_parameters
+        self.query_ids = []
+
+
+class SnowflakeValueCheckOperator(_SnowflakeDbHookMixin, SQLValueCheckOperator):
+    """
+    Performs a simple value check using sql code.
+
+    :param sql: the sql to be executed
+    :type sql: str
+    :param snowflake_conn_id: Reference to
+        :ref:`Snowflake connection id<howto/connection:snowflake>`
+    :type snowflake_conn_id: str
+    :param autocommit: if True, each command is automatically committed.
+        (default value: True)
+    :type autocommit: bool
+    :param parameters: (optional) the parameters to render the SQL query with.
+    :type parameters: dict or iterable
+    :param warehouse: name of warehouse (will overwrite any warehouse
+        defined in the connection's extra JSON)
+    :type warehouse: str
+    :param database: name of database (will overwrite database defined
+        in connection)
+    :type database: str
+    :param schema: name of schema (will overwrite schema defined in
+        connection)
+    :type schema: str
+    :param role: name of role (will overwrite any role defined in
+        connection's extra JSON)
+    :type role: str
+    :param authenticator: authenticator for Snowflake.
+        'snowflake' (default) to use the internal Snowflake authenticator
+        'externalbrowser' to authenticate using your web browser and
+        Okta, ADFS or any other SAML 2.0-compliant identify provider
+        (IdP) that has been defined for your account
+        'https://<your_okta_account_name>.okta.com' to authenticate
+        through native Okta.
+    :type authenticator: str
+    :param session_parameters: You can set session-level parameters at
+        the time you connect to Snowflake
+    :type session_parameters: dict
+    """
+
+    def __init__(
+        self,
+        *,
+        sql: Any,
+        pass_value: Any,
+        tolerance: Any = None,
+        snowflake_conn_id: str = 'snowflake_default',
+        parameters: Optional[dict] = None,
+        autocommit: bool = True,
+        do_xcom_push: bool = True,
+        warehouse: Optional[str] = None,
+        database: Optional[str] = None,
+        role: Optional[str] = None,
+        schema: Optional[str] = None,
+        authenticator: Optional[str] = None,
+        session_parameters: Optional[dict] = None,
+        **kwargs,
+    ) -> None:
+        super().__init__(sql=sql, pass_value=pass_value, tolerance=tolerance, **kwargs)
+        self.snowflake_conn_id = snowflake_conn_id
+        self.sql = sql
+        self.autocommit = autocommit
+        self.do_xcom_push = do_xcom_push
+        self.parameters = parameters
+        self.warehouse = warehouse
+        self.database = database
+        self.role = role
+        self.schema = schema
+        self.authenticator = authenticator
+        self.session_parameters = session_parameters
+        self.query_ids = []
+
+
+class SnowflakeIntervalCheckOperator(_SnowflakeDbHookMixin, SQLIntervalCheckOperator):
+    """
+    Checks that the values of metrics given as SQL expressions are within
+    a certain tolerance of the ones from days_back before.
+
+    This method constructs a query like so ::
+
+        SELECT {metrics_threshold_dict_key} FROM {table}
+        WHERE {date_filter_column}=<date>
+
+    :param table: the table name
+    :type table: str
+    :param days_back: number of days between ds and the ds we want to check

Review comment:
       Agreed. Those docstrings aren't even in the SQL Operators (and should be).

##########
File path: airflow/providers/snowflake/operators/snowflake.py
##########
@@ -125,3 +126,285 @@ def execute(self, context: Any) -> None:
 
         if self.do_xcom_push:
             return execution_info
+
+
+class _SnowflakeDbHookMixin:
+    def get_db_hook(self) -> SnowflakeHook:
+        """
+        Create and return SnowflakeHook.
+
+        :return: a SnowflakeHook instance.
+        :rtype: SnowflakeHook
+        """
+        return SnowflakeHook(
+            snowflake_conn_id=self.snowflake_conn_id,
+            warehouse=self.warehouse,
+            database=self.database,
+            role=self.role,
+            schema=self.schema,
+            authenticator=self.authenticator,
+            session_parameters=self.session_parameters,
+        )
+
+
+class SnowflakeCheckOperator(_SnowflakeDbHookMixin, SQLCheckOperator):
+    """
+    Performs a check against Snowflake. The ``SnowflakeCheckOperator`` expects
+    a sql query that will return a single row. Each value on that
+    first row is evaluated using python ``bool`` casting. If any of the
+    values return ``False`` the check is failed and errors out.
+
+    Note that Python bool casting evals the following as ``False``:
+
+    * ``False``
+    * ``0``
+    * Empty string (``""``)
+    * Empty list (``[]``)
+    * Empty dictionary or set (``{}``)
+
+    Given a query like ``SELECT COUNT(*) FROM foo``, it will fail only if
+    the count ``== 0``. You can craft much more complex query that could,
+    for instance, check that the table has the same number of rows as
+    the source table upstream, or that the count of today's partition is
+    greater than yesterday's partition, or that a set of metrics are less
+    than 3 standard deviation for the 7 day average.
+
+    This operator can be used as a data quality check in your pipeline, and
+    depending on where you put it in your DAG, you have the choice to
+    stop the critical path, preventing from
+    publishing dubious data, or on the side and receive email alerts
+    without stopping the progress of the DAG.
+
+    :param sql: the sql code to be executed. (templated)
+    :type sql: Can receive a str representing a sql statement,
+        a list of str (sql statements), or reference to a template file.
+        Template reference are recognized by str ending in '.sql'
+    :param snowflake_conn_id: Reference to
+        :ref:`Snowflake connection id<howto/connection:snowflake>`
+    :type snowflake_conn_id: str
+    :param autocommit: if True, each command is automatically committed.
+        (default value: True)
+    :type autocommit: bool
+    :param parameters: (optional) the parameters to render the SQL query with.
+    :type parameters: dict or iterable
+    :param warehouse: name of warehouse (will overwrite any warehouse
+        defined in the connection's extra JSON)
+    :type warehouse: str
+    :param database: name of database (will overwrite database defined
+        in connection)
+    :type database: str
+    :param schema: name of schema (will overwrite schema defined in
+        connection)
+    :type schema: str
+    :param role: name of role (will overwrite any role defined in
+        connection's extra JSON)
+    :type role: str
+    :param authenticator: authenticator for Snowflake.
+        'snowflake' (default) to use the internal Snowflake authenticator
+        'externalbrowser' to authenticate using your web browser and
+        Okta, ADFS or any other SAML 2.0-compliant identify provider
+        (IdP) that has been defined for your account
+        'https://<your_okta_account_name>.okta.com' to authenticate
+        through native Okta.
+    :type authenticator: str
+    :param session_parameters: You can set session-level parameters at
+        the time you connect to Snowflake
+    :type session_parameters: dict
+    """
+
+    template_fields = ('sql',)
+    template_ext = ('.sql',)
+    ui_color = '#ededed'
+
+    def __init__(
+        self,
+        *,
+        sql: Any,
+        snowflake_conn_id: str = 'snowflake_default',
+        parameters: Optional[dict] = None,
+        autocommit: bool = True,
+        do_xcom_push: bool = True,
+        warehouse: Optional[str] = None,
+        database: Optional[str] = None,
+        role: Optional[str] = None,
+        schema: Optional[str] = None,
+        authenticator: Optional[str] = None,
+        session_parameters: Optional[dict] = None,
+        **kwargs,
+    ) -> None:
+        super().__init__(sql=sql, **kwargs)
+        self.snowflake_conn_id = snowflake_conn_id
+        self.sql = sql
+        self.autocommit = autocommit
+        self.do_xcom_push = do_xcom_push
+        self.parameters = parameters
+        self.warehouse = warehouse
+        self.database = database
+        self.role = role
+        self.schema = schema
+        self.authenticator = authenticator
+        self.session_parameters = session_parameters
+        self.query_ids = []
+
+
+class SnowflakeValueCheckOperator(_SnowflakeDbHookMixin, SQLValueCheckOperator):
+    """
+    Performs a simple value check using sql code.
+
+    :param sql: the sql to be executed
+    :type sql: str
+    :param snowflake_conn_id: Reference to
+        :ref:`Snowflake connection id<howto/connection:snowflake>`
+    :type snowflake_conn_id: str
+    :param autocommit: if True, each command is automatically committed.
+        (default value: True)
+    :type autocommit: bool
+    :param parameters: (optional) the parameters to render the SQL query with.
+    :type parameters: dict or iterable
+    :param warehouse: name of warehouse (will overwrite any warehouse
+        defined in the connection's extra JSON)
+    :type warehouse: str
+    :param database: name of database (will overwrite database defined
+        in connection)
+    :type database: str
+    :param schema: name of schema (will overwrite schema defined in
+        connection)
+    :type schema: str
+    :param role: name of role (will overwrite any role defined in
+        connection's extra JSON)
+    :type role: str
+    :param authenticator: authenticator for Snowflake.
+        'snowflake' (default) to use the internal Snowflake authenticator
+        'externalbrowser' to authenticate using your web browser and
+        Okta, ADFS or any other SAML 2.0-compliant identify provider
+        (IdP) that has been defined for your account
+        'https://<your_okta_account_name>.okta.com' to authenticate
+        through native Okta.
+    :type authenticator: str
+    :param session_parameters: You can set session-level parameters at
+        the time you connect to Snowflake
+    :type session_parameters: dict
+    """
+
+    def __init__(
+        self,
+        *,
+        sql: Any,

Review comment:
       Actually just `str` according to the [SQL Operator](https://airflow.apache.org/docs/apache-airflow/stable/_modules/airflow/operators/sql.html#SQLValueCheckOperator) it inherits from.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] potiuk commented on a change in pull request #17741: Add Snowflake DQ Operators

Posted by GitBox <gi...@apache.org>.
potiuk commented on a change in pull request #17741:
URL: https://github.com/apache/airflow/pull/17741#discussion_r698721663



##########
File path: airflow/providers/snowflake/operators/snowflake.py
##########
@@ -125,3 +126,285 @@ def execute(self, context: Any) -> None:
 
         if self.do_xcom_push:
             return execution_info
+
+
+class _SnowflakeDbHookMixin:

Review comment:
       Good point




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] denimalpaca commented on pull request #17741: Add Snowflake DQ Operators

Posted by GitBox <gi...@apache.org>.
denimalpaca commented on pull request #17741:
URL: https://github.com/apache/airflow/pull/17741#issuecomment-902728442


   > I don’t understand Snowflake a bit to comment on whether this actually works, but the implementation seems straightforward (using things already present in Airflow) and sound.
   
   If it makes you feel better, I did add [this DAG](https://github.com/astronomer/airflow-data-quality-demo/pull/2) that I'm working on concurrently to the breeze environment and saw it working in a real Snowflake environment. So two of the three operators did run successfully.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] uranusjr commented on pull request #17741: Add Snowflake DQ Operators

Posted by GitBox <gi...@apache.org>.
uranusjr commented on pull request #17741:
URL: https://github.com/apache/airflow/pull/17741#issuecomment-909035478


   This needs to be rebased to latest main.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] potiuk closed pull request #17741: Add Snowflake DQ Operators

Posted by GitBox <gi...@apache.org>.
potiuk closed pull request #17741:
URL: https://github.com/apache/airflow/pull/17741


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] potiuk commented on pull request #17741: Add Snowflake DQ Operators

Posted by GitBox <gi...@apache.org>.
potiuk commented on pull request #17741:
URL: https://github.com/apache/airflow/pull/17741#issuecomment-902891679


   Reopened to rebuid!


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] potiuk commented on pull request #17741: Add Snowflake DQ Operators

Posted by GitBox <gi...@apache.org>.
potiuk commented on pull request #17741:
URL: https://github.com/apache/airflow/pull/17741#issuecomment-903326845


   Can you please rebase to the latest `main` ? I think there were some fixes to address genera CI job failures.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] mik-laj commented on a change in pull request #17741: Add Snowflake DQ Operators

Posted by GitBox <gi...@apache.org>.
mik-laj commented on a change in pull request #17741:
URL: https://github.com/apache/airflow/pull/17741#discussion_r695897882



##########
File path: airflow/providers/snowflake/operators/snowflake.py
##########
@@ -125,3 +126,284 @@ def execute(self, context: Any) -> None:
 
         if self.do_xcom_push:
             return execution_info
+
+
+class _SnowflakeDbHookMixin:
+    def get_db_hook(self) -> SnowflakeHook:
+        """
+        Create and return SnowflakeHook.
+        :return: a SnowflakeHook instance.

Review comment:
       ```suggestion
   
           :return: a SnowflakeHook instance.
   ```
   




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] denimalpaca commented on a change in pull request #17741: Add Snowflake DQ Operators

Posted by GitBox <gi...@apache.org>.
denimalpaca commented on a change in pull request #17741:
URL: https://github.com/apache/airflow/pull/17741#discussion_r695794477



##########
File path: airflow/providers/snowflake/operators/snowflake.py
##########
@@ -125,3 +126,299 @@ def execute(self, context: Any) -> None:
 
         if self.do_xcom_push:
             return execution_info
+
+
+class _SnowflakeDbHookMixin:
+    def get_db_hook(self) -> SnowflakeHook:
+        """
+        Create and return SnowflakeHook.
+        :return: a SnowflakeHook instance.
+        :rtype: SnowflakeHook
+        """
+        return SnowflakeHook(
+            snowflake_conn_id=self.snowflake_conn_id,
+            warehouse=self.warehouse,
+            database=self.database,
+            role=self.role,
+            schema=self.schema,
+            authenticator=self.authenticator,
+            session_parameters=self.session_parameters,
+        )
+
+
+class SnowflakeCheckOperator(_SnowflakeDbHookMixin, SQLCheckOperator):
+    """
+    Performs a check against Snowflake. The ``SnowflakeCheckOperator`` expects
+    a sql query that will return a single row. Each value on that
+    first row is evaluated using python ``bool`` casting. If any of the
+    values return ``False`` the check is failed and errors out.
+
+    Note that Python bool casting evals the following as ``False``:
+
+    * ``False``
+    * ``0``
+    * Empty string (``""``)
+    * Empty list (``[]``)
+    * Empty dictionary or set (``{}``)
+
+    Given a query like ``SELECT COUNT(*) FROM foo``, it will fail only if
+    the count ``== 0``. You can craft much more complex query that could,
+    for instance, check that the table has the same number of rows as
+    the source table upstream, or that the count of today's partition is
+    greater than yesterday's partition, or that a set of metrics are less
+    than 3 standard deviation for the 7 day average.
+
+    This operator can be used as a data quality check in your pipeline, and
+    depending on where you put it in your DAG, you have the choice to
+    stop the critical path, preventing from
+    publishing dubious data, or on the side and receive email alerts
+    without stopping the progress of the DAG.
+
+    :param sql: the sql code to be executed. (templated)
+    :type sql: Can receive a str representing a sql statement,
+        a list of str (sql statements), or reference to a template file.
+        Template reference are recognized by str ending in '.sql'
+    :param use_legacy_sql: Whether to use legacy SQL (true)
+        or standard SQL (false).
+    :type use_legacy_sql: bool
+    :param snowflake_conn_id: Reference to
+        :ref:`Snowflake connection id<howto/connection:snowflake>`
+    :type snowflake_conn_id: str
+    :param autocommit: if True, each command is automatically committed.
+        (default value: True)
+    :type autocommit: bool
+    :param parameters: (optional) the parameters to render the SQL query with.
+    :type parameters: dict or iterable
+    :param warehouse: name of warehouse (will overwrite any warehouse
+        defined in the connection's extra JSON)
+    :type warehouse: str
+    :param database: name of database (will overwrite database defined
+        in connection)
+    :type database: str
+    :param schema: name of schema (will overwrite schema defined in
+        connection)
+    :type schema: str
+    :param role: name of role (will overwrite any role defined in
+        connection's extra JSON)
+    :type role: str
+    :param authenticator: authenticator for Snowflake.
+        'snowflake' (default) to use the internal Snowflake authenticator
+        'externalbrowser' to authenticate using your web browser and
+        Okta, ADFS or any other SAML 2.0-compliant identify provider
+        (IdP) that has been defined for your account
+        'https://<your_okta_account_name>.okta.com' to authenticate
+        through native Okta.
+    :type authenticator: str
+    :param session_parameters: You can set session-level parameters at
+        the time you connect to Snowflake
+    :type session_parameters: dict
+    """
+
+    template_fields = ('sql',)
+    template_ext = ('.sql',)
+    ui_color = '#ededed'
+
+    def __init__(
+        self,
+        *,
+        sql: Any,
+        use_legacy_sql: bool = True,
+        snowflake_conn_id: str = 'snowflake_default',
+        parameters: Optional[dict] = None,
+        autocommit: bool = True,
+        do_xcom_push: bool = True,
+        warehouse: Optional[str] = None,
+        database: Optional[str] = None,
+        role: Optional[str] = None,
+        schema: Optional[str] = None,
+        authenticator: Optional[str] = None,
+        session_parameters: Optional[dict] = None,
+        **kwargs,
+    ) -> None:
+        super().__init__(sql=sql, **kwargs)
+        self.snowflake_conn_id = snowflake_conn_id
+        self.sql = sql
+        self.use_legacy_sql = use_legacy_sql
+        self.autocommit = autocommit
+        self.do_xcom_push = do_xcom_push
+        self.parameters = parameters
+        self.warehouse = warehouse
+        self.database = database
+        self.role = role
+        self.schema = schema
+        self.authenticator = authenticator
+        self.session_parameters = session_parameters
+        self.query_ids = []
+
+
+class SnowflakeValueCheckOperator(_SnowflakeDbHookMixin, SQLValueCheckOperator):
+    """
+    Performs a simple value check using sql code.
+
+    :param sql: the sql to be executed
+    :type sql: str
+    :param use_legacy_sql: Whether to use legacy SQL (true)

Review comment:
       Ah, no. Mistakenly added as I used the BigQuery operator as a model.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] JavierLopezT commented on a change in pull request #17741: Add Snowflake DQ Operators

Posted by GitBox <gi...@apache.org>.
JavierLopezT commented on a change in pull request #17741:
URL: https://github.com/apache/airflow/pull/17741#discussion_r701008583



##########
File path: airflow/providers/snowflake/operators/snowflake.py
##########
@@ -125,3 +126,285 @@ def execute(self, context: Any) -> None:
 
         if self.do_xcom_push:
             return execution_info
+
+
+class _SnowflakeDbHookMixin:

Review comment:
       DRY philosophy is great. However, my point was that you don't need a class at all and you can get rid of the class extra layer of abstraction. My comment is inspired by https://www.youtube.com/watch?v=o9pEzgHorH0
   
   Using the same class too in the SnowflakeOperator has been quite an improvement, but I think you can still have just a function in the file, and you can just call it from the operators. Anyway, it's not a big deal and the PR is already approved, so up to u. 




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] potiuk commented on a change in pull request #17741: Add Snowflake DQ Operators

Posted by GitBox <gi...@apache.org>.
potiuk commented on a change in pull request #17741:
URL: https://github.com/apache/airflow/pull/17741#discussion_r698721663



##########
File path: airflow/providers/snowflake/operators/snowflake.py
##########
@@ -125,3 +126,285 @@ def execute(self, context: Any) -> None:
 
         if self.do_xcom_push:
             return execution_info
+
+
+class _SnowflakeDbHookMixin:

Review comment:
       Good point

##########
File path: airflow/providers/snowflake/operators/snowflake.py
##########
@@ -125,3 +126,285 @@ def execute(self, context: Any) -> None:
 
         if self.do_xcom_push:
             return execution_info
+
+
+class _SnowflakeDbHookMixin:
+    def get_db_hook(self) -> SnowflakeHook:
+        """
+        Create and return SnowflakeHook.
+
+        :return: a SnowflakeHook instance.
+        :rtype: SnowflakeHook
+        """
+        return SnowflakeHook(
+            snowflake_conn_id=self.snowflake_conn_id,
+            warehouse=self.warehouse,
+            database=self.database,
+            role=self.role,
+            schema=self.schema,
+            authenticator=self.authenticator,
+            session_parameters=self.session_parameters,
+        )
+
+
+class SnowflakeCheckOperator(_SnowflakeDbHookMixin, SQLCheckOperator):
+    """
+    Performs a check against Snowflake. The ``SnowflakeCheckOperator`` expects
+    a sql query that will return a single row. Each value on that
+    first row is evaluated using python ``bool`` casting. If any of the
+    values return ``False`` the check is failed and errors out.
+
+    Note that Python bool casting evals the following as ``False``:
+
+    * ``False``
+    * ``0``
+    * Empty string (``""``)
+    * Empty list (``[]``)
+    * Empty dictionary or set (``{}``)
+
+    Given a query like ``SELECT COUNT(*) FROM foo``, it will fail only if
+    the count ``== 0``. You can craft much more complex query that could,
+    for instance, check that the table has the same number of rows as
+    the source table upstream, or that the count of today's partition is
+    greater than yesterday's partition, or that a set of metrics are less
+    than 3 standard deviation for the 7 day average.
+
+    This operator can be used as a data quality check in your pipeline, and
+    depending on where you put it in your DAG, you have the choice to
+    stop the critical path, preventing from
+    publishing dubious data, or on the side and receive email alerts
+    without stopping the progress of the DAG.
+
+    :param sql: the sql code to be executed. (templated)
+    :type sql: Can receive a str representing a sql statement,
+        a list of str (sql statements), or reference to a template file.
+        Template reference are recognized by str ending in '.sql'
+    :param snowflake_conn_id: Reference to
+        :ref:`Snowflake connection id<howto/connection:snowflake>`
+    :type snowflake_conn_id: str
+    :param autocommit: if True, each command is automatically committed.

Review comment:
       The can be also UPDATE statements

##########
File path: airflow/providers/snowflake/operators/snowflake.py
##########
@@ -125,3 +126,285 @@ def execute(self, context: Any) -> None:
 
         if self.do_xcom_push:
             return execution_info
+
+
+class _SnowflakeDbHookMixin:
+    def get_db_hook(self) -> SnowflakeHook:
+        """
+        Create and return SnowflakeHook.
+
+        :return: a SnowflakeHook instance.
+        :rtype: SnowflakeHook
+        """
+        return SnowflakeHook(
+            snowflake_conn_id=self.snowflake_conn_id,
+            warehouse=self.warehouse,
+            database=self.database,
+            role=self.role,
+            schema=self.schema,
+            authenticator=self.authenticator,
+            session_parameters=self.session_parameters,
+        )
+
+
+class SnowflakeCheckOperator(_SnowflakeDbHookMixin, SQLCheckOperator):
+    """
+    Performs a check against Snowflake. The ``SnowflakeCheckOperator`` expects
+    a sql query that will return a single row. Each value on that
+    first row is evaluated using python ``bool`` casting. If any of the
+    values return ``False`` the check is failed and errors out.
+
+    Note that Python bool casting evals the following as ``False``:
+
+    * ``False``
+    * ``0``
+    * Empty string (``""``)
+    * Empty list (``[]``)
+    * Empty dictionary or set (``{}``)
+
+    Given a query like ``SELECT COUNT(*) FROM foo``, it will fail only if
+    the count ``== 0``. You can craft much more complex query that could,
+    for instance, check that the table has the same number of rows as
+    the source table upstream, or that the count of today's partition is
+    greater than yesterday's partition, or that a set of metrics are less
+    than 3 standard deviation for the 7 day average.
+
+    This operator can be used as a data quality check in your pipeline, and
+    depending on where you put it in your DAG, you have the choice to
+    stop the critical path, preventing from
+    publishing dubious data, or on the side and receive email alerts
+    without stopping the progress of the DAG.
+
+    :param sql: the sql code to be executed. (templated)
+    :type sql: Can receive a str representing a sql statement,
+        a list of str (sql statements), or reference to a template file.
+        Template reference are recognized by str ending in '.sql'
+    :param snowflake_conn_id: Reference to
+        :ref:`Snowflake connection id<howto/connection:snowflake>`
+    :type snowflake_conn_id: str
+    :param autocommit: if True, each command is automatically committed.
+        (default value: True)
+    :type autocommit: bool
+    :param parameters: (optional) the parameters to render the SQL query with.
+    :type parameters: dict or iterable
+    :param warehouse: name of warehouse (will overwrite any warehouse
+        defined in the connection's extra JSON)
+    :type warehouse: str
+    :param database: name of database (will overwrite database defined
+        in connection)
+    :type database: str
+    :param schema: name of schema (will overwrite schema defined in
+        connection)
+    :type schema: str
+    :param role: name of role (will overwrite any role defined in
+        connection's extra JSON)
+    :type role: str
+    :param authenticator: authenticator for Snowflake.
+        'snowflake' (default) to use the internal Snowflake authenticator
+        'externalbrowser' to authenticate using your web browser and
+        Okta, ADFS or any other SAML 2.0-compliant identify provider
+        (IdP) that has been defined for your account
+        'https://<your_okta_account_name>.okta.com' to authenticate
+        through native Okta.
+    :type authenticator: str
+    :param session_parameters: You can set session-level parameters at
+        the time you connect to Snowflake
+    :type session_parameters: dict
+    """
+
+    template_fields = ('sql',)
+    template_ext = ('.sql',)
+    ui_color = '#ededed'
+
+    def __init__(
+        self,
+        *,
+        sql: Any,
+        snowflake_conn_id: str = 'snowflake_default',
+        parameters: Optional[dict] = None,
+        autocommit: bool = True,
+        do_xcom_push: bool = True,
+        warehouse: Optional[str] = None,
+        database: Optional[str] = None,
+        role: Optional[str] = None,
+        schema: Optional[str] = None,
+        authenticator: Optional[str] = None,
+        session_parameters: Optional[dict] = None,
+        **kwargs,
+    ) -> None:
+        super().__init__(sql=sql, **kwargs)
+        self.snowflake_conn_id = snowflake_conn_id
+        self.sql = sql
+        self.autocommit = autocommit
+        self.do_xcom_push = do_xcom_push
+        self.parameters = parameters
+        self.warehouse = warehouse
+        self.database = database
+        self.role = role
+        self.schema = schema
+        self.authenticator = authenticator
+        self.session_parameters = session_parameters
+        self.query_ids = []
+
+
+class SnowflakeValueCheckOperator(_SnowflakeDbHookMixin, SQLValueCheckOperator):
+    """
+    Performs a simple value check using sql code.
+
+    :param sql: the sql to be executed
+    :type sql: str
+    :param snowflake_conn_id: Reference to
+        :ref:`Snowflake connection id<howto/connection:snowflake>`
+    :type snowflake_conn_id: str
+    :param autocommit: if True, each command is automatically committed.
+        (default value: True)
+    :type autocommit: bool
+    :param parameters: (optional) the parameters to render the SQL query with.
+    :type parameters: dict or iterable
+    :param warehouse: name of warehouse (will overwrite any warehouse
+        defined in the connection's extra JSON)
+    :type warehouse: str
+    :param database: name of database (will overwrite database defined
+        in connection)
+    :type database: str
+    :param schema: name of schema (will overwrite schema defined in
+        connection)
+    :type schema: str
+    :param role: name of role (will overwrite any role defined in
+        connection's extra JSON)
+    :type role: str
+    :param authenticator: authenticator for Snowflake.
+        'snowflake' (default) to use the internal Snowflake authenticator
+        'externalbrowser' to authenticate using your web browser and
+        Okta, ADFS or any other SAML 2.0-compliant identify provider
+        (IdP) that has been defined for your account
+        'https://<your_okta_account_name>.okta.com' to authenticate
+        through native Okta.
+    :type authenticator: str
+    :param session_parameters: You can set session-level parameters at
+        the time you connect to Snowflake
+    :type session_parameters: dict
+    """
+
+    def __init__(
+        self,
+        *,
+        sql: Any,

Review comment:
       Good point too.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] potiuk commented on pull request #17741: Add Snowflake DQ Operators

Posted by GitBox <gi...@apache.org>.
potiuk commented on pull request #17741:
URL: https://github.com/apache/airflow/pull/17741#issuecomment-902972000


   I think you need to rebase to latest main @denimalpaca 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] kaxil merged pull request #17741: Add Snowflake DQ Operators

Posted by GitBox <gi...@apache.org>.
kaxil merged pull request #17741:
URL: https://github.com/apache/airflow/pull/17741


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] denimalpaca commented on a change in pull request #17741: Add Snowflake DQ Operators

Posted by GitBox <gi...@apache.org>.
denimalpaca commented on a change in pull request #17741:
URL: https://github.com/apache/airflow/pull/17741#discussion_r702159406



##########
File path: airflow/providers/snowflake/operators/snowflake.py
##########
@@ -125,3 +126,285 @@ def execute(self, context: Any) -> None:
 
         if self.do_xcom_push:
             return execution_info
+
+
+class _SnowflakeDbHookMixin:

Review comment:
       Will have this fixed Tuesday, looks like with the way BaseSQLOperator works, even the function without a class might not be necessary.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] uranusjr commented on pull request #17741: Add Snowflake DQ Operators

Posted by GitBox <gi...@apache.org>.
uranusjr commented on pull request #17741:
URL: https://github.com/apache/airflow/pull/17741#issuecomment-909035478


   This needs to be rebased to latest main.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org