You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by GitBox <gi...@apache.org> on 2020/12/02 07:39:52 UTC

[GitHub] [airflow] michalslowikowski00 commented on a change in pull request #12710: Enable SparkSqlHook to use supplied connections

michalslowikowski00 commented on a change in pull request #12710:
URL: https://github.com/apache/airflow/pull/12710#discussion_r533953467



##########
File path: tests/providers/apache/spark/hooks/test_spark_sql.py
##########
@@ -213,3 +209,29 @@ def test_spark_process_runcmd_and_fail(self, mock_popen):
                 sql, master, params, status
             ),
         )
+
+    def test_resolve_connection_yarn_default_connection(self):
+        hook = SparkSqlHook(conn_id='spark_default', sql='SELECT 1')

Review comment:
       This can be put into setup() test method, after that you will be have access through `self` to the hook and you will avoid code duplication. 
   ```
       def setUp(self):
               self.hook = SparkSqlHook(conn_id='spark_yarn_cluster', sql='SELECT 1')
   ```




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org