You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by GitBox <gi...@apache.org> on 2021/11/26 08:25:58 UTC

[GitHub] [airflow] eskarimov opened a new pull request #19835: Refactor DatabricksHook

eskarimov opened a new pull request #19835:
URL: https://github.com/apache/airflow/pull/19835


   related: PR #19736 and feature request #18999 
   
   The PR intends to refactor `DatabricksHook` and related classes as a preparation step before introducing Deferrable Operator for Databricks.
   
   Main points:
   - Use `int` type for `run_id` and `job_id`. Databricks docs specifies it as integer. Also, actual responses from Databricks API return integer, not string.
   - Fix filling AAD headers.
   - Move AAD token validation into a separate function, covered with tests.
   - Change initialisation of `RunState` - make it explicit that `run_state` might be None. Currently there's a tiny comment in the code telling that result_state might be None in case a job is still running.
   - Align quotation style


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] eskarimov commented on a change in pull request #19835: Refactor DatabricksHook

Posted by GitBox <gi...@apache.org>.
eskarimov commented on a change in pull request #19835:
URL: https://github.com/apache/airflow/pull/19835#discussion_r757859224



##########
File path: airflow/providers/databricks/hooks/databricks.py
##########
@@ -64,10 +66,12 @@
 class RunState:
     """Utility class for the run state concept of Databricks runs."""
 
-    def __init__(self, life_cycle_state: str, result_state: str, state_message: str) -> None:
+    def __init__(
+        self, life_cycle_state: str, state_message: str, result_state: str = None, *args, **kwargs

Review comment:
       It needs to be reviewed together with the changes for initialising the class instance out of API response:
   
   Current version:
   ```python
           state = response['state']
           life_cycle_state = state['life_cycle_state']
           # result_state may not be in the state if not terminal
           result_state = state.get('result_state', None)
           state_message = state['state_message']
           return RunState(life_cycle_state, result_state, state_message)
   ```
   Proposed version:
   ```python
           state = response['state']
           return RunState(**state)
   ```
   
   Current version is basically an intermediate layer between the API response and class, extracting values out of the API response and initialising class instance. But actually the response should already represent a state, why do we need this layer then?
   I see the following drawbacks with it:
   - Class signature doesn't tell that `result_state` might be missing if state is not terminal. Currently it's described with the comment deep in the code.
   - It tends to increase repeating code - let's say we want to introduce async class for `DatabricksHook`. This logic needs to be written twice. Also in case we want to change the class in the future, let's say add new property `user_cancelled_or_timedout` (which is already a part of the API response), then we need to change class arguments, parsing response logic and class instance initialisation everywhere it's used.
     With the proposed version, we only need to change class arguments.
   
   With all the above, answering the questions:
   - > why do we need *args, **kwargs ?
         
        It shows that RunState might receive other init arguments (since we don't have control over API response), see above example with `user_cancelled_or_timedout` in the response.
   
   - > why to change order of the parameters? They are logical now: lifecycle -> result state -> state message.
   
        Just because of Python syntax, we need to put arguments with default values after required arguments.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] alexott commented on a change in pull request #19835: Refactor DatabricksHook

Posted by GitBox <gi...@apache.org>.
alexott commented on a change in pull request #19835:
URL: https://github.com/apache/airflow/pull/19835#discussion_r760832179



##########
File path: airflow/providers/databricks/hooks/databricks.py
##########
@@ -64,10 +66,12 @@
 class RunState:
     """Utility class for the run state concept of Databricks runs."""
 
-    def __init__(self, life_cycle_state: str, result_state: str, state_message: str) -> None:
+    def __init__(
+        self, life_cycle_state: str, state_message: str, result_state: str = None, *args, **kwargs

Review comment:
       yes, it's safe to assume that it's empty string




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] eskarimov commented on pull request #19835: Refactor DatabricksHook

Posted by GitBox <gi...@apache.org>.
eskarimov commented on pull request #19835:
URL: https://github.com/apache/airflow/pull/19835#issuecomment-981072864


   > Otherwise, it looks good. I need to test changes on the real Databricks instances
   
   Perfect, thanks a lot! I've just pushed the changes regarding empty string by default for `result_state` and refactoring function for AAD headers retrieval.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] eskarimov commented on a change in pull request #19835: Refactor DatabricksHook

Posted by GitBox <gi...@apache.org>.
eskarimov commented on a change in pull request #19835:
URL: https://github.com/apache/airflow/pull/19835#discussion_r757880049



##########
File path: airflow/providers/databricks/hooks/databricks.py
##########
@@ -64,10 +66,12 @@
 class RunState:
     """Utility class for the run state concept of Databricks runs."""
 
-    def __init__(self, life_cycle_state: str, result_state: str, state_message: str) -> None:
+    def __init__(
+        self, life_cycle_state: str, state_message: str, result_state: str = None, *args, **kwargs

Review comment:
       Makes sense, when it's `None` it might be also treated that the class argument wasn't set at all, but we actually set it in the init, so empty string sounds like a better option. Plus we'd avoid checking the type where it's used further. Will change it, thanks! 👍 




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] eskarimov commented on a change in pull request #19835: Refactor DatabricksHook

Posted by GitBox <gi...@apache.org>.
eskarimov commented on a change in pull request #19835:
URL: https://github.com/apache/airflow/pull/19835#discussion_r760906842



##########
File path: airflow/providers/databricks/hooks/databricks.py
##########
@@ -64,10 +66,12 @@
 class RunState:
     """Utility class for the run state concept of Databricks runs."""
 
-    def __init__(self, life_cycle_state: str, result_state: str, state_message: str) -> None:
+    def __init__(
+        self, life_cycle_state: str, state_message: str, result_state: str = None, *args, **kwargs

Review comment:
       Changed it back to `life_cycle_state`, `result_state`, `state_message` with the latter two default to empty string




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] eskarimov commented on a change in pull request #19835: Refactor DatabricksHook

Posted by GitBox <gi...@apache.org>.
eskarimov commented on a change in pull request #19835:
URL: https://github.com/apache/airflow/pull/19835#discussion_r760825912



##########
File path: airflow/providers/databricks/hooks/databricks.py
##########
@@ -64,10 +66,12 @@
 class RunState:
     """Utility class for the run state concept of Databricks runs."""
 
-    def __init__(self, life_cycle_state: str, result_state: str, state_message: str) -> None:
+    def __init__(
+        self, life_cycle_state: str, state_message: str, result_state: str = None, *args, **kwargs

Review comment:
       Thinking loud: does it make sense to assume that `state_message` by default is an empty string? Then we'd keep the order of the arguments the same




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] eskarimov commented on a change in pull request #19835: Refactor DatabricksHook

Posted by GitBox <gi...@apache.org>.
eskarimov commented on a change in pull request #19835:
URL: https://github.com/apache/airflow/pull/19835#discussion_r758619784



##########
File path: airflow/providers/databricks/hooks/databricks.py
##########
@@ -64,10 +66,12 @@
 class RunState:
     """Utility class for the run state concept of Databricks runs."""
 
-    def __init__(self, life_cycle_state: str, result_state: str, state_message: str) -> None:
+    def __init__(
+        self, life_cycle_state: str, state_message: str, result_state: str = '', *args, **kwargs
+    ) -> None:

Review comment:
       Please refer to [the comment above](https://github.com/apache/airflow/pull/19835#discussion_r757859224), replied there earlier about the same




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] eskarimov commented on pull request #19835: Refactor DatabricksHook

Posted by GitBox <gi...@apache.org>.
eskarimov commented on pull request #19835:
URL: https://github.com/apache/airflow/pull/19835#issuecomment-979780541


   @alexott could you check this PR, please?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] alexott commented on pull request #19835: Refactor DatabricksHook

Posted by GitBox <gi...@apache.org>.
alexott commented on pull request #19835:
URL: https://github.com/apache/airflow/pull/19835#issuecomment-981067180


   Otherwise, it looks good. I need to test changes on the real Databricks instances


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] alexott commented on a change in pull request #19835: Refactor DatabricksHook

Posted by GitBox <gi...@apache.org>.
alexott commented on a change in pull request #19835:
URL: https://github.com/apache/airflow/pull/19835#discussion_r757665326



##########
File path: airflow/providers/databricks/hooks/databricks.py
##########
@@ -64,10 +66,12 @@
 class RunState:
     """Utility class for the run state concept of Databricks runs."""
 
-    def __init__(self, life_cycle_state: str, result_state: str, state_message: str) -> None:
+    def __init__(
+        self, life_cycle_state: str, state_message: str, result_state: str = None, *args, **kwargs

Review comment:
       why do we need `*args, **kwargs` ? Also, why to change order of the parameters? They are logical now: lifecycle -> result state -> state message.

##########
File path: airflow/providers/databricks/hooks/databricks.py
##########
@@ -356,31 +353,31 @@ def _do_api_call(self, endpoint_info, json):
     def _log_request_error(self, attempt_num: int, error: str) -> None:
         self.log.error('Attempt %s API Request to Databricks failed with reason: %s', attempt_num, error)
 
-    def run_now(self, json: dict) -> str:
+    def run_now(self, json: dict) -> int:

Review comment:
       can it break existing code? for example if people are using this result to concatenate with log string without using `str`?

##########
File path: airflow/providers/databricks/hooks/databricks.py
##########
@@ -235,21 +241,18 @@ def _get_aad_token(self, resource: str) -> str:
             attempt_num += 1
             sleep(self.retry_delay)
 
-    def _fill_aad_tokens(self, headers: dict) -> str:
+    def _fill_aad_headers(self, headers: dict) -> dict:
         """
-        Fills headers if necessary (SPN is outside of the workspace) and generates AAD token
+        Fills AAD headers if necessary (SPN is outside of the workspace)
         :param headers: dictionary with headers to fill-in
-        :return: AAD token
+        :return: dictionary with filled AAD headers
         """
-        # SP is outside of the workspace
-        if 'azure_resource_id' in self.databricks_conn.extra_dejson:

Review comment:
       I would keep this check inside the function, because it could be called by accident (in the future). maybe call it `_fill_aad_headers_if_needed`?

##########
File path: airflow/providers/databricks/hooks/databricks.py
##########
@@ -522,6 +515,20 @@ def uninstall(self, json: dict) -> None:
         """
         self._do_api_call(UNINSTALL_LIBS_ENDPOINT, json)
 
+    @staticmethod
+    def _is_aad_token_valid(aad_token: dict) -> bool:

Review comment:
       why do we need a separate function that is called from one place?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] eskarimov commented on a change in pull request #19835: Refactor DatabricksHook

Posted by GitBox <gi...@apache.org>.
eskarimov commented on a change in pull request #19835:
URL: https://github.com/apache/airflow/pull/19835#discussion_r757859242



##########
File path: airflow/providers/databricks/hooks/databricks.py
##########
@@ -522,6 +515,20 @@ def uninstall(self, json: dict) -> None:
         """
         self._do_api_call(UNINSTALL_LIBS_ENDPOINT, json)
 
+    @staticmethod
+    def _is_aad_token_valid(aad_token: dict) -> bool:

Review comment:
       - Mainly for readability to hide the details for checking that token is valid under the separate function, because it's not the main purpose of the parent function `_get_aad_token`
   - There's a mistake in the current function implementation: it subtracts `TOKEN_REFRESH_LEAD_TIME` out of the current time, while it should actually sum it. With this we fix it and cover the function with tests.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] uranusjr commented on a change in pull request #19835: Refactor DatabricksHook

Posted by GitBox <gi...@apache.org>.
uranusjr commented on a change in pull request #19835:
URL: https://github.com/apache/airflow/pull/19835#discussion_r758582330



##########
File path: airflow/providers/databricks/hooks/databricks.py
##########
@@ -64,10 +66,12 @@
 class RunState:
     """Utility class for the run state concept of Databricks runs."""
 
-    def __init__(self, life_cycle_state: str, result_state: str, state_message: str) -> None:
+    def __init__(
+        self, life_cycle_state: str, state_message: str, result_state: str = '', *args, **kwargs
+    ) -> None:

Review comment:
       Why flipping the arguments? This feels like a source for unnecessary trouble. Also, why take `*args, **kwarg` and ignore them? This is usually an anti-pattern.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] eskarimov commented on pull request #19835: Refactor DatabricksHook

Posted by GitBox <gi...@apache.org>.
eskarimov commented on pull request #19835:
URL: https://github.com/apache/airflow/pull/19835#issuecomment-979815608


   > Could you revert the unnecessary double-quote-to-single-quote changes? They make the patch very difficult to review.
   
   Reverted


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] eskarimov commented on a change in pull request #19835: Refactor DatabricksHook

Posted by GitBox <gi...@apache.org>.
eskarimov commented on a change in pull request #19835:
URL: https://github.com/apache/airflow/pull/19835#discussion_r757859235



##########
File path: airflow/providers/databricks/hooks/databricks.py
##########
@@ -356,31 +353,31 @@ def _do_api_call(self, endpoint_info, json):
     def _log_request_error(self, attempt_num: int, error: str) -> None:
         self.log.error('Attempt %s API Request to Databricks failed with reason: %s', attempt_num, error)
 
-    def run_now(self, json: dict) -> str:
+    def run_now(self, json: dict) -> int:

Review comment:
       It won't break existing code, actually it's opposite - if someone assumes that output is `str` because of the function signature, then it'd break the code, because the actual returned type is `int`.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] potiuk merged pull request #19835: Refactor DatabricksHook

Posted by GitBox <gi...@apache.org>.
potiuk merged pull request #19835:
URL: https://github.com/apache/airflow/pull/19835


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] alexott commented on a change in pull request #19835: Refactor DatabricksHook

Posted by GitBox <gi...@apache.org>.
alexott commented on a change in pull request #19835:
URL: https://github.com/apache/airflow/pull/19835#discussion_r757866971



##########
File path: airflow/providers/databricks/hooks/databricks.py
##########
@@ -235,21 +241,18 @@ def _get_aad_token(self, resource: str) -> str:
             attempt_num += 1
             sleep(self.retry_delay)
 
-    def _fill_aad_tokens(self, headers: dict) -> str:
+    def _fill_aad_headers(self, headers: dict) -> dict:
         """
-        Fills headers if necessary (SPN is outside of the workspace) and generates AAD token
+        Fills AAD headers if necessary (SPN is outside of the workspace)
         :param headers: dictionary with headers to fill-in
-        :return: AAD token
+        :return: dictionary with filled AAD headers
         """
-        # SP is outside of the workspace
-        if 'azure_resource_id' in self.databricks_conn.extra_dejson:

Review comment:
       yes, I thought something like this. it's easier to use because the logic of adding headers is incorporating inside function...




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] alexott commented on a change in pull request #19835: Refactor DatabricksHook

Posted by GitBox <gi...@apache.org>.
alexott commented on a change in pull request #19835:
URL: https://github.com/apache/airflow/pull/19835#discussion_r757868042



##########
File path: airflow/providers/databricks/hooks/databricks.py
##########
@@ -64,10 +66,12 @@
 class RunState:
     """Utility class for the run state concept of Databricks runs."""
 
-    def __init__(self, life_cycle_state: str, result_state: str, state_message: str) -> None:
+    def __init__(
+        self, life_cycle_state: str, state_message: str, result_state: str = None, *args, **kwargs

Review comment:
       Just thinking - should we initialize `result_state` to empty string? If we leave it's `None`, then we need to adjust `get_run_state_str` to use empty string instead when result_state is None. WDYT?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] eskarimov commented on a change in pull request #19835: Refactor DatabricksHook

Posted by GitBox <gi...@apache.org>.
eskarimov commented on a change in pull request #19835:
URL: https://github.com/apache/airflow/pull/19835#discussion_r757859224



##########
File path: airflow/providers/databricks/hooks/databricks.py
##########
@@ -64,10 +66,12 @@
 class RunState:
     """Utility class for the run state concept of Databricks runs."""
 
-    def __init__(self, life_cycle_state: str, result_state: str, state_message: str) -> None:
+    def __init__(
+        self, life_cycle_state: str, state_message: str, result_state: str = None, *args, **kwargs

Review comment:
       It needs to be reviewed together with the changes for initialising the class instance out of API response:
   
   Current version:
   ```python
           state = response['state']
           life_cycle_state = state['life_cycle_state']
           # result_state may not be in the state if not terminal
           result_state = state.get('result_state', None)
           state_message = state['state_message']
           return RunState(life_cycle_state, result_state, state_message)
   ```
   Proposed version:
   ```python
           state = response['state']
           return RunState(**state)
   ```
   
   Current version is basically an intermediate layer between the API response and class, extracting values out of the API response and initialising class instance. But actually the response should already represent a state, why do we need this layer then?
   I see the following drawbacks with it:
   - Class description doesn't tell that `result_state` might be missing if state is not terminal. Currently it's described with the comment deep in the code.
   - It tends to increase repeating code - let's say we want to introduce async class for `DatabricksHook`. This logic needs to be written twice. Also in case we want to change the class in the future, let's say add new property `user_cancelled_or_timedout` (which is already a part of the API response), then we need to change class arguments, parsing response logic and class instance initialisation everywhere it's used.
     With the proposed version, we only need to change class arguments.
   
   With all the above, answering the questions:
   - > why do we need *args, **kwargs ?
         
        It shows that RunState might receive other init arguments (since we don't have control over API response), see above example with `user_cancelled_or_timedout` in the response.
   
   - > why to change order of the parameters? They are logical now: lifecycle -> result state -> state message.
   
        Just because of Python syntax, we need to put arguments with default values after required arguments.

##########
File path: airflow/providers/databricks/hooks/databricks.py
##########
@@ -235,21 +241,18 @@ def _get_aad_token(self, resource: str) -> str:
             attempt_num += 1
             sleep(self.retry_delay)
 
-    def _fill_aad_tokens(self, headers: dict) -> str:
+    def _fill_aad_headers(self, headers: dict) -> dict:
         """
-        Fills headers if necessary (SPN is outside of the workspace) and generates AAD token
+        Fills AAD headers if necessary (SPN is outside of the workspace)
         :param headers: dictionary with headers to fill-in
-        :return: AAD token
+        :return: dictionary with filled AAD headers
         """
-        # SP is outside of the workspace
-        if 'azure_resource_id' in self.databricks_conn.extra_dejson:

Review comment:
       What do you think if we call it `_get_aad_headers()`, which would return either empty dict or a filled dict? Also we won't need input arg `headers` in this case.
   
   Then we could construct headers like:
   ```python
   aad_headers = self._get_aad_headers()
   headers = {**USER_AGENT_HEADER.copy(), **aad_headers}
   ```
   

##########
File path: airflow/providers/databricks/hooks/databricks.py
##########
@@ -356,31 +353,31 @@ def _do_api_call(self, endpoint_info, json):
     def _log_request_error(self, attempt_num: int, error: str) -> None:
         self.log.error('Attempt %s API Request to Databricks failed with reason: %s', attempt_num, error)
 
-    def run_now(self, json: dict) -> str:
+    def run_now(self, json: dict) -> int:

Review comment:
       It won't break existing code, actually it's backwards - if someone assumes that output is `str` because of the function signature, then it'd break the code, because the actual returned type is `int`.

##########
File path: airflow/providers/databricks/hooks/databricks.py
##########
@@ -522,6 +515,20 @@ def uninstall(self, json: dict) -> None:
         """
         self._do_api_call(UNINSTALL_LIBS_ENDPOINT, json)
 
+    @staticmethod
+    def _is_aad_token_valid(aad_token: dict) -> bool:

Review comment:
       - Mainly for readability to hide the details for checking that token is valid under the function, because it's not the main purpose of the parent function `_get_aad_token`
   - There's a mistake in the current function implementation: it subtracts `TOKEN_REFRESH_LEAD_TIME` out of the current time, while it should actually sum it. With this we fix it and cover the function with tests.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] uranusjr commented on pull request #19835: Refactor DatabricksHook

Posted by GitBox <gi...@apache.org>.
uranusjr commented on pull request #19835:
URL: https://github.com/apache/airflow/pull/19835#issuecomment-979803009


   Could you revert the unnecessary double-quote-to-single-quote changes? They make the patch very difficult to review.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] eskarimov commented on pull request #19835: Refactor DatabricksHook

Posted by GitBox <gi...@apache.org>.
eskarimov commented on pull request #19835:
URL: https://github.com/apache/airflow/pull/19835#issuecomment-986181001


   @alexott may I kindly request your review please? (the button for re-requesting somehow doesn't work, nothing happens when I click on it)


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org