You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by "Anthony Miyaguchi (JIRA)" <ji...@apache.org> on 2018/09/07 22:39:00 UTC

[jira] [Updated] (AIRFLOW-3027) Read credentials from a file in the Databricks operators and hook

     [ https://issues.apache.org/jira/browse/AIRFLOW-3027?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Anthony Miyaguchi updated AIRFLOW-3027:
---------------------------------------
    Description: 
The Databricks hook requires token-based authentication via the connections database. The token is passed into the connections field:
{code:java}
 Extras: {"token": "<GENERATED_TOKEN>"}{code}

 This means the token can be seen in plaintext in the Admin UI, which is undesirable for our setup. The AWS hook gets around this by either using boto's authentication mechanisms or by reading from a file.
{code:java}
elif 's3_config_file' in connection_object.extra_dejson:
    aws_access_key_id, aws_secret_access_key = \
        _parse_s3_config(
            connection_object.extra_dejson['s3_config_file'],
            connection_object.extra_dejson.get('s3_config_format')){code}
[[source] [https://github.com/apache/incubator-airflow/blob/08ecca47862f304dba548bcfc6c34406cdcf556f/airflow/contrib/hooks/aws_hook.py#L110-L114]|https://github.com/apache/incubator-airflow/blob/08ecca47862f304dba548bcfc6c34406cdcf556f/airflow/contrib/hooks/aws_hook.py#L110-L114]

 

The databricks hook should also support reading the token from a file to avoid exposing sensitive tokens in plaintext.

 

  was:
The Databricks hook requires token-based authentication via the connections database. The token is passed into the connections field:
Extras: \{"token": "<GENERATED_TOKEN>"}
This means the token can be seen in plaintext in the Admin UI, which is undesirable for our setup. The AWS hook gets around this by either using boto's authentication mechanisms or by reading from a file.
{code:java}
elif 's3_config_file' in connection_object.extra_dejson:
    aws_access_key_id, aws_secret_access_key = \
        _parse_s3_config(
            connection_object.extra_dejson['s3_config_file'],
            connection_object.extra_dejson.get('s3_config_format')){code}
[[source] https://github.com/apache/incubator-airflow/blob/08ecca47862f304dba548bcfc6c34406cdcf556f/airflow/contrib/hooks/aws_hook.py#L110-L114|https://github.com/apache/incubator-airflow/blob/08ecca47862f304dba548bcfc6c34406cdcf556f/airflow/contrib/hooks/aws_hook.py#L110-L114]

 

The databricks hook should also support reading the token from a file to avoid exposing sensitive tokens in plaintext.

 


> Read credentials from a file in the Databricks operators and hook
> -----------------------------------------------------------------
>
>                 Key: AIRFLOW-3027
>                 URL: https://issues.apache.org/jira/browse/AIRFLOW-3027
>             Project: Apache Airflow
>          Issue Type: Improvement
>          Components: authentication, hooks, operators
>    Affects Versions: 1.9.0
>            Reporter: Anthony Miyaguchi
>            Priority: Minor
>
> The Databricks hook requires token-based authentication via the connections database. The token is passed into the connections field:
> {code:java}
>  Extras: {"token": "<GENERATED_TOKEN>"}{code}
>  This means the token can be seen in plaintext in the Admin UI, which is undesirable for our setup. The AWS hook gets around this by either using boto's authentication mechanisms or by reading from a file.
> {code:java}
> elif 's3_config_file' in connection_object.extra_dejson:
>     aws_access_key_id, aws_secret_access_key = \
>         _parse_s3_config(
>             connection_object.extra_dejson['s3_config_file'],
>             connection_object.extra_dejson.get('s3_config_format')){code}
> [[source] [https://github.com/apache/incubator-airflow/blob/08ecca47862f304dba548bcfc6c34406cdcf556f/airflow/contrib/hooks/aws_hook.py#L110-L114]|https://github.com/apache/incubator-airflow/blob/08ecca47862f304dba548bcfc6c34406cdcf556f/airflow/contrib/hooks/aws_hook.py#L110-L114]
>  
> The databricks hook should also support reading the token from a file to avoid exposing sensitive tokens in plaintext.
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)