You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by "Sebastian Radloff (JIRA)" <ji...@apache.org> on 2018/03/01 17:18:00 UTC

[jira] [Created] (AIRFLOW-2162) Run DAG as user other than airflow does NOT have access to AIRFLOW_ environment variables

Sebastian Radloff created AIRFLOW-2162:
------------------------------------------

             Summary: Run DAG as user other than airflow does NOT have access to AIRFLOW_ environment variables
                 Key: AIRFLOW-2162
                 URL: https://issues.apache.org/jira/browse/AIRFLOW-2162
             Project: Apache Airflow
          Issue Type: Bug
          Components: configuration
            Reporter: Sebastian Radloff


When running airflow with LocalExecutor, I inject airflow environment variables that are supposed to override what is in the airflow.cfg, according to the documentation [https://airflow.apache.org/configuration.html.

I|https://airflow.apache.org/configuration.html.]f you specify to run your DAGs as another linux user, root for example, this is what airflow executes under the hood:
{code:java}
['bash', '-c', u'sudo -H -u root airflow run docker_sample docker_op_tester 2018-03-01T15:14:55.699668 --job_id 2 --raw -sd DAGS_FOLDER/docker-operator.py --cfg_path /tmp/tmpignV9B']
{code}
 

It uses sudo and switches to the root linux user, unfortunately, it won't have access to the environment variables injected to override the config. This is important for people who are trying to inject variables into a docker container at run time while wishing to maintain a level of security around database credentials.

I think a decent proposal made by [~ashb] in gitter, would be to automatically pass all environment variables starting with *AIRFLOW__* to any user. Please lmk if y'all want any help on the documentation or point me in the right direction and I could create a PR. 

 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)