You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by GitBox <gi...@apache.org> on 2020/11/21 21:13:20 UTC

[GitHub] [airflow] letyndr opened a new issue #12537: Mounting directories using docker operator on airflow is not working

letyndr opened a new issue #12537:
URL: https://github.com/apache/airflow/issues/12537


   <!--
   
   Welcome to Apache Airflow!  For a smooth issue process, try to answer the following questions.
   Don't worry if they're not all applicable; just try to include what you can :-)
   
   If you need to include code snippets or logs, please put them in fenced code
   blocks.  If they're super-long, please use the details tag like
   <details><summary>super-long log</summary> lots of stuff </details>
   
   Please delete these comment blocks before submitting the issue.
   
   -->
   
   <!--
   
   IMPORTANT!!!
   
   PLEASE CHECK "SIMILAR TO X EXISTING ISSUES" OPTION IF VISIBLE
   NEXT TO "SUBMIT NEW ISSUE" BUTTON!!!
   
   PLEASE CHECK IF THIS ISSUE HAS BEEN REPORTED PREVIOUSLY USING SEARCH!!!
   
   Please complete the next sections or the issue will be closed.
   These questions are the first thing we need to know to understand the context.
   
   -->
   
   **Apache Airflow version**: apache-airflow==1.10.12
   
   
   **Kubernetes version (if you are using kubernetes)** (use `kubectl version`): Does not apply
   
   **Environment**: 
   
   - **Cloud provider or hardware configuration**:
   - **OS** (e.g. from /etc/os-release): Ubuntu 18.04.5 LTS bionic
   - **Kernel** (e.g. `uname -a`): Linux letyndr-letyndr 4.15.0-123-generic #126-Ubuntu SMP Wed Oct 21 09:40:11 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux
   - **Install tools**:
   - **Others**:
   
   **What happened**:
   
   I'm trying to use the docker operator to automate the execution of some scripts using airflow.
   
   What I want to do is to "copy" all my project's files (with folders and files) to the container using this code.
   
   The following file ml-intermediate.py is in this directory ~/airflow/dags/ml-intermediate.py:
   
   
   
   ```
   """
   Template to convert a Ploomber DAG to Airflow
   """
   from airflow import DAG
   from airflow.operators.bash_operator import BashOperator
   from airflow.utils.dates import days_ago
   
   from ploomber.spec import DAGSpec
   from soopervisor.script.ScriptConfig import ScriptConfig
   
   script_cfg = ScriptConfig.from_path('/home/letyndr/airflow/dags/ml-intermediate')
   # Replace the project root to reflect the new location - or maybe just
   # write a soopervisor.yaml, then we can we rid of this line
   script_cfg.paths.project = '/home/letyndr/airflow/dags/ml-intermediate'
   
   # TODO: use lazy_import from script_cfg
   dag_ploomber = DAGSpec('/home/letyndr/airflow/dags/ml-intermediate/pipeline.yaml',
                          lazy_import=True).to_dag()
   dag_ploomber.name = "ML Intermediate"
   
   default_args = {
       'start_date': days_ago(0),
   }
   
   dag_airflow = DAG(
       dag_ploomber.name.replace(' ', '-'),
       default_args=default_args,
       description='Ploomber dag',
       schedule_interval=None,
   )
   
   script_cfg.save_script()
   
   from airflow.operators.docker_operator import DockerOperator
   for task_name in dag_ploomber:
       DockerOperator(task_id=task_name,
           image="continuumio/miniconda3",
           api_version="auto",
           auto_remove=True,
           # command="sh /home/letyndr/airflow/dags/ml-intermediate/script.sh",
           command="sleep 600",
           docker_url="unix://var/run/docker.sock",
           volumes=[
               "/home/letyndr/airflow/dags/ml-intermediate:/home/letyndr/airflow/dags/ml-intermediate:rw",
               "/home/letyndr/airflow-data/ml-intermediate:/home/letyndr/airflow-data/ml-intermediate:rw"
           ],
           working_dir=script_cfg.paths.project,
           dag=dag_airflow,
           container_name=task_name,
       )
   
   
   
   for task_name in dag_ploomber:
       task_ploomber = dag_ploomber[task_name]
       task_airflow = dag_airflow.get_task(task_name)
   
       for upstream in task_ploomber.upstream:
           task_airflow.set_upstream(dag_airflow.get_task(upstream))
   
   dag = dag_airflow
   ```
   
   When I execute this DAG using Airflow, I get the error that the docker does not find the `/home/letyndr/airflow/dags/ml-intermediate/script.sh` script. I changed the execution command of the docker operator `sleep 600` to enter to the container and check the files in the container with the corrects paths.
   
   **What you expected to happen**: Basically to share the files of the host with the docker container to execute a shell script within the container.
   
   When I'm in the container I can go to this path /home/letyndr/airflow/dags/ml-intermediate/ for example, but I don't see the files that are supposed to be there.
   
   **What do you think went wrong?** 
   
   I tried to replicate how Airflow implements Docker SDK for Python 
   
   This is my one replication of the docker implementation:
   
   ```
   import docker
   
   client = docker.APIClient()
   
   # binds = {
   #         "/home/letyndr/airflow/dags": {
   #             "bind": "/home/letyndr/airflow/dags",
   #             "mode": "rw"
   #         },
   #         "/home/letyndr/airflow-data/ml-intermediate": {
   #             "bind": "/home/letyndr/airflow-data/ml-intermediate",
   #             "mode": "rw"
   #         }
   #     }
   
   binds = ["/home/letyndr/airflow/dags:/home/letyndr/airflow/dags:rw",
   "/home/letyndr/airflow-data/ml-intermediate:/home/letyndr/airflow-data/ml-intermediate:rw"]
   
   container = client.create_container(
       image="continuumio/miniconda3",
       command="sleep 600",
       volumes=["/home/letyndr/airflow/dags", "/home/letyndr/airflow-data/ml-intermediate"],
       host_config=client.create_host_config(binds=binds),
       working_dir="/home/letyndr/airflow/dags",
       name="simple_example",
   )
   
   client.start(container=container.get("Id"))
   
   ```
   What I found was that mounting volumes only works if it's set `host_config` and `volumes`, the problem is that the implementation on Airflow just set `host_config` but not `volumes`. I added the parameter on the method `create_container`, it worked.
   
   
   
   **How to reproduce it**:
   Mount a volume from a host and use the files inside the directory in the docker container.
   
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] boring-cyborg[bot] commented on issue #12537: Mounting directories using docker operator on airflow is not working

Posted by GitBox <gi...@apache.org>.
boring-cyborg[bot] commented on issue #12537:
URL: https://github.com/apache/airflow/issues/12537#issuecomment-731637390


   Thanks for opening your first issue here! Be sure to follow the issue template!
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] bgsthiago commented on issue #12537: Mounting directories using docker operator on airflow is not working

Posted by GitBox <gi...@apache.org>.
bgsthiago commented on issue #12537:
URL: https://github.com/apache/airflow/issues/12537#issuecomment-830190630


   Apache Airflow version: apache-airflow==2.0.1
   
   I was facing the same issue, I tried to mount a volume but when I execute the python script, it didn't find the file. 
   
             t_docker = DockerOperator(
                   task_id='docker_command',
                   image='pipeline:latest',
                   api_version='auto',
                   auto_remove=True,
                   volumes=[
                           '/opt/airflow/pipelines/book/:/book/:rw',
                           ],
                   command=['python', '/book/exec.py'],
                   do_xcom_push=False,
                   docker_url='unix://var/run/docker.sock',
                   network_mode='bridge'
           )
   Tried to pass `rw` in volumes too and now exec.py exists but Python says that the file does not have the "__main__", it seems like the all the files in `book/` are directories.
   
   `[2021-04-30 15:27:20,858] {docker.py:263} INFO - /usr/local/bin/python: can't find 'main' module in '/book/exec.py'`


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] potiuk closed issue #12537: Mounting directories using docker operator on airflow is not working

Posted by GitBox <gi...@apache.org>.
potiuk closed issue #12537:
URL: https://github.com/apache/airflow/issues/12537


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] potiuk closed issue #12537: Mounting directories using docker operator on airflow is not working

Posted by GitBox <gi...@apache.org>.
potiuk closed issue #12537:
URL: https://github.com/apache/airflow/issues/12537


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] uranusjr commented on issue #12537: Mounting directories using docker operator on airflow is not working

Posted by GitBox <gi...@apache.org>.
uranusjr commented on issue #12537:
URL: https://github.com/apache/airflow/issues/12537#issuecomment-839655584


   @bgsthiago Try removing the trailing slashes? Docker really does not like them sometimes, see https://stackoverflow.com/a/38585245/1376863
   
   As a meta-issue, maybe we should deprecate `volumes` altogether and migrate to the `mounts` API instead (i.e. [`docker --mount`](https://docs.docker.com/storage/bind-mounts/#choose-the--v-or---mount-flag)). It’s much easier for users to get those right than binds (`-v`).


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org