You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@airflow.apache.org by David Montgomery <da...@gmail.com> on 2016/08/21 21:46:13 UTC

airflow supervisord scripts do not work

Hi,

its great that one can start the webserver from the command line however
using the upstart scripts do not work.


this is in the
https://github.com/apache/incubator-airflow/blob/master/scripts/upstart/airflow-webserver.conf

exec usr/local/bin/airflow webserver

But I want to use supervisor. And toe below is the terrible error I get

Traceback (most recent call last): File "/usr/local/bin/airflow", line 15,
in <module> args.func(args) File
"/usr/local/lib/python2.7/dist-packages/airflow/bin/cli.py", line 423, in
webserver 'gunicorn', run_args File "/usr/lib/python2.7/os.py", line 344,
in execvp _execvpe(file, args) File "/usr/lib/python2.7/os.py", line 380,
in _execvpe func(fullname, *argrest) OSError: [Errno 2] No such file or
directory


[program:airflow]
command = /usr/local/bin/airflow webserver -p 80

process_name=%(program_name)s
autostart=true
autorestart=true
stopsignal=KILL
stopasgroup = true
killasgroup = true


stdout_logfile=/tmp/airflow.log
stdout_logfile_maxbytes=1MB
stdout_logfile_backups=1

stderr_logfile = /tmp/airflow.err
stderr_logfile_maxbytes=1MB
stderr_logfile_backups=1


[group:airflow_server]
programs=airflow

Re: airflow supervisord scripts do not work

Posted by Wang Yajun <kw...@gmail.com>.
Hi David

The airflow need to `gunicorn` as a server. So you may be should check
whether the `gunicorn` exists.

if installed, may be you can add environment to supervisord script, such as
:

```bash
[program:airflow]
....
environment=PATH=your/gunicorn/path:%(ENV_PATH)s
```




siddharth anand <sa...@apache.org>于2016年8月22日周一 上午7:23写道:

> Hi David,
> Thanks for reporting this.
>
> I don't use supervisor, but the assumption `/usr/local/bin/airflow` seems
> like the problem. Please file a bug if you haven't already and document how
> you set up your environment and what your assumptions are. Feel free to
> propose a fix via a PR. I've checked my stage and prod envs and I do see
> airflow under that path, I'm not 100% sure if that is what happens via
> setup.py or pip or both, since at some point both were run.
>
> I am redoing my deployment code to follow a more "immutable infrastructure"
> approach  - we will leverage ansible, packer, and Elastic Container
> Registry (ECR) in our new approach. If you are testing via virtualenv, then
> airflow will not be installed at the path above, so that is definitely one
> case not covered.
>
> -s
>
> On Sun, Aug 21, 2016 at 2:46 PM, David Montgomery <
> davidmontgomery@gmail.com
> > wrote:
>
> > Hi,
> >
> > its great that one can start the webserver from the command line however
> > using the upstart scripts do not work.
> >
> >
> > this is in the
> > https://github.com/apache/incubator-airflow/blob/master/
> > scripts/upstart/airflow-webserver.conf
> >
> > exec usr/local/bin/airflow webserver
> >
> > But I want to use supervisor. And toe below is the terrible error I get
> >
> > Traceback (most recent call last): File "/usr/local/bin/airflow", line
> 15,
> > in <module> args.func(args) File
> > "/usr/local/lib/python2.7/dist-packages/airflow/bin/cli.py", line 423, in
> > webserver 'gunicorn', run_args File "/usr/lib/python2.7/os.py", line 344,
> > in execvp _execvpe(file, args) File "/usr/lib/python2.7/os.py", line 380,
> > in _execvpe func(fullname, *argrest) OSError: [Errno 2] No such file or
> > directory
> >
> >
> > [program:airflow]
> > command = /usr/local/bin/airflow webserver -p 80
> >
> > process_name=%(program_name)s
> > autostart=true
> > autorestart=true
> > stopsignal=KILL
> > stopasgroup = true
> > killasgroup = true
> >
> >
> > stdout_logfile=/tmp/airflow.log
> > stdout_logfile_maxbytes=1MB
> > stdout_logfile_backups=1
> >
> > stderr_logfile = /tmp/airflow.err
> > stderr_logfile_maxbytes=1MB
> > stderr_logfile_backups=1
> >
> >
> > [group:airflow_server]
> > programs=airflow
> >
>

Re: airflow supervisord scripts do not work

Posted by siddharth anand <sa...@apache.org>.
Hi David,
Thanks for reporting this.

I don't use supervisor, but the assumption `/usr/local/bin/airflow` seems
like the problem. Please file a bug if you haven't already and document how
you set up your environment and what your assumptions are. Feel free to
propose a fix via a PR. I've checked my stage and prod envs and I do see
airflow under that path, I'm not 100% sure if that is what happens via
setup.py or pip or both, since at some point both were run.

I am redoing my deployment code to follow a more "immutable infrastructure"
approach  - we will leverage ansible, packer, and Elastic Container
Registry (ECR) in our new approach. If you are testing via virtualenv, then
airflow will not be installed at the path above, so that is definitely one
case not covered.

-s

On Sun, Aug 21, 2016 at 2:46 PM, David Montgomery <davidmontgomery@gmail.com
> wrote:

> Hi,
>
> its great that one can start the webserver from the command line however
> using the upstart scripts do not work.
>
>
> this is in the
> https://github.com/apache/incubator-airflow/blob/master/
> scripts/upstart/airflow-webserver.conf
>
> exec usr/local/bin/airflow webserver
>
> But I want to use supervisor. And toe below is the terrible error I get
>
> Traceback (most recent call last): File "/usr/local/bin/airflow", line 15,
> in <module> args.func(args) File
> "/usr/local/lib/python2.7/dist-packages/airflow/bin/cli.py", line 423, in
> webserver 'gunicorn', run_args File "/usr/lib/python2.7/os.py", line 344,
> in execvp _execvpe(file, args) File "/usr/lib/python2.7/os.py", line 380,
> in _execvpe func(fullname, *argrest) OSError: [Errno 2] No such file or
> directory
>
>
> [program:airflow]
> command = /usr/local/bin/airflow webserver -p 80
>
> process_name=%(program_name)s
> autostart=true
> autorestart=true
> stopsignal=KILL
> stopasgroup = true
> killasgroup = true
>
>
> stdout_logfile=/tmp/airflow.log
> stdout_logfile_maxbytes=1MB
> stdout_logfile_backups=1
>
> stderr_logfile = /tmp/airflow.err
> stderr_logfile_maxbytes=1MB
> stderr_logfile_backups=1
>
>
> [group:airflow_server]
> programs=airflow
>