You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by "Andreas Merkel (JIRA)" <ji...@apache.org> on 2016/05/10 07:03:12 UTC
[jira] [Updated] (AIRFLOW-90) `[celery]` section needed even if
CeleryExecutor not used
[ https://issues.apache.org/jira/browse/AIRFLOW-90?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Andreas Merkel updated AIRFLOW-90:
----------------------------------
Description:
Now that you know a little about me, let me tell you about the issue I am having:
* What did you expect to happen?
I expect that if I don't use the Celery Executor, I don't need a {{[celery]}} section in my {{airflow.cfg}}.
* What happened instead?
If I remove the section, Airflow does not start. At the very least I need
{code}
[celery]
celeryd_concurrency = 1
{code}
regardless of the executor configured.
* Stack trace, if appropriate:
{code}
Traceback (most recent call last):
File "/home/PHI-TPS/amerkel/venv/airflow/bin/airflow", line 13, in <module>
parser = get_parser()
File "/home/PHI-TPS/amerkel/venv/airflow/local/lib/python2.7/site-packages/airflow/bin/cli.py", line 751, in get_parser
default=configuration.get('celery', 'celeryd_concurrency'))
File "/home/PHI-TPS/amerkel/venv/airflow/local/lib/python2.7/site-packages/airflow/configuration.py", line 520, in get
return conf.get(section, key, **kwargs)
File "/home/PHI-TPS/amerkel/venv/airflow/local/lib/python2.7/site-packages/airflow/configuration.py", line 428, in get
"in config".format(**locals()))
airflow.configuration.AirflowConfigException: section/key [celery/celeryd_concurrency] not found in config
{code}
h2. Reproducing the Issue
Here is how you can reproduce this issue on your machine.
* Example code that reproduces the issue, including a minimally illustrative DAG if necessary:
{{airflow.cfg}}:
{code}
[core]
airflow_home = $PWD
dags_folder = $PWD/dags
base_log_folder = $PWD/airflow_logs
plugins_folder = $PWD/plugins
executor = SequentialExecutor
sql_alchemy_conn = sqlite:///airflow.db
parallelism = 8
dag_concurrency = 4
max_active_runs_per_dag = 4
load_examples = False
donot_pickle = False
fernet_key = ; provided via environment
[webserver]
expose_config = true
authenticate = False
filter_by_owner = False
[scheduler]
job_heartbeat_sec = 5
scheduler_heartbeat_sec = 5
{code}
* Reproduction steps:
1. Configure {{airflow.cfg}} as above
2. run {{airflow}} from the console
3. see the stack trace
was:
Now that you know a little about me, let me tell you about the issue I am having:
* What did you expect to happen?
I expect that if I don't use the Celery Executor, I don't need a {{[celery]}} section in my {{airflow.cfg}}.
* What happened instead?
If I remove the section, Airflow does not start. At the very least I need
{code}
[celery]
celeryd_concurrency = 1
{code}
regardless of the executor configured.
* Stack trace, if appropriate:
{code}
Traceback (most recent call last):
File "/home/PHI-TPS/amerkel/venv/airflow/bin/airflow", line 13, in <module>
parser = get_parser()
File "/home/PHI-TPS/amerkel/venv/airflow/local/lib/python2.7/site-packages/airflow/bin/cli.py", line 751, in get_parser
default=configuration.get('celery', 'celeryd_concurrency'))
File "/home/PHI-TPS/amerkel/venv/airflow/local/lib/python2.7/site-packages/airflow/configuration.py", line 520, in get
return conf.get(section, key, **kwargs)
File "/home/PHI-TPS/amerkel/venv/airflow/local/lib/python2.7/site-packages/airflow/configuration.py", line 428, in get
"in config".format(**locals()))
airflow.configuration.AirflowConfigException: section/key [celery/celeryd_concurrency] not found in config
{code}
.h2 Reproducing the Issue
Here is how you can reproduce this issue on your machine.
* Example code that reproduces the issue, including a minimally illustrative DAG if necessary:
{{airflow.cfg}}:
{code}
[core]
airflow_home = $PWD
dags_folder = $PWD/dags
base_log_folder = $PWD/airflow_logs
plugins_folder = $PWD/plugins
executor = SequentialExecutor
sql_alchemy_conn = sqlite:///airflow.db
parallelism = 8
dag_concurrency = 4
max_active_runs_per_dag = 4
load_examples = False
donot_pickle = False
fernet_key = ; provided via environment
[webserver]
expose_config = true
authenticate = False
filter_by_owner = False
[scheduler]
job_heartbeat_sec = 5
scheduler_heartbeat_sec = 5
{code}
* Reproduction steps:
1. Configure {{airflow.cfg}} as above
2. run {{airflow}} from the console
3. see the stack trace
> `[celery]` section needed even if CeleryExecutor not used
> ---------------------------------------------------------
>
> Key: AIRFLOW-90
> URL: https://issues.apache.org/jira/browse/AIRFLOW-90
> Project: Apache Airflow
> Issue Type: Bug
> Components: celery
> Affects Versions: Airflow 1.7.0
> Environment: * Airflow version: 1.7.0
> * Airflow components: webserver and scheduler with a postgres database and LocalExecutor
> * Relevant {{airflow.cfg}} settings: no {{[celery]}} section
> * Python Version: Python 2.7.3
> * Operating System: Linux ubu91 3.13.0-85-generic #129~precise1-Ubuntu SMP Fri Mar 18 17:38:08 UTC 2016 x86_64 x86_64 x86_64 GNU/Linux
> * Python packages:
> {code}
> Babel==1.3
> Flask==0.10.1
> Flask-Admin==1.4.0
> Flask-Bcrypt==0.7.1
> Flask-Cache==0.13.1
> Flask-Login==0.2.11
> Flask-WTF==0.12
> JPype1==0.6.1
> JayDeBeApi==0.2.0
> Jinja2==2.8
> Mako==1.0.4
> Markdown==2.6.6
> MarkupSafe==0.23
> PyHive==0.1.7
> PySmbClient==0.1.3
> Pygments==2.1.3
> SQLAlchemy==1.0.12
> Sphinx==1.4
> Sphinx-PyPI-upload==0.2.1
> WTForms==2.1
> Werkzeug==0.11.5
> airflow==1.7.0
> alabaster==0.7.7
> alembic==0.8.5
> amqp==1.4.9
> anyjson==0.3.3
> argparse==1.2.1
> backports.ssl-match-hostname==3.5.0.1
> bcrypt==2.0.0
> billiard==3.3.0.23
> boto==2.39.0
> celery==3.1.23
> certifi==2016.2.28
> cffi==1.5.2
> chartkick==0.4.2
> check-manifest==0.31
> coverage==4.0.3
> coveralls==1.1
> croniter==0.3.12
> cryptography==1.3.1
> decorator==4.0.9
> devpi-client==2.5.0
> devpi-common==2.0.8
> dill==0.2.5
> docker-py==1.7.2
> docopt==0.6.2
> docutils==0.12
> enum34==1.1.2
> filechunkio==1.6
> flake8==2.5.4
> flower==0.9.0
> funcsigs==0.4
> future==0.15.2
> futures==3.0.5
> gunicorn==19.3.0
> hdfs==2.0.5
> hive-thrift-py==0.0.1
> idna==2.1
> imagesize==0.7.0
> ipaddress==1.0.16
> ipython==4.1.2
> ipython-genutils==0.1.0
> itsdangerous==0.24
> kombu==3.0.35
> ldap3==1.2.2
> lxml==3.6.0
> mccabe==0.4.0
> mock==1.3.0
> mysqlclient==1.3.7
> nose==1.3.7
> nose-exclude==0.4.1
> numpy==1.11.0
> pandas==0.18.0
> path.py==8.1.2
> pbr==1.8.1
> pep8==1.7.0
> pexpect==4.0.1
> pickleshare==0.6
> pkginfo==1.2.1
> pluggy==0.3.1
> psycopg2==2.6.1
> ptyprocess==0.5.1
> py==1.4.31
> pyOpenSSL==16.0.0
> pyasn1==0.1.9
> pycparser==2.14
> pydruid==0.2.3
> pyflakes==1.0.0
> pykerberos==1.1.10
> pytest==2.9.1
> pytest-cov==2.2.1
> python-dateutil==2.5.2
> python-editor==1.0
> pytz==2016.3
> redis==2.10.5
> requests==2.9.1
> setproctitle==1.1.9
> simplegeneric==0.8.1
> six==1.10.0
> slackclient==1.0.0
> snowballstemmer==1.2.1
> sphinx-argparse==0.1.15
> sphinx-rtd-theme==0.1.9
> statsd==3.2.1
> thrift==0.9.3
> tornado==4.2
> tox==2.3.1
> traitlets==4.2.1
> unicodecsv==0.14.1
> virtualenv==15.0.1
> websocket-client==0.35.0
> wheel==0.29.0
> wsgiref==0.1.2
> {code}
> Reporter: Andreas Merkel
> Priority: Trivial
>
> Now that you know a little about me, let me tell you about the issue I am having:
> * What did you expect to happen?
> I expect that if I don't use the Celery Executor, I don't need a {{[celery]}} section in my {{airflow.cfg}}.
> * What happened instead?
> If I remove the section, Airflow does not start. At the very least I need
> {code}
> [celery]
> celeryd_concurrency = 1
> {code}
> regardless of the executor configured.
> * Stack trace, if appropriate:
> {code}
> Traceback (most recent call last):
> File "/home/PHI-TPS/amerkel/venv/airflow/bin/airflow", line 13, in <module>
> parser = get_parser()
> File "/home/PHI-TPS/amerkel/venv/airflow/local/lib/python2.7/site-packages/airflow/bin/cli.py", line 751, in get_parser
> default=configuration.get('celery', 'celeryd_concurrency'))
> File "/home/PHI-TPS/amerkel/venv/airflow/local/lib/python2.7/site-packages/airflow/configuration.py", line 520, in get
> return conf.get(section, key, **kwargs)
> File "/home/PHI-TPS/amerkel/venv/airflow/local/lib/python2.7/site-packages/airflow/configuration.py", line 428, in get
> "in config".format(**locals()))
> airflow.configuration.AirflowConfigException: section/key [celery/celeryd_concurrency] not found in config
> {code}
> h2. Reproducing the Issue
> Here is how you can reproduce this issue on your machine.
> * Example code that reproduces the issue, including a minimally illustrative DAG if necessary:
> {{airflow.cfg}}:
> {code}
> [core]
> airflow_home = $PWD
> dags_folder = $PWD/dags
> base_log_folder = $PWD/airflow_logs
> plugins_folder = $PWD/plugins
> executor = SequentialExecutor
> sql_alchemy_conn = sqlite:///airflow.db
> parallelism = 8
> dag_concurrency = 4
> max_active_runs_per_dag = 4
> load_examples = False
> donot_pickle = False
> fernet_key = ; provided via environment
> [webserver]
> expose_config = true
> authenticate = False
> filter_by_owner = False
> [scheduler]
> job_heartbeat_sec = 5
> scheduler_heartbeat_sec = 5
> {code}
> * Reproduction steps:
> 1. Configure {{airflow.cfg}} as above
> 2. run {{airflow}} from the console
> 3. see the stack trace
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)