You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by "Shivakumar Gopalakrishnan (JIRA)" <ji...@apache.org> on 2018/08/30 16:17:00 UTC

[jira] [Created] (AIRFLOW-2986) Airflow Worker does not reach sqs

Shivakumar Gopalakrishnan created AIRFLOW-2986:
--------------------------------------------------

             Summary: Airflow Worker does not reach sqs
                 Key: AIRFLOW-2986
                 URL: https://issues.apache.org/jira/browse/AIRFLOW-2986
             Project: Apache Airflow
          Issue Type: Bug
         Environment: amazon linux
            Reporter: Shivakumar Gopalakrishnan


I am running the airflow worker service. The service is not able to connect to the sqs

The scheduler is able to reach and write to the queue

Proxies are fine; I have implemented this in both python 2.7 and 3.5 same issue

Copy of the log is below

starting airflow-worker...
/data/share/airflow
/data/share/airflow/airflow.cfg
[2018-08-30 15:41:44,367] \{settings.py:146} DEBUG - Setting up DB connection pool (PID 12304)
[2018-08-30 15:41:44,367] \{settings.py:174} INFO - setting.configure_orm(): Using pool settings. pool_size=5, pool_recycle=1800
[2018-08-30 15:41:44,468] \{__init__.py:42} DEBUG - Cannot import due to doesn't look like a module path
[2018-08-30 15:41:44,875] \{__init__.py:51} INFO - Using executor CeleryExecutor
[2018-08-30 15:41:44,886] \{cli_action_loggers.py:40} DEBUG - Adding <function default_action_log at 0x7ff3a1dc2598> to pre execution callback
[2018-08-30 15:41:44,995] \{cli_action_loggers.py:64} DEBUG - Calling callbacks: [<function default_action_log at 0x7ff3a1dc2598>]
[2018-08-30 15:41:45,768] \{settings.py:146} DEBUG - Setting up DB connection pool (PID 12308)
[2018-08-30 15:41:45,768] \{settings.py:174} INFO - setting.configure_orm(): Using pool settings. pool_size=5, pool_recycle=1800
[2018-08-30 15:41:45,883] \{__init__.py:42} DEBUG - Cannot import due to doesn't look like a module path
[2018-08-30 15:41:46,345] \{__init__.py:51} INFO - Using executor CeleryExecutor
[2018-08-30 15:41:46,358] \{cli_action_loggers.py:40} DEBUG - Adding <function default_action_log at 0x7f9e7b62e598> to pre execution callback
[2018-08-30 15:41:46,476] \{cli_action_loggers.py:64} DEBUG - Calling callbacks: [<function default_action_log at 0x7f9e7b62e598>]
Starting flask
[2018-08-30 15:41:46,519] \{_internal.py:88} INFO - * Running on http://0.0.0.0:8793/ (Press CTRL+C to quit)
[2018-08-30 15:43:58,779: CRITICAL/MainProcess] Unrecoverable error: Exception('Request Empty body HTTP 599 Failed to connect to eu-west-1.queue.amazonaws.com port 443: Connection timed out (None)',)
Traceback (most recent call last):
 File "/usr/local/lib/python3.5/site-packages/celery/worker/worker.py", line 207, in start
 self.blueprint.start(self)
 File "/usr/local/lib/python3.5/site-packages/celery/bootsteps.py", line 119, in start
 step.start(parent)
 File "/usr/local/lib/python3.5/site-packages/celery/bootsteps.py", line 370, in start
 return self.obj.start()
 File "/usr/local/lib/python3.5/site-packages/celery/worker/consumer/consumer.py", line 316, in start
 blueprint.start(self)
 File "/usr/local/lib/python3.5/site-packages/celery/bootsteps.py", line 119, in start
 step.start(parent)
 File "/usr/local/lib/python3.5/site-packages/celery/worker/consumer/consumer.py", line 592, in start
 c.loop(*c.loop_args())
 File "/usr/local/lib/python3.5/site-packages/celery/worker/loops.py", line 91, in asynloop
 next(loop)
 File "/usr/local/lib/python3.5/site-packages/kombu/asynchronous/hub.py", line 354, in create_loop
 cb(*cbargs)
 File "/usr/local/lib/python3.5/site-packages/kombu/asynchronous/http/curl.py", line 114, in on_writable
 return self._on_event(fd, _pycurl.CSELECT_OUT)
 File "/usr/local/lib/python3.5/site-packages/kombu/asynchronous/http/curl.py", line 124, in _on_event
 self._process_pending_requests()
 File "/usr/local/lib/python3.5/site-packages/kombu/asynchronous/http/curl.py", line 132, in _process_pending_requests
 self._process(curl, errno, reason)
 File "/usr/local/lib/python3.5/site-packages/kombu/asynchronous/http/curl.py", line 178, in _process
 buffer=buffer, effective_url=effective_url, error=error,
 File "/usr/local/lib/python3.5/site-packages/vine/promises.py", line 150, in __call__
 svpending(*ca, **ck)
 File "/usr/local/lib/python3.5/site-packages/vine/promises.py", line 143, in __call__
 return self.throw()
 File "/usr/local/lib/python3.5/site-packages/vine/promises.py", line 140, in __call__
 retval = fun(*final_args, **final_kwargs)
 File "/usr/local/lib/python3.5/site-packages/vine/funtools.py", line 100, in _transback
 return callback(ret)
 File "/usr/local/lib/python3.5/site-packages/vine/promises.py", line 143, in __call__
 return self.throw()

File "/usr/local/lib/python3.5/site-packages/vine/promises.py", line 140, in __call__
 retval = fun(*final_args, **final_kwargs)
 File "/usr/local/lib/python3.5/site-packages/vine/funtools.py", line 98, in _transback
 callback.throw()
 File "/usr/local/lib/python3.5/site-packages/vine/funtools.py", line 96, in _transback
 ret = filter_(*args + (ret,), **kwargs)
 File "/usr/local/lib/python3.5/site-packages/kombu/asynchronous/aws/connection.py", line 233, in _on_list_ready
 raise self._for_status(response, response.read())
Exception: Request Empty body HTTP 599 Failed to connect to eu-west-1.queue.amazonaws.com port 443: Connection timed out (None)

-------------- celery@ip-10-92-19-197 v4.1.1 (latentcall)
---- **** -----
--- * *** * -- Linux-4.9.76-3.78.amzn1.x86_64-x86_64-with-glibc2.3.4 2018-08-30 15:41:45
-- * - **** ---
- ** ---------- [config]
- ** ---------- .> app: airflow.executors.celery_executor:0x7ff3a2ab4128
- ** ---------- .> transport: sqs://localhost//
- ** ---------- .> results: postgresql://airflowdbv4:**@airflowdb.tst.aegon.io/airflowserverdbv4
- *** --- * --- .> concurrency: 16 (prefork)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** -----
 -------------- [queues]
 .> anl-sqs-tst-dp-airflowkfdkf exchange=anl-sqs-tst-dp-airflowkfdkf(direct) key=anl-sqs-tst-dp-airflowkfdkf


starting airflow-worker...
/data/share/airflow
/data/share/airflow/airflow.cfg



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)