You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by "venkata Bonu (JIRA)" <ji...@apache.org> on 2019/08/14 14:12:00 UTC

[jira] [Created] (AIRFLOW-5213) DockerOperator failing when the docker default logging drivers are other than 'journald','json-file'

venkata Bonu created AIRFLOW-5213:
-------------------------------------

             Summary: DockerOperator failing when the docker default logging drivers are other than 'journald','json-file'
                 Key: AIRFLOW-5213
                 URL: https://issues.apache.org/jira/browse/AIRFLOW-5213
             Project: Apache Airflow
          Issue Type: Bug
          Components: DAG, operators
    Affects Versions: 1.10.4
            Reporter: venkata Bonu
            Assignee: venkata Bonu
         Attachments: Screen Shot 2019-08-14 at 7.10.01 AM.png

Background:

Docker can be configured with multiple logging drivers.
 * syslog
 * local
 * json - file
 * journald
 * local
 * gelf
 * fluentd
 * awslogs
 * splunk
 * etwlogs
 * gcplogs
 * Logentries

But reading docker logs is supported only with drivers local , json-file , journald

Docker documentation: [https://docs.docker.com/config/containers/logging/configure/]

 

Description:

When a docker is configured with a logging driver other than local , json-file , jourmald , Airflow Tasks which are using DockerOperator are failing with an error

_docker.errors.APIError: 501 Server Error: Not Implemented ("configured logging driver does not support reading")_

Issue exists in the below lines of the code when the operator is trying to read the logs by attaching the container.

```
{code:python}
line = ''
for line in self.cli.attach(container=self.container['Id'], stdout=True, stderr=True, stream=True):  
    line = line.strip()
    if hasattr(line, 'decode'):
       line = line.decode('utf-8')
    self.log.info(line)
{code}
```

 

 



--
This message was sent by Atlassian JIRA
(v7.6.14#76016)