You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by GitBox <gi...@apache.org> on 2019/07/29 14:21:06 UTC
[GitHub] [airflow] nuclearpinguin commented on issue #5681: [AIRFLOW-5065]
Add colors to console log
nuclearpinguin commented on issue #5681: [AIRFLOW-5065] Add colors to console log
URL: https://github.com/apache/airflow/pull/5681#issuecomment-516012256
In UI you can spot some ASCII codes 😕
```
*** Reading local file: /airflow/logs/example_gcp_dataproc_create_cluster/create_cluster/2019-07-29T14:14:17.519593+00:00/1.log
[2019-07-29 14:16:35,052] {taskinstance.py:614} INFO - Dependencies all met for <TaskInstance: example_gcp_dataproc_create_cluster.create_cluster 2019-07-29T14:14:17.519593+00:00 [queued]>
[2019-07-29 14:16:35,061] {taskinstance.py:614} INFO - Dependencies all met for <TaskInstance: example_gcp_dataproc_create_cluster.create_cluster 2019-07-29T14:14:17.519593+00:00 [queued]>
[2019-07-29 14:16:35,061] {taskinstance.py:832} INFO -
--------------------------------------------------------------------------------
[2019-07-29 14:16:35,061] {taskinstance.py:833} INFO - Starting attempt 1 of 1
[2019-07-29 14:16:35,061] {taskinstance.py:834} INFO -
--------------------------------------------------------------------------------
[2019-07-29 14:16:35,079] {taskinstance.py:853} INFO - Executing <Task(DataprocClusterCreateOperator): create_cluster> on 2019-07-29T14:14:17.519593+00:00
[2019-07-29 14:16:35,079] {base_task_runner.py:127} INFO - Running: ['airflow', 'tasks', 'run', 'example_gcp_dataproc_create_cluster', 'create_cluster', '2019-07-29T14:14:17.519593+00:00', '--job_id', '4', '--pool', 'default_pool', '--raw', '-sd', 'DAGS_FOLDER/example_gcp_dataproc_create_cluster.py', '--cfg_path', '/tmp/tmpjta0f2_e']
[2019-07-29 14:16:36,836] {base_task_runner.py:113} INFO - Job 4: Subtask create_cluster /workspace/airflow/models/dagbag.py:21: DeprecationWarning: the imp module is deprecated in favour of importlib; see the module's documentation for alternative uses
[2019-07-29 14:16:36,836] {base_task_runner.py:113} INFO - Job 4: Subtask create_cluster import imp
[2019-07-29 14:16:37,008] {base_task_runner.py:113} INFO - Job 4: Subtask create_cluster [[34m2019-07-29 14:16:37,008[0m] {[34msettings.py[0m:173} INFO - settings.configure_orm(): Using pool settings. pool_size=5, max_overflow=10, pool_recycle=1800, pid=78770
[2019-07-29 14:16:37,129] {base_task_runner.py:113} INFO - Job 4: Subtask create_cluster [[34m2019-07-29 14:16:37,128[0m] {[34m__init__.py[0m:51} INFO - Using executor [1mLocalExecutor[0m
[2019-07-29 14:16:37,743] {base_task_runner.py:113} INFO - Job 4: Subtask create_cluster [[34m2019-07-29 14:16:37,741[0m] {[34mdagbag.py[0m:86} INFO - Filling up the DagBag from [1m/airflow/dags/example_gcp_dataproc_create_cluster.py[0m
[2019-07-29 14:16:37,947] {base_task_runner.py:113} INFO - Job 4: Subtask create_cluster [[34m2019-07-29 14:16:37,946[0m] {[34mcli.py[0m:536} INFO - Running [1m<TaskInstance: example_gcp_dataproc_create_cluster.create_cluster 2019-07-29T14:14:17.519593+00:00 [running]>[0m on host [1md40eb3bb69b0[0m
[2019-07-29 14:16:37,966] {dataproc_operator.py:433} INFO - Creating cluster: cluster-tomasz36-build
[2019-07-29 14:16:37,967] {logging_mixin.py:98} INFO - [[34m2019-07-29 14:16:37,967[0m] {[34mgcp_api_base_hook.py[0m:98} INFO - Getting connection using `google.auth.default()` since no key file is defined for hook.
[2019-07-29 14:16:40,973] {logging_mixin.py:98} INFO - [[34m2019-07-29 14:16:40,973[0m] {[34m_metadata.py[0m:86} INFO - Compute Engine Metadata server unavailable onattempt 1 of 3
[2019-07-29 14:16:43,943] {logging_mixin.py:98} INFO - [[34m2019-07-29 14:16:43,943[0m] {[34m_metadata.py[0m:86} INFO - Compute Engine Metadata server unavailable onattempt 2 of 3
[2019-07-29 14:16:43,944] {logging_mixin.py:98} INFO - [[34m2019-07-29 14:16:43,944[0m] {[34m_metadata.py[0m:86} INFO - Compute Engine Metadata server unavailable onattempt 3 of 3
[2019-07-29 14:16:43,945] {taskinstance.py:1045} ERROR - Could not automatically determine credentials. Please set GOOGLE_APPLICATION_CREDENTIALS or explicitly create credentials and re-run the application. For more information, please see https://cloud.google.com/docs/authentication/getting-started
Traceback (most recent call last):
File "/workspace/airflow/models/taskinstance.py", line 920, in _run_raw_task
result = task_copy.execute(context=context)
File "/workspace/airflow/contrib/operators/dataproc_operator.py", line 66, in execute
self.hook.wait(self.start())
File "/workspace/airflow/contrib/operators/dataproc_operator.py", line 437, in start
self.hook.get_conn().projects().regions().clusters().create( # pylint: disable=no-member
File "/workspace/airflow/contrib/hooks/gcp_dataproc_hook.py", line 442, in get_conn
http_authorized = self._authorize()
File "/workspace/airflow/contrib/hooks/gcp_api_base_hook.py", line 145, in _authorize
credentials = self._get_credentials()
File "/workspace/airflow/contrib/hooks/gcp_api_base_hook.py", line 100, in _get_credentials
credentials, _ = google.auth.default(scopes=scopes)
File "/root/.virtualenvs/airflow36/lib/python3.6/site-packages/google/auth/_default.py", line 317, in default
raise exceptions.DefaultCredentialsError(_HELP_MESSAGE)
```
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
With regards,
Apache Git Services