You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by GitBox <gi...@apache.org> on 2020/12/28 15:29:37 UTC

[GitHub] [airflow] Overbryd opened a new issue #13343: Remote logging broken (Airflow 2.0.0 on GCS)

Overbryd opened a new issue #13343:
URL: https://github.com/apache/airflow/issues/13343


   **Apache Airflow version**: 2.0.0
   
   **Kubernetes version (if you are using kubernetes)** (use `kubectl version`): 
   
   ```console
   $ kubectl version
   Client Version: version.Info{Major:"1", Minor:"19", GitVersion:"v1.19.3", GitCommit:"1e11e4a2108024935ecfcb2912226cedeafd99df", GitTreeState:"clean", BuildDate:"2020-10-14T12:50:19Z", GoVersion:"go1.15.2", Compiler:"gc", Platform:"darwin/amd64"}
   Server Version: version.Info{Major:"1", Minor:"17+", GitVersion:"v1.17.14-gke.1600", GitCommit:"7c407f5cc8632f9af5a2657f220963aa7f1c46e7", GitTreeState:"clean", BuildDate:"2020-12-07T09:22:27Z", GoVersion:"go1.13.15b4", Compiler:"gc", Platform:"linux/amd64"}
   ```
   
   **Environment**:
   
   - **Cloud provider or hardware configuration**: GKE
   - **OS** (e.g. from /etc/os-release):
   - **Kernel** (e.g. `uname -a`):
   - **Install tools**:
   - **Others**:
   
   **What happened**:
   
   Remote logging is configured as follows:
   
   ```
   export AIRFLOW__LOGGING__REMOTE_LOGGING=True
   export AIRFLOW__LOGGING__REMOTE_BASE_LOG_FOLDER-gs://your-bucket-name"
   export AIRFLOW__LOGGING__REMOTE_LOG_CONN_ID=google_cloud_default
   ```
   
   It worked flawlessly before the upgrade to 2.0.0.
   Now its utterly broken, and returns weird one-line logs, all prefixed with the same error message.
   
   All logs are broken in the same way:
   
   ```
   *** Reading remote log from gs://<your-bucket-name>/<dag_id>/<task_id>/2020-01-24T00:00:00+00:00/1.log.
   b'*** Previous log discarded: 404 GET https://storage.googleapis.com/download/storage/v1/b/<redacted>?alt=media: No such object: <redacted>/2020-01-24T00:00:00+00:00/1.log: (\'Request failed with status code\', 404, \'Expected one of\', <HTTPStatus.OK: 200>, <HTTPStatus.PARTIAL_CONTENT: 206>)\n\n[2020-12-28 14:57:51,263] {taskinstance.py:826} INFO - Dependencies all met for <TaskInstance: <redacted> 2020-01-24T00:00:00+00:00 [queued]>\n[2020-12-28 14:57:51,281] {taskinstance.py:826} INFO - Dependencies all met for <TaskInstance: <redacted> 2020-01-24T00:00:00+00:00 [queued]>\n[2020-12-28 14:57:51,281] {taskinstance.py:1017} INFO - \n--------------------------------------------------------------------------------\n[2020-12-28 14:57:51,281] {taskinstance.py:1018} INFO - Starting attempt 1 of 1\n[2020-12-28 14:57:51,281] {taskinstance.py:1019} INFO - \n--------------------------------------------------------------------------------\n[2020-12-28 14:57:51,305] {taskinstance.py:1038} <<
 <SNIP>>>\n'
   ```
   
   As you can the, the actual log is there, however completely utterly broken.
   At first I thought the log could not be written.
   But when I manually check the bucket, the log is actually there!
   
   I suspect the GCSTaskHandler / remote logging to be broken on 2.0.0.
   
   **What you expected to happen**:
   
   Remote logging to GCS works as advertised.
   
   **How to reproduce it**:
   
   Get a GCS bucket.
   
   Configure remote logging to GCS.
   
   ```
   export AIRFLOW__LOGGING__REMOTE_LOGGING=True
   export AIRFLOW__LOGGING__REMOTE_BASE_LOG_FOLDER-gs://your-bucket-name"
   export AIRFLOW__LOGGING__REMOTE_LOG_CONN_ID=google_cloud_default
   ```
   
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] SakuraSound commented on issue #13343: Remote logging broken (Airflow 2.0.0 on GCS)

Posted by GitBox <gi...@apache.org>.
SakuraSound commented on issue #13343:
URL: https://github.com/apache/airflow/issues/13343#issuecomment-755106802


   I am also seeing issues here
   
   **Apache Airflow version:** 2.0.0
   
   **Kubernetes version (if you are using kubernetes) (use kubectl version):**
   
   ```
   $ kubectl version
   Client Version: version.Info{Major:"1", Minor:"18", GitVersion:"v1.18.6", GitCommit:"dff82dc0de47299ab66c83c626e08b245ab19037", GitTreeState:"clean", BuildDate:"2020-07-16T00:04:31Z", GoVersion:"go1.14.4", Compiler:"gc", Platform:"darwin/amd64"}
   Server Version: version.Info{Major:"1", Minor:"16+", GitVersion:"v1.16.15-eks-ad4801", GitCommit:"ad4801fd44fe0f125c8d13f1b1d4827e8884476d", GitTreeState:"clean", BuildDate:"2020-10-20T23:27:12Z", GoVersion:"go1.13.15", Compiler:"gc", Platform:"linux/amd64"}
   ```
   
   **Environment:**
   
   **Cloud provider or hardware configuration:** AWS
   
   **What happened:**
   
   Remote logging is configured as follows:
   
   export AIRFLOW__LOGGING__REMOTE_LOGGING=True
   export AIRFLOW__LOGGING__REMOTE_BASE_LOG_FOLDER=s3://arn:aws:s3:us-west-2:<AWS_ACCOUNT_ID>:accesspoint:<OUR_ACCESS_POINT>/logs
   export AIRFLOW__LOGGING__REMOTE_LOG_CONN_ID=S3LogConn
   
   This was also working for us at 1.10.14, but stopped working in Airflow 2.0.0.
   
   What we noticed specifically is that if we run 
   `airflow tasks run test_dag test now` from the web server or scheduler, we saw logs in S3. But when we let airflow naturally run DAGS, we see:
   
   ```
   [2021-01-05 19:12:41,205] {s3_task_handler.py:193} ERROR - Could not write logs to s3://arn:aws:s3:us-west-2:<AWS_ACCOUNT_ID>:accesspoint:<OUR_ACCESS_POINT>/logs/test_dag/test/2021-01-05T19:12:23.797669+00:00/1.log
   Traceback (most recent call last):
     File "/home/airflow/.local/lib/python3.8/site-packages/airflow/providers/amazon/aws/log/s3_task_handler.py", line 186, in s3_write
       self.hook.load_string(
     File "/home/airflow/.local/lib/python3.8/site-packages/airflow/providers/amazon/aws/hooks/s3.py", line 57, in wrapper
       connection = self.get_connection(self.aws_conn_id)
     File "/home/airflow/.local/lib/python3.8/site-packages/airflow/hooks/base.py", line 63, in get_connection
       conn = Connection.get_connection_from_secrets(conn_id)
     File "/home/airflow/.local/lib/python3.8/site-packages/airflow/models/connection.py", line 351, in get_connection_from_secrets
       conn = secrets_backend.get_connection(conn_id=conn_id)
     File "/home/airflow/.local/lib/python3.8/site-packages/airflow/utils/session.py", line 64, in wrapper
       with create_session() as session:
     File "/usr/local/lib/python3.8/contextlib.py", line 113, in __enter__
       return next(self.gen)
     File "/home/airflow/.local/lib/python3.8/site-packages/airflow/utils/session.py", line 29, in create_session
       session = settings.Session()
   TypeError: 'NoneType' object is not callable
   ```
   
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] Overbryd edited a comment on issue #13343: Remote logging broken (Airflow 2.0.0 on GCS)

Posted by GitBox <gi...@apache.org>.
Overbryd edited a comment on issue #13343:
URL: https://github.com/apache/airflow/issues/13343#issuecomment-751751840






----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] Overbryd commented on issue #13343: Remote logging broken (Airflow 2.0.0 on GCS)

Posted by GitBox <gi...@apache.org>.
Overbryd commented on issue #13343:
URL: https://github.com/apache/airflow/issues/13343#issuecomment-751862565


   @potiuk I still have the same problem. That `-` was just a typo on my issue, the problem still persists. I have updated my issue accordingly.
   
   I do not have a permission problem, the logs get written too!
   I would appreciate some helpers in how to debug/better the problem instead of a blunt rejection of the problem at hand.
   
   Here is the log file written to the bucket:
   ![Screenshot 2020-12-28 at 9 55 07 PM](https://user-images.githubusercontent.com/21111/103242721-996d6d80-4957-11eb-9462-bb710d3e4b7d.png)
   
   Here is the same log output from Airflow, just scrambled:
   ![Screenshot 2020-12-28 at 9 55 19 PM](https://user-images.githubusercontent.com/21111/103242737-a7bb8980-4957-11eb-9c02-edd653a15eaf.png)
   
   Here is a screenshot showing the Airflow service account has "Storage Admin" on the same bucket:
   ![Screenshot 2020-12-28 at 9 56 02 PM](https://user-images.githubusercontent.com/21111/103242806-e05b6300-4957-11eb-9176-4d561718e578.png)
   
   What else do I have to do to get some proper attention? Like I said, I am very willing to help and debug the problem. However, I have no interesting in helping out a project that does not take things seriously.
   Like having a proper look at a bug report?


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] mik-laj commented on issue #13343: Remote logging broken (Airflow 2.0.0 on GCS)

Posted by GitBox <gi...@apache.org>.
mik-laj commented on issue #13343:
URL: https://github.com/apache/airflow/issues/13343#issuecomment-763236585


   Today I had discussions with the Polidea team about this bug. @turbaszek  said that this problem is already fixed in the main branch. I can also confirm that this problem no longer occurs and that system tests are successful.  There are some more problems with the StackdriverTaskHandler, but I already have a patch ready and I'm testing it.
   
   ```
   root@f375df8a2fd5:/opt/airflow# pytest tests/providers/google/cloud/log/test_gcs_task_handler_system.py --system google -s
   ============================================================================================================================= test session starts ==============================================================================================================================
   platform linux -- Python 3.6.12, pytest-6.2.1, py-1.10.0, pluggy-0.13.1 -- /usr/local/bin/python
   cachedir: .pytest_cache
   rootdir: /opt/airflow, configfile: pytest.ini
   plugins: forked-1.3.0, timeouts-1.2.1, xdist-2.2.0, cov-2.10.1, instafail-0.4.2, flaky-3.7.0, requests-mock-1.8.0, rerunfailures-9.1.1, celery-4.4.7
   setup timeout: 0.0s, execution timeout: 0.0s, teardown timeout: 0.0s
   collected 1 item
   
   tests/providers/google/cloud/log/test_gcs_task_handler_system.py::TestGCSTaskHandlerSystemTest::test_should_read_logs ========================= AIRFLOW ==========================
   Home of the user: /root
   Airflow home /root/airflow
   Skipping initializing of the DB as it was initialized already.
   You can re-initialize the database by adding --with-db-init flag when running tests.
   [2021-01-20 00:17:42,536] {logging_command_executor.py:33} INFO - Executing: 'gcloud config set core/project polidea-airflow'
   [2021-01-20 00:17:44,288] {logging_command_executor.py:44} INFO - Stdout:
   [2021-01-20 00:17:44,288] {logging_command_executor.py:45} INFO - Stderr: Updated property [core/project].
   
   [2021-01-20 00:17:44,288] {logging_command_executor.py:33} INFO - Executing: 'gcloud auth activate-service-account --key-file=/files/airflow-breeze-config/keys/gcp_gcs.json'
   [2021-01-20 00:17:46,231] {logging_command_executor.py:44} INFO - Stdout:
   [2021-01-20 00:17:46,231] {logging_command_executor.py:45} INFO - Stderr: Activated service account credentials for: [gcp-storage-account@polidea-airflow.iam.gserviceaccount.com]
   
   [2021-01-20 00:17:46,232] {logging_command_executor.py:33} INFO - Executing: 'gsutil mb gs://airflow-gcs-task-handler-tests-hiucqpbantyszrxv'
   [2021-01-20 00:17:51,253] {logging_command_executor.py:44} INFO - Stdout:
   [2021-01-20 00:17:51,254] {logging_command_executor.py:45} INFO - Stderr: Creating gs://airflow-gcs-task-handler-tests-hiucqpbantyszrxv/...
   
   [2021-01-20 00:17:55,495] {__init__.py:38} INFO - Loaded API auth backend: <module 'airflow.api.auth.backend.default' from '/opt/airflow/airflow/api/auth/backend/default.py'>
   [2021-01-20 00:17:58,331] {opentelemetry_tracing.py:29} INFO - This service is instrumented using OpenTelemetry. OpenTelemetry could not be imported; please add opentelemetry-api and opentelemetry-instrumentation packages in order to get BigQuery Tracing data.
   Created <DagRun example_complex @ 2021-01-20 00:18:00+00:00: manual__2021-01-20T00:18:00+00:00, externally triggered: True>
     ____________       _____________
    ____    |__( )_________  __/__  /________      __
   ____  /| |_  /__  ___/_  /_ __  /_  __ \_ | /| / /
   ___  ___ |  / _  /   _  __/ _  / / /_/ /_ |/ |/ /
    _/_/  |_/_/  /_/    /_/    /_/  \____/____/|__/
   [2021-01-20 00:18:04,044] {scheduler_job.py:1239} INFO - Starting the scheduler
   [2021-01-20 00:18:04,044] {scheduler_job.py:1244} INFO - Processing each file at most -1 times
   [2021-01-20 00:18:04,232] {dag_processing.py:250} INFO - Launched DagFileProcessorManager with pid: 31561
   [2021-01-20 00:18:04,235] {scheduler_job.py:1748} INFO - Resetting orphaned tasks for active dag runs
   [2021-01-20 00:18:04,290] {settings.py:52} INFO - Configured default timezone Timezone('UTC')
   [2021-01-20 00:18:07,141] {opentelemetry_tracing.py:29} INFO - This service is instrumented using OpenTelemetry. OpenTelemetry could not be imported; please add opentelemetry-api and opentelemetry-instrumentation packages in order to get BigQuery Tracing data.
   [2021-01-20 00:18:08,796] {scheduler_job.py:936} INFO - 1 tasks up for execution:
   	<TaskInstance: example_complex.create_entry_group 2021-01-20 00:18:00+00:00 [scheduled]>
   [2021-01-20 00:18:08,800] {scheduler_job.py:970} INFO - Figuring out tasks to run in Pool(name=default_pool) with 128 open slots and 1 task instances ready to be queued
   [2021-01-20 00:18:08,800] {scheduler_job.py:997} INFO - DAG example_complex has 0/16 running and queued tasks
   [2021-01-20 00:18:08,801] {scheduler_job.py:1058} INFO - Setting the following tasks to queued state:
   	<TaskInstance: example_complex.create_entry_group 2021-01-20 00:18:00+00:00 [scheduled]>
   [2021-01-20 00:18:08,809] {scheduler_job.py:1100} INFO - Sending TaskInstanceKey(dag_id='example_complex', task_id='create_entry_group', execution_date=datetime.datetime(2021, 1, 20, 0, 18, tzinfo=Timezone('UTC')), try_number=1) to executor with priority 37 and queue default
   [2021-01-20 00:18:08,809] {base_executor.py:79} INFO - Adding to queue: ['airflow', 'tasks', 'run', 'example_complex', 'create_entry_group', '2021-01-20T00:18:00+00:00', '--local', '--pool', 'default_pool', '--subdir', '/opt/airflow/airflow/example_dags/example_complex.py']
   [2021-01-20 00:18:08,816] {local_executor.py:81} INFO - QueuedLocalWorker running ['airflow', 'tasks', 'run', 'example_complex', 'create_entry_group', '2021-01-20T00:18:00+00:00', '--local', '--pool', 'default_pool', '--subdir', '/opt/airflow/airflow/example_dags/example_complex.py']
   [2021-01-20 00:18:08,831] {scheduler_job.py:1401} INFO - Exiting scheduler loop as requested number of runs (1 - got to 1) has been reached
   [2021-01-20 00:18:08,832] {dag_processing.py:439} INFO - Sending termination message to manager.
   [2021-01-20 00:18:08,834] {scheduler_job.py:1282} INFO - Deactivating DAGs that haven't been touched since 2021-01-20T00:18:04.234269+00:00
   [2021-01-20 00:18:08,992] {dagbag.py:440} INFO - Filling up the DagBag from /opt/airflow/airflow/example_dags/example_complex.py
   [2021-01-20 00:18:11,716] {opentelemetry_tracing.py:29} INFO - This service is instrumented using OpenTelemetry. OpenTelemetry could not be imported; please add opentelemetry-api and opentelemetry-instrumentation packages in order to get BigQuery Tracing data.
   Running <TaskInstance: example_complex.create_entry_group 2021-01-20T00:18:00+00:00 [queued]> on host f375df8a2fd5
   [2021-01-20 00:18:14,325] {gcs_task_handler.py:182} INFO - Previous log discarded: 404 GET https://storage.googleapis.com/download/storage/v1/b/airflow-gcs-task-handler-tests-hiucqpbantyszrxv/o/path%2Fto%2Flogs%2Fexample_complex%2Fcreate_entry_group%2F2021-01-20T00%3A18%3A00%2B00%3A00%2F1.log?alt=media: No such object: airflow-gcs-task-handler-tests-hiucqpbantyszrxv/path/to/logs/example_complex/create_entry_group/2021-01-20T00:18:00+00:00/1.log: ('Request failed with status code', 404, 'Expected one of', <HTTPStatus.OK: 200>, <HTTPStatus.PARTIAL_CONTENT: 206>)
   [2021-01-20 00:18:14,735] {process_utils.py:95} INFO - Sending Signals.SIGTERM to GPID 31561
   [2021-01-20 00:18:14,736] {scheduler_job.py:1293} INFO - Exited execute loop
   [2021-01-20 00:18:15,245] {dagbag.py:440} INFO - Filling up the DagBag from /opt/airflow/airflow/example_dags/example_complex.py
   [2021-01-20 00:18:15,627] {logging_command_executor.py:33} INFO - Executing: 'gcloud config set core/project polidea-airflow'
   [2021-01-20 00:18:17,463] {logging_command_executor.py:44} INFO - Stdout:
   [2021-01-20 00:18:17,464] {logging_command_executor.py:45} INFO - Stderr: Updated property [core/project].
   
   [2021-01-20 00:18:17,464] {logging_command_executor.py:33} INFO - Executing: 'gcloud auth activate-service-account --key-file=/files/airflow-breeze-config/keys/gcp_gcs.json'
   [2021-01-20 00:18:19,103] {logging_command_executor.py:44} INFO - Stdout:
   [2021-01-20 00:18:19,104] {logging_command_executor.py:45} INFO - Stderr: Activated service account credentials for: [gcp-storage-account@polidea-airflow.iam.gserviceaccount.com]
   
   [2021-01-20 00:18:21,811] {opentelemetry_tracing.py:29} INFO - This service is instrumented using OpenTelemetry. OpenTelemetry could not be imported; please add opentelemetry-api and opentelemetry-instrumentation packages in order to get BigQuery Tracing data.
   PASSED[2021-01-20 00:18:24,225] {logging_command_executor.py:33} INFO - Executing: 'gcloud config set core/project polidea-airflow'
   [2021-01-20 00:18:25,949] {logging_command_executor.py:44} INFO - Stdout:
   [2021-01-20 00:18:25,951] {logging_command_executor.py:45} INFO - Stderr: Updated property [core/project].
   
   [2021-01-20 00:18:25,952] {logging_command_executor.py:33} INFO - Executing: 'gcloud auth activate-service-account --key-file=/files/airflow-breeze-config/keys/gcp_gcs.json'
   [2021-01-20 00:18:27,947] {logging_command_executor.py:44} INFO - Stdout:
   [2021-01-20 00:18:27,948] {logging_command_executor.py:45} INFO - Stderr: Activated service account credentials for: [gcp-storage-account@polidea-airflow.iam.gserviceaccount.com]
   
   [2021-01-20 00:18:27,948] {logging_command_executor.py:33} INFO - Executing: 'gsutil -m rm -r gs://airflow-gcs-task-handler-tests-hiucqpbantyszrxv'
   [2021-01-20 00:18:33,055] {logging_command_executor.py:44} INFO - Stdout:
   [2021-01-20 00:18:33,056] {logging_command_executor.py:45} INFO - Stderr: Removing gs://airflow-gcs-task-handler-tests-hiucqpbantyszrxv/path/to/logs/example_complex/create_entry_group/2021-01-20T00:18:00+00:00/1.log#1611101894698364...
   / [1/1 objects] 100% Done
   Operation completed over 1 objects.
   Removing gs://airflow-gcs-task-handler-tests-hiucqpbantyszrxv/...
   
   
   
   =============================================================================================================================== warnings summary ===============================================================================================================================
   tests/providers/google/cloud/log/test_gcs_task_handler_system.py::TestGCSTaskHandlerSystemTest::test_should_read_logs
     /usr/local/lib/python3.6/importlib/_bootstrap.py:219: RuntimeWarning: numpy.ufunc size changed, may indicate binary incompatibility. Expected 192 from C header, got 216 from PyObject
       return f(*args, **kwds)
   
   tests/providers/google/cloud/log/test_gcs_task_handler_system.py::TestGCSTaskHandlerSystemTest::test_should_read_logs
     /usr/local/lib/python3.6/site-packages/boto/plugin.py:40: DeprecationWarning: the imp module is deprecated in favour of importlib; see the module's documentation for alternative uses
       import imp
   
   -- Docs: https://docs.pytest.org/en/stable/warnings.html
   ======================================================================================================================== 1 passed, 2 warnings in 52.09s ========================================================================================================================
   root@f375df8a2fd5:/opt/airflow#
   ```


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] potiuk commented on issue #13343: Remote logging broken (Airflow 2.0.0 on GCS)

Posted by GitBox <gi...@apache.org>.
potiuk commented on issue #13343:
URL: https://github.com/apache/airflow/issues/13343#issuecomment-758530698


   Milestone is target for fix, so hopefully we will be able to investigate and fix it and release it in 2.0.1


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] SakuraSound commented on issue #13343: Remote logging broken (Airflow 2.0.0 on GCS)

Posted by GitBox <gi...@apache.org>.
SakuraSound commented on issue #13343:
URL: https://github.com/apache/airflow/issues/13343#issuecomment-758529471


   is this ticket still considered invalid? I see it in the 2.0.1 milestone, but wondering if I should create a separate issue


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] boring-cyborg[bot] commented on issue #13343: Remote logging broken (Airflow 2.0.0 on GCS)

Posted by GitBox <gi...@apache.org>.
boring-cyborg[bot] commented on issue #13343:
URL: https://github.com/apache/airflow/issues/13343#issuecomment-751751362


   Thanks for opening your first issue here! Be sure to follow the issue template!
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] Overbryd edited a comment on issue #13343: Remote logging broken (Airflow 2.0.0 on GCS)

Posted by GitBox <gi...@apache.org>.
Overbryd edited a comment on issue #13343:
URL: https://github.com/apache/airflow/issues/13343#issuecomment-751876986


   off topic:
   
   @potiuk I get it. Sorry if I put out some bad vibes here, no intent, I was just surprised.
   Also thanks for having a second look.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] Overbryd commented on issue #13343: Remote logging broken (Airflow 2.0.0 on GCS)

Posted by GitBox <gi...@apache.org>.
Overbryd commented on issue #13343:
URL: https://github.com/apache/airflow/issues/13343#issuecomment-751876986


   off topic:
   
   @potiuk I get it. Sorry if I put out some bad vibes here, no intent, just startled.
   Also thanks for having a second look.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] SakuraSound commented on issue #13343: Remote logging broken (Airflow 2.0.0 on GCS)

Posted by GitBox <gi...@apache.org>.
SakuraSound commented on issue #13343:
URL: https://github.com/apache/airflow/issues/13343#issuecomment-763201945


   Is that necessary for the AWS setup as well?
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] Overbryd commented on issue #13343: Remote logging broken (Airflow 2.0.0 on GCS)

Posted by GitBox <gi...@apache.org>.
Overbryd commented on issue #13343:
URL: https://github.com/apache/airflow/issues/13343#issuecomment-751751840


   @boring-cyborg who put you in charge? I followed the issue template!


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] Overbryd edited a comment on issue #13343: Remote logging broken (Airflow 2.0.0 on GCS)

Posted by GitBox <gi...@apache.org>.
Overbryd edited a comment on issue #13343:
URL: https://github.com/apache/airflow/issues/13343#issuecomment-751862565


   @potiuk I still have the same problem. That `-` was just a typo on my issue, the problem still persists. I have updated my issue accordingly.
   
   I do not have a permission problem, the logs get written too!
   I would appreciate some helpers in how to debug/better the problem instead of a blunt rejection of the problem at hand.
   
   Here is the log file written to the bucket:
   ![Screenshot 2020-12-28 at 9 55 07 PM](https://user-images.githubusercontent.com/21111/103242721-996d6d80-4957-11eb-9462-bb710d3e4b7d.png)
   
   Here is the same log output from Airflow, just scrambled:
   ![Screenshot 2020-12-28 at 9 55 19 PM](https://user-images.githubusercontent.com/21111/103242737-a7bb8980-4957-11eb-9c02-edd653a15eaf.png)
   
   Here is a screenshot showing the Airflow service account has "Storage Admin" on the same bucket:
   ![Screenshot 2020-12-28 at 9 56 02 PM](https://user-images.githubusercontent.com/21111/103242806-e05b6300-4957-11eb-9176-4d561718e578.png)
    
   Like I said, I am very willing to help and debug the problem, please just don't close the issue...


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] mik-laj commented on issue #13343: Remote logging broken (Airflow 2.0.0 on GCS)

Posted by GitBox <gi...@apache.org>.
mik-laj commented on issue #13343:
URL: https://github.com/apache/airflow/issues/13343#issuecomment-751870884


   > `AIRFLOW__LOGGING__REMOTE_LOG_CONN_ID` 
   
   This parameter is not supported in Airflow 2.0 for this task handler. See: https://github.com/apache/airflow/blob/master/UPDATING.md#simplified-gcstaskhandler-configuration


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] mik-laj closed issue #13343: Remote logging broken (Airflow 2.0.0 on GCS)

Posted by GitBox <gi...@apache.org>.
mik-laj closed issue #13343:
URL: https://github.com/apache/airflow/issues/13343


   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] Overbryd edited a comment on issue #13343: Remote logging broken (Airflow 2.0.0 on GCS)

Posted by GitBox <gi...@apache.org>.
Overbryd edited a comment on issue #13343:
URL: https://github.com/apache/airflow/issues/13343#issuecomment-751862565


   @potiuk I still have the same problem. That `-` was just a typo on my issue, the problem still persists. I have updated my issue accordingly.
   
   I do not have a permission problem, the logs get written too!
   I would appreciate some helpers in how to debug/better the problem instead of a blunt rejection of the problem at hand.
   
   Here is the log file written to the bucket:
   ![Screenshot 2020-12-28 at 9 55 07 PM](https://user-images.githubusercontent.com/21111/103242721-996d6d80-4957-11eb-9462-bb710d3e4b7d.png)
   
   Here is the same log output from Airflow, just scrambled:
   ![Screenshot 2020-12-28 at 9 55 19 PM](https://user-images.githubusercontent.com/21111/103242737-a7bb8980-4957-11eb-9c02-edd653a15eaf.png)
   
   Here is a screenshot showing the Airflow service account has "Storage Admin" on the same bucket:
   ![Screenshot 2020-12-28 at 9 56 02 PM](https://user-images.githubusercontent.com/21111/103242806-e05b6300-4957-11eb-9176-4d561718e578.png)
   
   What else do I have to do to get some attention? Like I said, I am very willing to help and debug the problem. However, I have no interest in helping out a project that does not take things seriously.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] potiuk commented on issue #13343: Remote logging broken (Airflow 2.0.0 on GCS)

Posted by GitBox <gi...@apache.org>.
potiuk commented on issue #13343:
URL: https://github.com/apache/airflow/issues/13343#issuecomment-751826162


   Are you sure your Airflow user is configured to have the right read permissions in the bucket? I think this error 404 would be returned also when there is no permission to read the logs


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] potiuk closed issue #13343: Remote logging broken (Airflow 2.0.0 on GCS)

Posted by GitBox <gi...@apache.org>.
potiuk closed issue #13343:
URL: https://github.com/apache/airflow/issues/13343


   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] kaxil commented on issue #13343: Remote logging broken (Airflow 2.0.0 on GCS)

Posted by GitBox <gi...@apache.org>.
kaxil commented on issue #13343:
URL: https://github.com/apache/airflow/issues/13343#issuecomment-763127476


   Have you installed the google provider @Overbryd @SakuraSound ?


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] potiuk commented on issue #13343: Remote logging broken (Airflow 2.0.0 on GCS)

Posted by GitBox <gi...@apache.org>.
potiuk commented on issue #13343:
URL: https://github.com/apache/airflow/issues/13343#issuecomment-751826775


   ```
   export AIRFLOW__LOGGING__REMOTE_BASE_LOG_FOLDER-gs://your-bucket-name"
   ```
   
   I believe you have typo here and this might be the reason.
   
   ```
   export AIRFLOW__LOGGING__REMOTE_BASE_LOG_FOLDER=gs://your-bucket-name
   ```


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] SakuraSound commented on issue #13343: Remote logging broken (Airflow 2.0.0 on GCS)

Posted by GitBox <gi...@apache.org>.
SakuraSound commented on issue #13343:
URL: https://github.com/apache/airflow/issues/13343#issuecomment-763229877


   Yes, we do have this installed


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] mik-laj commented on issue #13343: Remote logging broken (Airflow 2.0.0 on GCS)

Posted by GitBox <gi...@apache.org>.
mik-laj commented on issue #13343:
URL: https://github.com/apache/airflow/issues/13343#issuecomment-751871335


   But I can confirm that this is a problem. I have failed to run system tests for this integration.
   ```
   root@ff47e5ab2751:/opt/airflow# pytest tests/providers/google/cloud/log/test_gcs_task_handler_system.py --system google -s
   ============================================================================================================================= test session starts ==============================================================================================================================
   platform linux -- Python 3.6.12, pytest-6.1.2, py-1.9.0, pluggy-0.13.1 -- /usr/local/bin/python
   cachedir: .pytest_cache
   rootdir: /opt/airflow, configfile: pytest.ini
   plugins: forked-1.3.0, timeouts-1.2.1, cov-2.10.1, xdist-2.1.0, instafail-0.4.2, flaky-3.7.0, requests-mock-1.8.0, rerunfailures-9.1.1, celery-4.4.7
   setup timeout: 0.0s, execution timeout: 0.0s, teardown timeout: 0.0s
   collected 1 item
   
   tests/providers/google/cloud/log/test_gcs_task_handler_system.py::TestGCSTaskHandlerSystemTest::test_should_read_logs ========================= AIRFLOW ==========================
   Home of the user: /root
   Airflow home /root/airflow
   Skipping initializing of the DB as it was initialized already.
   You can re-initialize the database by adding --with-db-init flag when running tests.
   [2020-12-28 21:30:47,952] {logging_command_executor.py:33} INFO - Executing: 'gcloud config set core/project polidea-airflow'
   [2020-12-28 21:30:48,885] {logging_command_executor.py:44} INFO - Stdout:
   [2020-12-28 21:30:48,887] {logging_command_executor.py:45} INFO - Stderr: Updated property [core/project].
   
   [2020-12-28 21:30:48,890] {logging_command_executor.py:33} INFO - Executing: 'gcloud auth activate-service-account --key-file=/files/airflow-breeze-config/keys/gcp_gcs.json'
   [2020-12-28 21:30:50,129] {logging_command_executor.py:44} INFO - Stdout:
   [2020-12-28 21:30:50,131] {logging_command_executor.py:45} INFO - Stderr: Activated service account credentials for: [gcp-storage-account@polidea-airflow.iam.gserviceaccount.com]
   
   [2020-12-28 21:30:50,135] {logging_command_executor.py:33} INFO - Executing: 'gsutil mb gs://airflow-gcs-task-handler-tests-mxdcsqvutbgzhkpi'
   [2020-12-28 21:30:53,752] {logging_command_executor.py:44} INFO - Stdout:
   [2020-12-28 21:30:53,752] {logging_command_executor.py:45} INFO - Stderr: Creating gs://airflow-gcs-task-handler-tests-mxdcsqvutbgzhkpi/...
   
   [2020-12-28 21:30:57,743] {__init__.py:38} INFO - Loaded API auth backend: <module 'airflow.api.auth.backend.default' from '/opt/airflow/airflow/api/auth/backend/default.py'>
   [2020-12-28 21:31:04,250] {opentelemetry_tracing.py:29} INFO - This service is instrumented using OpenTelemetry. OpenTelemetry could not be imported; please add opentelemetry-api and opentelemetry-instrumentation packages in order to get BigQuery Tracing data.
   [2020-12-28 21:31:07,748] {plugins_manager.py:286} INFO - Loading 4 plugin(s) took 9.98 seconds
   Created <DagRun example_complex @ 2020-12-28 21:31:07+00:00: manual__2020-12-28T21:31:07+00:00, externally triggered: True>
     ____________       _____________
    ____    |__( )_________  __/__  /________      __
   ____  /| |_  /__  ___/_  /_ __  /_  __ \_ | /| / /
   ___  ___ |  / _  /   _  __/ _  / / /_/ /_ |/ |/ /
    _/_/  |_/_/  /_/    /_/    /_/  \____/____/|__/
   [2020-12-28 21:31:12,040] {scheduler_job.py:1242} INFO - Starting the scheduler
   [2020-12-28 21:31:12,040] {scheduler_job.py:1247} INFO - Processing each file at most -1 times
   [2020-12-28 21:31:12,373] {dag_processing.py:250} INFO - Launched DagFileProcessorManager with pid: 13250
   [2020-12-28 21:31:12,375] {scheduler_job.py:1752} INFO - Resetting orphaned tasks for active dag runs
   [2020-12-28 21:31:12,410] {settings.py:52} INFO - Configured default timezone Timezone('UTC')
   [2020-12-28 21:31:12,542] {scheduler_job.py:1405} INFO - Exiting scheduler loop as requested number of runs (1 - got to 1) has been reached
   [2020-12-28 21:31:12,542] {dag_processing.py:439} INFO - Sending termination message to manager.
   [2020-12-28 21:31:12,544] {scheduler_job.py:1286} INFO - Deactivating DAGs that haven't been touched since 2020-12-28T21:31:12.375019+00:00
   [2020-12-28 21:31:12,912] {process_utils.py:95} INFO - Sending Signals.SIGTERM to GPID 13250
   [2020-12-28 21:31:12,913] {scheduler_job.py:1297} INFO - Exited execute loop
   [2020-12-28 21:31:13,215] {dagbag.py:440} INFO - Filling up the DagBag from /opt/airflow/airflow/example_dags/example_complex.py
   [2020-12-28 21:31:13,911] {logging_command_executor.py:33} INFO - Executing: 'gcloud config set core/project polidea-airflow'
   [2020-12-28 21:31:15,274] {logging_command_executor.py:44} INFO - Stdout:
   [2020-12-28 21:31:15,276] {logging_command_executor.py:45} INFO - Stderr: Updated property [core/project].
   
   [2020-12-28 21:31:15,277] {logging_command_executor.py:33} INFO - Executing: 'gcloud auth activate-service-account --key-file=/files/airflow-breeze-config/keys/gcp_gcs.json'
   [2020-12-28 21:31:16,906] {logging_command_executor.py:44} INFO - Stdout:
   [2020-12-28 21:31:16,908] {logging_command_executor.py:45} INFO - Stderr: Activated service account credentials for: [gcp-storage-account@polidea-airflow.iam.gserviceaccount.com]
   
   FAILED[2020-12-28 21:31:17,320] {logging_command_executor.py:33} INFO - Executing: 'gcloud config set core/project polidea-airflow'
   [2020-12-28 21:31:18,432] {logging_command_executor.py:44} INFO - Stdout:
   [2020-12-28 21:31:18,433] {logging_command_executor.py:45} INFO - Stderr: Updated property [core/project].
   
   [2020-12-28 21:31:18,435] {logging_command_executor.py:33} INFO - Executing: 'gcloud auth activate-service-account --key-file=/files/airflow-breeze-config/keys/gcp_gcs.json'
   [2020-12-28 21:31:20,133] {logging_command_executor.py:44} INFO - Stdout:
   [2020-12-28 21:31:20,135] {logging_command_executor.py:45} INFO - Stderr: Activated service account credentials for: [gcp-storage-account@polidea-airflow.iam.gserviceaccount.com]
   
   [2020-12-28 21:31:20,138] {logging_command_executor.py:33} INFO - Executing: 'gsutil -m rm -r gs://airflow-gcs-task-handler-tests-mxdcsqvutbgzhkpi'
   [2020-12-28 21:31:25,299] {logging_command_executor.py:44} INFO - Stdout:
   [2020-12-28 21:31:25,300] {logging_command_executor.py:45} INFO - Stderr: Removing gs://airflow-gcs-task-handler-tests-mxdcsqvutbgzhkpi/...
   
   
   
   =================================================================================================================================== FAILURES ===================================================================================================================================
   ______________________________________________________________________________________________________________ TestGCSTaskHandlerSystemTest.test_should_read_logs ______________________________________________________________________________________________________________
   
   self = <tests.providers.google.cloud.log.test_gcs_task_handler_system.TestGCSTaskHandlerSystemTest testMethod=test_should_read_logs>, session = <sqlalchemy.orm.session.Session object at 0x7fc942f0bcc0>
   
       @provide_session
       def test_should_read_logs(self, session):
           with mock.patch.dict(
               'os.environ',
               AIRFLOW__LOGGING__REMOTE_LOGGING="true",
               AIRFLOW__LOGGING__REMOTE_BASE_LOG_FOLDER=f"gs://{self.bucket_name}/path/to/logs",
               AIRFLOW__LOGGING__REMOTE_LOG_CONN_ID="google_cloud_default",
               AIRFLOW__CORE__LOAD_EXAMPLES="false",
               AIRFLOW__CORE__DAGS_FOLDER=example_complex.__file__,
               GOOGLE_APPLICATION_CREDENTIALS=resolve_full_gcp_key_path(GCP_GCS_KEY),
           ):
               self.assertEqual(0, subprocess.Popen(["airflow", "dags", "trigger", "example_complex"]).wait())
               self.assertEqual(0, subprocess.Popen(["airflow", "scheduler", "--num-runs", "1"]).wait())
   
           ti = session.query(TaskInstance).filter(TaskInstance.task_id == "create_entry_group").first()
           dag = DagBag(dag_folder=example_complex.__file__).dags['example_complex']
           task = dag.task_dict["create_entry_group"]
           ti.task = task
   >       self.assert_remote_logs("INFO - Task exited with return code 0", ti)
   
   tests/providers/google/cloud/log/test_gcs_task_handler_system.py:82:
   _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
   tests/providers/google/cloud/log/test_gcs_task_handler_system.py:99: in assert_remote_logs
       self.assertIn(expected_message, logs)
   E   AssertionError: 'INFO - Task exited with return code 0' not found in ''
   ------------------------------------------------------------------------------------------------------------------------------ Captured log setup ------------------------------------------------------------------------------------------------------------------------------
   INFO     tests.test_utils.logging_command_executor.LoggingCommandExecutor:logging_command_executor.py:33 Executing: 'gcloud config set core/project polidea-airflow'
   INFO     tests.test_utils.logging_command_executor.LoggingCommandExecutor:logging_command_executor.py:44 Stdout:
   INFO     tests.test_utils.logging_command_executor.LoggingCommandExecutor:logging_command_executor.py:45 Stderr: Updated property [core/project].
   
   INFO     tests.test_utils.logging_command_executor.LoggingCommandExecutor:logging_command_executor.py:33 Executing: 'gcloud auth activate-service-account --key-file=/files/airflow-breeze-config/keys/gcp_gcs.json'
   INFO     tests.test_utils.logging_command_executor.LoggingCommandExecutor:logging_command_executor.py:44 Stdout:
   INFO     tests.test_utils.logging_command_executor.LoggingCommandExecutor:logging_command_executor.py:45 Stderr: Activated service account credentials for: [gcp-storage-account@polidea-airflow.iam.gserviceaccount.com]
   
   INFO     tests.test_utils.logging_command_executor.LoggingCommandExecutor:logging_command_executor.py:33 Executing: 'gsutil mb gs://airflow-gcs-task-handler-tests-mxdcsqvutbgzhkpi'
   INFO     tests.test_utils.logging_command_executor.LoggingCommandExecutor:logging_command_executor.py:44 Stdout:
   INFO     tests.test_utils.logging_command_executor.LoggingCommandExecutor:logging_command_executor.py:45 Stderr: Creating gs://airflow-gcs-task-handler-tests-mxdcsqvutbgzhkpi/...
   
   ------------------------------------------------------------------------------------------------------------------------------ Captured log call -------------------------------------------------------------------------------------------------------------------------------
   INFO     airflow.models.dagbag.DagBag:dagbag.py:440 Filling up the DagBag from /opt/airflow/airflow/example_dags/example_complex.py
   INFO     tests.test_utils.logging_command_executor.LoggingCommandExecutor:logging_command_executor.py:33 Executing: 'gcloud config set core/project polidea-airflow'
   INFO     tests.test_utils.logging_command_executor.LoggingCommandExecutor:logging_command_executor.py:44 Stdout:
   INFO     tests.test_utils.logging_command_executor.LoggingCommandExecutor:logging_command_executor.py:45 Stderr: Updated property [core/project].
   
   INFO     tests.test_utils.logging_command_executor.LoggingCommandExecutor:logging_command_executor.py:33 Executing: 'gcloud auth activate-service-account --key-file=/files/airflow-breeze-config/keys/gcp_gcs.json'
   INFO     tests.test_utils.logging_command_executor.LoggingCommandExecutor:logging_command_executor.py:44 Stdout:
   INFO     tests.test_utils.logging_command_executor.LoggingCommandExecutor:logging_command_executor.py:45 Stderr: Activated service account credentials for: [gcp-storage-account@polidea-airflow.iam.gserviceaccount.com]
   
   ---------------------------------------------------------------------------------------------------------------------------- Captured log teardown -----------------------------------------------------------------------------------------------------------------------------
   INFO     tests.test_utils.logging_command_executor.LoggingCommandExecutor:logging_command_executor.py:33 Executing: 'gcloud config set core/project polidea-airflow'
   INFO     tests.test_utils.logging_command_executor.LoggingCommandExecutor:logging_command_executor.py:44 Stdout:
   INFO     tests.test_utils.logging_command_executor.LoggingCommandExecutor:logging_command_executor.py:45 Stderr: Updated property [core/project].
   
   INFO     tests.test_utils.logging_command_executor.LoggingCommandExecutor:logging_command_executor.py:33 Executing: 'gcloud auth activate-service-account --key-file=/files/airflow-breeze-config/keys/gcp_gcs.json'
   INFO     tests.test_utils.logging_command_executor.LoggingCommandExecutor:logging_command_executor.py:44 Stdout:
   INFO     tests.test_utils.logging_command_executor.LoggingCommandExecutor:logging_command_executor.py:45 Stderr: Activated service account credentials for: [gcp-storage-account@polidea-airflow.iam.gserviceaccount.com]
   
   INFO     tests.test_utils.logging_command_executor.LoggingCommandExecutor:logging_command_executor.py:33 Executing: 'gsutil -m rm -r gs://airflow-gcs-task-handler-tests-mxdcsqvutbgzhkpi'
   INFO     tests.test_utils.logging_command_executor.LoggingCommandExecutor:logging_command_executor.py:44 Stdout:
   INFO     tests.test_utils.logging_command_executor.LoggingCommandExecutor:logging_command_executor.py:45 Stderr: Removing gs://airflow-gcs-task-handler-tests-mxdcsqvutbgzhkpi/...
   
   =========================================================================================================================== short test summary info ============================================================================================================================
   FAILED tests/providers/google/cloud/log/test_gcs_task_handler_system.py::TestGCSTaskHandlerSystemTest::test_should_read_logs - AssertionError: 'INFO - Task exited with return code 0' not found in ''
   ============================================================================================================================== 1 failed in 39.10s ==============================================================================================================================
   ```


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] potiuk commented on issue #13343: Remote logging broken (Airflow 2.0.0 on GCS)

Posted by GitBox <gi...@apache.org>.
potiuk commented on issue #13343:
URL: https://github.com/apache/airflow/issues/13343#issuecomment-751873572


   @Overbryd  - I simply asked for extra information and explained that we will reopen it when you come back with answer to the suspected issue (which turned out to be your mistake for copy&paste). 
   
   As you see from @mik-laj  comment, we do take care about it seriously and we are ready to re-open the issues (exactly as I explained) when more information is provided :). 
   
   Just try to excercise a little empathy - we have > 700 issues, few people mostly working in their free time on those issues raised and I guess you have only 1 issue in Airflow to deal with, so closing an issue when there is not enough information is given and we suspect there is an issue in your configuration, is very sensible thing to do.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] astleychen commented on issue #13343: Remote logging broken (Airflow 2.0.0 on GCS)

Posted by GitBox <gi...@apache.org>.
astleychen commented on issue #13343:
URL: https://github.com/apache/airflow/issues/13343#issuecomment-753781450


   You can refer to #13115 for the one-line logs issue on GCS logging.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] kaxil commented on issue #13343: Remote logging broken (Airflow 2.0.0 on GCS)

Posted by GitBox <gi...@apache.org>.
kaxil commented on issue #13343:
URL: https://github.com/apache/airflow/issues/13343#issuecomment-763203240


   > Is that necessary for the AWS setup as well?
   
   Yup - https://pypi.org/project/apache-airflow-providers-amazon/
   
   ```
   pip install apache-airflow-providers-amazon
   ```
   
   or
   
   ```
   pip install -U 'apache-airflow[amazon]'
   ```
   
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] kaxil edited a comment on issue #13343: Remote logging broken (Airflow 2.0.0 on GCS)

Posted by GitBox <gi...@apache.org>.
kaxil edited a comment on issue #13343:
URL: https://github.com/apache/airflow/issues/13343#issuecomment-763203240


   > Is that necessary for the AWS setup as well?
   
   Yup - https://pypi.org/project/apache-airflow-providers-amazon/
   
   ```
   pip install apache-airflow-providers-amazon
   ```
   
   or
   
   ```
   pip install -U 'apache-airflow[aws]'
   ```
   
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] potiuk commented on issue #13343: Remote logging broken (Airflow 2.0.0 on GCS)

Posted by GitBox <gi...@apache.org>.
potiuk commented on issue #13343:
URL: https://github.com/apache/airflow/issues/13343#issuecomment-751827913


   Closing as invalid, unless you can provide answers/check the configuration. Please let us know in case configuration/permission changes do not solve the problem. Also it would be great to get some more logs from airflow webserver in case you still have the problems.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] Overbryd commented on issue #13343: Remote logging broken (Airflow 2.0.0 on GCS)

Posted by GitBox <gi...@apache.org>.
Overbryd commented on issue #13343:
URL: https://github.com/apache/airflow/issues/13343#issuecomment-751872376


   > > `AIRFLOW__LOGGING__REMOTE_LOG_CONN_ID`
   > 
   > This parameter is not supported in Airflow 2.0 for this task handler. See: https://github.com/apache/airflow/blob/master/UPDATING.md#simplified-gcstaskhandler-configuration
   
   Ok, I will remove `AIRFLOW__LOGGING__REMOTE_LOG_CONN_ID` as I am using application default credentials anyway.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org