You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by GitBox <gi...@apache.org> on 2021/02/17 07:15:55 UTC

[GitHub] [airflow] damon09273 opened a new issue #14267: GoogleCloudStorageToBigQueryOperator generate duplicate data when network issue happen

damon09273 opened a new issue #14267:
URL: https://github.com/apache/airflow/issues/14267


   <!--
   
   Welcome to Apache Airflow!  For a smooth issue process, try to answer the following questions.
   Don't worry if they're not all applicable; just try to include what you can :-)
   
   If you need to include code snippets or logs, please put them in fenced code
   blocks.  If they're super-long, please use the details tag like
   <details><summary>super-long log</summary> lots of stuff </details>
   
   Please delete these comment blocks before submitting the issue.
   
   -->
   
   <!--
   
   IMPORTANT!!!
   
   PLEASE CHECK "SIMILAR TO X EXISTING ISSUES" OPTION IF VISIBLE
   NEXT TO "SUBMIT NEW ISSUE" BUTTON!!!
   
   PLEASE CHECK IF THIS ISSUE HAS BEEN REPORTED PREVIOUSLY USING SEARCH!!!
   
   Please complete the next sections or the issue will be closed.
   These questions are the first thing we need to know to understand the context.
   
   -->
   
   **Apache Airflow version**:
   1.10.10
   
   **Kubernetes version (if you are using kubernetes)** (use `kubectl version`):
   v1.16.15-gke.6000
   
   **Environment**:
   
   - **Cloud provider or hardware configuration**: GCP
   - **OS** (e.g. from /etc/os-release): Debian GNU/Linux 10 (buster)
   - **Kernel** (e.g. `uname -a`): 4.19.112+
   - **Install tools**:
   - **Others**:
   
   **What happened**:
   I get duplicated data when run task by `GoogleCloudStorageToBigQueryOperator`.
   The task log has error message: Connection reset by peer when running job, task failed and retry is fine, but problem is **the executing BQ job doesn’t be cancel, so previous job and retry job both append same data to destination.**
   
   **What you expected to happen**:
   The previous running job must be cancel when task failed.
   
   <!-- What do you think went wrong? -->
   
   **How to reproduce it**: Network issue only happens at sometimes.
   <!---
   
   As minimally and precisely as possible. Keep in mind we do not have access to your cluster or dags.
   
   If you are using kubernetes, please attempt to recreate the issue using minikube or kind.
   
   ## Install minikube/kind
   
   - Minikube https://minikube.sigs.k8s.io/docs/start/
   - Kind https://kind.sigs.k8s.io/docs/user/quick-start/
   
   If this is a UI bug, please provide a screenshot of the bug or a link to a youtube video of the bug in action
   
   You can include images using the .md style of
   ![alt text](http://url/to/img.png)
   
   To record a screencast, mac users can use QuickTime and then create an unlisted youtube video with the resulting .mov file.
   
   --->
   
   
   **Anything else we need to know**:
   Once per week.
   
   ```
   [2021-02-16 00:46:58,584] {{discovery.py:867}} INFO - URL being requested: GET https://bigquery.googleapis.com/bigquery/v2/projects/dcard-production/jobs/job_CmfL_czbPNebWFysuoCGYz6I6b31?location=US&alt=json
   [2021-02-16 00:46:58,587] {{taskinstance.py:1150}} ERROR - [Errno 104] Connection reset by peer
   Traceback (most recent call last):
     File "/usr/local/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 984, in _run_raw_task
       result = task_copy.execute(context=context)
     File "/usr/local/lib/python3.7/site-packages/airflow/contrib/operators/gcs_to_bq.py", line 288, in execute
       encryption_configuration=self.encryption_configuration)
     File "/usr/local/lib/python3.7/site-packages/airflow/contrib/hooks/bigquery_hook.py", line 1302, in run_load
       return self.run_with_configuration(configuration)
     File "/usr/local/lib/python3.7/site-packages/airflow/contrib/hooks/bigquery_hook.py", line 1338, in run_with_configuration
       location=location).execute(num_retries=self.num_retries)
     File "/usr/local/lib/python3.7/site-packages/googleapiclient/_helpers.py", line 130, in positional_wrapper
       return wrapped(*args, **kwargs)
     File "/usr/local/lib/python3.7/site-packages/googleapiclient/http.py", line 851, in execute
       method=str(self.method), body=self.body, headers=self.headers)
     File "/usr/local/lib/python3.7/site-packages/googleapiclient/http.py", line 165, in _retry_request
       resp, content = http.request(uri, method, *args, **kwargs)
     File "/usr/local/lib/python3.7/site-packages/google_auth_httplib2.py", line 198, in request
       uri, method, body=body, headers=request_headers, **kwargs)
     File "/usr/local/lib/python3.7/site-packages/googleapiclient/http.py", line 1729, in new_request
       redirections=redirections, connection_type=connection_type)
     File "/usr/local/lib/python3.7/site-packages/httplib2/__init__.py", line 1957, in request
       cachekey,
     File "/usr/local/lib/python3.7/site-packages/httplib2/__init__.py", line 1622, in _request
       conn, request_uri, method, body, headers
     File "/usr/local/lib/python3.7/site-packages/httplib2/__init__.py", line 1560, in _conn_request
       response = conn.getresponse()
     File "/usr/local/lib/python3.7/http/client.py", line 1369, in getresponse
       response.begin()
     File "/usr/local/lib/python3.7/http/client.py", line 310, in begin
       version, status, reason = self._read_status()
     File "/usr/local/lib/python3.7/http/client.py", line 271, in _read_status
       line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1")
     File "/usr/local/lib/python3.7/socket.py", line 589, in readinto
       return self._sock.recv_into(b)
     File "/usr/local/lib/python3.7/ssl.py", line 1071, in recv_into
       return self.read(nbytes, buffer)
     File "/usr/local/lib/python3.7/ssl.py", line 929, in read
       return self._sslobj.read(len, buffer)
   ConnectionResetError: [Errno 104] Connection reset by peer
   [2021-02-16 00:46:58,595] {{taskinstance.py:1194}} INFO - Marking task as UP_FOR_RETRY. dag_id=core_data_postgres_hourly_v6, task_id=truncate_or_append_dcard_post_collections_with_new_records, execution_date=20210215T230000, start_date=20210216T004651, end_date=20210216T004658
   [2021-02-16 00:47:01,011] {{local_task_job.py:102}} INFO - Task exited with return code 1
   ```
   <!--
   
   How often does this problem occur? Once? Every time etc?
   
   Any relevant logs to include? Put them here in side a detail tag:
   <details><summary>x.log</summary> lots of stuff </details>
   
   -->
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] boring-cyborg[bot] commented on issue #14267: GoogleCloudStorageToBigQueryOperator generate duplicate data when network issue happen

Posted by GitBox <gi...@apache.org>.
boring-cyborg[bot] commented on issue #14267:
URL: https://github.com/apache/airflow/issues/14267#issuecomment-780357996


   Thanks for opening your first issue here! Be sure to follow the issue template!
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] eladkal commented on issue #14267: GoogleCloudStorageToBigQueryOperator generate duplicate data when network issue happen

Posted by GitBox <gi...@apache.org>.
eladkal commented on issue #14267:
URL: https://github.com/apache/airflow/issues/14267#issuecomment-924708275


   You are using outdated version of the operator.
   Can you please test against latest version of google provider?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] damon09273 closed issue #14267: GoogleCloudStorageToBigQueryOperator generate duplicate data when network issue happen

Posted by GitBox <gi...@apache.org>.
damon09273 closed issue #14267:
URL: https://github.com/apache/airflow/issues/14267


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] damon09273 commented on issue #14267: GoogleCloudStorageToBigQueryOperator generate duplicate data when network issue happen

Posted by GitBox <gi...@apache.org>.
damon09273 commented on issue #14267:
URL: https://github.com/apache/airflow/issues/14267#issuecomment-935281219


   @eladkal The problem had been solved after upgrade Airflow 2.0+ and latest GCP provider packages.
   Thanks for response.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org