You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by GitBox <gi...@apache.org> on 2022/02/24 06:51:46 UTC
[GitHub] [airflow] rajaths010494 opened a new issue #21783: BigQueryGetDataOperator: fails when the table has Date field
rajaths010494 opened a new issue #21783:
URL: https://github.com/apache/airflow/issues/21783
### Apache Airflow Provider(s)
google
### Versions of Apache Airflow Providers
apache-airflow-providers-google==6.3.0
### Apache Airflow version
2.2.3
### Operating System
any
### Deployment
Docker-Compose
### Deployment details
The operator ` airflow.providers.google.cloud.operators.bigquery.BigQueryGetDataOperator ` fails when the table for which data needs to be fetched has data field with the following error.
```
2022-02-24, 06:16:45 UTC] {warnings.py:109} WARNING - /usr/local/lib/python3.9/site-packages/***/providers/google/cloud/operators/bigquery.py:475: DeprecationWarning: The bigquery_conn_id parameter has been deprecated. You should pass the gcp_conn_id parameter.
hook = BigQueryHook(
[2022-02-24, 06:16:47 UTC] {bigquery.py:489} INFO - Total extracted rows: 10
[2022-02-24, 06:16:47 UTC] {xcom.py:333} ERROR - Could not serialize the XCom value into JSON. If you are using pickle instead of JSON for XCom, then you need to enable pickle support for XCom in your *** config.
[2022-02-24, 06:16:47 UTC] {taskinstance.py:1700} ERROR - Task failed with exception
Traceback (most recent call last):
File "/usr/local/lib/python3.9/site-packages/airflow/models/taskinstance.py", line 1329, in _run_raw_task
self._execute_task_with_callbacks(context)
File "/usr/local/lib/python3.9/site-packages/airflow/models/taskinstance.py", line 1455, in _execute_task_with_callbacks
result = self._execute_task(context, self.task)
File "/usr/local/lib/python3.9/site-packages/airflow/models/taskinstance.py", line 1514, in _execute_task
self.xcom_push(key=XCOM_RETURN_KEY, value=result)
File "/usr/local/lib/python3.9/site-packages/airflow/utils/session.py", line 70, in wrapper
return func(*args, session=session, **kwargs)
File "/usr/local/lib/python3.9/site-packages/airflow/models/taskinstance.py", line 2135, in xcom_push
XCom.set(
File "/usr/local/lib/python3.9/site-packages/airflow/utils/session.py", line 67, in wrapper
return func(*args, **kwargs)
File "/usr/local/lib/python3.9/site-packages/airflow/models/xcom.py", line 100, in set
value = XCom.serialize_value(value)
File "/usr/local/lib/python3.9/site-packages/airflow/models/xcom.py", line 331, in serialize_value
return json.dumps(value).encode('UTF-8')
File "/usr/local/lib/python3.9/json/__init__.py", line 231, in dumps
return _default_encoder.encode(obj)
File "/usr/local/lib/python3.9/json/encoder.py", line 199, in encode
chunks = self.iterencode(o, _one_shot=True)
File "/usr/local/lib/python3.9/json/encoder.py", line 257, in iterencode
return _iterencode(o, 0)
File "/usr/local/lib/python3.9/json/encoder.py", line 179, in default
raise TypeError(f'Object of type {o.__class__.__name__} '
TypeError: Object of type date is not JSON serializable
[2022-02-24, 06:16:47 UTC] {taskinstance.py:1267} INFO - Marking task as FAILED. dag_id=example_async_bigquery_queries, task_id=get_data, execution_date=20220224T061606, start_date=20220224T061644, end_date=20220224T061647
[2022-02-24, 06:16:47 UTC] {standard_task_runner.py:89} ERROR - Failed to execute job 86 for task get_data
```
The data has been fetched but while pushing to XCOM Could not serialize the XCom value into JSON. If you are using pickle instead of JSON for XCom,
### What happened
The operator ` airflow.providers.google.cloud.operators.bigquery.BigQueryGetDataOperator ` fails when the table for which data needs to be fetched has data field with the following error.
```
2022-02-24, 06:16:45 UTC] {warnings.py:109} WARNING - /usr/local/lib/python3.9/site-packages/***/providers/google/cloud/operators/bigquery.py:475: DeprecationWarning: The bigquery_conn_id parameter has been deprecated. You should pass the gcp_conn_id parameter.
hook = BigQueryHook(
[2022-02-24, 06:16:47 UTC] {bigquery.py:489} INFO - Total extracted rows: 10
[2022-02-24, 06:16:47 UTC] {xcom.py:333} ERROR - Could not serialize the XCom value into JSON. If you are using pickle instead of JSON for XCom, then you need to enable pickle support for XCom in your *** config.
[2022-02-24, 06:16:47 UTC] {taskinstance.py:1700} ERROR - Task failed with exception
Traceback (most recent call last):
File "/usr/local/lib/python3.9/site-packages/airflow/models/taskinstance.py", line 1329, in _run_raw_task
self._execute_task_with_callbacks(context)
File "/usr/local/lib/python3.9/site-packages/airflow/models/taskinstance.py", line 1455, in _execute_task_with_callbacks
result = self._execute_task(context, self.task)
File "/usr/local/lib/python3.9/site-packages/airflow/models/taskinstance.py", line 1514, in _execute_task
self.xcom_push(key=XCOM_RETURN_KEY, value=result)
File "/usr/local/lib/python3.9/site-packages/airflow/utils/session.py", line 70, in wrapper
return func(*args, session=session, **kwargs)
File "/usr/local/lib/python3.9/site-packages/airflow/models/taskinstance.py", line 2135, in xcom_push
XCom.set(
File "/usr/local/lib/python3.9/site-packages/airflow/utils/session.py", line 67, in wrapper
return func(*args, **kwargs)
File "/usr/local/lib/python3.9/site-packages/airflow/models/xcom.py", line 100, in set
value = XCom.serialize_value(value)
File "/usr/local/lib/python3.9/site-packages/airflow/models/xcom.py", line 331, in serialize_value
return json.dumps(value).encode('UTF-8')
File "/usr/local/lib/python3.9/json/__init__.py", line 231, in dumps
return _default_encoder.encode(obj)
File "/usr/local/lib/python3.9/json/encoder.py", line 199, in encode
chunks = self.iterencode(o, _one_shot=True)
File "/usr/local/lib/python3.9/json/encoder.py", line 257, in iterencode
return _iterencode(o, 0)
File "/usr/local/lib/python3.9/json/encoder.py", line 179, in default
raise TypeError(f'Object of type {o.__class__.__name__} '
TypeError: Object of type date is not JSON serializable
[2022-02-24, 06:16:47 UTC] {taskinstance.py:1267} INFO - Marking task as FAILED. dag_id=example_async_bigquery_queries, task_id=get_data, execution_date=20220224T061606, start_date=20220224T061644, end_date=20220224T061647
[2022-02-24, 06:16:47 UTC] {standard_task_runner.py:89} ERROR - Failed to execute job 86 for task get_data
```
The data has been fetched but while pushing to XCOM Could not serialize the XCom value into JSON. If you are using pickle instead of JSON for XCom,
### What you expected to happen
Expected to the return all the records properly.
### How to reproduce
Create a table with date field column and try to fetch the records using BigQueryGetDataOperator
### Anything else
_No response_
### Are you willing to submit PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
[GitHub] [airflow] boring-cyborg[bot] commented on issue #21783: BigQueryGetDataOperator: fails when the table has Date field
Posted by GitBox <gi...@apache.org>.
boring-cyborg[bot] commented on issue #21783:
URL: https://github.com/apache/airflow/issues/21783#issuecomment-1049547312
Thanks for opening your first issue here! Be sure to follow the issue template!
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org
For queries about this service, please contact Infrastructure at:
users@infra.apache.org