You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@beam.apache.org by "Beam JIRA Bot (Jira)" <ji...@apache.org> on 2022/03/20 16:59:00 UTC

[jira] [Commented] (BEAM-13959) Unable to write to BigQuery tables with column named 'f'

    [ https://issues.apache.org/jira/browse/BEAM-13959?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17509473#comment-17509473 ] 

Beam JIRA Bot commented on BEAM-13959:
--------------------------------------

This issue is assigned but has not received an update in 30 days so it has been labeled "stale-assigned". If you are still working on the issue, please give an update and remove the label. If you are no longer working on the issue, please unassign so someone else may work on it. In 7 days the issue will be automatically unassigned.

> Unable to write to BigQuery tables with column named 'f'
> --------------------------------------------------------
>
>                 Key: BEAM-13959
>                 URL: https://issues.apache.org/jira/browse/BEAM-13959
>             Project: Beam
>          Issue Type: Bug
>          Components: io-java-gcp
>    Affects Versions: 2.36.0
>            Reporter: Joel Weierman
>            Assignee: Reuven Lax
>            Priority: P1
>              Labels: stale-assigned
>          Time Spent: 2.5h
>  Remaining Estimate: 0h
>
> When using the BigQuery Storage Write API through the Java Beam SDK (both the latest release 2.35.0 and 2.36.0-SNAPSHOT), there seems to be an issue when converting field Storage API Proto to columns named 'f'. 
> Reproduction Steps: The "field" named 'f' is unable to be written to BigQuery with the error referenced below. 
> [1]
> "name": "item3",
> "type": "RECORD",
> "mode": "NULLABLE",
> "fields": [
> {
> "name": "data",
> "mode": "NULLABLE",
> "type": "RECORD",
> "fields": [
> {
> "mode": "NULLABLE",
> "name": "a",
> "type": "FLOAT"
> },
> {
> "mode": "NULLABLE",
> "name": "b",
> "type": "FLOAT"
> },
> {
> "mode": "NULLABLE",
> "name": "c",
> "type": "FLOAT"
> },
> {
> "mode": "NULLABLE",
> "name": "d",
> "type": "FLOAT"
> },
> {
> "mode": "NULLABLE",
> "name": "e",
> "type": "FLOAT"
> },
> {
> "mode": "NULLABLE",
> "name": "f",
> "type": "FLOAT"
> }
> ]
> }
> ]
> [2]
> {
> ...
> "item3": {
> "data": {
> "a": 1.627424812511E12,
> "b": 3.0,
> "c": 3.0,
> "d": 530.0,
> "e": 675.0
> }
> },
> ...
> }
> The following error occurs: Exception in thread "main" org.apache.beam.sdk.Pipeline$PipelineExecutionException: java.lang.IllegalArgumentException: Can not set java.util.List field com.google.api.services.bigquery.model.TableRow.f to java.lang.Double at org.apache.beam.runners.direct.DirectRunner$DirectPipelineResult.waitUntilFinish(DirectRunner.java:373) at org.apache.beam.runners.direct.DirectRunner$DirectPipelineResult.waitUntilFinish(DirectRunner.java:341) at org.apache.beam.runners.direct.DirectRunner.run(DirectRunner.java:218) at org.apache.beam.runners.direct.DirectRunner.run(DirectRunner.java:67) at org.apache.beam.sdk.Pipeline.run(Pipeline.java:323) at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309) at com.google.cloud.teleport.templates.PubSubToBigQuery.run(PubSubToBigQuery.java:342) at com.google.cloud.teleport.templates.PubSubToBigQuery.main(PubSubToBigQuery.java:223) Caused by: java.lang.IllegalArgumentException: Can not set java.util.List field com.google.api.services.bigquery.model.TableRow.f to java.lang.Double at sun.reflect.UnsafeFieldAccessorImpl.throwSetIllegalArgumentException(UnsafeFieldAccessorImpl.java:167) at sun.reflect.UnsafeFieldAccessorImpl.throwSetIllegalArgumentException(UnsafeFieldAccessorImpl.java:171) at sun.reflect.UnsafeObjectFieldAccessorImpl.set(UnsafeObjectFieldAccessorImpl.java:81) at java.lang.reflect.Field.set(Field.java:764) at com.google.api.client.util.FieldInfo.setFieldValue(FieldInfo.java:275) at com.google.api.client.util.FieldInfo.setValue(FieldInfo.java:231) at com.google.api.client.util.GenericData.set(GenericData.java:118) at com.google.api.client.json.GenericJson.set(GenericJson.java:91) at com.google.api.services.bigquery.model.TableRow.set(TableRow.java:64) at com.google.api.services.bigquery.model.TableRow.set(TableRow.java:29) at com.google.api.client.util.GenericData.putAll(GenericData.java:131) at org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto.toProtoValue(TableRowToStorageApiProto.java:206) at org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto.messageValueFromFieldValue(TableRowToStorageApiProto.java:175) at org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto.messageFromTableRow(TableRowToStorageApiProto.java:103) at org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto.toProtoValue(TableRowToStorageApiProto.java:207) at org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto.messageValueFromFieldValue(TableRowToStorageApiProto.java:175) at org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto.messageFromTableRow(TableRowToStorageApiProto.java:103) at org.apache.beam.sdk.io.gcp.bigquery.StorageApiDynamicDestinationsTableRow$1.toMessage(StorageApiDynamicDestinationsTableRow.java:95) at org.apache.beam.sdk.io.gcp.bigquery.StorageApiConvertMessages$ConvertMessagesDoFn.processElement(StorageApiConvertMessages.java:106) This error does not show up if I leave the write method to use Streaming Inserts.



--
This message was sent by Atlassian Jira
(v8.20.1#820001)