You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@beam.apache.org by "ASF GitHub Bot (Jira)" <ji...@apache.org> on 2022/03/10 17:43:00 UTC

[jira] [Work logged] (BEAM-13990) BigQueryIO cannot write to DATE and TIMESTAMP columns when using Storage Write API

     [ https://issues.apache.org/jira/browse/BEAM-13990?focusedWorklogId=739636&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-739636 ]

ASF GitHub Bot logged work on BEAM-13990:
-----------------------------------------

                Author: ASF GitHub Bot
            Created on: 10/Mar/22 17:42
            Start Date: 10/Mar/22 17:42
    Worklog Time Spent: 10m 
      Work Description: aaltay commented on pull request #16926:
URL: https://github.com/apache/beam/pull/16926#issuecomment-1064327307


   @liu-du - Could you please address the failing tests?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: github-unsubscribe@beam.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


Issue Time Tracking
-------------------

            Worklog Id:     (was: 739636)
    Remaining Estimate: 108h 40m  (was: 108h 50m)
            Time Spent: 11h 20m  (was: 11h 10m)

> BigQueryIO cannot write to DATE and TIMESTAMP columns when using Storage Write API 
> -----------------------------------------------------------------------------------
>
>                 Key: BEAM-13990
>                 URL: https://issues.apache.org/jira/browse/BEAM-13990
>             Project: Beam
>          Issue Type: Improvement
>          Components: io-java-gcp
>    Affects Versions: 2.36.0
>            Reporter: Du Liu
>            Assignee: Du Liu
>            Priority: P2
>   Original Estimate: 120h
>          Time Spent: 11h 20m
>  Remaining Estimate: 108h 40m
>
> when using Storage Write API with BigQueryIO, DATE and TIMESTAMP values are currently converted to String type in protobuf message. This is incorrect, according to storage write api [documentation|#data_type_conversions],] DATE should be converted to int32 and TIMESTAMP should be converted to int64.
> Here's error message: 
> INFO: Stream finished with error com.google.api.gax.rpc.InvalidArgumentException: io.grpc.StatusRuntimeException: INVALID_ARGUMENT: The proto field mismatched with BigQuery field at D6cbe536b_4dab_4292_8fda_ff2932dded49.datevalue, the proto field type string, BigQuery field type DATE Entity
> I have included an integration test here: [https://github.com/liu-du/beam/commit/b56823d1d213adf6ca5564ce1d244cc4ae8f0816]
>  
> The problem is because DATE and TIMESTAMP are converted to String in protobuf message here: 
> [https://github.com/apache/beam/blob/a78fec72d0d9198eef75144a7bdaf93ada5abf9b/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigquery/TableRowToStorageApiProto.java#L69]
>  
> Storage Write API reject the request because it's expecting int32/int64 values. 
>  
> I've opened a PR here: https://github.com/apache/beam/pull/16926



--
This message was sent by Atlassian Jira
(v8.20.1#820001)