You are viewing a plain text version of this content. The canonical link for it is here.
Posted to github@beam.apache.org by "bastewart (via GitHub)" <gi...@apache.org> on 2023/01/31 13:50:14 UTC

[GitHub] [beam] bastewart opened a new issue, #25227: [Bug]: BigQuery Storage Write API does not fully obey `ignoreUnknownValues`

bastewart opened a new issue, #25227:
URL: https://github.com/apache/beam/issues/25227

   ### What happened?
   
   Writing to BigQuery with `BigQueryIO` in `STORAGE_WRITE_API` mode leads to loss of rows if any individual datapoint in it is invalid, even when using `ignoreUnknownValues`. The rows are piped instead to the failed output collection and dropped entirely.
   
   This is related to another issue which together mean it's quite common (at least in my use-case!) for rows to be lost.
   
   As an example I get the following error for an optional field even with `ignoreUnknownValues` set:
   
   ```
   Exception: org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto$SchemaDoesntMatchException: Unexpected value :36183.0, type: class java.lang.Double. Table field name: <snip>, type: INT64'}
   ```
   
   I think this could be a relatively easy fix; instead of throwing a `SchemaDoesNotMatch` exception at the end of `singularFieldToProtoValue` we could return `null` if `FieldDescriptor.isOptional` is true.
   
   Similarly some over `throw`s would need to change to check if the field is optional. e.g.: https://github.com/apache/beam/blob/634b0453469b66ee4c135aca48b02d2425916f36/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigquery/TableRowToStorageApiProto.java#L692-L697
   
   Apologies if this is not a bug, but I personally classify it as "unexpected behaviour" having moved from the Streaming API where this behaviour is supported.
   
   I'd be happy to open a PR here, but have never contributed here and Java isn't my primary language, so it may be a little slow...
   
   ### Issue Priority
   
   Priority: 2 (default / most bugs should be filed as P2)
   
   ### Issue Components
   
   - [ ] Component: Python SDK
   - [X] Component: Java SDK
   - [ ] Component: Go SDK
   - [ ] Component: Typescript SDK
   - [X] Component: IO connector
   - [ ] Component: Beam examples
   - [ ] Component: Beam playground
   - [ ] Component: Beam katas
   - [ ] Component: Website
   - [ ] Component: Spark Runner
   - [ ] Component: Flink Runner
   - [ ] Component: Samza Runner
   - [ ] Component: Twister2 Runner
   - [ ] Component: Hazelcast Jet Runner
   - [ ] Component: Google Cloud Dataflow Runner


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: github-unsubscribe@beam.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [beam] reuvenlax commented on issue #25227: [Bug]: BigQuery Storage Write API does not fully obey `ignoreUnknownValues`

Posted by "reuvenlax (via GitHub)" <gi...@apache.org>.
reuvenlax commented on issue #25227:
URL: https://github.com/apache/beam/issues/25227#issuecomment-1420113254

   ignoreUnknownValues is defined to trigger if an unknown field name is referenced, not if there is a type conversion error. 
   
   Is there an ask to have a mode that simply nulls out invalid fields (if the field is nullable) and continues to write the record? That seems like a reasonable feature request, though I think ignoreInvalidValues is a more-appropriate name.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: github-unsubscribe@beam.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [beam] Abacn commented on issue #25227: [Bug]: BigQuery Storage Write API does not fully obey `ignoreUnknownValues`

Posted by "Abacn (via GitHub)" <gi...@apache.org>.
Abacn commented on issue #25227:
URL: https://github.com/apache/beam/issues/25227#issuecomment-1414578287

   agree that the behavior should be consistent among different writing methods. @slilichenko @apilloud @ahmedabu98 @lukecwik who have involed in #24366 for opinion.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: github-unsubscribe@beam.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [beam] bastewart closed issue #25227: [Bug]: BigQuery Storage Write API does not fully obey `ignoreUnknownValues`

Posted by "bastewart (via GitHub)" <gi...@apache.org>.
bastewart closed issue #25227: [Bug]: BigQuery Storage Write API does not fully obey `ignoreUnknownValues`
URL: https://github.com/apache/beam/issues/25227


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: github-unsubscribe@beam.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [beam] lukecwik commented on issue #25227: [Bug]: BigQuery Storage Write API does not fully obey `ignoreUnknownValues`

Posted by "lukecwik (via GitHub)" <gi...@apache.org>.
lukecwik commented on issue #25227:
URL: https://github.com/apache/beam/issues/25227#issuecomment-1419511585

   Doing what you suggest makes sense but we'll also need to fill in all the type conversions that we should be doing otherwise users who had failures in the past will now have successes with missing data.
   
   @reuvenlax is there a type map that we need to support for storage write API conversions before we can enable `ignoreUnknownValues` fully here?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: github-unsubscribe@beam.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [beam] bastewart commented on issue #25227: [Bug]: BigQuery Storage Write API does not fully obey `ignoreUnknownValues`

Posted by "bastewart (via GitHub)" <gi...@apache.org>.
bastewart commented on issue #25227:
URL: https://github.com/apache/beam/issues/25227#issuecomment-1415816086

   Thank you!
   
   I've got [a branch in my fork](https://github.com/bastewart/beam/tree/improve-bigquery-storage-write-api-type-conversions) which resolves this as well as #25228. It's probably not up-to-scratch but I can open a PR if it would help?
   
   Fix for this issue: https://github.com/bastewart/beam/commit/ef09d344678c7e6b3dc21c2492147ce7fd327e55
   Fix for #25228: https://github.com/bastewart/beam/commit/38601213f81896444c60dd9e590f8a795358d09a
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: github-unsubscribe@beam.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [beam] bastewart commented on issue #25227: [Bug]: BigQuery Storage Write API does not fully obey `ignoreUnknownValues`

Posted by "bastewart (via GitHub)" <gi...@apache.org>.
bastewart commented on issue #25227:
URL: https://github.com/apache/beam/issues/25227#issuecomment-1443603644

   > ignoreUnknownValues is defined to trigger if an unknown field name is referenced, not if there is a type conversion error.
   > 
   > Is there an ask to have a mode that simply nulls out invalid fields (if the field is nullable) and continues to write the record? That seems like a reasonable feature request, though I think ignoreInvalidValues is a more-appropriate name.
   
   Sorry, you are right! I was slightly confused about the behaviour of BigQuery in this instance. You're absolutely right that `ignoreUnknownValues` only skips unrecognised field names, and does not skip invalid data points.
   
   The real issue I'm running into is #25228. As @lukecwik says it's a type conversion problem.
   
   I'll close this issue as, as you say, it's a different feature request to ask for `ignoreInvalidValues`. It's also not a feature that BQ itself offers 👍 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: github-unsubscribe@beam.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org