You are viewing a plain text version of this content. The canonical link for it is here.
Posted to github@beam.apache.org by GitBox <gi...@apache.org> on 2021/11/17 18:25:46 UTC

[GitHub] [beam] kennknowles commented on a change in pull request #15987: [BEAM-13242] Allow values with smaller precision and scale for FixedP…

kennknowles commented on a change in pull request #15987:
URL: https://github.com/apache/beam/pull/15987#discussion_r751524811



##########
File path: sdks/java/io/jdbc/src/main/java/org/apache/beam/sdk/io/jdbc/LogicalTypes.java
##########
@@ -266,8 +266,8 @@ private FixedPrecisionNumeric(
     @Override
     public BigDecimal toInputType(BigDecimal base) {
       checkArgument(
-          base == null || (base.precision() == precision && base.scale() == scale),
-          "Expected BigDecimal base to be null or have precision = %s (was %s), scale = %s (was %s)",
+          base == null || (base.precision() <= precision && base.scale() <= scale),

Review comment:
       Incidentally it took me a minute to understand what is happening here because I don't know why it is called `base`. This is converting the value returned by JDBC to the value expected by Beam schema, right?
   
   I am having trouble wrapping my head around all the cases here, or coming up with the simple math. The inefficient but simple check here is `base.round(new MathContext(precision)).compareTo(base) == 0`.
   
   If the scales are equal, then lower precision is fine, certainly.
   
   But you could even have "1000 x 10^-3" (precision 3, scale 3) and it can safely coerce to "1 x 10^0" (precision 1, scale 0). Am I getting this wrong?
   




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: github-unsubscribe@beam.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org