You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2021/03/17 22:33:31 UTC

[GitHub] [spark] huaxingao commented on pull request #29695: [SPARK-22390][SPARK-32833][SQL] [WIP]JDBC V2 Datasource aggregate push down

huaxingao commented on pull request #29695:
URL: https://github.com/apache/spark/pull/29695#issuecomment-801485332


   > For the simple PostgreSQL JDBC case, if Spark queries decimal values which have large scale/precision at PostgreSQL, how Spark handles this? Cast the values to fix Spark scale/precision?
   
   I did a quick test using H2. Spark actually threw Exception if the underlying database returns larger precision than Spark's MAX_PRECISION which is 38. 
   Is this the correct behavior?
   ```
         conn.prepareStatement(
           "CREATE TABLE \"test\".\"test_decimal\" (C1 DECIMAL(45, 30), C2 INTEGER NOT NULL)")
           .executeUpdate()
         conn.prepareStatement("INSERT INTO \"test\".\"test_decimal\" VALUES " +
           "(123456789012345.5432154321543215432154321, 1)").executeUpdate()
   
   
        sql("SELECT C1, C2 FROM h2.test.test_decimal").show(false)
   ```
   The test failed with 
   ```
   java.lang.ArithmeticException: Decimal precision 45 exceeds max precision 38
   ```
   
   Here are the code: Spark first maps the JDBC Decimal to Spark DecimalType with the maximum precision of 38.
   ```
     private def getCatalystType(
         sqlType: Int,
         precision: Int,
         scale: Int,
         signed: Boolean): DataType = {
   
         ......
   
         case java.sql.Types.DECIMAL
           if precision != 0 || scale != 0 => DecimalType.bounded(precision, scale)
   ```
   
   ```
     private[sql] def bounded(precision: Int, scale: Int): DecimalType = {
       DecimalType(min(precision, MAX_PRECISION), min(scale, MAX_SCALE))
     }
   ```
   It then throws Exception here:
   ```
     def set(decimal: BigDecimal, precision: Int, scale: Int): Decimal = {
       DecimalType.checkNegativeScale(scale)
       this.decimalVal = decimal.setScale(scale, ROUND_HALF_UP)
       if (decimalVal.precision > precision) {
         throw new ArithmeticException(
           s"Decimal precision ${decimalVal.precision} exceeds max precision $precision")
       }
   ```
   
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org