You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by "MaxGekk (via GitHub)" <gi...@apache.org> on 2024/02/16 12:42:06 UTC

[PR] [SPARK-47072][SQL][3.5] Fix supported interval formats in error messages [spark]

MaxGekk opened a new pull request, #45139:
URL: https://github.com/apache/spark/pull/45139

   ### What changes were proposed in this pull request?
   In the PR, I propose to add one more field to keys of `supportedFormat` in `IntervalUtils` because current implementation has duplicate keys that overwrites each other. For instance, the following keys are the same:
   ```
   (YM.YEAR, YM.MONTH)
   ...
   (DT.DAY, DT.HOUR)
   ```
   because `YM.YEAR = DT.DAY = 0` and `YM.MONTH = DT.HOUR = 1`.
   
   This is a backport of https://github.com/apache/spark/pull/45127.
   
   ### Why are the changes needed?
   To fix the incorrect error message when Spark cannot parse ANSI interval string. For example, the expected format should be some year-month format but Spark outputs day-time one:
   ```sql
   spark-sql (default)> select interval '-\t2-2\t' year to month;
   
   Interval string does not match year-month format of `[+|-]d h`, `INTERVAL [+|-]'[+|-]d h' DAY TO HOUR` when cast to interval year to month: -	2-2	. (line 1, pos 16)
   
   == SQL ==
   select interval '-\t2-2\t' year to month
   ----------------^^^
   ```
   
   ### Does this PR introduce _any_ user-facing change?
   Yes.
   
   ### How was this patch tested?
   By running the existing test suite:
   ```
   $ build/sbt "test:testOnly *IntervalUtilsSuite"
   ```
   and regenerating the golden files:
   ```
   $ SPARK_GENERATE_GOLDEN_FILES=1 PYSPARK_PYTHON=python3 build/sbt "sql/testOnly org.apache.spark.sql.SQLQueryTestSuite"
   ```
   
   ### Was this patch authored or co-authored using generative AI tooling?
   No.
   
   Authored-by: Max Gekk <ma...@gmail.com>
   (cherry picked from commit 074fcf2807000d342831379de0fafc1e49a6bf19)


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Re: [PR] [SPARK-47072][SQL][3.5] Fix supported interval formats in error messages [spark]

Posted by "MaxGekk (via GitHub)" <gi...@apache.org>.
MaxGekk commented on PR #45139:
URL: https://github.com/apache/spark/pull/45139#issuecomment-1951851380

   Merging to 3.5. Thank you, @cloud-fan for review.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Re: [PR] [SPARK-47072][SQL][3.5] Fix supported interval formats in error messages [spark]

Posted by "MaxGekk (via GitHub)" <gi...@apache.org>.
MaxGekk closed pull request #45139: [SPARK-47072][SQL][3.5] Fix supported interval formats in error messages
URL: https://github.com/apache/spark/pull/45139


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org