You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by "MaxGekk (via GitHub)" <gi...@apache.org> on 2023/05/05 07:22:04 UTC

[GitHub] [spark] MaxGekk commented on a diff in pull request #41059: [SPARK-42842][SQL]Update the error class _LEGACY_ERROR_TEMP_2006 to NEGATIVE_REGEX_GROUP_INDEX

MaxGekk commented on code in PR #41059:
URL: https://github.com/apache/spark/pull/41059#discussion_r1185771279


##########
core/src/main/resources/error/error-classes.json:
##########
@@ -1009,6 +1009,11 @@
           "Initial bytes from input <saltedMagic> do not match 'Salted__' (0x53616C7465645F5F)."
         ]
       },
+      "NEGATIVE_REGEX_GROUP_INDEX" : {

Review Comment:
   Can't you re-use the existing error class `REGEX_GROUP_INDEX`?



##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/regexpExpressions.scala:
##########
@@ -748,7 +748,7 @@ object RegExpReplace {
 object RegExpExtractBase {
   def checkGroupIndex(prettyName: String, groupCount: Int, groupIndex: Int): Unit = {
     if (groupIndex < 0) {
-      throw QueryExecutionErrors.regexGroupIndexLessThanZeroError
+      throw QueryExecutionErrors.regexGroupIndexLessThanZeroError(prettyName)
     } else if (groupCount < groupIndex) {
       throw QueryExecutionErrors.regexGroupIndexExceedGroupCountError(
         prettyName, groupCount, groupIndex)

Review Comment:
   How about:
   ```scala
       if (groupIndex < 0 || groupCount < groupIndex) {
         throw QueryExecutionErrors.invalidRegexGroupIndexError(
           prettyName, groupCount, groupIndex)
       }
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org