You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2022/03/16 19:09:46 UTC

[GitHub] [spark] amaliujia edited a comment on pull request #35352: [SPARK-38063][SQL] Support split_part Function

amaliujia edited a comment on pull request #35352:
URL: https://github.com/apache/spark/pull/35352#issuecomment-1069509046


   > Do we need a special function for this rather than simply composing as you say? I get that some other DB does it, but is it at all standard?
   
   The value of this special function is: 
   
   current `split` adopts the split pattern as regex, while a common case if just use the pattern as delimiter. Thus if people need to compose, they will need to always think about quoting the whole pattern (treat all characters in the pattern without carrying special meaning). 
   
   With `split_part`, they will free to come up with a delimiter without thinking further if that contains `*`, `.`, etc.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org