You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Wenchen Fan (Jira)" <ji...@apache.org> on 2022/03/24 06:30:00 UTC

[jira] [Assigned] (SPARK-38063) Support SQL split_part function

     [ https://issues.apache.org/jira/browse/SPARK-38063?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Wenchen Fan reassigned SPARK-38063:
-----------------------------------

    Assignee: Rui Wang

> Support SQL split_part function
> -------------------------------
>
>                 Key: SPARK-38063
>                 URL: https://issues.apache.org/jira/browse/SPARK-38063
>             Project: Spark
>          Issue Type: Task
>          Components: SQL
>    Affects Versions: 3.3.0
>            Reporter: Rui Wang
>            Assignee: Rui Wang
>            Priority: Major
>
> `split_part()` is a commonly supported function by other systems such as Postgres and some other systems. The Spark equivalent  is `element_at(split(arg, delim), part)`
> h5. Function Specificaiton
> h6. Syntax
> {code:java}
> split_part(str, delimiter, partNum)
> {code}
> h6. Arguments
> {code:java}
> str: string type
> delimiter: string type
> partNum: Integer type
> {code}
> h6. Note
> {code:java}
> 1. This function splits `str` by `delimiter` and return requested part of the split (1-based). 
> 2. If any input parameter is NULL, return NULL.
> 3. If the index is out of range of split parts, returns empty stirng.
> 4. If `partNum` is 0, throws an error.
> 5. If `partNum` is negative, the parts are counted backward from the end of the string
> 6. when delimiter is empty, str is considered not split thus there is just 1 split part. 
> {code}
> h6. Examples
> {code:java}
> > SELECT _FUNC_('11.12.13', '.', 3);
> 13
> > SELECT _FUNC_(NULL, '.', 3);
> NULL
> > SELECT _FUNC_('11.12.13', '', 1);
> '11.12.13'
> {code}



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org