You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Dongjoon Hyun (Jira)" <ji...@apache.org> on 2020/02/10 20:53:00 UTC

[jira] [Assigned] (SPARK-30759) The cache in StringRegexExpression is not initialized for foldable patterns

     [ https://issues.apache.org/jira/browse/SPARK-30759?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Dongjoon Hyun reassigned SPARK-30759:
-------------------------------------

    Assignee: Maxim Gekk

> The cache in StringRegexExpression is not initialized for foldable patterns
> ---------------------------------------------------------------------------
>
>                 Key: SPARK-30759
>                 URL: https://issues.apache.org/jira/browse/SPARK-30759
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 3.1.0
>            Reporter: Maxim Gekk
>            Assignee: Maxim Gekk
>            Priority: Minor
>         Attachments: Screen Shot 2020-02-08 at 22.45.50.png
>
>
> In the case of foldable patterns, the cache in StringRegexExpression should be evaluated once but in fact it is compiled every time. Here is the example:
> {code:sql}
> SELECT '%SystemDrive%\Users\John' _FUNC_ '%SystemDrive%\\Users.*';
> {code}
> the code https://github.com/apache/spark/blob/8aebc80e0e67bcb1aa300b8c8b1a209159237632/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/regexpExpressions.scala#L45-L48:
> {code:scala}
>   // try cache the pattern for Literal
>   private lazy val cache: Pattern = pattern match {
>     case Literal(value: String, StringType) => compile(value)
>     case _ => null
>   }
> {code}
> The attached screenshot shows that foldable expression doesn't fall to the first case.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org