You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Huaxin Gao (Jira)" <ji...@apache.org> on 2021/08/24 06:03:00 UTC

[jira] [Updated] (SPARK-36555) DS V2 Filter support

     [ https://issues.apache.org/jira/browse/SPARK-36555?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Huaxin Gao updated SPARK-36555:
-------------------------------
    Description: 
The motivation of adding DSV2 filters:
The values in V1 filters are Scala types. When translating catalyst Expression to V1 filers, we have to call convertToScala to convert from Catalyst types used internally in rows to standard Scala types, and later convert Scala types back to Catalyst types. This is very inefficient. In V2 filters, we use Expression for filter values, so the conversion from Catalyst types to Scala types and Scala types back to Catalyst types are avoided.

  was:Currently, DS V2 still uses V1 filters. We need to add V2 filters and use these V2 filters in V2 codepath.


> DS V2 Filter support
> --------------------
>
>                 Key: SPARK-36555
>                 URL: https://issues.apache.org/jira/browse/SPARK-36555
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 3.3.0
>            Reporter: Huaxin Gao
>            Priority: Major
>
> The motivation of adding DSV2 filters:
> The values in V1 filters are Scala types. When translating catalyst Expression to V1 filers, we have to call convertToScala to convert from Catalyst types used internally in rows to standard Scala types, and later convert Scala types back to Catalyst types. This is very inefficient. In V2 filters, we use Expression for filter values, so the conversion from Catalyst types to Scala types and Scala types back to Catalyst types are avoided.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org