You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Corey J. Nolet (JIRA)" <ji...@apache.org> on 2015/01/18 02:16:34 UTC

[jira] [Commented] (SPARK-5296) Predicate Pushdown (BaseRelation) to have an interface that will accept OR filters

    [ https://issues.apache.org/jira/browse/SPARK-5296?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14281621#comment-14281621 ] 

Corey J. Nolet commented on SPARK-5296:
---------------------------------------

The more I'm thinking about this- It would be nice if there was a tree pushed down for the filters instead of an Array. This is a significant change to the API so it would still probably be easiest to create a new class (PrunedFilteredTreeScan?).

Probably easiest to have AndFilter and OrFilter parent nodes that can be arbitrarily nested with the leaf nodes being the filters that are already used (hopefully with the addition of the NotEqualsFilter from SPARK-5306).

> Predicate Pushdown (BaseRelation) to have an interface that will accept OR filters
> ----------------------------------------------------------------------------------
>
>                 Key: SPARK-5296
>                 URL: https://issues.apache.org/jira/browse/SPARK-5296
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>            Reporter: Corey J. Nolet
>
> Currently, the BaseRelation API allows a FilteredRelation to handle an Array[Filter] which represents filter expressions that are applied as an AND operator.
> We should support OR operations in a BaseRelation as well. I'm not sure what this would look like in terms of API changes, but it almost seems like a FilteredUnionedScan BaseRelation (the name stinks but you get the idea) would be useful.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org