You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2015/12/09 08:35:10 UTC

[jira] [Updated] (SPARK-12236) JDBC filter tests all pass if filters are not really pushed down

     [ https://issues.apache.org/jira/browse/SPARK-12236?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Hyukjin Kwon updated SPARK-12236:
---------------------------------
    Description: 
It is similar with https://issues.apache.org/jira/browse/SPARK-11676 and https://issues.apache.org/jira/browse/SPARK-11677.

Currently JDBC predicate tests all pass even if filters are not pushed down.

This is because of Spark-side filtering. 

Moreover, {{Not(Equal)}} is also being tested which is actually not pushed down to JDBC datasource.

  was:
It is similar with https://issues.apache.org/jira/browse/SPARK-11676 and https://issues.apache.org/jira/browse/SPARK-11677.

Currently JDBC predicate tests all pass even if filters are not pushed down or this is disabled.

This is because of Spark-side filtering. 

Moreover, {{Not(Equal)}} is also being tested which is actually not pushed down to JDBC datasource.


> JDBC filter tests all pass if filters are not really pushed down
> ----------------------------------------------------------------
>
>                 Key: SPARK-12236
>                 URL: https://issues.apache.org/jira/browse/SPARK-12236
>             Project: Spark
>          Issue Type: Test
>          Components: SQL
>            Reporter: Hyukjin Kwon
>
> It is similar with https://issues.apache.org/jira/browse/SPARK-11676 and https://issues.apache.org/jira/browse/SPARK-11677.
> Currently JDBC predicate tests all pass even if filters are not pushed down.
> This is because of Spark-side filtering. 
> Moreover, {{Not(Equal)}} is also being tested which is actually not pushed down to JDBC datasource.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org