You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Huaxin Gao (Jira)" <ji...@apache.org> on 2019/10/07 18:55:00 UTC

[jira] [Comment Edited] (SPARK-22390) Aggregate push down

    [ https://issues.apache.org/jira/browse/SPARK-22390?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16946140#comment-16946140 ] 

Huaxin Gao edited comment on SPARK-22390 at 10/7/19 6:54 PM:
-------------------------------------------------------------

I am slowly catching up the changes in Data Source V2 and try to fit aggregate push down there. It seems JDBC support is still not in Data Source V2 yet. If I put aggregate push down in V2, I don't have a data source to test the implementation. I guess all I can do is to follow the pushFilters in DataSourceV2Suite and implement a simple aggregate in AdvancedDataSourceV2, something like the GreaterThan filter. Any suggestions? [~smilegator] 
  [~holden]


was (Author: huaxingao):
I am slowly catching up the changes in Data Source V2 and try to fit aggregate push down there. It seems JDBC support is still not in Data Source V2 yet. If I put aggregate push down in V2, I don't have a data source to test the implementation. I guess all I can do is to follow the pushFilters in DataSourceV2Suite and implement a simple aggregate in AdvancedDataSourceV2, something like the GreaterThan filter. Any suggestions? [~smilegator][~holden]

> Aggregate push down
> -------------------
>
>                 Key: SPARK-22390
>                 URL: https://issues.apache.org/jira/browse/SPARK-22390
>             Project: Spark
>          Issue Type: Sub-task
>          Components: SQL
>    Affects Versions: 2.3.0
>            Reporter: Wenchen Fan
>            Priority: Major
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org