You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (Jira)" <ji...@apache.org> on 2020/05/17 21:36:00 UTC

[jira] [Commented] (SPARK-27217) Nested schema pruning doesn't work for aggregation e.g. `sum`.

    [ https://issues.apache.org/jira/browse/SPARK-27217?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17109669#comment-17109669 ] 

Apache Spark commented on SPARK-27217:
--------------------------------------

User 'viirya' has created a pull request for this issue:
https://github.com/apache/spark/pull/28560

> Nested schema pruning doesn't work for aggregation e.g. `sum`.
> --------------------------------------------------------------
>
>                 Key: SPARK-27217
>                 URL: https://issues.apache.org/jira/browse/SPARK-27217
>             Project: Spark
>          Issue Type: Sub-task
>          Components: SQL
>    Affects Versions: 3.0.0
>            Reporter: colin fang
>            Priority: Major
>
> Since SPARK-4502 is fixed,  I would expect queries such as `select sum(b.x)` doesn't have to read other nested fields.
> {code:python}   
>  rdd = spark.range(1000).rdd.map(lambda x: [x.id+3, [x.id+1, x.id-1]])
>     df = spark.createDataFrame(add, schema='a:int,b:struct<x:int,y:int>')
>     df.repartition(1).write.mode('overwrite').parquet('test.parquet')
>     df = spark.read.parquet('test.parquet')
>     spark.conf.set('spark.sql.optimizer.nestedSchemaPruning.enabled', 'true')
>     df.select('b.x').explain()
>     # ReadSchema: struct<b:struct<x:int>>
>     spark.conf.set('spark.sql.optimizer.nestedSchemaPruning.enabled', 'false')
>     df.select('b.x').explain()
>     # ReadSchema: struct<b:struct<x:int,y:int>>
>     spark.conf.set('spark.sql.optimizer.nestedSchemaPruning.enabled', 'true')
>     df.selectExpr('sum(b.x)').explain()
>     #  ReadSchema: struct<b:struct<x:int,y:int>>
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org