You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Reynold Xin (JIRA)" <ji...@apache.org> on 2015/04/26 10:00:45 UTC

[jira] [Commented] (SPARK-7133) Implement struct, array, and map field accessor using apply in Scala and __getitem__ in Python

    [ https://issues.apache.org/jira/browse/SPARK-7133?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14512943#comment-14512943 ] 

Reynold Xin commented on SPARK-7133:
------------------------------------

cc [~cloud_fan] 

I'm looking at some code you wrote. Looks like both GetItem and ArrayGetField can be used to get a field out of an array. Any reason why we don't just have ArrayGetField, MapGetField, StructGetField?

It seems to me it'd be easier if we generalize UnresolvedGetField to support all map, struct, and array, and during analysis rewrite it to one of  ArrayGetField, MapGetField, StructGetField.

> Implement struct, array, and map field accessor using apply in Scala and __getitem__ in Python
> ----------------------------------------------------------------------------------------------
>
>                 Key: SPARK-7133
>                 URL: https://issues.apache.org/jira/browse/SPARK-7133
>             Project: Spark
>          Issue Type: Sub-task
>          Components: SQL
>            Reporter: Reynold Xin
>              Labels: starter
>
> Typing 
> {code}
> df.col[1]
> {code}
> and
> {code}
> df.col['field']
> {code}
> is so much eaiser than
> {code}
> df.col.getField('field')
> df.col.getItem(1)
> {code}
> This would require us to define (in Column) an apply function in Scala, and a __getitem__ function in Python.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org