You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (Jira)" <ji...@apache.org> on 2019/11/21 01:30:00 UTC

[jira] [Resolved] (SPARK-29961) Implement typeof builtin function

     [ https://issues.apache.org/jira/browse/SPARK-29961?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Hyukjin Kwon resolved SPARK-29961.
----------------------------------
    Fix Version/s: 3.0.0
       Resolution: Fixed

Issue resolved by pull request 26599
[https://github.com/apache/spark/pull/26599]

> Implement typeof builtin function
> ---------------------------------
>
>                 Key: SPARK-29961
>                 URL: https://issues.apache.org/jira/browse/SPARK-29961
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 3.0.0
>            Reporter: Kent Yao
>            Assignee: Kent Yao
>            Priority: Major
>             Fix For: 3.0.0
>
>
> Add typeof function in Spark to illastrate the underlying type of a value.
> {code:sql}
> -- !query 0
> select typeof(1)
> -- !query 0 schema
> struct<typeof(1):string>
> -- !query 0 output
> int
> -- !query 1
> select typeof(1.2)
> -- !query 1 schema
> struct<typeof(1.2):string>
> -- !query 1 output
> decimal(2,1)
> -- !query 2
> select typeof(array(1, 2))
> -- !query 2 schema
> struct<typeof(array(1, 2)):string>
> -- !query 2 output
> array<int>
> -- !query 3
> select typeof(a) from (values (1), (2), (3.1)) t(a)
> -- !query 3 schema
> struct<typeof(a):string>
> -- !query 3 output
> decimal(11,1)
> decimal(11,1)
> decimal(11,1)
> {code}
> presto
> {code:sql}
> resto> select typeof(array[1]);
>      _col0
> ----------------
>  array(integer)
> (1 row)
> {code}
> PostgreSQL
> {code:sql}
> postgres=# select pg_typeof(a) from (values (1), (2), (3.0)) t(a);
>  pg_typeof
> -----------
>  numeric
>  numeric
>  numeric
> (3 rows)
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org