You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Feng Yuan (JIRA)" <ji...@apache.org> on 2016/12/30 07:09:58 UTC
[jira] [Updated] (SPARK-19035) nested functions in case when
statement will failed
[ https://issues.apache.org/jira/browse/SPARK-19035?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Feng Yuan updated SPARK-19035:
------------------------------
Description:
In this case:
select
case when a=1 then 1 else concat(a,cast(rand() as string)) end b,count(1)
from
yuanfeng1_a
group by
case when a=1 then 1 else concat(a,cast(rand() as string)) end;
Throw error:
Error in query: expression 'yuanfeng1_a.`a`' is neither present in the group by, nor is it an aggregate function. Add to group by or wrap in first() (or first_value) if you don't care which value you get.;;
Aggregate [CASE WHEN (a#2075 = 1) THEN cast(1 as string) ELSE concat(cast(a#2075 as string), cast(rand(519367429988179997) as string)) END], [CASE WHEN (a#2075 = 1) THEN cast(1 as string) ELSE concat(cast(a#2075 as string), cast(rand(8090243936131101651) as string)) END AS b#2074]
+- MetastoreRelation default, yuanfeng1_a
was:
in this case:
select case when a=1 then 1 else concat(a,cast(rand() as string)) end b,count(1) from yuanfeng1_a group by case when a=1 then 1 else concat(a,cast(rand() as string)) end;
throw error:
Error in query: expression 'yuanfeng1_a.`a`' is neither present in the group by, nor is it an aggregate function. Add to group by or wrap in first() (or first_value) if you don't care which value you get.;;
Aggregate [CASE WHEN (a#2075 = 1) THEN cast(1 as string) ELSE concat(cast(a#2075 as string), cast(rand(519367429988179997) as string)) END], [CASE WHEN (a#2075 = 1) THEN cast(1 as string) ELSE concat(cast(a#2075 as string), cast(rand(8090243936131101651) as string)) END AS b#2074]
+- MetastoreRelation default, yuanfeng1_a
spark-sql> select case when a=1 then 1 else concat(a,cast(rand() as string)) end b,count(1) from yuanfeng1_a group by case when a=1 then 1 else concat(a,cast(rand() as string)) end;
16/12/30 15:05:55 INFO execution.SparkSqlParser: Parsing command: select case when a=1 then 1 else concat(a,cast(rand() as string)) end b,count(1) from yuanfeng1_a group by case when a=1 then 1 else concat(a,cast(rand() as string)) end
16/12/30 15:05:55 INFO parser.CatalystSqlParser: Parsing command: int
Error in query: expression 'yuanfeng1_a.`a`' is neither present in the group by, nor is it an aggregate function. Add to group by or wrap in first() (or first_value) if you don't care which value you get.;;
Aggregate [CASE WHEN (a#2077 = 1) THEN cast(1 as string) ELSE concat(cast(a#2077 as string), cast(rand(-8113865568189974672) as string)) END], [CASE WHEN (a#2077 = 1) THEN cast(1 as string) ELSE concat(cast(a#2077 as string), cast(rand(-824889479508647173) as string)) END AS b#2076, count(1) AS count(1)#2079L]
+- MetastoreRelation default, yuanfeng1_a
> nested functions in case when statement will failed
> ---------------------------------------------------
>
> Key: SPARK-19035
> URL: https://issues.apache.org/jira/browse/SPARK-19035
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 2.0.0, 2.0.1, 2.0.2
> Reporter: Feng Yuan
>
> In this case:
> select
> case when a=1 then 1 else concat(a,cast(rand() as string)) end b,count(1)
> from
> yuanfeng1_a
> group by
> case when a=1 then 1 else concat(a,cast(rand() as string)) end;
> Throw error:
> Error in query: expression 'yuanfeng1_a.`a`' is neither present in the group by, nor is it an aggregate function. Add to group by or wrap in first() (or first_value) if you don't care which value you get.;;
> Aggregate [CASE WHEN (a#2075 = 1) THEN cast(1 as string) ELSE concat(cast(a#2075 as string), cast(rand(519367429988179997) as string)) END], [CASE WHEN (a#2075 = 1) THEN cast(1 as string) ELSE concat(cast(a#2075 as string), cast(rand(8090243936131101651) as string)) END AS b#2074]
> +- MetastoreRelation default, yuanfeng1_a
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org