You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Dominic Ricard (JIRA)" <ji...@apache.org> on 2016/04/15 21:33:25 UTC
[jira] [Comment Edited] (SPARK-14666) Using DISTINCT on a UDF (like
CONCAT) is not supported
[ https://issues.apache.org/jira/browse/SPARK-14666?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15243455#comment-15243455 ]
Dominic Ricard edited comment on SPARK-14666 at 4/15/16 7:33 PM:
-----------------------------------------------------------------
It appears that the order by clause is expecting the column name. The proper query looks like this:
{noformat}
select
distinct concat(column_1, ' : ', column_2) as `column1`
from
table
order by
`column1`;
{noformat}
Why isn't concat(column_1, ' : ', column2) converted to the _c0 alias in order by when used in conjunction with distinct?
was (Author: dricard):
It appears that the order by clause is expecting the column name. The proper query looks like this:
{noformat}
select
distinct concat(column_1, ' : ', column_2) as `column1`
from
table
order by
`column1`;
{noformat}
> Using DISTINCT on a UDF (like CONCAT) is not supported
> ------------------------------------------------------
>
> Key: SPARK-14666
> URL: https://issues.apache.org/jira/browse/SPARK-14666
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 1.6.1
> Reporter: Dominic Ricard
> Priority: Minor
>
> The following query fails with:
> {noformat}
> Java::JavaSql::SQLException: org.apache.spark.sql.AnalysisException: cannot resolve 'column_1' given input columns: [_c0]; line # pos ##
> {noformat}
> Query:
> {noformat}
> select
> distinct concat(column_1, ' : ', column_2)
> from
> table
> order by
> concat(column_1, ' : ', column_2);
> {noformat}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org