You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Mick Davies <Mi...@gmail.com> on 2015/07/01 19:25:09 UTC

Custom order by in Spark SQL

Hi, 

Is there a way to specify a custom order by (Ordering) on a column in Spark
SQL

In particular I would like to have the order by applied to a currency column
not to be alpha, but something like -  USD, EUR,  JPY, GBP etc..

I saw an earlier post on UDTs and ordering (which I can't seem to find in
this archive,
http://mail-archives.us.apache.org/mod_mbox/spark-user/201503.mbox/%3CCAFGcCdWWCFCwVp7+BCaPQ=6UupmYjcBhQYJn9tXEu45HJg4iFg@mail.gmail.com%3E),
which is somewhat related to this question. 

Thanks
Mick



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Custom-order-by-in-Spark-SQL-tp23569.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: Custom order by in Spark SQL

Posted by Michael Armbrust <mi...@databricks.com>.
Easiest way to do this today is to define a UDF that maps from string to a
number.

On Wed, Jul 1, 2015 at 10:25 AM, Mick Davies <Mi...@gmail.com>
wrote:

> Hi,
>
> Is there a way to specify a custom order by (Ordering) on a column in Spark
> SQL
>
> In particular I would like to have the order by applied to a currency
> column
> not to be alpha, but something like -  USD, EUR,  JPY, GBP etc..
>
> I saw an earlier post on UDTs and ordering (which I can't seem to find in
> this archive,
>
> http://mail-archives.us.apache.org/mod_mbox/spark-user/201503.mbox/%3CCAFGcCdWWCFCwVp7+BCaPQ=6UupmYjcBhQYJn9tXEu45HJg4iFg@mail.gmail.com%3E
> ),
> which is somewhat related to this question.
>
> Thanks
> Mick
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Custom-order-by-in-Spark-SQL-tp23569.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>