You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Wenchen Fan (JIRA)" <ji...@apache.org> on 2017/05/12 08:45:04 UTC
[jira] [Resolved] (SPARK-20639) Add single argument support for
to_timestamp in SQL
[ https://issues.apache.org/jira/browse/SPARK-20639?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Wenchen Fan resolved SPARK-20639.
---------------------------------
Resolution: Fixed
Fix Version/s: 2.3.0
Issue resolved by pull request 17901
[https://github.com/apache/spark/pull/17901]
> Add single argument support for to_timestamp in SQL
> ---------------------------------------------------
>
> Key: SPARK-20639
> URL: https://issues.apache.org/jira/browse/SPARK-20639
> Project: Spark
> Issue Type: Improvement
> Components: SQL
> Affects Versions: 2.2.0
> Reporter: Hyukjin Kwon
> Priority: Minor
> Fix For: 2.3.0
>
>
> Currently, it looks we can omit the timestamp format as below:
> {code}
> import org.apache.spark.sql.functions._
> Seq("2016-12-31 00:12:00.00").toDF("a").select(to_timestamp(col("a"))).show()
> {code}
> {code}
> +----------------------------------------+
> |to_timestamp(`a`, 'yyyy-MM-dd HH:mm:ss')|
> +----------------------------------------+
> | 2016-12-31 00:12:00|
> +----------------------------------------+
> {code}
> whereas this does not work in SQL as below:
> {code}
> spark-sql> SELECT to_timestamp('2016-12-31 00:12:00.00');
> Error in query: Invalid number of arguments for function to_timestamp; line 1 pos 7
> {code}
> It looks we could support this too. For {{to_date}}, it looks already working in SQL as well as other language APIs.
> {code}
> scala> Seq("2016-12-31").toDF("a").select(to_date(col("a"))).show()
> +----------+
> |to_date(a)|
> +----------+
> |2016-12-31|
> +----------+
> {code}
> {code}
> spark-sql> SELECT to_date('2016-12-31');
> 2016-12-31
> {code}
--
This message was sent by Atlassian JIRA
(v6.3.15#6346)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org