You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by gu...@apache.org on 2018/07/06 10:29:02 UTC
spark git commit: [SPARK-24673][SQL][PYTHON][FOLLOWUP] Support Column
arguments in timezone of from_utc_timestamp/to_utc_timestamp
Repository: spark
Updated Branches:
refs/heads/master 141953f4c -> a381bce72
[SPARK-24673][SQL][PYTHON][FOLLOWUP] Support Column arguments in timezone of from_utc_timestamp/to_utc_timestamp
## What changes were proposed in this pull request?
This pr supported column arguments in timezone of `from_utc_timestamp/to_utc_timestamp` (follow-up of #21693).
## How was this patch tested?
Added tests.
Author: Takeshi Yamamuro <ya...@apache.org>
Closes #21723 from maropu/SPARK-24673-FOLLOWUP.
Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/a381bce7
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/a381bce7
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/a381bce7
Branch: refs/heads/master
Commit: a381bce7285ec30f58f28f523dfcfe0c13221bbf
Parents: 141953f
Author: Takeshi Yamamuro <ya...@apache.org>
Authored: Fri Jul 6 18:28:54 2018 +0800
Committer: hyukjinkwon <gu...@apache.org>
Committed: Fri Jul 6 18:28:54 2018 +0800
----------------------------------------------------------------------
python/pyspark/sql/functions.py | 26 +++++++++++++++++++++++---
1 file changed, 23 insertions(+), 3 deletions(-)
----------------------------------------------------------------------
http://git-wip-us.apache.org/repos/asf/spark/blob/a381bce7/python/pyspark/sql/functions.py
----------------------------------------------------------------------
diff --git a/python/pyspark/sql/functions.py b/python/pyspark/sql/functions.py
index 4d37197..55e7d57 100644
--- a/python/pyspark/sql/functions.py
+++ b/python/pyspark/sql/functions.py
@@ -1285,11 +1285,21 @@ def from_utc_timestamp(timestamp, tz):
that time as a timestamp in the given time zone. For example, 'GMT+1' would yield
'2017-07-14 03:40:00.0'.
- >>> df = spark.createDataFrame([('1997-02-28 10:30:00',)], ['t'])
- >>> df.select(from_utc_timestamp(df.t, "PST").alias('local_time')).collect()
+ :param timestamp: the column that contains timestamps
+ :param tz: a string that has the ID of timezone, e.g. "GMT", "America/Los_Angeles", etc
+
+ .. versionchanged:: 2.4
+ `tz` can take a :class:`Column` containing timezone ID strings.
+
+ >>> df = spark.createDataFrame([('1997-02-28 10:30:00', 'JST')], ['ts', 'tz'])
+ >>> df.select(from_utc_timestamp(df.ts, "PST").alias('local_time')).collect()
[Row(local_time=datetime.datetime(1997, 2, 28, 2, 30))]
+ >>> df.select(from_utc_timestamp(df.ts, df.tz).alias('local_time')).collect()
+ [Row(local_time=datetime.datetime(1997, 2, 28, 19, 30))]
"""
sc = SparkContext._active_spark_context
+ if isinstance(tz, Column):
+ tz = _to_java_column(tz)
return Column(sc._jvm.functions.from_utc_timestamp(_to_java_column(timestamp), tz))
@@ -1300,11 +1310,21 @@ def to_utc_timestamp(timestamp, tz):
zone, and renders that time as a timestamp in UTC. For example, 'GMT+1' would yield
'2017-07-14 01:40:00.0'.
- >>> df = spark.createDataFrame([('1997-02-28 10:30:00',)], ['ts'])
+ :param timestamp: the column that contains timestamps
+ :param tz: a string that has the ID of timezone, e.g. "GMT", "America/Los_Angeles", etc
+
+ .. versionchanged:: 2.4
+ `tz` can take a :class:`Column` containing timezone ID strings.
+
+ >>> df = spark.createDataFrame([('1997-02-28 10:30:00', 'JST')], ['ts', 'tz'])
>>> df.select(to_utc_timestamp(df.ts, "PST").alias('utc_time')).collect()
[Row(utc_time=datetime.datetime(1997, 2, 28, 18, 30))]
+ >>> df.select(to_utc_timestamp(df.ts, df.tz).alias('utc_time')).collect()
+ [Row(utc_time=datetime.datetime(1997, 2, 28, 1, 30))]
"""
sc = SparkContext._active_spark_context
+ if isinstance(tz, Column):
+ tz = _to_java_column(tz)
return Column(sc._jvm.functions.to_utc_timestamp(_to_java_column(timestamp), tz))
---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org