You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Eric Blanco (JIRA)" <ji...@apache.org> on 2018/06/13 10:40:00 UTC
[jira] [Updated] (SPARK-24545) Function hour not working as
expected for hour 2 in PySpark
[ https://issues.apache.org/jira/browse/SPARK-24545?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Eric Blanco updated SPARK-24545:
--------------------------------
Description:
Hello,
I tried to get the hour out of a date and it works except if the hour is 2. It works well in Scala but in PySpark it shows hour 3 instead of hour 2.
Example:
from pyspark.sql.functions import *
columns = ["id","date"]
vals = [(4,"2016-03-27 02:00:00")]
df = sqlContext.createDataFrame(vals, columns)
df.withColumn("hours", hour(col("date"))).show()
+----+-----------------++-----
|id|date|hours|
+----+-----------------++-----
|4|2016-03-27 2:00:00|3|
+----+-----------------++-----
It works as expected for other hours.
was:
Hello,
I tried to get the hour out of a date and it works except if the hour is 2.
Example:
from pyspark.sql.functions import *
columns = ["id","date"]
vals = [(4,"2016-03-27 02:00:00")]
df = sqlContext.createDataFrame(vals, columns)
df.withColumn("hours", hour(col("date"))).show()
+---+------------------+-----+
| id| date|hours|
+---+------------------+-----+
| 4|2016-03-27 2:00:00| 3|
+---+------------------+-----+
It works as expected for other hours.
Summary: Function hour not working as expected for hour 2 in PySpark (was: Function hour not working as expected for hour 2)
> Function hour not working as expected for hour 2 in PySpark
> -----------------------------------------------------------
>
> Key: SPARK-24545
> URL: https://issues.apache.org/jira/browse/SPARK-24545
> Project: Spark
> Issue Type: Bug
> Components: Java API
> Affects Versions: 2.2.1
> Reporter: Eric Blanco
> Priority: Major
>
> Hello,
> I tried to get the hour out of a date and it works except if the hour is 2. It works well in Scala but in PySpark it shows hour 3 instead of hour 2.
> Example:
> from pyspark.sql.functions import *
> columns = ["id","date"]
> vals = [(4,"2016-03-27 02:00:00")]
> df = sqlContext.createDataFrame(vals, columns)
> df.withColumn("hours", hour(col("date"))).show()
> +----+-----------------++-----
> |id|date|hours|
> +----+-----------------++-----
> |4|2016-03-27 2:00:00|3|
> +----+-----------------++-----
>
> It works as expected for other hours.
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org