You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Michael Styles (JIRA)" <ji...@apache.org> on 2016/08/12 11:16:20 UTC
[jira] [Created] (SPARK-17035) Conversion of datetime.max to
microseconds produces incorrect value
Michael Styles created SPARK-17035:
--------------------------------------
Summary: Conversion of datetime.max to microseconds produces incorrect value
Key: SPARK-17035
URL: https://issues.apache.org/jira/browse/SPARK-17035
Project: Spark
Issue Type: Bug
Components: PySpark
Affects Versions: 2.0.0
Reporter: Michael Styles
Conversion of datetime.max to microseconds produces incorrect value. For example,
from datetime import datetime
from pyspark.sql import Row
from pyspark.sql.types import StructType, StructField, TimestampType
schema = StructType([StructField("dt", TimestampType(), False)])
data = [{"dt": datetime.max}]
# convert python objects to sql data
sql_data = [schema.toInternal(row) for row in data]
# Value is wrong.
sql_data
[(2.534023188e+17,)]
This value should be [(253402318799999999,)].
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org