You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2017/11/03 15:33:00 UTC

[jira] [Commented] (SPARK-22417) createDataFrame from a pandas.DataFrame reads datetime64 values as longs

    [ https://issues.apache.org/jira/browse/SPARK-22417?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16237772#comment-16237772 ] 

Apache Spark commented on SPARK-22417:
--------------------------------------

User 'BryanCutler' has created a pull request for this issue:
https://github.com/apache/spark/pull/19646

> createDataFrame from a pandas.DataFrame reads datetime64 values as longs
> ------------------------------------------------------------------------
>
>                 Key: SPARK-22417
>                 URL: https://issues.apache.org/jira/browse/SPARK-22417
>             Project: Spark
>          Issue Type: Bug
>          Components: PySpark
>    Affects Versions: 2.2.0
>            Reporter: Bryan Cutler
>            Priority: Normal
>
> When trying to create a Spark DataFrame from an existing Pandas DataFrame using {{createDataFrame}}, columns with datetime64 values are converted as long values.  This is only when the schema is not specified.  
> {code}
> In [2]: import pandas as pd
>    ...: from datetime import datetime
>    ...: 
> In [3]: pdf = pd.DataFrame({"ts": [datetime(2017, 10, 31, 1, 1, 1)]})
> In [4]: df = spark.createDataFrame(pdf)
> In [5]: df.show()
> +-------------------+
> |                 ts|
> +-------------------+
> |1509411661000000000|
> +-------------------+
> In [6]: df.schema
> Out[6]: StructType(List(StructField(ts,LongType,true)))
> {code}
> Spark should interpret a datetime64\[D\] value to DateType and other datetime64 values to TImestampType.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org