You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (Jira)" <ji...@apache.org> on 2023/02/28 23:34:00 UTC
[jira] [Assigned] (SPARK-41868) Support data type Duration(NANOSECOND)
[ https://issues.apache.org/jira/browse/SPARK-41868?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Apache Spark reassigned SPARK-41868:
------------------------------------
Assignee: (was: Apache Spark)
> Support data type Duration(NANOSECOND)
> --------------------------------------
>
> Key: SPARK-41868
> URL: https://issues.apache.org/jira/browse/SPARK-41868
> Project: Spark
> Issue Type: Sub-task
> Components: Connect
> Affects Versions: 3.4.0
> Reporter: Sandeep Singh
> Priority: Major
>
> {code:java}
> import pandas as pd
> from datetime import timedelta
> df = self.spark.createDataFrame(pd.DataFrame({"a": [timedelta(microseconds=123)]})) {code}
> {code:java}
> Traceback (most recent call last):
> File "/Users/s.singh/personal/spark-oss/python/pyspark/sql/tests/test_dataframe.py", line 1291, in test_create_dataframe_from_pandas_with_day_time_interval
> self.assertEqual(df.toPandas().a.iloc[0], timedelta(microseconds=123))
> File "/Users/s.singh/personal/spark-oss/python/pyspark/sql/connect/dataframe.py", line 1031, in toPandas
> return self._session.client.to_pandas(query)
> File "/Users/s.singh/personal/spark-oss/python/pyspark/sql/connect/client.py", line 413, in to_pandas
> return self._execute_and_fetch(req)
> File "/Users/s.singh/personal/spark-oss/python/pyspark/sql/connect/client.py", line 573, in _execute_and_fetch
> self._handle_error(rpc_error)
> File "/Users/s.singh/personal/spark-oss/python/pyspark/sql/connect/client.py", line 623, in _handle_error
> raise SparkConnectException(status.message, info.reason) from None
> pyspark.sql.connect.client.SparkConnectException: (org.apache.spark.SparkUnsupportedOperationException) Unsupported data type: Duration(NANOSECOND){code}
--
This message was sent by Atlassian Jira
(v8.20.10#820010)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org