You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by felixcheung <gi...@git.apache.org> on 2018/02/01 05:54:24 UTC
[GitHub] spark issue #18933: [WIP][SPARK-21722][SQL][PYTHON] Enable timezone-aware ti...
Github user felixcheung commented on the issue:
https://github.com/apache/spark/pull/18933
Ping. I ran into this exact issue with pandas_udf on a simple data set with a timestamp type column.
As far as I can tell, there is no way to around this since pandas code is running deep inside pyspark and the only workaround is to make the column a string?
@BryanCutler @ueshin @icexelloss @HyukjinKwon any thought on how to fix this?
---
---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org