You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2020/06/10 09:18:54 UTC

[GitHub] [spark] HyukjinKwon commented on a change in pull request #28593: [SPARK-31710][SQL] Fail casting numeric to timestamp by default

HyukjinKwon commented on a change in pull request #28593:
URL: https://github.com/apache/spark/pull/28593#discussion_r437982378



##########
File path: python/pyspark/sql/functions.py
##########
@@ -1427,6 +1427,19 @@ def to_utc_timestamp(timestamp, tz):
     return Column(sc._jvm.functions.to_utc_timestamp(_to_java_column(timestamp), tz))
 
 
+@since(3.1)
+def timestamp_seconds(col):

Review comment:
       There are two ways:
   - Using `expr(...)`. I think we usually recommend this way when the SQL functions are not existent in `functions.scala`.
   - We can simply add this into [this dictionary](https://github.com/apache/spark/blob/c7f2a9b323c5354c5dab1354c9a9bda19274dcdc/python/pyspark/sql/functions.py#L130-L136). This way is currently kind of discouraged. I was discussed in the mailing list before, and we should convert that dictionary into each function definition for better static analysis and IDE support.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org