You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Martin <bo...@gmx.de> on 2022/10/14 21:14:58 UTC

[Feature Request] make unix_micros() and unix_millis() available in PySpark (pyspark.sql.functions)

Hi everyone,

In *Spark SQL* there are several timestamp related functions

   - unix_micros(timestamp)
   Returns the number of microseconds since 1970-01-01 00:00:00 UTC.
   - unix_millis(timestamp)
   Returns the number of milliseconds since 1970-01-01 00:00:00 UTC.
   Truncates higher levels of precision.

See https://spark.apache.org/docs/latest/sql-ref-functions-builtin.html

Currently these are *"missing" in pyspark.sql.functions*.
https://spark.apache.org/docs/latest/api/python/reference/pyspark.sql/functions.html#datetime-functions

I'd appreciate it if these were also available in PySpark.

Cheers,
Martin

Re: [Feature Request] make unix_micros() and unix_millis() available in PySpark (pyspark.sql.functions)

Posted by Hyukjin Kwon <gu...@gmail.com>.
You can workaround it by leveraging expr, e.g., expr("unix_micros(col)")
for now.
Should better have Scala binding first before we have Python one FWIW,

On Sat, 15 Oct 2022 at 06:19, Martin <bo...@gmx.de> wrote:

> Hi everyone,
>
> In *Spark SQL* there are several timestamp related functions
>
>    - unix_micros(timestamp)
>    Returns the number of microseconds since 1970-01-01 00:00:00 UTC.
>    - unix_millis(timestamp)
>    Returns the number of milliseconds since 1970-01-01 00:00:00 UTC.
>    Truncates higher levels of precision.
>
> See https://spark.apache.org/docs/latest/sql-ref-functions-builtin.html
>
> Currently these are *"missing" in pyspark.sql.functions*.
>
> https://spark.apache.org/docs/latest/api/python/reference/pyspark.sql/functions.html#datetime-functions
>
> I'd appreciate it if these were also available in PySpark.
>
> Cheers,
> Martin
>