You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Divya Gehlot <di...@gmail.com> on 2016/07/21 03:50:52 UTC
getting null when calculating time diff with unix_timestamp + spark 1.6
Hi,
val lags=sqlContext.sql("select *,(unix_timestamp(time1,'$timeFmt') -
lag(unix_timestamp(time2,'$timeFmt'))) as time_diff from df_table");
Instead of time difference in seconds I am gettng null .
Would reay appreciate the help.
Thanks,
Divya
Re: getting null when calculating time diff with unix_timestamp +
spark 1.6
Posted by Jacek Laskowski <ja...@japila.pl>.
Hi,
It appears that lag didn't work properly, right? I'm new to it, and
remember that in Scala you'd need to define a WindowSpec. I don't see
one in your SQL query.
Pozdrawiam,
Jacek Laskowski
----
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski
On Thu, Jul 21, 2016 at 5:50 AM, Divya Gehlot <di...@gmail.com> wrote:
> Hi,
>
> val lags=sqlContext.sql("select *,(unix_timestamp(time1,'$timeFmt') -
> lag(unix_timestamp(time2,'$timeFmt'))) as time_diff from df_table");
>
> Instead of time difference in seconds I am gettng null .
>
> Would reay appreciate the help.
>
>
> Thanks,
> Divya
---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org