You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Mich Talebzadeh <mi...@gmail.com> on 2016/09/13 11:28:22 UTC

Any viable DATEDIFF function in Spark/Scala

Hi,

This tricky bit.

I use the following to get the current data and time

scala> val date = java.time.LocalDate.now.toString
date: String = 2016-09-13
scala> val hour = java.time.LocalTime.now.toString
hour: String = 11:49:13.577

I store a column called TIMECREATED as String in hdfs. For now these values
look like this

scala> val df3 = df2.filter('security > " " && 'price > "10" &&
'TIMECREATED >  current_date()).select('TIMECREATED, current_date(),
datediff(current_date(), 'TIMECREATED).as("datediff")).show(2)
+-------------------+--------------+--------+
|        TIMECREATED|current_date()|datediff|
+-------------------+--------------+--------+
|2016-09-13 08:49:31|    2016-09-13|       0|
|2016-09-13 08:49:54|    2016-09-13|       0|
+-------------------+--------------+--------+
Which shows rows for today

Now I want to find all the rows where the rows are created in the past
15 minutes?

In other words something similar to this

*DATEDIFF* ( *date-part*, *date-expression1*, *date-expression2* )

Any available implementation


Thanks






Dr Mich Talebzadeh



LinkedIn * https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
<https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*



http://talebzadehmich.wordpress.com


*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.