You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by anbu <an...@gmail.com> on 2019/03/14 03:59:47 UTC
Spark scala Date Usage
Hi sir,
Could you please help me to implement the below scenario using spark scala:
how to convert the string date type to date type to check
the agg_start_date is less than the data_date.I want to take 2019-01-09 data
alone for my aggregations
data_date:2019-01-10 (Currently it is the processing data_date)
def(spark:SparkSession,retail:Dataset[DataProcess],data-qa:Dataset[RetailModel],data_date:String):Dataset[]
={
// agg_start_date is one of the String field in the case class.I want to
check the date comparison as current date with previous date to collect all
the previous date data details
This agg_start_date has "2019-01-10" and "2019-01-09" both data and want to
take previous data for mt aggregations
withColumn("calc_previous", ( retail("agg_start_date").lt(lit("data_date)))
withColumn("calc_previous",retail("agg_start_date").cast("DateType").lt(lit("data_date).cast("DataType))
}
--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org