You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Chongguang LIU (Jira)" <ji...@apache.org> on 2020/12/13 13:12:00 UTC
[jira] [Created] (SPARK-33769) improve the next-day function of the
sql component to deal with Column type
Chongguang LIU created SPARK-33769:
--------------------------------------
Summary: improve the next-day function of the sql component to deal with Column type
Key: SPARK-33769
URL: https://issues.apache.org/jira/browse/SPARK-33769
Project: Spark
Issue Type: Improvement
Components: SQL
Affects Versions: 3.0.0
Reporter: Chongguang LIU
Hello all,
I used the function next_day in the spark SQL component and loved it: [https://github.com/apache/spark/blob/master/sql/core/src/main/scala/org/apache/spark/sql/functions.scala#L3077]
Actually the signature of this function is: def next_day(date: Column, dayOfWeek: String): Column.
It accepts the dayOfWeek parameter as a String. However in my case, the dayOfWeek is in a Column, so different values for each row of the dataframe. So I had to use the NextDay function like this: NextDay(dateCol.expr, dayOfWeekCol.expr).
My proposition is to add another signature for this function: def next_day(date: Column, dayOfWeek: Column): Column
In fact it is already the case for some other functions in this scala object, exemple:
def date_sub(start: Column, days: Int): Column = date_sub(start, lit(days))
def date_sub(start: Column, days: Column): Column = withExpr \{ DateSub(start.expr, days.expr) }
or
def add_months(startDate: Column, numMonths: Int): Column = add_months(startDate, lit(numMonths))
def add_months(startDate: Column, numMonths: Column): Column = withExpr {
AddMonths(startDate.expr, numMonths.expr)
}
I hope have explained my idea clearly. Let me know what are your opinions. If you are ok, I can submit a pull request with the necessary change.
Kind regardes,
Chongguang
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org