You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2021/12/28 15:10:03 UTC

[GitHub] [spark] zero323 commented on a change in pull request #35032: [SPARK-37738][PYTHON] Fix API skew in PySpark date functions

zero323 commented on a change in pull request #35032:
URL: https://github.com/apache/spark/pull/35032#discussion_r775952162



##########
File path: python/pyspark/sql/tests/test_functions.py
##########
@@ -286,6 +289,42 @@ def test_dayofweek(self):
         row = df.select(dayofweek(df.date)).first()
         self.assertEqual(row[0], 2)
 
+    # Test added for SPARK-37738; change Python API to accept both col & int as input
+    def test_date_add_function(self):
+        dt = datetime.date(2021, 12, 27)
+        df = self.spark.createDataFrame([Row(date=dt, add=2)])
+        # default number in Python gets converted to LongType column
+        df = df.withColumn("add", col("add").cast("integer"))
+
+        row_via_col_addition = df.select(date_add(df.date, df.add)).first()
+        self.assertEqual(row_via_col_addition[0], datetime.date(2021, 12, 29))
+        row_via_scalar_addition = df.select(date_add(df.date, 3)).first()
+        self.assertEqual(row_via_scalar_addition[0], datetime.date(2021, 12, 30))
+
+    # Test added for SPARK-37738; change Python API to accept both col & int as input
+    def test_date_sub_function(self):
+        dt = datetime.date(2021, 12, 27)
+        df = self.spark.createDataFrame([Row(date=dt, add=2)])
+        # default number in Python gets converted to LongType column
+        df = df.withColumn("add", col("add").cast("integer"))

Review comment:
       This seems to be obsolete.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org