You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2022/09/16 06:53:31 UTC

[GitHub] [spark] Yikun commented on pull request #37888: [SPARK-40196][PYTHON][PS] Consolidate `lit` function with NumPy scalar in sql and pandas module

Yikun commented on PR #37888:
URL: https://github.com/apache/spark/pull/37888#issuecomment-1248987419

   ```
     test_repeat (pyspark.pandas.tests.test_spark_functions.SparkFunctionsTests) ... FAIL (0.052s)
   
   ======================================================================
   FAIL [0.052s]: test_repeat (pyspark.pandas.tests.test_spark_functions.SparkFunctionsTests)
   ----------------------------------------------------------------------
   Traceback (most recent call last):
     File "/__w/spark/spark/python/pyspark/pandas/tests/test_spark_functions.py", line 28, in test_repeat
       self.assertTrue(spark_column_equals(SF.repeat(F.lit(1), 2), F.repeat(F.lit(1), 2)))
   AssertionError: False is not true
   
   ----------------------------------------------------------------------
   Ran 1 test in 8.471s
   ```
   
   Still failed due to `test_repeat`, Initial invistigation:
   ```
   F.repeat(F.lit(1), 2).__dict__
   Out[3]: {'_jc': JavaObject id=o50}
   SF.repeat(F.lit(1), 2).__dict__
   Out[4]: {'_jc': JavaObject id=o54}
   ```
   
   According to https://github.com/apache/spark/pull/37888#discussion_r971408190 , Looks like we need to remove this test?
   
   ```python
   class SparkFunctionsTests(PandasOnSparkTestCase):
       def test_repeat(self):
           # TODO: Placeholder
           pass
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org