You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by "zhengruifeng (via GitHub)" <gi...@apache.org> on 2023/09/18 03:07:44 UTC

[GitHub] [spark] zhengruifeng commented on a diff in pull request #42956: [SPARK-43654][CONNECT][PS][TESTS] Enable `InternalFrameParityTests.test_from_pandas`

zhengruifeng commented on code in PR #42956:
URL: https://github.com/apache/spark/pull/42956#discussion_r1328212481


##########
python/pyspark/pandas/tests/connect/test_parity_internal.py:
##########
@@ -15,18 +15,86 @@
 # limitations under the License.
 #
 import unittest
+import pandas as pd
 
 from pyspark.pandas.tests.test_internal import InternalFrameTestsMixin
 from pyspark.testing.connectutils import ReusedConnectTestCase
 from pyspark.testing.pandasutils import PandasOnSparkTestUtils
+from pyspark.pandas.internal import (
+    InternalFrame,
+    SPARK_DEFAULT_INDEX_NAME,
+    SPARK_INDEX_NAME_FORMAT,
+)
+from pyspark.pandas.utils import spark_column_equals
 
 
 class InternalFrameParityTests(
     InternalFrameTestsMixin, PandasOnSparkTestUtils, ReusedConnectTestCase
 ):
-    @unittest.skip("TODO(SPARK-43654): Enable InternalFrameParityTests.test_from_pandas.")
     def test_from_pandas(self):
-        super().test_from_pandas()
+        pdf = pd.DataFrame({"a": [1, 2, 3], "b": [4, 5, 6]})

Review Comment:
   Instead of compare the columns, what about simplify the tests by comparing the column string representations?
   
   Otherwise, you may have to add `__eq__` for Column and all the Expressions in Connect, but I am not sure whether it works in case 'resolved column' vs 'unresolved column'



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org