You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2022/10/30 14:18:34 UTC

[GitHub] [spark] itholic commented on a diff in pull request #38437: [SPARK-40966][PS][TEST] Fix read_parquet test in pandas 1.5.1

itholic commented on code in PR #38437:
URL: https://github.com/apache/spark/pull/38437#discussion_r1008872504


##########
python/pyspark/pandas/tests/test_dataframe_spark_io.py:
##########
@@ -99,7 +99,7 @@ def test_parquet_read_with_pandas_metadata(self):
             expected3 = expected2.set_index("index", append=True)
             # There is a bug in `to_parquet` from pandas 1.5.0 when writing MultiIndex.
             # See https://github.com/pandas-dev/pandas/issues/48848 for the reported issue.
-            if LooseVersion(pd.__version__) == LooseVersion("1.5.0"):
+            if LooseVersion(pd.__version__) in (LooseVersion("1.5.0"), LooseVersion("1.5.1")):

Review Comment:
   I'm not sure they it's fixed soon but seems trying to fix it ASAP, so I just wanna check explicitly for every version for now



##########
python/pyspark/pandas/tests/test_dataframe_spark_io.py:
##########
@@ -99,7 +99,7 @@ def test_parquet_read_with_pandas_metadata(self):
             expected3 = expected2.set_index("index", append=True)
             # There is a bug in `to_parquet` from pandas 1.5.0 when writing MultiIndex.
             # See https://github.com/pandas-dev/pandas/issues/48848 for the reported issue.
-            if LooseVersion(pd.__version__) == LooseVersion("1.5.0"):
+            if LooseVersion(pd.__version__) in (LooseVersion("1.5.0"), LooseVersion("1.5.1")):

Review Comment:
   I'm not sure it's fixed soon but seems trying to fix it ASAP, so I just wanna check explicitly for every version for now



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org