You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2019/02/27 16:24:01 UTC

[GitHub] viirya commented on a change in pull request #23904: [SPARK-27000][PYTHON] Upgrades cloudpickle to v0.8.0

viirya commented on a change in pull request #23904: [SPARK-27000][PYTHON] Upgrades cloudpickle to v0.8.0
URL: https://github.com/apache/spark/pull/23904#discussion_r260829270
 
 

 ##########
 File path: python/pyspark/tests/test_rdd.py
 ##########
 @@ -726,6 +729,13 @@ def stopit(*x):
         self.assertRaisesRegexp((Py4JJavaError, RuntimeError), msg,
                                 seq_rdd.aggregate, 0, lambda *x: 1, stopit)
 
+    def test_overwritten_global_func(self):
+        # Regression test for SPARK-27000
+        global global_func
+        self.assertEqual(self.sc.parallelize([1]).map(lambda _: global_func()).first(), "Hi")
+        global_func = lambda: "Yeah"
+        self.assertEqual(self.sc.parallelize([1]).map(lambda _: global_func()).first(), "Yeah")
 
 Review comment:
   I ran a test locally:
   
   ```python
   >>> global_func = lambda: "Hi"
   >>> sc.parallelize([1]).map(lambda _: global_func()).first()
   'Hi'
   >>> global_func = lambda: "Yeah"
   >>> sc.parallelize([1]).map(lambda _: global_func()).first()
   'Hi'
   >>> sc.parallelize([1]).map(lambda _: global_func()).first()
   'Yeah'
   >>> sc.parallelize([1]).map(lambda _: global_func()).first()
   'Hi'
   ```
   
   Seems it outputs `Hi` or `Yeah` randomly. Is it caused by this cloudpickle issue?
   
   
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org