You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2020/10/02 04:36:38 UTC

[GitHub] [spark] sunchao commented on pull request #29843: [WIP][SPARK-29250][test-maven][test-hadoop2.7] Upgrade to Hadoop 3.2.1 and move to shaded client

sunchao commented on pull request #29843:
URL: https://github.com/apache/spark/pull/29843#issuecomment-702520516


   Just one last test failure:
   
   "Exception: Python in worker has different version 3.6 than that in driver 3.8, PySpark cannot run with different minor versions. Please check environment variables PYSPARK_PYTHON and PYSPARK_DRIVER_PYTHON are correctly set."
   
   @HyukjinKwon do you happen to know the reason for this? looking at the CI script it seems it should install either Python 3.6 or 3.8, but not both.
   
   I'm also not sure which part of this PR could affect the Yarn/Python tests. I tried on my own Spark fork with some dummy change in `YarnClusterSuite` (just to trigger tests on `ExtendedYarnTest`) and the tests there all passed, so it seems the failure is indeed related to this PR.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org