You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2020/07/24 02:45:28 UTC

[GitHub] [spark] HyukjinKwon commented on pull request #29212: [SPARK-32419][PYTHON][BUILD] Avoid using subshell for Conda env (de)activation in pip packaging test

HyukjinKwon commented on pull request #29212:
URL: https://github.com/apache/spark/pull/29212#issuecomment-663323801


   Okay, seems it properly uses the activated Conda env.
   
   ```
   Collecting py4j==0.10.9
     Downloading py4j-0.10.9-py2.py3-none-any.whl (198 kB)
   Building wheels for collected packages: pyspark
     Building wheel for pyspark (setup.py): started
     Building wheel for pyspark (setup.py): finished with status 'done'
     Created wheel for pyspark: filename=pyspark-3.1.0.dev0-py2.py3-none-any.whl size=301489400 sha256=d5ddda7599a48b31099641a4bc71ace4838cfc4f615cb133c413b67bf0eb6304
     Stored in directory: /tmp/pip-ephem-wheel-cache-5pwgtgf2/wheels/42/ef/92/48cc19770d32e29bd58811e58bc68ee45511755f8cc987fcfa
   Successfully built pyspark
   ...
   
   Installing dist into virtual env
   Obtaining file:///home/runner/work/spark/spark/python
   Collecting py4j==0.10.9
     Downloading py4j-0.10.9-py2.py3-none-any.whl (198 kB)
   Installing collected packages: py4j, pyspark
     Running setup.py develop for pyspark
   Successfully installed py4j-0.10.9 pyspark
   ```
   
   @dongjoon-hyun, sorry I missed this part and I believe this is the last touch for pip packaging test for now. Could you take a look please?


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org