You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2019/02/05 11:14:34 UTC

[GitHub] soxofaan opened a new pull request #23736: bin/pyspark: eliminate use of hardcoded 'python' command and fix IPython version check

soxofaan opened a new pull request #23736: bin/pyspark: eliminate use of hardcoded 'python' command and fix IPython version check
URL: https://github.com/apache/spark/pull/23736
 
 
   I was trying out pyspark on a system with only a `python3` command but no `python` command and got this error:
   ```
   /opt/spark/bin/pyspark: line 45: python: command not found
   ```
   
   While the pyspark script is full of variables to refer to a python interpreter there is still a hardcoded `python` used for
   ```
   WORKS_WITH_IPYTHON=$(python -c 'import sys; print(sys.version_info >= (2, 7, 0))')
   ``` 
   
   While looking into this, I also noticed the bash syntax for the IPython version check is wrong:  `[[ !  $WORKS_WITH_IPYTHON ]]`  always evaluates to false when `$WORKS_WITH_IPYTHON` is non-empty 
   
   ## What changes were proposed in this pull request?
   
   - replace the hardcoded `python` with `$PYSPARK_DRIVER_PYTHON` so that the IPython version check is about the correct interpreter
   - avoid calculating `WORKS_WITH_IPYTHON` when not used
   - fix the bash syntax to test whether `WORKS_WITH_IPYTHON` is "True" or "False"
   
   ## How was this patch tested?
   
   I manually tested the `pyspark` launch script and bash syntax stuff
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org