You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by HyukjinKwon <gi...@git.apache.org> on 2017/11/19 05:24:00 UTC

[GitHub] spark issue #19782: [SPARK-22554][PYTHON] Add a config to control if PySpark...

Github user HyukjinKwon commented on the issue:

    https://github.com/apache/spark/pull/19782
  
    This is also partly for running Python coverage without extra code change. I know a hacky way to run this (see https://github.com/apache/spark/pull/19630#issuecomment-345490662 and https://github.com/apache/spark/pull/19630#issuecomment-345171997):
    
    Now, we can do, for example, as below:
    
    ```
    pip install coverage
    # Build Spark (http://spark.apache.org/docs/latest/building-spark.html)
    rm python/lib/pyspark.zip
    rm -fr .coverage
    rm -fr coverage_html
    
    echo "spark.python.use.daemon false" >> conf/spark-defaults.conf
    
    echo "
    #!/usr/bin/env bash
    coverage run -p \$@
    " > coverage_python
    chmod 755 coverage_python
    
    # Run actual Python tests
    PATH=`pwd`:$PATH PYSPARK_PYTHON=coverage_python SPARK_TESTING=1 bin/pyspark pyspark.sql.tests VectorizedUDFTests 
    
    rm conf/spark-defaults.conf
    
    coverage combine
    coverage html -d coverage_html -i
    open coverage_html
    # Open up index.html in your browser.
    ```


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org