You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by gu...@apache.org on 2018/01/22 13:12:56 UTC

spark git commit: [SPARK-7721][PYTHON][TESTS] Adds PySpark coverage generation script

Repository: spark
Updated Branches:
  refs/heads/master 5d680cae4 -> 87ffe7add


[SPARK-7721][PYTHON][TESTS] Adds PySpark coverage generation script

## What changes were proposed in this pull request?

Note that this PR was made based on the top of https://github.com/apache/spark/pull/20151. So, it almost leaves the main codes intact.

This PR proposes to add a script for the preparation of automatic PySpark coverage generation. Now, it's difficult to check the actual coverage in case of PySpark. With this script, it allows to run tests by the way we did via `run-tests` script before. The usage is exactly the same with `run-tests` script as this basically wraps it.

This script and PR alone should also be useful. I was asked about how to run this before, and seems some reviewers (including me) need this. It would be also useful to run it manually.

It usually requires a small diff in normal Python projects but PySpark cases are a bit different because apparently we are unable to track the coverage after it's forked. So, here, I made a custom worker that forces the coverage, based on the top of https://github.com/apache/spark/pull/20151.

I made a simple demo. Please take a look - https://spark-test.github.io/pyspark-coverage-site.

To show up the structure, this PR adds the files as below:

```
python
├── .coveragerc  # Runtime configuration when we run the script.
├── run-tests-with-coverage  # The script that has coverage support and wraps run-tests script.
└── test_coverage  # Directories that have files required when running coverage.
    ├── conf
    │   └── spark-defaults.conf  # Having the configuration 'spark.python.daemon.module'.
    ├── coverage_daemon.py  # A daemon having custom fix and wrapping our daemon.py
    └── sitecustomize.py  # Initiate coverage with COVERAGE_PROCESS_START
```

Note that this PR has a minor nit:

[This scope](https://github.com/apache/spark/blob/04e44b37cc04f62fbf9e08c7076349e0a4d12ea8/python/pyspark/daemon.py#L148-L169) in `daemon.py` is not in the coverage results as basically I am producing the coverage results in `worker.py` separately and then merging it. I believe it's not a big deal.

In a followup, I might have a site that has a single up-to-date PySpark coverage from the master branch as the fallback / default, or have a site that has multiple PySpark coverages and the site link will be left to each pull request.

## How was this patch tested?

Manually tested. Usage is the same with the existing Python test script - `./python/run-tests`. For example,

```
sh run-tests-with-coverage --python-executables=python3 --modules=pyspark-sql
```

Running this will generate HTMLs under `./python/test_coverage/htmlcov`.

Console output example:

```
sh run-tests-with-coverage --python-executables=python3,python --modules=pyspark-core
Running PySpark tests. Output is in /.../spark/python/unit-tests.log
Will test against the following Python executables: ['python3', 'python']
Will test the following Python modules: ['pyspark-core']
Starting test(python): pyspark.tests
Starting test(python3): pyspark.tests
...
Tests passed in 231 seconds
Combining collected coverage data under /.../spark/python/test_coverage/coverage_data
Reporting the coverage data at /...spark/python/test_coverage/coverage_data/coverage
Name                         Stmts   Miss Branch BrPart  Cover
--------------------------------------------------------------
pyspark/__init__.py             41      0      8      2    96%
...
pyspark/profiler.py             74     11     22      5    83%
pyspark/rdd.py                 871     40    303     32    93%
pyspark/rddsampler.py           68     10     32      2    82%
...
--------------------------------------------------------------
TOTAL                         8521   3077   2748    191    59%
Generating HTML files for PySpark coverage under /.../spark/python/test_coverage/htmlcov
```

Author: hyukjinkwon <gu...@gmail.com>

Closes #20204 from HyukjinKwon/python-coverage.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/87ffe7ad
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/87ffe7ad
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/87ffe7ad

Branch: refs/heads/master
Commit: 87ffe7adddf517541aac0d1e8536b02ad8881606
Parents: 5d680ca
Author: hyukjinkwon <gu...@gmail.com>
Authored: Mon Jan 22 22:12:50 2018 +0900
Committer: hyukjinkwon <gu...@gmail.com>
Committed: Mon Jan 22 22:12:50 2018 +0900

----------------------------------------------------------------------
 .gitignore                                    |  2 +
 python/.coveragerc                            | 21 +++++++
 python/run-tests-with-coverage                | 69 ++++++++++++++++++++++
 python/run-tests.py                           |  5 +-
 python/test_coverage/conf/spark-defaults.conf | 21 +++++++
 python/test_coverage/coverage_daemon.py       | 45 ++++++++++++++
 python/test_coverage/sitecustomize.py         | 23 ++++++++
 7 files changed, 185 insertions(+), 1 deletion(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/87ffe7ad/.gitignore
----------------------------------------------------------------------
diff --git a/.gitignore b/.gitignore
index 903297d..3908590 100644
--- a/.gitignore
+++ b/.gitignore
@@ -62,6 +62,8 @@ project/plugins/src_managed/
 project/plugins/target/
 python/lib/pyspark.zip
 python/deps
+python/test_coverage/coverage_data
+python/test_coverage/htmlcov
 python/pyspark/python
 reports/
 scalastyle-on-compile.generated.xml

http://git-wip-us.apache.org/repos/asf/spark/blob/87ffe7ad/python/.coveragerc
----------------------------------------------------------------------
diff --git a/python/.coveragerc b/python/.coveragerc
new file mode 100644
index 0000000..b3339cd
--- /dev/null
+++ b/python/.coveragerc
@@ -0,0 +1,21 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+
+[run]
+branch = true
+parallel = true
+data_file = ${COVERAGE_DIR}/coverage_data/coverage

http://git-wip-us.apache.org/repos/asf/spark/blob/87ffe7ad/python/run-tests-with-coverage
----------------------------------------------------------------------
diff --git a/python/run-tests-with-coverage b/python/run-tests-with-coverage
new file mode 100755
index 0000000..6d74b56
--- /dev/null
+++ b/python/run-tests-with-coverage
@@ -0,0 +1,69 @@
+#!/usr/bin/env bash
+
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+
+set -o pipefail
+set -e
+
+# This variable indicates which coverage executable to run to combine coverages
+# and generate HTMLs, for example, 'coverage3' in Python 3.
+COV_EXEC="${COV_EXEC:-coverage}"
+FWDIR="$(cd "`dirname $0`"; pwd)"
+pushd "$FWDIR" > /dev/null
+
+# Ensure that coverage executable is installed.
+if ! hash $COV_EXEC 2>/dev/null; then
+  echo "Missing coverage executable in your path, skipping PySpark coverage"
+  exit 1
+fi
+
+# Set up the directories for coverage results.
+export COVERAGE_DIR="$FWDIR/test_coverage"
+rm -fr "$COVERAGE_DIR/coverage_data"
+rm -fr "$COVERAGE_DIR/htmlcov"
+mkdir -p "$COVERAGE_DIR/coverage_data"
+
+# Current directory are added in the python path so that it doesn't refer our built
+# pyspark zip library first.
+export PYTHONPATH="$FWDIR:$PYTHONPATH"
+# Also, our sitecustomize.py and coverage_daemon.py are included in the path.
+export PYTHONPATH="$COVERAGE_DIR:$PYTHONPATH"
+
+# We use 'spark.python.daemon.module' configuration to insert the coverage supported workers.
+export SPARK_CONF_DIR="$COVERAGE_DIR/conf"
+
+# This environment variable enables the coverage.
+export COVERAGE_PROCESS_START="$FWDIR/.coveragerc"
+
+# If you'd like to run a specific unittest class, you could do such as
+# SPARK_TESTING=1 ../bin/pyspark pyspark.sql.tests VectorizedUDFTests
+./run-tests "$@"
+
+# Don't run coverage for the coverage command itself
+unset COVERAGE_PROCESS_START
+
+# Coverage could generate empty coverage data files. Remove it to get rid of warnings when combining.
+find $COVERAGE_DIR/coverage_data -size 0 -print0 | xargs -0 rm
+echo "Combining collected coverage data under $COVERAGE_DIR/coverage_data"
+$COV_EXEC combine
+echo "Reporting the coverage data at $COVERAGE_DIR/coverage_data/coverage"
+$COV_EXEC report --include "pyspark/*"
+echo "Generating HTML files for PySpark coverage under $COVERAGE_DIR/htmlcov"
+$COV_EXEC html --ignore-errors --include "pyspark/*" --directory "$COVERAGE_DIR/htmlcov"
+
+popd

http://git-wip-us.apache.org/repos/asf/spark/blob/87ffe7ad/python/run-tests.py
----------------------------------------------------------------------
diff --git a/python/run-tests.py b/python/run-tests.py
index 1341086..f03284c 100755
--- a/python/run-tests.py
+++ b/python/run-tests.py
@@ -38,7 +38,7 @@ sys.path.append(os.path.join(os.path.dirname(os.path.realpath(__file__)), "../de
 
 
 from sparktestsupport import SPARK_HOME  # noqa (suppress pep8 warnings)
-from sparktestsupport.shellutils import which, subprocess_check_output  # noqa
+from sparktestsupport.shellutils import which, subprocess_check_output, run_cmd  # noqa
 from sparktestsupport.modules import all_modules  # noqa
 
 
@@ -175,6 +175,9 @@ def main():
 
     task_queue = Queue.PriorityQueue()
     for python_exec in python_execs:
+        if "COVERAGE_PROCESS_START" in os.environ:
+            # Make sure if coverage is installed.
+            run_cmd([python_exec, "-c", "import coverage"])
         python_implementation = subprocess_check_output(
             [python_exec, "-c", "import platform; print(platform.python_implementation())"],
             universal_newlines=True).strip()

http://git-wip-us.apache.org/repos/asf/spark/blob/87ffe7ad/python/test_coverage/conf/spark-defaults.conf
----------------------------------------------------------------------
diff --git a/python/test_coverage/conf/spark-defaults.conf b/python/test_coverage/conf/spark-defaults.conf
new file mode 100644
index 0000000..bf44ea6
--- /dev/null
+++ b/python/test_coverage/conf/spark-defaults.conf
@@ -0,0 +1,21 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+
+# This is used to generate PySpark coverage results. Seems there's no way to
+# add a configuration when SPARK_TESTING environment variable is set because
+# we will directly execute modules by python -m.
+spark.python.daemon.module coverage_daemon

http://git-wip-us.apache.org/repos/asf/spark/blob/87ffe7ad/python/test_coverage/coverage_daemon.py
----------------------------------------------------------------------
diff --git a/python/test_coverage/coverage_daemon.py b/python/test_coverage/coverage_daemon.py
new file mode 100644
index 0000000..c87366a
--- /dev/null
+++ b/python/test_coverage/coverage_daemon.py
@@ -0,0 +1,45 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+
+import os
+import imp
+
+
+# This is a hack to always refer the main code rather than built zip.
+main_code_dir = os.path.dirname(os.path.dirname(os.path.realpath(__file__)))
+daemon = imp.load_source("daemon", "%s/pyspark/daemon.py" % main_code_dir)
+
+if "COVERAGE_PROCESS_START" in os.environ:
+    worker = imp.load_source("worker", "%s/pyspark/worker.py" % main_code_dir)
+
+    def _cov_wrapped(*args, **kwargs):
+        import coverage
+        cov = coverage.coverage(
+            config_file=os.environ["COVERAGE_PROCESS_START"])
+        cov.start()
+        try:
+            worker.main(*args, **kwargs)
+        finally:
+            cov.stop()
+            cov.save()
+    daemon.worker_main = _cov_wrapped
+else:
+    raise RuntimeError("COVERAGE_PROCESS_START environment variable is not set, exiting.")
+
+
+if __name__ == '__main__':
+    daemon.manager()

http://git-wip-us.apache.org/repos/asf/spark/blob/87ffe7ad/python/test_coverage/sitecustomize.py
----------------------------------------------------------------------
diff --git a/python/test_coverage/sitecustomize.py b/python/test_coverage/sitecustomize.py
new file mode 100644
index 0000000..630237a
--- /dev/null
+++ b/python/test_coverage/sitecustomize.py
@@ -0,0 +1,23 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+
+# Note that this 'sitecustomize' module is a built-in feature in Python.
+# If this module is defined, it's executed when the Python session begins.
+# `coverage.process_startup()` seeks if COVERAGE_PROCESS_START environment
+# variable is set or not. If set, it starts to run the coverage.
+import coverage
+coverage.process_startup()


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org