You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2017/11/23 08:09:00 UTC

[jira] [Commented] (SPARK-7721) Generate test coverage report from Python

    [ https://issues.apache.org/jira/browse/SPARK-7721?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16263944#comment-16263944 ] 

Hyukjin Kwon commented on SPARK-7721:
-------------------------------------

[~joshrosen], ahh, I happened to duplicate the efforts here before ..

So, seems Jenkins <> Codecov is declined for now? Probably one easy workaround is just to use github pages - https://pages.github.com/. What we need would probably just push the changes into a repo if the tests pass, which will automatically updates its page.

I did this before to demonstrate SQL function docs:

https://spark-test.github.io/sparksqldoc/
https://github.com/spark-test/sparksqldoc 

FWIW, I recently added {{spark.python.use.daemon}} config like SparkR to disable os.fork and this (of course) enables tracking worker processes, although of course we should not disable it in Jenkins tests as it's extremely slow. It was good enough for small tests to verify PR or changes though.

> Generate test coverage report from Python
> -----------------------------------------
>
>                 Key: SPARK-7721
>                 URL: https://issues.apache.org/jira/browse/SPARK-7721
>             Project: Spark
>          Issue Type: Test
>          Components: PySpark, Tests
>            Reporter: Reynold Xin
>
> Would be great to have test coverage report for Python. Compared with Scala, it is tricker to understand the coverage without coverage reports in Python because we employ both docstring tests and unit tests in test files. 



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org