You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2018/01/04 12:39:00 UTC

[jira] [Updated] (SPARK-22959) Configuration to select the modules for daemon and worker in PySpark

     [ https://issues.apache.org/jira/browse/SPARK-22959?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Hyukjin Kwon updated SPARK-22959:
---------------------------------
    Description: 
We are now forced to use {{pyspark/daemon.py}} and {{pyspark/worker.py}} in PySpark tests.

This doesn't allow a custom modification for it and it's sometimes hard to debug what happens in Python worker processes.

This is actually related with SPARK-7721 too as somehow Coverage is unable to detect the coverage from {{os.fork}}. If we have some custom fixes to force the coverage, it works fine.

This is also related with SPARK-20368. This JIRA describes Sentry supports which (roughly) needs some changes within worker side.  With this simple workaround, advanced users will be able to do a lot of pluggable supports.

  was:
We are now forced to use {{pyspark/daemon.py}} and {{pyspark/worker.py}} in PySpark tests.

This doesn't allow a custom modification for it and it's sometimes hard to debug what happens in Python worker processes.

This is actually related with SPARK-7721 too as somehow Coverage is unable to detect the coverage from {{os.fork}}.

This is also related with SPARK-20368. With this simple workaround, advanced users will be able to do a lot of pluggable supports.


> Configuration to select the modules for daemon and worker in PySpark
> --------------------------------------------------------------------
>
>                 Key: SPARK-22959
>                 URL: https://issues.apache.org/jira/browse/SPARK-22959
>             Project: Spark
>          Issue Type: Improvement
>          Components: PySpark
>    Affects Versions: 2.3.0
>            Reporter: Hyukjin Kwon
>
> We are now forced to use {{pyspark/daemon.py}} and {{pyspark/worker.py}} in PySpark tests.
> This doesn't allow a custom modification for it and it's sometimes hard to debug what happens in Python worker processes.
> This is actually related with SPARK-7721 too as somehow Coverage is unable to detect the coverage from {{os.fork}}. If we have some custom fixes to force the coverage, it works fine.
> This is also related with SPARK-20368. This JIRA describes Sentry supports which (roughly) needs some changes within worker side.  With this simple workaround, advanced users will be able to do a lot of pluggable supports.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org