You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2017/11/19 05:07:01 UTC

[jira] [Created] (SPARK-22554) Add a config to control if PySpark should use daemon or not

Hyukjin Kwon created SPARK-22554:
------------------------------------

             Summary: Add a config to control if PySpark should use daemon or not
                 Key: SPARK-22554
                 URL: https://issues.apache.org/jira/browse/SPARK-22554
             Project: Spark
          Issue Type: Improvement
          Components: PySpark
    Affects Versions: 2.3.0
            Reporter: Hyukjin Kwon
            Priority: Trivial


Actually, SparkR already has a flag for {{useDaemon}}:

https://github.com/apache/spark/blob/478fbc866fbfdb4439788583281863ecea14e8af/core/src/main/scala/org/apache/spark/api/r/RRunner.scala#L362

It'd be great if we have this flag too. It makes easier to test Windows specific issue.

Also, this is also partly for running Python coverage without extra code change. I know a hacky way to run this:

https://github.com/apache/spark/pull/19630#issuecomment-345490662




--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org