You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Patrick Wendell <pw...@gmail.com> on 2015/03/09 20:53:00 UTC

Cross cutting internal changes to launch scripts

Hey All,

Marcelo Vanzin has been working on a patch for a few months that
performs cross cutting clean-up and fixes to the way that Spark's
launch scripts work (including PySpark, spark submit, the daemon
scripts, etc.). The changes won't modify any public API's in terms of
how those scripts are invoked.

Historically, such patches have been difficult to test due to the
number of interactions between components and interactions with
external environments. I'd like to welcome people to test and/or code
review this patch in their own environment. This patch is the in the
very late stages of review and will likely be merged soon into master
(eventually 1.4).

https://github.com/apache/spark/pull/3916/files

I'll ping this thread again once it is merged and we can establish a
JIRA to encapsulate any issues. Just wanted to give a heads up as this
is one of the larger internal changes we've made to this
infrastructure since Spark 1.0

- Patrick

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org