You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by vanzin <gi...@git.apache.org> on 2015/04/01 00:30:27 UTC

[GitHub] spark pull request: [SPARK-1502][YARN]Add config option to not inc...

Github user vanzin commented on the pull request:

    https://github.com/apache/spark/pull/5294#issuecomment-88271630
  
    So I was mostly interested in understanding what the use case was, since the bug was a little short on details. Tom's explanation makes sense; the opposite (hadoopA built into Spark assembly breaking when it's run on the cluster's hadoopB) already has workarounds since Spark gives user control of the app's classpath in different ways.
    
    Given that the patch looks good; probably should remain as an undocumented option, though.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org