You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Patrick Wendell (JIRA)" <ji...@apache.org> on 2015/10/26 00:27:27 UTC
[jira] [Commented] (SPARK-11305) Remove Third-Party Hadoop
Distributions Doc Page
[ https://issues.apache.org/jira/browse/SPARK-11305?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14973493#comment-14973493 ]
Patrick Wendell commented on SPARK-11305:
-----------------------------------------
/cc [~srowen] for his thoughts.
> Remove Third-Party Hadoop Distributions Doc Page
> ------------------------------------------------
>
> Key: SPARK-11305
> URL: https://issues.apache.org/jira/browse/SPARK-11305
> Project: Spark
> Issue Type: Improvement
> Components: Documentation
> Reporter: Patrick Wendell
> Priority: Critical
>
> There is a fairly old page in our docs that contains a bunch of assorted information regarding running Spark on Hadoop clusters. I think this page should be removed and merged into other parts of the docs because the information is largely redundant and somewhat outdated.
> http://spark.apache.org/docs/latest/hadoop-third-party-distributions.html
> There are three sections:
> 1. Compile time Hadoop version - this information I think can be removed in favor of that on the "building spark" page. These days most "advanced users" are building without bundling Hadoop, so I'm not sure giving them a bunch of different Hadoop versions sends the right message.
> 2. Linking against Hadoop - this doesn't seem to add much beyond what is in the programming guide.
> 3. Where to run Spark - redundant with the hardware provisioning guide.
> 4. Inheriting cluster configurations - I think this would be better as a section at the end of the configuration page.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org