You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by sr...@apache.org on 2016/11/23 11:17:15 UTC

[5/5] spark-website git commit: Port wiki Useful Developer Tools and Profiling Spark Apps to /developer-tools.html; port Spark Versioning Policy and main wiki to /versioning-policy.html; port Preparing Spark Releases to /release-process.html; rearrange m

Port wiki Useful Developer Tools and Profiling Spark Apps to /developer-tools.html; port Spark Versioning Policy and main wiki to /versioning-policy.html; port Preparing Spark Releases to /release-process.html; rearrange menu with new Developer menu


Project: http://git-wip-us.apache.org/repos/asf/spark-website/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark-website/commit/cf21826b
Tree: http://git-wip-us.apache.org/repos/asf/spark-website/tree/cf21826b
Diff: http://git-wip-us.apache.org/repos/asf/spark-website/diff/cf21826b

Branch: refs/heads/asf-site
Commit: cf21826beab2b83cca0028c555e1008ae2f2ed93
Parents: 0744e8f
Author: Sean Owen <so...@cloudera.com>
Authored: Tue Nov 22 14:21:33 2016 +0000
Committer: Sean Owen <so...@cloudera.com>
Committed: Tue Nov 22 14:38:50 2016 +0000

----------------------------------------------------------------------
 _layouts/global.html                            |  18 +-
 developer-tools.md                              | 287 +++++++++++
 documentation.md                                |   7 -
 downloads.md                                    |   2 +-
 release-process.md                              | 263 ++++++++++
 site/committers.html                            |  18 +-
 site/community.html                             |  18 +-
 site/contributing.html                          |  18 +-
 site/developer-tools.html                       | 494 +++++++++++++++++++
 site/documentation.html                         |  25 +-
 site/downloads.html                             |  20 +-
 site/examples.html                              |  18 +-
 site/faq.html                                   |  18 +-
 site/graphx/index.html                          |  18 +-
 site/index.html                                 |  18 +-
 site/mailing-lists.html                         |  18 +-
 site/mllib/index.html                           |  18 +-
 site/news/amp-camp-2013-registration-ope.html   |  18 +-
 .../news/announcing-the-first-spark-summit.html |  18 +-
 .../news/fourth-spark-screencast-published.html |  18 +-
 site/news/index.html                            |  18 +-
 site/news/nsdi-paper.html                       |  18 +-
 site/news/one-month-to-spark-summit-2015.html   |  18 +-
 .../proposals-open-for-spark-summit-east.html   |  18 +-
 ...registration-open-for-spark-summit-east.html |  18 +-
 .../news/run-spark-and-shark-on-amazon-emr.html |  18 +-
 site/news/spark-0-6-1-and-0-5-2-released.html   |  18 +-
 site/news/spark-0-6-2-released.html             |  18 +-
 site/news/spark-0-7-0-released.html             |  18 +-
 site/news/spark-0-7-2-released.html             |  18 +-
 site/news/spark-0-7-3-released.html             |  18 +-
 site/news/spark-0-8-0-released.html             |  18 +-
 site/news/spark-0-8-1-released.html             |  18 +-
 site/news/spark-0-9-0-released.html             |  18 +-
 site/news/spark-0-9-1-released.html             |  18 +-
 site/news/spark-0-9-2-released.html             |  18 +-
 site/news/spark-1-0-0-released.html             |  18 +-
 site/news/spark-1-0-1-released.html             |  18 +-
 site/news/spark-1-0-2-released.html             |  18 +-
 site/news/spark-1-1-0-released.html             |  18 +-
 site/news/spark-1-1-1-released.html             |  18 +-
 site/news/spark-1-2-0-released.html             |  18 +-
 site/news/spark-1-2-1-released.html             |  18 +-
 site/news/spark-1-2-2-released.html             |  18 +-
 site/news/spark-1-3-0-released.html             |  18 +-
 site/news/spark-1-4-0-released.html             |  18 +-
 site/news/spark-1-4-1-released.html             |  18 +-
 site/news/spark-1-5-0-released.html             |  18 +-
 site/news/spark-1-5-1-released.html             |  18 +-
 site/news/spark-1-5-2-released.html             |  18 +-
 site/news/spark-1-6-0-released.html             |  18 +-
 site/news/spark-1-6-1-released.html             |  18 +-
 site/news/spark-1-6-2-released.html             |  18 +-
 site/news/spark-1-6-3-released.html             |  18 +-
 site/news/spark-2-0-0-released.html             |  18 +-
 site/news/spark-2-0-1-released.html             |  18 +-
 site/news/spark-2-0-2-released.html             |  18 +-
 site/news/spark-2.0.0-preview.html              |  18 +-
 .../spark-accepted-into-apache-incubator.html   |  18 +-
 site/news/spark-and-shark-in-the-news.html      |  18 +-
 site/news/spark-becomes-tlp.html                |  18 +-
 site/news/spark-featured-in-wired.html          |  18 +-
 .../spark-mailing-lists-moving-to-apache.html   |  18 +-
 site/news/spark-meetups.html                    |  18 +-
 site/news/spark-screencasts-published.html      |  18 +-
 site/news/spark-summit-2013-is-a-wrap.html      |  18 +-
 site/news/spark-summit-2014-videos-posted.html  |  18 +-
 site/news/spark-summit-2015-videos-posted.html  |  18 +-
 site/news/spark-summit-agenda-posted.html       |  18 +-
 .../spark-summit-east-2015-videos-posted.html   |  18 +-
 .../spark-summit-east-2016-cfp-closing.html     |  18 +-
 site/news/spark-summit-east-agenda-posted.html  |  18 +-
 .../news/spark-summit-europe-agenda-posted.html |  18 +-
 site/news/spark-summit-europe.html              |  18 +-
 .../spark-summit-june-2016-agenda-posted.html   |  18 +-
 site/news/spark-tips-from-quantifind.html       |  18 +-
 .../spark-user-survey-and-powered-by-page.html  |  18 +-
 site/news/spark-version-0-6-0-released.html     |  18 +-
 .../spark-wins-cloudsort-100tb-benchmark.html   |  18 +-
 ...-wins-daytona-gray-sort-100tb-benchmark.html |  18 +-
 .../strata-exercises-now-available-online.html  |  18 +-
 .../news/submit-talks-to-spark-summit-2014.html |  18 +-
 .../news/submit-talks-to-spark-summit-2016.html |  18 +-
 .../submit-talks-to-spark-summit-east-2016.html |  18 +-
 .../submit-talks-to-spark-summit-eu-2016.html   |  18 +-
 site/news/two-weeks-to-spark-summit-2014.html   |  18 +-
 ...deo-from-first-spark-development-meetup.html |  18 +-
 site/powered-by.html                            |  18 +-
 site/release-process.html                       | 475 ++++++++++++++++++
 site/releases/spark-release-0-3.html            |  18 +-
 site/releases/spark-release-0-5-0.html          |  18 +-
 site/releases/spark-release-0-5-1.html          |  18 +-
 site/releases/spark-release-0-5-2.html          |  18 +-
 site/releases/spark-release-0-6-0.html          |  18 +-
 site/releases/spark-release-0-6-1.html          |  18 +-
 site/releases/spark-release-0-6-2.html          |  18 +-
 site/releases/spark-release-0-7-0.html          |  18 +-
 site/releases/spark-release-0-7-2.html          |  18 +-
 site/releases/spark-release-0-7-3.html          |  18 +-
 site/releases/spark-release-0-8-0.html          |  18 +-
 site/releases/spark-release-0-8-1.html          |  18 +-
 site/releases/spark-release-0-9-0.html          |  18 +-
 site/releases/spark-release-0-9-1.html          |  18 +-
 site/releases/spark-release-0-9-2.html          |  18 +-
 site/releases/spark-release-1-0-0.html          |  18 +-
 site/releases/spark-release-1-0-1.html          |  18 +-
 site/releases/spark-release-1-0-2.html          |  18 +-
 site/releases/spark-release-1-1-0.html          |  18 +-
 site/releases/spark-release-1-1-1.html          |  18 +-
 site/releases/spark-release-1-2-0.html          |  18 +-
 site/releases/spark-release-1-2-1.html          |  18 +-
 site/releases/spark-release-1-2-2.html          |  18 +-
 site/releases/spark-release-1-3-0.html          |  18 +-
 site/releases/spark-release-1-3-1.html          |  18 +-
 site/releases/spark-release-1-4-0.html          |  18 +-
 site/releases/spark-release-1-4-1.html          |  18 +-
 site/releases/spark-release-1-5-0.html          |  18 +-
 site/releases/spark-release-1-5-1.html          |  18 +-
 site/releases/spark-release-1-5-2.html          |  18 +-
 site/releases/spark-release-1-6-0.html          |  18 +-
 site/releases/spark-release-1-6-1.html          |  18 +-
 site/releases/spark-release-1-6-2.html          |  18 +-
 site/releases/spark-release-1-6-3.html          |  18 +-
 site/releases/spark-release-2-0-0.html          |  18 +-
 site/releases/spark-release-2-0-1.html          |  18 +-
 site/releases/spark-release-2-0-2.html          |  18 +-
 site/research.html                              |  18 +-
 site/screencasts/1-first-steps-with-spark.html  |  18 +-
 .../2-spark-documentation-overview.html         |  18 +-
 .../3-transformations-and-caching.html          |  18 +-
 .../4-a-standalone-job-in-spark.html            |  18 +-
 site/screencasts/index.html                     |  18 +-
 site/sitemap.xml                                |  12 +
 site/sql/index.html                             |  18 +-
 site/streaming/index.html                       |  18 +-
 site/third-party-projects.html                  |  18 +-
 site/trademarks.html                            |  18 +-
 site/versioning-policy.html                     | 320 ++++++++++++
 versioning-policy.md                            |  75 +++
 139 files changed, 3618 insertions(+), 666 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark-website/blob/cf21826b/_layouts/global.html
----------------------------------------------------------------------
diff --git a/_layouts/global.html b/_layouts/global.html
index 662fb86..0fbfe5a 100644
--- a/_layouts/global.html
+++ b/_layouts/global.html
@@ -123,24 +123,32 @@
         <ul class="dropdown-menu">
           <li><a href="{{site.baseurl}}/docs/latest/">Latest Release (Spark 2.0.2)</a></li>
           <li><a href="{{site.baseurl}}/documentation.html">Older Versions and Other Resources</a></li>
+          <li><a href="{{site.baseurl}}/faq.html">Frequently Asked Questions</a></li>
         </ul>
       </li>
       <li><a href="{{site.baseurl}}/examples.html">Examples</a></li>
       <li class="dropdown">
-        <a href="{{site.baseurl}}/community.html" class="dropdown-toggle" data-toggle="dropdown">
+        <a href="#" class="dropdown-toggle" data-toggle="dropdown">
           Community <b class="caret"></b>
         </a>
         <ul class="dropdown-menu">
-          <li><a href="{{site.baseurl}}/community.html#mailing-lists">Mailing Lists</a></li>
+          <li><a href="{{site.baseurl}}/community.html">Mailing Lists &amp; Resources</a></li>
           <li><a href="{{site.baseurl}}/contributing.html">Contributing to Spark</a></li>
           <li><a href="https://issues.apache.org/jira/browse/SPARK">Issue Tracker</a></li>
-          <li><a href="{{site.baseurl}}/community.html#events">Events and Meetups</a></li>
-          <li><a href="{{site.baseurl}}/community.html#history">Project History</a></li>
           <li><a href="{{site.baseurl}}/powered-by.html">Powered By</a></li>
           <li><a href="{{site.baseurl}}/committers.html">Project Committers</a></li>
         </ul>
       </li>
-      <li><a href="{{site.baseurl}}/faq.html">FAQ</a></li>
+      <li class="dropdown">
+        <a href="#" class="dropdown-toggle" data-toggle="dropdown">
+           Developers <b class="caret"></b>
+        </a>
+        <ul class="dropdown-menu">
+          <li><a href="{{site.baseurl}}/developer-tools.html">Useful Developer Tools</a></li>
+          <li><a href="{{site.baseurl}}/versioning-policy.html">Versioning Policy</a></li>
+          <li><a href="{{site.baseurl}}/release-process.html">Release Process</a></li>
+        </ul>
+      </li>
     </ul>
     <ul class="nav navbar-nav navbar-right">
       <li class="dropdown">

http://git-wip-us.apache.org/repos/asf/spark-website/blob/cf21826b/developer-tools.md
----------------------------------------------------------------------
diff --git a/developer-tools.md b/developer-tools.md
new file mode 100644
index 0000000..77d225f
--- /dev/null
+++ b/developer-tools.md
@@ -0,0 +1,287 @@
+---
+layout: global
+title: Useful Developer Tools
+type: "page singular"
+navigation:
+  weight: 5
+  show: true
+---
+
+<h2>Useful Developer Tools</h2>
+
+<h3>Reducing Build Times</h3>
+
+Spark's default build strategy is to assemble a jar including all of its dependencies. This can 
+be cumbersome when doing iterative development. When developing locally, it is possible to create 
+an assembly jar including all of Spark's dependencies and then re-package only Spark itself 
+when making changes.
+
+```
+$ build/sbt clean package
+$ ./bin/spark-shell
+$ export SPARK_PREPEND_CLASSES=true
+$ ./bin/spark-shell # Now it's using compiled classes
+# ... do some local development ... #
+$ build/sbt compile
+# ... do some local development ... #
+$ build/sbt compile
+$ unset SPARK_PREPEND_CLASSES
+$ ./bin/spark-shell
+ 
+# You can also use ~ to let sbt do incremental builds on file changes without running a new sbt session every time
+$ build/sbt ~compile
+```
+
+<h3>Checking Out Pull Requests</h3>
+
+Git provides a mechanism for fetching remote pull requests into your own local repository. 
+This is useful when reviewing code or testing patches locally. If you haven't yet cloned the 
+Spark Git repository, use the following command:
+
+```
+$ git clone https://github.com/apache/spark.git
+$ cd spark
+```
+
+To enable this feature you'll need to configure the git remote repository to fetch pull request 
+data. Do this by modifying the `.git/config` file inside of your Spark directory. The remote may 
+not be named "origin" if you've named it something else:
+
+```
+[remote "origin"]
+  url = git@github.com:apache/spark.git
+  fetch = +refs/heads/*:refs/remotes/origin/*
+  fetch = +refs/pull/*/head:refs/remotes/origin/pr/*   # Add this line
+```
+
+Once you've done this you can fetch remote pull requests
+
+```
+# Fetch remote pull requests
+$ git fetch origin
+# Checkout a remote pull request
+$ git checkout origin/pr/112
+# Create a local branch from a remote pull request
+$ git checkout origin/pr/112 -b new-branch
+```
+
+<h3>Generating Dependency Graphs</h3>
+
+```
+$ # sbt
+$ build/sbt dependency-tree
+ 
+$ # Maven
+$ build/mvn -DskipTests install
+$ build/mvn dependency:tree
+```
+
+<a name="individual-tests"></a>
+<h3>Running Build Targets For Individual Projects</h3>
+
+```
+$ # sbt
+$ build/sbt package
+$ # Maven
+$ build/mvn package -DskipTests -pl assembly
+```
+
+<h3>ScalaTest Issues</h3>
+
+If the following error occurs when running ScalaTest
+
+```
+An internal error occurred during: "Launching XYZSuite.scala".
+java.lang.NullPointerException
+```
+It is due to an incorrect Scala library in the classpath. To fix it:
+
+- Right click on project
+- Select `Build Path | Configure Build Path`
+- `Add Library | Scala Library`
+- Remove `scala-library-2.10.4.jar - lib_managed\jars`
+
+In the event of "Could not find resource path for Web UI: org/apache/spark/ui/static", 
+it's due to a classpath issue (some classes were probably not compiled). To fix this, it 
+sufficient to run a test from the command line:
+
+```
+build/sbt "test-only org.apache.spark.rdd.SortingSuite"
+```
+
+<h3>Running Different Test Permutations on Jenkins</h3>
+
+When running tests for a pull request on Jenkins, you can add special phrases to the title of 
+your pull request to change testing behavior. This includes:
+
+- `[test-maven]` - signals to test the pull request using maven
+- `[test-hadoop1.0]` - signals to test using Spark's Hadoop 1.0 profile (other options include 
+Hadoop 2.0, 2.2, and 2.3)
+
+<h3>Organizing Imports</h3>
+
+You can use a <a href="https://plugins.jetbrains.com/plugin/7350">IntelliJ Imports Organizer</a> 
+from Aaron Davidson to help you organize the imports in 
+your code.  It can be configured to match the import ordering from the style guide.
+
+<h3>IDE Setup</h3>
+
+<h4>IntelliJ</h4>
+
+While many of the Spark developers use SBT or Maven on the command line, the most common IDE we 
+use is IntelliJ IDEA. You can get the community edition for free (Apache committers can get 
+free IntelliJ Ultimate Edition licenses) and install the JetBrains Scala plugin from `Preferences > Plugins`.
+
+To create a Spark project for IntelliJ:
+
+- Download IntelliJ and install the 
+<a href="https://confluence.jetbrains.com/display/SCA/Scala+Plugin+for+IntelliJ+IDEA">Scala plug-in for IntelliJ</a>.
+- Go to `File -> Import Project`, locate the spark source directory, and select "Maven Project".
+- In the Import wizard, it's fine to leave settings at their default. However it is usually useful 
+to enable "Import Maven projects automatically", since changes to the project structure will 
+automatically update the IntelliJ project.
+- As documented in <a href="http://spark.apache.org/docs/latest/building-spark.html">Building Spark</a>, 
+some build configurations require specific profiles to be 
+enabled. The same profiles that are enabled with `-P[profile name]` above may be enabled on the 
+Profiles screen in the Import wizard. For example, if developing for Hadoop 2.4 with YARN support, 
+enable profiles yarn and hadoop-2.4. These selections can be changed later by accessing the 
+"Maven Projects" tool window from the View menu, and expanding the Profiles section.
+
+Other tips:
+
+- "Rebuild Project" can fail the first time the project is compiled, because generate source files 
+are not automatically generated. Try clicking the "Generate Sources and Update Folders For All 
+Projects" button in the "Maven Projects" tool window to manually generate these sources.
+- Some of the modules have pluggable source directories based on Maven profiles (i.e. to support 
+both Scala 2.11 and 2.10 or to allow cross building against different versions of Hive). In some 
+cases IntelliJ's does not correctly detect use of the maven-build-plugin to add source directories. 
+In these cases, you may need to add source locations explicitly to compile the entire project. If 
+so, open the "Project Settings" and select "Modules". Based on your selected Maven profiles, you 
+may need to add source folders to the following modules:
+    - spark-hive: add v0.13.1/src/main/scala
+    - spark-streaming-flume-sink: add target\scala-2.10\src_managed\main\compiled_avro
+- Compilation may fail with an error like "scalac: bad option: 
+-P:/home/jakub/.m2/repository/org/scalamacros/paradise_2.10.4/2.0.1/paradise_2.10.4-2.0.1.jar". 
+If so, go to Preferences > Build, Execution, Deployment > Scala Compiler and clear the "Additional 
+compiler options" field.  It will work then although the option will come back when the project 
+reimports.  If you try to build any of the projects using quasiquotes (eg., sql) then you will 
+need to make that jar a compiler plugin (just below "Additional compiler options"). 
+Otherwise you will see errors like:
+```
+/Users/irashid/github/spark/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/codegen/CodeGenerator.scala
+Error:(147, 9) value q is not a member of StringContext
+ Note: implicit class Evaluate2 is not applicable here because it comes after the application point and it lacks an explicit result type
+        q"""
+        ^ 
+```
+
+<h4>Eclipse</h4>
+
+Eclipse can be used to develop and test Spark. The following configuration is known to work:
+
+- Eclipse Juno
+- <a href="http://scala-ide.org/">Scala IDE 4.0</a>
+- Scala Test
+
+The easiest way is to download the Scala IDE bundle from the Scala IDE download page. It comes 
+pre-installed with ScalaTest. Alternatively, use the Scala IDE update site or Eclipse Marketplace.
+
+SBT can create Eclipse `.project` and `.classpath` files. To create these files for each Spark sub 
+project, use this command:
+
+```
+sbt/sbt eclipse
+```
+
+To import a specific project, e.g. spark-core, select `File | Import | Existing Projects` into 
+Workspace. Do not select "Copy projects into workspace".
+
+If you want to develop on Scala 2.10 you need to configure a Scala installation for the 
+exact Scala version that\u2019s used to compile Spark. 
+ Since Scala IDE bundles the latest versions (2.10.5 and 2.11.8 at this point), you need to add one 
+in `Eclipse Preferences -> Scala -> Installations` by pointing to the `lib/` directory of your 
+Scala 2.10.5 distribution. Once this is done, select all Spark projects and right-click, 
+choose `Scala -> Set Scala Installation` and point to the 2.10.5 installation. 
+This should clear all errors about invalid cross-compiled libraries. A clean build should succeed now.
+
+ScalaTest can execute unit tests by right clicking a source file and selecting `Run As | Scala Test`.
+
+If Java memory errors occur, it might be necessary to increase the settings in `eclipse.ini` 
+in the Eclipse install directory. Increase the following setting as needed:
+
+```
+--launcher.XXMaxPermSize
+256M
+```
+
+<a name="nightly-builds"></a>
+<h3>Nightly Builds</h3>
+
+Packages are built regularly off of Spark's master branch and release branches. These provide 
+Spark developers access to the bleeding-edge of Spark master or the most recent fixes not yet 
+incorporated into a maintenance release. These should only be used by Spark developers, as they 
+may have bugs and have not undergone the same level of testing as releases. Spark nightly packages 
+are available at:
+
+- Latest master build: <a href="http://people.apache.org/~pwendell/spark-nightly/spark-master-bin/latest">http://people.apache.org/~pwendell/spark-nightly/spark-master-bin/latest</a>
+- All nightly builds: <a href="http://people.apache.org/~pwendell/spark-nightly/">http://people.apache.org/~pwendell/spark-nightly/</a>
+
+Spark also publishes SNAPSHOT releases of its Maven artifacts for both master and maintenance 
+branches on a nightly basis. To link to a SNAPSHOT you need to add the ASF snapshot 
+repository to your build. Note that SNAPSHOT artifacts are ephemeral and may change or
+be removed. To use these you must add the ASF snapshot repository at 
+<a href="http://repository.apache.org/snapshots/">http://repository.apache.org/snapshots/<a>.
+
+```
+groupId: org.apache.spark
+artifactId: spark-core_2.10
+version: 1.5.0-SNAPSHOT
+```
+
+<a name="profiling"></a>
+<h3>Profiling Spark Applications Using YourKit</h3>
+
+Here are instructions on profiling Spark applications using YourKit Java Profiler.
+
+<h4>On Spark EC2 images</h4>
+
+- After logging into the master node, download the YourKit Java Profiler for Linux from the 
+<a href="https://www.yourkit.com/download/index.jsp">YourKit downloads page</a>. 
+This file is pretty big (~100 MB) and YourKit downloads site is somewhat slow, so you may 
+consider mirroring this file or including it on a custom AMI.
+- Untar this file somewhere (in `/root` in our case): `tar xvjf yjp-12.0.5-linux.tar.bz2`
+- Copy the expanded YourKit files to each node using copy-dir: `~/spark-ec2/copy-dir /root/yjp-12.0.5`
+- Configure the Spark JVMs to use the YourKit profiling agent by editing `~/spark/conf/spark-env.sh` 
+and adding the lines
+```
+SPARK_DAEMON_JAVA_OPTS+=" -agentpath:/root/yjp-12.0.5/bin/linux-x86-64/libyjpagent.so=sampling"
+export SPARK_DAEMON_JAVA_OPTS
+SPARK_JAVA_OPTS+=" -agentpath:/root/yjp-12.0.5/bin/linux-x86-64/libyjpagent.so=sampling"
+export SPARK_JAVA_OPTS
+```
+- Copy the updated configuration to each node: `~/spark-ec2/copy-dir ~/spark/conf/spark-env.sh`
+- Restart your Spark cluster: `~/spark/bin/stop-all.sh` and `~/spark/bin/start-all.sh`
+- By default, the YourKit profiler agents use ports 10001-10010. To connect the YourKit desktop 
+application to the remote profiler agents, you'll have to open these ports in the cluster's EC2 
+security groups. To do this, sign into the AWS Management Console. Go to the EC2 section and 
+select `Security Groups` from the `Network & Security` section on the left side of the page. 
+Find the security groups corresponding to your cluster; if you launched a cluster named `test_cluster`, 
+then you will want to modify the settings for the `test_cluster-slaves` and `test_cluster-master` 
+security groups. For each group, select it from the list, click the `Inbound` tab, and create a 
+new `Custom TCP Rule` opening the port range `10001-10010`. Finally, click `Apply Rule Changes`. 
+Make sure to do this for both security groups.
+Note: by default, `spark-ec2` re-uses security groups: if you stop this cluster and launch another 
+cluster with the same name, your security group settings will be re-used.
+- Launch the YourKit profiler on your desktop.
+- Select "Connect to remote application..." from the welcome screen and enter the the address of your Spark master or worker machine, e.g. `ec2--.compute-1.amazonaws.com`
+- YourKit should now be connected to the remote profiling agent. It may take a few moments for profiling information to appear.
+
+Please see the full YourKit documentation for the full list of profiler agent
+<a href="http://www.yourkit.com/docs/80/help/startup_options.jsp">startup options</a>.
+ 
+<h4>In Spark unit tests</h4>
+
+When running Spark tests through SBT, add `javaOptions in Test += "-agentpath:/path/to/yjp"`
+to `SparkBuild.scala` to launch the tests with the YourKit profiler agent enabled.  
+The platform-specific paths to the profiler agents are listed in the 
+<a href="http://www.yourkit.com/docs/80/help/agent.jsp">YourKit documentation</a>.

http://git-wip-us.apache.org/repos/asf/spark-website/blob/cf21826b/documentation.md
----------------------------------------------------------------------
diff --git a/documentation.md b/documentation.md
index 465f432..6234eac 100644
--- a/documentation.md
+++ b/documentation.md
@@ -174,13 +174,6 @@ Slides, videos and EC2-based exercises from each of these are available online:
   <li>The <a href="{{site.baseurl}}/examples.html">Spark examples page</a> shows the basic API in Scala, Java and Python.</li>
 </ul>
 
-<h3>Wiki</h3>
-
-<ul><li>
-The <a href="https://cwiki.apache.org/confluence/display/SPARK/Wiki+Homepage">Spark wiki</a> contains
-information for developers, such as architecture documents and how to <a href="{{site.baseurl}}/contributing.html">">contribute</a> to Spark.
-</li></ul>
-
 <h3>Research Papers</h3>
 
 <p>

http://git-wip-us.apache.org/repos/asf/spark-website/blob/cf21826b/downloads.md
----------------------------------------------------------------------
diff --git a/downloads.md b/downloads.md
index 17e1c7c..ddf9ba6 100644
--- a/downloads.md
+++ b/downloads.md
@@ -69,5 +69,5 @@ Once you've downloaded Spark, you can find instructions for installing and build
 <ul id="sparkReleaseNotes"></ul>
 
 ### Nightly Packages and Artifacts
-For developers, Spark maintains nightly builds and SNAPSHOT artifacts. More information is available on the [Spark developer Wiki](https://cwiki.apache.org/confluence/display/SPARK/Useful+Developer+Tools#UsefulDeveloperTools-NightlyBuilds).
+For developers, Spark maintains nightly builds and SNAPSHOT artifacts. More information is available on the [the Developer Tools page](/developer-tools.html#nightly-builds).
 

http://git-wip-us.apache.org/repos/asf/spark-website/blob/cf21826b/release-process.md
----------------------------------------------------------------------
diff --git a/release-process.md b/release-process.md
new file mode 100644
index 0000000..0de2adc
--- /dev/null
+++ b/release-process.md
@@ -0,0 +1,263 @@
+---
+layout: global
+title: Release Process
+type: "page singular"
+navigation:
+  weight: 5
+  show: true
+---
+
+<h2>Preparing Spark Releases</h2>
+
+<h3>Background</h3>
+
+The release manager role in Spark means you are responsible for a few different things:
+
+1. Preparing for release candidates:
+    1. cutting a release branch
+    1. informing the community of timing
+    1. working with component leads to clean up JIRA
+    1. making code changes in that branch with necessary version updates
+1. Running the voting process for a release:
+    1. creating release candidates using automated tooling
+    1. calling votes and triaging issues
+1. Finalizing and posting a release:
+    1. updating the Spark website
+    1. writing release notes
+    1. announcing the release 
+
+<h2>Preparing Spark for Release</h2>
+
+The main step towards preparing a release is to create a release branch. This is done via 
+standard Git branching mechanism and should be announced to the community once the branch is 
+created. It is also good to set up Jenkins jobs for the release branch once it is cut to 
+ensure tests are passing (consult Josh Rosen and Shane Knapp for help with this).
+
+Next, ensure that all Spark versions are correct in the code base on the release branch (see 
+<a href="https://github.com/apache/spark/commit/01d233e4aede65ffa39b9d2322196d4b64186526">this example commit</a>).
+You should grep through the codebase to find all instances of the version string. Some known 
+places to change are:
+
+- **SparkContext**. Search for VERSION (only for branch 1.x)
+- **Maven build**. Ensure that the version in all the `pom.xml` files is `<SPARK-VERSION>-SNAPSHOT` 
+(e.g. `1.1.1-SNAPSHOT`). This will be changed to `<SPARK-VERSION>` (e.g. 1.1.1) automatically by 
+Maven when cutting the release. Note that there are a few exceptions that should just use 
+`<SPARK-VERSION>`, like `extras/java8-tests/pom.xml`. These modules are not published as artifacts.
+- **Spark REPLs**. Look for the Spark ASCII art in `SparkILoopInit.scala` for the Scala shell 
+and in `shell.py` for the Python REPL.
+- **Docs**. Search for VERSION in `docs/_config.yml`
+
+Finally, update `CHANGES.txt` with this script in the Spark repository. `CHANGES.txt` captures 
+all the patches that have made it into this release candidate since the last release.
+
+```
+$ export SPARK_HOME=<your Spark home>
+$ cd spark
+# Update release versions
+$ vim dev/create-release/generate-changelist.py
+$ dev/create-release/generate-changelist.py
+```
+
+This produces a `CHANGES.txt.new` that should be a superset of the existing `CHANGES.txt`. 
+Replace the old `CHANGES.txt` with the new one (see 
+<a href="https://github.com/apache/spark/commit/131c62672a39a6f71f6834e9aad54b587237f13c">this example commit</a>).
+
+<h3>Cutting a Release Candidate</h3>
+
+If this is not the first RC, then make sure that the JIRA issues that have been solved since the 
+last RC are marked as `FIXED` in this release version.
+
+- A possible protocol for this is to mark such issues as `FIXED` in next maintenance release. 
+For example if you are cutting RC for 1.0.2, mark such issues as `FIXED` in 1.0.3.
+- When cutting new RC, find all the issues that are marked as `FIXED` for next maintenance 
+release, and change them to the current release.
+- Verify from `git log` whether they are actually making it in the new RC or not.
+
+The process of cutting a release candidate has been automated via the AMPLab Jenkins. There are 
+Jenkins jobs that can tag a release candidate and create various packages based on that candidate. 
+The recommended process is to ask the previous release manager to walk you through the Jenkins jobs.
+
+<h3>Call a Vote on the Release Candidate</h3>
+
+The release voting takes place on the Apache Spark developers list (the PMC is voting). 
+Look at past voting threads to see how this proceeds. The email should follow 
+<a href="http://mail-archives.apache.org/mod_mbox/spark-dev/201407.mbox/%3cCABPQxss7Cf+YaUuxCk0jnusH4207hCP4dkWn3BWFSvdnD86HHQ@mail.gmail.com%3e">this format</a>.
+
+- Make a shortened link to the full list of JIRAs using <a href="http://s.apache.org/">http://s.apache.org/</a>
+- If possible, attach a draft of the release notes with the email
+- Make sure the voting closing time is in UTC format. Use this script to generate it
+- Make sure the email is in text format and the links are correct
+
+Once the vote is done, you should also send out a summary email with the totals, with a subject 
+that looks something like `[RESULT] [VOTE]...`.
+
+<h3>Finalize the Release</h3>
+
+**Be Careful!**
+
+**THIS STEP IS IRREVERSIBLE so make sure you selected the correct staging repository. Once you 
+move the artifacts into the release folder, they cannot be removed.**
+
+After the vote passes, find the staging repository and click Release and confirm. To upload the 
+binaries, you have to first upload them to the dev directory in the Apache Distribution repo, 
+and then move the binaries from dev directory to release directory. This "moving" is the only
+ way you can add stuff to the actual release directory.
+
+```
+# Checkout the Spark directory in Apache distribution SVN "dev" repo
+$ svn co https://dist.apache.org/repos/dist/dev/spark/
+  
+# Make directory for this RC in the above directory
+mkdir spark-1.1.1-rc2
+ 
+# Download the voted binaries and add them to the directory
+$ scp andrewor14@people.apache.org:~/public_html/spark-1.1.1-rc2/* spark-1.1.1-rc2
+ 
+# NOTE: Remove any binaries you don\u2019t want to publish
+# E.g. never push MapR and *without-hive artifacts to apache
+$ rm spark-1.1.1-rc2/*mapr*
+$ rm spark-1.1.1-rc2/*without-hive*
+$ svn add spark-1.1.1-rc2
+$ svn commit -m "Add spark-1.1.1-rc2" --username "andrewor14"
+  
+# Move the sub-directory in "dev" to the
+# corresponding directory in "release"
+$ export SVN_EDITOR=vim
+$ svn mv https://dist.apache.org/repos/dist/dev/spark/spark-1.1.1-rc2 https://dist.apache.org/repos/dist/release/spark/spark-1.1.1
+```
+
+Verify that the resources are present in <a href="http://www.apache.org/dist/spark/">http://www.apache.org/dist/spark/</a>.
+It may take a while for them to be visible. This will be mirrored throughout the Apache network. 
+There are a few remaining steps.
+
+<h4>Remove Old Releases from Mirror Network</h4>
+
+Spark always keeps two releases in the mirror network: the most recent release on the current and 
+previous branches. To delete older versions simply use svn rm. The `downloads.js` file in the 
+website `js/` directory must also be updated to reflect the changes. For instance, the two 
+releases should be 1.1.1 and 1.0.2, but not 1.1.1 and 1.1.0.
+
+```
+$ svn rm https://dist.apache.org/repos/dist/release/spark/spark-1.1.0
+```
+
+<h4>Update the Spark Apache Repository</h4>
+
+Check out the tagged commit for the release candidate that passed and apply the correct version tag.
+
+```
+$ git checkout v1.1.1-rc2 # the RC that passed
+$ git tag v1.1.1
+$ git push apache v1.1.1
+ 
+# Verify that the tag has been applied correctly
+# If so, remove the old tag
+$ git push apache :v1.1.1-rc2
+$ git tag -d v1.1.1-rc2
+```
+
+Next, update remaining version numbers in the release branch. If you are doing a patch release, 
+see the similar commit made after the previous release in that branch. For example, for branch 1.0, 
+see <a href="https://github.com/apache/spark/commit/2a5514f7dcb9765b60cb772b97038cbbd1b58983">this example commit</a>.
+
+In general, the rules are as follows:
+
+- `grep` through the repository to find such occurrences
+- References to the version just released. Upgrade them to next release version. If it is not a 
+documentation related version (e.g. inside `spark/docs/` or inside `spark/python/epydoc.conf`), 
+add `-SNAPSHOT` to the end.
+- References to the next version. Ensure these already have `-SNAPSHOT`.
+
+<!--
+<h4>Update the EC2 Scripts</h4>
+
+Upload the binary packages to the S3 bucket s3n://spark-related-packages (ask pwendell to do this). Then, change the init scripts in mesos/spark-ec2 repository to pull new binaries (see this example commit).
+For Spark 1.1+, update branch v4+
+For Spark 1.1, update branch v3+
+For Spark 1.0, update branch v3+
+For Spark 0.9, update branch v2+
+You can audit the ec2 set-up by launching a cluster and running this audit script. Make sure you create cluster with default instance type (m1.xlarge).
+-->
+
+<h4>Update the Spark Website</h4>
+
+The website repository is located at 
+<a href="https://github.com/apache/spark-website">https://github.com/apache/spark-website</a>. 
+Ensure the docs were generated with the PRODUCTION=1 environment variable and with Java 7.
+ 
+```
+# Build the latest docs
+$ git checkout v1.1.1
+$ cd docs
+$ JAVA_HOME=$JAVA_7_HOME PRODUCTION=1 jekyll build
+ 
+# Copy the new documentation to apache
+$ git clone https://github.com/apache/spark-website
+...
+$ cp -R _site spark-website/site/docs/1.1.1
+
+# Update the "latest" link
+$ cd spark/site/docs
+$ rm latest
+$ ln -s 1.1.1 latest
+```
+
+Next, update the rest of the Spark website. See how the previous releases are documented. 
+In particular, have a look at the changes to the `*.md` files in this commit (all the HTML 
+file changes are generated by `jekyll`).
+
+```
+$ git add 1.1.1
+$ git commit -m "Add docs for Spark 1.1.1" 
+```
+ 
+Then, create the release notes. The contributors list can be automatically generated through 
+<a href="https://github.com/apache/spark/blob/branch-1.1/dev/create-release/generate-contributors.py">this script</a>.
+It accepts the tag that corresponds to the current release and another tag that 
+corresponds to the previous (not including maintenance release). For instance, if you are 
+releasing Spark 1.2.0, set the current tag to v1.2.0-rc2 and the previous tag to v1.1.0.
+Once you have generated the initial contributors list, it is highly likely that there will be 
+warnings about author names not being properly translated. To fix this, run 
+<a href="https://github.com/apache/spark/blob/branch-1.1/dev/create-release/translate-contributors.py">this other script</a>,
+which fetches potential replacements from Github and JIRA. For instance:
+
+```
+$ cd release-spark/dev/create-release
+# Set RELEASE_TAG and PREVIOUS_RELEASE_TAG
+$ vim generate-contributors.py
+# Generate initial contributors list, likely with warnings
+$ ./generate-contributors.py
+# Set JIRA_USERNAME, JIRA_PASSWORD, and GITHUB_API_TOKEN
+$ vim release-spark/dev/translate-contributors.py
+# Translate names generated in the previous step, reading from known_translations if necessary
+$ ./translate-contributors.py
+```
+
+Additionally, if you wish to give more specific credit for developers of larger patches, you may 
+use the the following commands to identify large patches. Extra care must be taken to make sure 
+commits from previous releases are not counted since git cannot easily associate commits that 
+were back ported into different branches.
+
+```
+# Determine PR numbers closed only in the new release
+$ git log v1.1.1 | grep "Closes #" | cut -d " " -f 5,6 | grep Closes | sort > closed_1.1.1
+$ git log v1.1.0 | grep "Closes #" | cut -d " " -f 5,6 | grep Closes | sort > closed_1.1.0
+$ diff --new-line-format="" --unchanged-line-format="" closed_1.1.1 closed_1.1.0 > diff.txt
+ 
+# Grep expression with all new patches
+$ EXPR=$(cat diff.txt | awk '{ print "\\("$1" "$2" \\)"; }' | tr "\n" "|" | sed -e "s/|/\\\|/g" | sed "s/\\\|$//")
+ 
+# Contributor list
+$ git shortlog v1.1.1 --grep "$EXPR" > contrib.txt
+ 
+# Large patch list (300+ lines)
+$ git log v1.1.1 --grep "$expr" --shortstat --oneline | grep -B 1 -e "[3-9][0-9][0-9] insert" -e "[1-9][1-9][1-9][1-9] insert" | grep SPARK > large-patches.txt
+```
+
+Then, update the downloads page, and then the main page with a news item.
+
+<h4>Create an Announcement</h4>
+
+Once everything is working (website docs, website changes) create an announcement on the website 
+and then send an e-mail to the mailing list. Enjoy an adult beverage of your choice, and 
+congratulations on making a Spark release.

http://git-wip-us.apache.org/repos/asf/spark-website/blob/cf21826b/site/committers.html
----------------------------------------------------------------------
diff --git a/site/committers.html b/site/committers.html
index bad4414..5069db0 100644
--- a/site/committers.html
+++ b/site/committers.html
@@ -108,24 +108,32 @@
         <ul class="dropdown-menu">
           <li><a href="/docs/latest/">Latest Release (Spark 2.0.2)</a></li>
           <li><a href="/documentation.html">Older Versions and Other Resources</a></li>
+          <li><a href="/faq.html">Frequently Asked Questions</a></li>
         </ul>
       </li>
       <li><a href="/examples.html">Examples</a></li>
       <li class="dropdown">
-        <a href="/community.html" class="dropdown-toggle" data-toggle="dropdown">
+        <a href="#" class="dropdown-toggle" data-toggle="dropdown">
           Community <b class="caret"></b>
         </a>
         <ul class="dropdown-menu">
-          <li><a href="/community.html#mailing-lists">Mailing Lists</a></li>
+          <li><a href="/community.html">Mailing Lists &amp; Resources</a></li>
           <li><a href="/contributing.html">Contributing to Spark</a></li>
           <li><a href="https://issues.apache.org/jira/browse/SPARK">Issue Tracker</a></li>
-          <li><a href="/community.html#events">Events and Meetups</a></li>
-          <li><a href="/community.html#history">Project History</a></li>
           <li><a href="/powered-by.html">Powered By</a></li>
           <li><a href="/committers.html">Project Committers</a></li>
         </ul>
       </li>
-      <li><a href="/faq.html">FAQ</a></li>
+      <li class="dropdown">
+        <a href="#" class="dropdown-toggle" data-toggle="dropdown">
+           Developers <b class="caret"></b>
+        </a>
+        <ul class="dropdown-menu">
+          <li><a href="/developer-tools.html">Useful Developer Tools</a></li>
+          <li><a href="/versioning-policy.html">Versioning Policy</a></li>
+          <li><a href="/release-process.html">Release Process</a></li>
+        </ul>
+      </li>
     </ul>
     <ul class="nav navbar-nav navbar-right">
       <li class="dropdown">

http://git-wip-us.apache.org/repos/asf/spark-website/blob/cf21826b/site/community.html
----------------------------------------------------------------------
diff --git a/site/community.html b/site/community.html
index 1a5e80f..a83ae1b 100644
--- a/site/community.html
+++ b/site/community.html
@@ -108,24 +108,32 @@
         <ul class="dropdown-menu">
           <li><a href="/docs/latest/">Latest Release (Spark 2.0.2)</a></li>
           <li><a href="/documentation.html">Older Versions and Other Resources</a></li>
+          <li><a href="/faq.html">Frequently Asked Questions</a></li>
         </ul>
       </li>
       <li><a href="/examples.html">Examples</a></li>
       <li class="dropdown">
-        <a href="/community.html" class="dropdown-toggle" data-toggle="dropdown">
+        <a href="#" class="dropdown-toggle" data-toggle="dropdown">
           Community <b class="caret"></b>
         </a>
         <ul class="dropdown-menu">
-          <li><a href="/community.html#mailing-lists">Mailing Lists</a></li>
+          <li><a href="/community.html">Mailing Lists &amp; Resources</a></li>
           <li><a href="/contributing.html">Contributing to Spark</a></li>
           <li><a href="https://issues.apache.org/jira/browse/SPARK">Issue Tracker</a></li>
-          <li><a href="/community.html#events">Events and Meetups</a></li>
-          <li><a href="/community.html#history">Project History</a></li>
           <li><a href="/powered-by.html">Powered By</a></li>
           <li><a href="/committers.html">Project Committers</a></li>
         </ul>
       </li>
-      <li><a href="/faq.html">FAQ</a></li>
+      <li class="dropdown">
+        <a href="#" class="dropdown-toggle" data-toggle="dropdown">
+           Developers <b class="caret"></b>
+        </a>
+        <ul class="dropdown-menu">
+          <li><a href="/developer-tools.html">Useful Developer Tools</a></li>
+          <li><a href="/versioning-policy.html">Versioning Policy</a></li>
+          <li><a href="/release-process.html">Release Process</a></li>
+        </ul>
+      </li>
     </ul>
     <ul class="nav navbar-nav navbar-right">
       <li class="dropdown">

http://git-wip-us.apache.org/repos/asf/spark-website/blob/cf21826b/site/contributing.html
----------------------------------------------------------------------
diff --git a/site/contributing.html b/site/contributing.html
index 72b5292..be01615 100644
--- a/site/contributing.html
+++ b/site/contributing.html
@@ -108,24 +108,32 @@
         <ul class="dropdown-menu">
           <li><a href="/docs/latest/">Latest Release (Spark 2.0.2)</a></li>
           <li><a href="/documentation.html">Older Versions and Other Resources</a></li>
+          <li><a href="/faq.html">Frequently Asked Questions</a></li>
         </ul>
       </li>
       <li><a href="/examples.html">Examples</a></li>
       <li class="dropdown">
-        <a href="/community.html" class="dropdown-toggle" data-toggle="dropdown">
+        <a href="#" class="dropdown-toggle" data-toggle="dropdown">
           Community <b class="caret"></b>
         </a>
         <ul class="dropdown-menu">
-          <li><a href="/community.html#mailing-lists">Mailing Lists</a></li>
+          <li><a href="/community.html">Mailing Lists &amp; Resources</a></li>
           <li><a href="/contributing.html">Contributing to Spark</a></li>
           <li><a href="https://issues.apache.org/jira/browse/SPARK">Issue Tracker</a></li>
-          <li><a href="/community.html#events">Events and Meetups</a></li>
-          <li><a href="/community.html#history">Project History</a></li>
           <li><a href="/powered-by.html">Powered By</a></li>
           <li><a href="/committers.html">Project Committers</a></li>
         </ul>
       </li>
-      <li><a href="/faq.html">FAQ</a></li>
+      <li class="dropdown">
+        <a href="#" class="dropdown-toggle" data-toggle="dropdown">
+           Developers <b class="caret"></b>
+        </a>
+        <ul class="dropdown-menu">
+          <li><a href="/developer-tools.html">Useful Developer Tools</a></li>
+          <li><a href="/versioning-policy.html">Versioning Policy</a></li>
+          <li><a href="/release-process.html">Release Process</a></li>
+        </ul>
+      </li>
     </ul>
     <ul class="nav navbar-nav navbar-right">
       <li class="dropdown">

http://git-wip-us.apache.org/repos/asf/spark-website/blob/cf21826b/site/developer-tools.html
----------------------------------------------------------------------
diff --git a/site/developer-tools.html b/site/developer-tools.html
new file mode 100644
index 0000000..0de49c6
--- /dev/null
+++ b/site/developer-tools.html
@@ -0,0 +1,494 @@
+<!DOCTYPE html>
+<html lang="en">
+<head>
+  <meta charset="utf-8">
+  <meta http-equiv="X-UA-Compatible" content="IE=edge">
+  <meta name="viewport" content="width=device-width, initial-scale=1.0">
+
+  <title>
+     Useful Developer Tools | Apache Spark
+    
+  </title>
+
+  
+
+  
+
+  <!-- Bootstrap core CSS -->
+  <link href="/css/cerulean.min.css" rel="stylesheet">
+  <link href="/css/custom.css" rel="stylesheet">
+
+  <!-- Code highlighter CSS -->
+  <link href="/css/pygments-default.css" rel="stylesheet">
+
+  <script type="text/javascript">
+  <!-- Google Analytics initialization -->
+  var _gaq = _gaq || [];
+  _gaq.push(['_setAccount', 'UA-32518208-2']);
+  _gaq.push(['_trackPageview']);
+  (function() {
+    var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true;
+    ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js';
+    var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s);
+  })();
+
+  <!-- Adds slight delay to links to allow async reporting -->
+  function trackOutboundLink(link, category, action) {
+    try {
+      _gaq.push(['_trackEvent', category , action]);
+    } catch(err){}
+
+    setTimeout(function() {
+      document.location.href = link.href;
+    }, 100);
+  }
+  </script>
+
+  <!-- HTML5 shim and Respond.js IE8 support of HTML5 elements and media queries -->
+  <!--[if lt IE 9]>
+  <script src="https://oss.maxcdn.com/libs/html5shiv/3.7.0/html5shiv.js"></script>
+  <script src="https://oss.maxcdn.com/libs/respond.js/1.3.0/respond.min.js"></script>
+  <![endif]-->
+</head>
+
+<body>
+
+<script src="https://code.jquery.com/jquery.js"></script>
+<script src="https://netdna.bootstrapcdn.com/bootstrap/3.0.3/js/bootstrap.min.js"></script>
+<script src="/js/lang-tabs.js"></script>
+<script src="/js/downloads.js"></script>
+
+<div class="container" style="max-width: 1200px;">
+
+<div class="masthead">
+  
+    <p class="lead">
+      <a href="/">
+      <img src="/images/spark-logo-trademark.png"
+        style="height:100px; width:auto; vertical-align: bottom; margin-top: 20px;"></a><span class="tagline">
+          Lightning-fast cluster computing
+      </span>
+    </p>
+  
+</div>
+
+<nav class="navbar navbar-default" role="navigation">
+  <!-- Brand and toggle get grouped for better mobile display -->
+  <div class="navbar-header">
+    <button type="button" class="navbar-toggle" data-toggle="collapse"
+            data-target="#navbar-collapse-1">
+      <span class="sr-only">Toggle navigation</span>
+      <span class="icon-bar"></span>
+      <span class="icon-bar"></span>
+      <span class="icon-bar"></span>
+    </button>
+  </div>
+
+  <!-- Collect the nav links, forms, and other content for toggling -->
+  <div class="collapse navbar-collapse" id="navbar-collapse-1">
+    <ul class="nav navbar-nav">
+      <li><a href="/downloads.html">Download</a></li>
+      <li class="dropdown">
+        <a href="#" class="dropdown-toggle" data-toggle="dropdown">
+          Libraries <b class="caret"></b>
+        </a>
+        <ul class="dropdown-menu">
+          <li><a href="/sql/">SQL and DataFrames</a></li>
+          <li><a href="/streaming/">Spark Streaming</a></li>
+          <li><a href="/mllib/">MLlib (machine learning)</a></li>
+          <li><a href="/graphx/">GraphX (graph)</a></li>
+          <li class="divider"></li>
+          <li><a href="/third-party-projects.html">Third-Party Projects</a></li>
+        </ul>
+      </li>
+      <li class="dropdown">
+        <a href="#" class="dropdown-toggle" data-toggle="dropdown">
+          Documentation <b class="caret"></b>
+        </a>
+        <ul class="dropdown-menu">
+          <li><a href="/docs/latest/">Latest Release (Spark 2.0.2)</a></li>
+          <li><a href="/documentation.html">Older Versions and Other Resources</a></li>
+          <li><a href="/faq.html">Frequently Asked Questions</a></li>
+        </ul>
+      </li>
+      <li><a href="/examples.html">Examples</a></li>
+      <li class="dropdown">
+        <a href="#" class="dropdown-toggle" data-toggle="dropdown">
+          Community <b class="caret"></b>
+        </a>
+        <ul class="dropdown-menu">
+          <li><a href="/community.html">Mailing Lists &amp; Resources</a></li>
+          <li><a href="/contributing.html">Contributing to Spark</a></li>
+          <li><a href="https://issues.apache.org/jira/browse/SPARK">Issue Tracker</a></li>
+          <li><a href="/powered-by.html">Powered By</a></li>
+          <li><a href="/committers.html">Project Committers</a></li>
+        </ul>
+      </li>
+      <li class="dropdown">
+        <a href="#" class="dropdown-toggle" data-toggle="dropdown">
+           Developers <b class="caret"></b>
+        </a>
+        <ul class="dropdown-menu">
+          <li><a href="/developer-tools.html">Useful Developer Tools</a></li>
+          <li><a href="/versioning-policy.html">Versioning Policy</a></li>
+          <li><a href="/release-process.html">Release Process</a></li>
+        </ul>
+      </li>
+    </ul>
+    <ul class="nav navbar-nav navbar-right">
+      <li class="dropdown">
+        <a href="http://www.apache.org/" class="dropdown-toggle" data-toggle="dropdown">
+          Apache Software Foundation <b class="caret"></b></a>
+        <ul class="dropdown-menu">
+          <li><a href="http://www.apache.org/">Apache Homepage</a></li>
+          <li><a href="http://www.apache.org/licenses/">License</a></li>
+          <li><a href="http://www.apache.org/foundation/sponsorship.html">Sponsorship</a></li>
+          <li><a href="http://www.apache.org/foundation/thanks.html">Thanks</a></li>
+          <li><a href="http://www.apache.org/security/">Security</a></li>
+        </ul>
+      </li>
+    </ul>
+  </div>
+  <!-- /.navbar-collapse -->
+</nav>
+
+
+<div class="row">
+  <div class="col-md-3 col-md-push-9">
+    <div class="news" style="margin-bottom: 20px;">
+      <h5>Latest News</h5>
+      <ul class="list-unstyled">
+        
+          <li><a href="/news/spark-wins-cloudsort-100tb-benchmark.html">Spark wins CloudSort Benchmark as the most efficient engine</a>
+          <span class="small">(Nov 15, 2016)</span></li>
+        
+          <li><a href="/news/spark-2-0-2-released.html">Spark 2.0.2 released</a>
+          <span class="small">(Nov 14, 2016)</span></li>
+        
+          <li><a href="/news/spark-1-6-3-released.html">Spark 1.6.3 released</a>
+          <span class="small">(Nov 07, 2016)</span></li>
+        
+          <li><a href="/news/spark-2-0-1-released.html">Spark 2.0.1 released</a>
+          <span class="small">(Oct 03, 2016)</span></li>
+        
+      </ul>
+      <p class="small" style="text-align: right;"><a href="/news/index.html">Archive</a></p>
+    </div>
+    <div class="hidden-xs hidden-sm">
+      <a href="/downloads.html" class="btn btn-success btn-lg btn-block" style="margin-bottom: 30px;">
+        Download Spark
+      </a>
+      <p style="font-size: 16px; font-weight: 500; color: #555;">
+        Built-in Libraries:
+      </p>
+      <ul class="list-none">
+        <li><a href="/sql/">SQL and DataFrames</a></li>
+        <li><a href="/streaming/">Spark Streaming</a></li>
+        <li><a href="/mllib/">MLlib (machine learning)</a></li>
+        <li><a href="/graphx/">GraphX (graph)</a></li>
+      </ul>
+      <a href="/third-party-projects.html">Third-Party Projects</a>
+    </div>
+  </div>
+
+  <div class="col-md-9 col-md-pull-3">
+    <h2>Useful Developer Tools</h2>
+
+<h3>Reducing Build Times</h3>
+
+<p>Spark&#8217;s default build strategy is to assemble a jar including all of its dependencies. This can 
+be cumbersome when doing iterative development. When developing locally, it is possible to create 
+an assembly jar including all of Spark&#8217;s dependencies and then re-package only Spark itself 
+when making changes.</p>
+
+<pre><code>$ build/sbt clean package
+$ ./bin/spark-shell
+$ export SPARK_PREPEND_CLASSES=true
+$ ./bin/spark-shell # Now it's using compiled classes
+# ... do some local development ... #
+$ build/sbt compile
+# ... do some local development ... #
+$ build/sbt compile
+$ unset SPARK_PREPEND_CLASSES
+$ ./bin/spark-shell
+ 
+# You can also use ~ to let sbt do incremental builds on file changes without running a new sbt session every time
+$ build/sbt ~compile
+</code></pre>
+
+<h3>Checking Out Pull Requests</h3>
+
+<p>Git provides a mechanism for fetching remote pull requests into your own local repository. 
+This is useful when reviewing code or testing patches locally. If you haven&#8217;t yet cloned the 
+Spark Git repository, use the following command:</p>
+
+<pre><code>$ git clone https://github.com/apache/spark.git
+$ cd spark
+</code></pre>
+
+<p>To enable this feature you&#8217;ll need to configure the git remote repository to fetch pull request 
+data. Do this by modifying the <code>.git/config</code> file inside of your Spark directory. The remote may 
+not be named &#8220;origin&#8221; if you&#8217;ve named it something else:</p>
+
+<pre><code>[remote "origin"]
+  url = git@github.com:apache/spark.git
+  fetch = +refs/heads/*:refs/remotes/origin/*
+  fetch = +refs/pull/*/head:refs/remotes/origin/pr/*   # Add this line
+</code></pre>
+
+<p>Once you&#8217;ve done this you can fetch remote pull requests</p>
+
+<pre><code># Fetch remote pull requests
+$ git fetch origin
+# Checkout a remote pull request
+$ git checkout origin/pr/112
+# Create a local branch from a remote pull request
+$ git checkout origin/pr/112 -b new-branch
+</code></pre>
+
+<h3>Generating Dependency Graphs</h3>
+
+<pre><code>$ # sbt
+$ build/sbt dependency-tree
+ 
+$ # Maven
+$ build/mvn -DskipTests install
+$ build/mvn dependency:tree
+</code></pre>
+
+<p><a name="individual-tests"></a></p>
+<h3>Running Build Targets For Individual Projects</h3>
+
+<pre><code>$ # sbt
+$ build/sbt package
+$ # Maven
+$ build/mvn package -DskipTests -pl assembly
+</code></pre>
+
+<h3>ScalaTest Issues</h3>
+
+<p>If the following error occurs when running ScalaTest</p>
+
+<pre><code>An internal error occurred during: "Launching XYZSuite.scala".
+java.lang.NullPointerException
+</code></pre>
+<p>It is due to an incorrect Scala library in the classpath. To fix it:</p>
+
+<ul>
+  <li>Right click on project</li>
+  <li>Select <code>Build Path | Configure Build Path</code></li>
+  <li><code>Add Library | Scala Library</code></li>
+  <li>Remove <code>scala-library-2.10.4.jar - lib_managed\jars</code></li>
+</ul>
+
+<p>In the event of &#8220;Could not find resource path for Web UI: org/apache/spark/ui/static&#8221;, 
+it&#8217;s due to a classpath issue (some classes were probably not compiled). To fix this, it 
+sufficient to run a test from the command line:</p>
+
+<pre><code>build/sbt "test-only org.apache.spark.rdd.SortingSuite"
+</code></pre>
+
+<h3>Running Different Test Permutations on Jenkins</h3>
+
+<p>When running tests for a pull request on Jenkins, you can add special phrases to the title of 
+your pull request to change testing behavior. This includes:</p>
+
+<ul>
+  <li><code>[test-maven]</code> - signals to test the pull request using maven</li>
+  <li><code>[test-hadoop1.0]</code> - signals to test using Spark&#8217;s Hadoop 1.0 profile (other options include 
+Hadoop 2.0, 2.2, and 2.3)</li>
+</ul>
+
+<h3>Organizing Imports</h3>
+
+<p>You can use a <a href="https://plugins.jetbrains.com/plugin/7350">IntelliJ Imports Organizer</a> 
+from Aaron Davidson to help you organize the imports in 
+your code.  It can be configured to match the import ordering from the style guide.</p>
+
+<h3>IDE Setup</h3>
+
+<h4>IntelliJ</h4>
+
+<p>While many of the Spark developers use SBT or Maven on the command line, the most common IDE we 
+use is IntelliJ IDEA. You can get the community edition for free (Apache committers can get 
+free IntelliJ Ultimate Edition licenses) and install the JetBrains Scala plugin from <code>Preferences &gt; Plugins</code>.</p>
+
+<p>To create a Spark project for IntelliJ:</p>
+
+<ul>
+  <li>Download IntelliJ and install the 
+<a href="https://confluence.jetbrains.com/display/SCA/Scala+Plugin+for+IntelliJ+IDEA">Scala plug-in for IntelliJ</a>.</li>
+  <li>Go to <code>File -&gt; Import Project</code>, locate the spark source directory, and select &#8220;Maven Project&#8221;.</li>
+  <li>In the Import wizard, it&#8217;s fine to leave settings at their default. However it is usually useful 
+to enable &#8220;Import Maven projects automatically&#8221;, since changes to the project structure will 
+automatically update the IntelliJ project.</li>
+  <li>As documented in <a href="http://spark.apache.org/docs/latest/building-spark.html">Building Spark</a>, 
+some build configurations require specific profiles to be 
+enabled. The same profiles that are enabled with <code>-P[profile name]</code> above may be enabled on the 
+Profiles screen in the Import wizard. For example, if developing for Hadoop 2.4 with YARN support, 
+enable profiles yarn and hadoop-2.4. These selections can be changed later by accessing the 
+&#8220;Maven Projects&#8221; tool window from the View menu, and expanding the Profiles section.</li>
+</ul>
+
+<p>Other tips:</p>
+
+<ul>
+  <li>&#8220;Rebuild Project&#8221; can fail the first time the project is compiled, because generate source files 
+are not automatically generated. Try clicking the &#8220;Generate Sources and Update Folders For All 
+Projects&#8221; button in the &#8220;Maven Projects&#8221; tool window to manually generate these sources.</li>
+  <li>Some of the modules have pluggable source directories based on Maven profiles (i.e. to support 
+both Scala 2.11 and 2.10 or to allow cross building against different versions of Hive). In some 
+cases IntelliJ&#8217;s does not correctly detect use of the maven-build-plugin to add source directories. 
+In these cases, you may need to add source locations explicitly to compile the entire project. If 
+so, open the &#8220;Project Settings&#8221; and select &#8220;Modules&#8221;. Based on your selected Maven profiles, you 
+may need to add source folders to the following modules:
+    <ul>
+      <li>spark-hive: add v0.13.1/src/main/scala</li>
+      <li>spark-streaming-flume-sink: add target\scala-2.10\src_managed\main\compiled_avro</li>
+    </ul>
+  </li>
+  <li>Compilation may fail with an error like &#8220;scalac: bad option: 
+-P:/home/jakub/.m2/repository/org/scalamacros/paradise_2.10.4/2.0.1/paradise_2.10.4-2.0.1.jar&#8221;. 
+If so, go to Preferences &gt; Build, Execution, Deployment &gt; Scala Compiler and clear the &#8220;Additional 
+compiler options&#8221; field.  It will work then although the option will come back when the project 
+reimports.  If you try to build any of the projects using quasiquotes (eg., sql) then you will 
+need to make that jar a compiler plugin (just below &#8220;Additional compiler options&#8221;). 
+Otherwise you will see errors like:
+    <pre><code>/Users/irashid/github/spark/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/codegen/CodeGenerator.scala
+Error:(147, 9) value q is not a member of StringContext
+ Note: implicit class Evaluate2 is not applicable here because it comes after the application point and it lacks an explicit result type
+      q"""
+      ^ 
+</code></pre>
+  </li>
+</ul>
+
+<h4>Eclipse</h4>
+
+<p>Eclipse can be used to develop and test Spark. The following configuration is known to work:</p>
+
+<ul>
+  <li>Eclipse Juno</li>
+  <li><a href="http://scala-ide.org/">Scala IDE 4.0</a></li>
+  <li>Scala Test</li>
+</ul>
+
+<p>The easiest way is to download the Scala IDE bundle from the Scala IDE download page. It comes 
+pre-installed with ScalaTest. Alternatively, use the Scala IDE update site or Eclipse Marketplace.</p>
+
+<p>SBT can create Eclipse <code>.project</code> and <code>.classpath</code> files. To create these files for each Spark sub 
+project, use this command:</p>
+
+<pre><code>sbt/sbt eclipse
+</code></pre>
+
+<p>To import a specific project, e.g. spark-core, select <code>File | Import | Existing Projects</code> into 
+Workspace. Do not select &#8220;Copy projects into workspace&#8221;.</p>
+
+<p>If you want to develop on Scala 2.10 you need to configure a Scala installation for the 
+exact Scala version that\u2019s used to compile Spark. 
+ Since Scala IDE bundles the latest versions (2.10.5 and 2.11.8 at this point), you need to add one 
+in <code>Eclipse Preferences -&gt; Scala -&gt; Installations</code> by pointing to the <code>lib/</code> directory of your 
+Scala 2.10.5 distribution. Once this is done, select all Spark projects and right-click, 
+choose <code>Scala -&gt; Set Scala Installation</code> and point to the 2.10.5 installation. 
+This should clear all errors about invalid cross-compiled libraries. A clean build should succeed now.</p>
+
+<p>ScalaTest can execute unit tests by right clicking a source file and selecting <code>Run As | Scala Test</code>.</p>
+
+<p>If Java memory errors occur, it might be necessary to increase the settings in <code>eclipse.ini</code> 
+in the Eclipse install directory. Increase the following setting as needed:</p>
+
+<pre><code>--launcher.XXMaxPermSize
+256M
+</code></pre>
+
+<p><a name="nightly-builds"></a></p>
+<h3>Nightly Builds</h3>
+
+<p>Packages are built regularly off of Spark&#8217;s master branch and release branches. These provide 
+Spark developers access to the bleeding-edge of Spark master or the most recent fixes not yet 
+incorporated into a maintenance release. These should only be used by Spark developers, as they 
+may have bugs and have not undergone the same level of testing as releases. Spark nightly packages 
+are available at:</p>
+
+<ul>
+  <li>Latest master build: <a href="http://people.apache.org/~pwendell/spark-nightly/spark-master-bin/latest">http://people.apache.org/~pwendell/spark-nightly/spark-master-bin/latest</a></li>
+  <li>All nightly builds: <a href="http://people.apache.org/~pwendell/spark-nightly/">http://people.apache.org/~pwendell/spark-nightly/</a></li>
+</ul>
+
+<p>Spark also publishes SNAPSHOT releases of its Maven artifacts for both master and maintenance 
+branches on a nightly basis. To link to a SNAPSHOT you need to add the ASF snapshot 
+repository to your build. Note that SNAPSHOT artifacts are ephemeral and may change or
+be removed. To use these you must add the ASF snapshot repository at 
+<a href="http://repository.apache.org/snapshots/">http://repository.apache.org/snapshots/<a>.</a></a></p>
+
+<pre><code>groupId: org.apache.spark
+artifactId: spark-core_2.10
+version: 1.5.0-SNAPSHOT
+</code></pre>
+
+<p><a name="profiling"></a></p>
+<h3>Profiling Spark Applications Using YourKit</h3>
+
+<p>Here are instructions on profiling Spark applications using YourKit Java Profiler.</p>
+
+<h4>On Spark EC2 images</h4>
+
+<ul>
+  <li>After logging into the master node, download the YourKit Java Profiler for Linux from the 
+<a href="https://www.yourkit.com/download/index.jsp">YourKit downloads page</a>. 
+This file is pretty big (~100 MB) and YourKit downloads site is somewhat slow, so you may 
+consider mirroring this file or including it on a custom AMI.</li>
+  <li>Untar this file somewhere (in <code>/root</code> in our case): <code>tar xvjf yjp-12.0.5-linux.tar.bz2</code></li>
+  <li>Copy the expanded YourKit files to each node using copy-dir: <code>~/spark-ec2/copy-dir /root/yjp-12.0.5</code></li>
+  <li>Configure the Spark JVMs to use the YourKit profiling agent by editing <code>~/spark/conf/spark-env.sh</code> 
+and adding the lines
+    <pre><code>SPARK_DAEMON_JAVA_OPTS+=" -agentpath:/root/yjp-12.0.5/bin/linux-x86-64/libyjpagent.so=sampling"
+export SPARK_DAEMON_JAVA_OPTS
+SPARK_JAVA_OPTS+=" -agentpath:/root/yjp-12.0.5/bin/linux-x86-64/libyjpagent.so=sampling"
+export SPARK_JAVA_OPTS
+</code></pre>
+  </li>
+  <li>Copy the updated configuration to each node: <code>~/spark-ec2/copy-dir ~/spark/conf/spark-env.sh</code></li>
+  <li>Restart your Spark cluster: <code>~/spark/bin/stop-all.sh</code> and <code>~/spark/bin/start-all.sh</code></li>
+  <li>By default, the YourKit profiler agents use ports 10001-10010. To connect the YourKit desktop 
+application to the remote profiler agents, you&#8217;ll have to open these ports in the cluster&#8217;s EC2 
+security groups. To do this, sign into the AWS Management Console. Go to the EC2 section and 
+select <code>Security Groups</code> from the <code>Network &amp; Security</code> section on the left side of the page. 
+Find the security groups corresponding to your cluster; if you launched a cluster named <code>test_cluster</code>, 
+then you will want to modify the settings for the <code>test_cluster-slaves</code> and <code>test_cluster-master</code> 
+security groups. For each group, select it from the list, click the <code>Inbound</code> tab, and create a 
+new <code>Custom TCP Rule</code> opening the port range <code>10001-10010</code>. Finally, click <code>Apply Rule Changes</code>. 
+Make sure to do this for both security groups.
+Note: by default, <code>spark-ec2</code> re-uses security groups: if you stop this cluster and launch another 
+cluster with the same name, your security group settings will be re-used.</li>
+  <li>Launch the YourKit profiler on your desktop.</li>
+  <li>Select &#8220;Connect to remote application&#8230;&#8221; from the welcome screen and enter the the address of your Spark master or worker machine, e.g. <code>ec2--.compute-1.amazonaws.com</code></li>
+  <li>YourKit should now be connected to the remote profiling agent. It may take a few moments for profiling information to appear.</li>
+</ul>
+
+<p>Please see the full YourKit documentation for the full list of profiler agent
+<a href="http://www.yourkit.com/docs/80/help/startup_options.jsp">startup options</a>.</p>
+
+<h4>In Spark unit tests</h4>
+
+<p>When running Spark tests through SBT, add <code>javaOptions in Test += "-agentpath:/path/to/yjp"</code>
+to <code>SparkBuild.scala</code> to launch the tests with the YourKit profiler agent enabled.<br />
+The platform-specific paths to the profiler agents are listed in the 
+<a href="http://www.yourkit.com/docs/80/help/agent.jsp">YourKit documentation</a>.</p>
+
+  </div>
+</div>
+
+
+
+<footer class="small">
+  <hr>
+  Apache Spark, Spark, Apache, and the Spark logo are <a href="/trademarks.html">trademarks</a> of
+  <a href="http://www.apache.org">The Apache Software Foundation</a>.
+</footer>
+
+</div>
+
+</body>
+</html>

http://git-wip-us.apache.org/repos/asf/spark-website/blob/cf21826b/site/documentation.html
----------------------------------------------------------------------
diff --git a/site/documentation.html b/site/documentation.html
index b8a8b40..8d911da 100644
--- a/site/documentation.html
+++ b/site/documentation.html
@@ -108,24 +108,32 @@
         <ul class="dropdown-menu">
           <li><a href="/docs/latest/">Latest Release (Spark 2.0.2)</a></li>
           <li><a href="/documentation.html">Older Versions and Other Resources</a></li>
+          <li><a href="/faq.html">Frequently Asked Questions</a></li>
         </ul>
       </li>
       <li><a href="/examples.html">Examples</a></li>
       <li class="dropdown">
-        <a href="/community.html" class="dropdown-toggle" data-toggle="dropdown">
+        <a href="#" class="dropdown-toggle" data-toggle="dropdown">
           Community <b class="caret"></b>
         </a>
         <ul class="dropdown-menu">
-          <li><a href="/community.html#mailing-lists">Mailing Lists</a></li>
+          <li><a href="/community.html">Mailing Lists &amp; Resources</a></li>
           <li><a href="/contributing.html">Contributing to Spark</a></li>
           <li><a href="https://issues.apache.org/jira/browse/SPARK">Issue Tracker</a></li>
-          <li><a href="/community.html#events">Events and Meetups</a></li>
-          <li><a href="/community.html#history">Project History</a></li>
           <li><a href="/powered-by.html">Powered By</a></li>
           <li><a href="/committers.html">Project Committers</a></li>
         </ul>
       </li>
-      <li><a href="/faq.html">FAQ</a></li>
+      <li class="dropdown">
+        <a href="#" class="dropdown-toggle" data-toggle="dropdown">
+           Developers <b class="caret"></b>
+        </a>
+        <ul class="dropdown-menu">
+          <li><a href="/developer-tools.html">Useful Developer Tools</a></li>
+          <li><a href="/versioning-policy.html">Versioning Policy</a></li>
+          <li><a href="/release-process.html">Release Process</a></li>
+        </ul>
+      </li>
     </ul>
     <ul class="nav navbar-nav navbar-right">
       <li class="dropdown">
@@ -350,13 +358,6 @@ Slides, videos and EC2-based exercises from each of these are available online:
   <li>The <a href="/examples.html">Spark examples page</a> shows the basic API in Scala, Java and Python.</li>
 </ul>
 
-<h3>Wiki</h3>
-
-<ul><li>
-The <a href="https://cwiki.apache.org/confluence/display/SPARK/Wiki+Homepage">Spark wiki</a> contains
-information for developers, such as architecture documents and how to <a href="/contributing.html">"&gt;contribute</a> to Spark.
-</li></ul>
-
 <h3>Research Papers</h3>
 
 <p>

http://git-wip-us.apache.org/repos/asf/spark-website/blob/cf21826b/site/downloads.html
----------------------------------------------------------------------
diff --git a/site/downloads.html b/site/downloads.html
index 6bb636d..aa88bba 100644
--- a/site/downloads.html
+++ b/site/downloads.html
@@ -108,24 +108,32 @@
         <ul class="dropdown-menu">
           <li><a href="/docs/latest/">Latest Release (Spark 2.0.2)</a></li>
           <li><a href="/documentation.html">Older Versions and Other Resources</a></li>
+          <li><a href="/faq.html">Frequently Asked Questions</a></li>
         </ul>
       </li>
       <li><a href="/examples.html">Examples</a></li>
       <li class="dropdown">
-        <a href="/community.html" class="dropdown-toggle" data-toggle="dropdown">
+        <a href="#" class="dropdown-toggle" data-toggle="dropdown">
           Community <b class="caret"></b>
         </a>
         <ul class="dropdown-menu">
-          <li><a href="/community.html#mailing-lists">Mailing Lists</a></li>
+          <li><a href="/community.html">Mailing Lists &amp; Resources</a></li>
           <li><a href="/contributing.html">Contributing to Spark</a></li>
           <li><a href="https://issues.apache.org/jira/browse/SPARK">Issue Tracker</a></li>
-          <li><a href="/community.html#events">Events and Meetups</a></li>
-          <li><a href="/community.html#history">Project History</a></li>
           <li><a href="/powered-by.html">Powered By</a></li>
           <li><a href="/committers.html">Project Committers</a></li>
         </ul>
       </li>
-      <li><a href="/faq.html">FAQ</a></li>
+      <li class="dropdown">
+        <a href="#" class="dropdown-toggle" data-toggle="dropdown">
+           Developers <b class="caret"></b>
+        </a>
+        <ul class="dropdown-menu">
+          <li><a href="/developer-tools.html">Useful Developer Tools</a></li>
+          <li><a href="/versioning-policy.html">Versioning Policy</a></li>
+          <li><a href="/release-process.html">Release Process</a></li>
+        </ul>
+      </li>
     </ul>
     <ul class="nav navbar-nav navbar-right">
       <li class="dropdown">
@@ -256,7 +264,7 @@ git clone git://github.com/apache/spark.git -b branch-2.0
 <ul id="sparkReleaseNotes"></ul>
 
 <h3 id="nightly-packages-and-artifacts">Nightly Packages and Artifacts</h3>
-<p>For developers, Spark maintains nightly builds and SNAPSHOT artifacts. More information is available on the <a href="https://cwiki.apache.org/confluence/display/SPARK/Useful+Developer+Tools#UsefulDeveloperTools-NightlyBuilds">Spark developer Wiki</a>.</p>
+<p>For developers, Spark maintains nightly builds and SNAPSHOT artifacts. More information is available on the <a href="/developer-tools.html#nightly-builds">the Developer Tools page</a>.</p>
 
 
   </div>

http://git-wip-us.apache.org/repos/asf/spark-website/blob/cf21826b/site/examples.html
----------------------------------------------------------------------
diff --git a/site/examples.html b/site/examples.html
index 877e5fa..18ece51 100644
--- a/site/examples.html
+++ b/site/examples.html
@@ -108,24 +108,32 @@
         <ul class="dropdown-menu">
           <li><a href="/docs/latest/">Latest Release (Spark 2.0.2)</a></li>
           <li><a href="/documentation.html">Older Versions and Other Resources</a></li>
+          <li><a href="/faq.html">Frequently Asked Questions</a></li>
         </ul>
       </li>
       <li><a href="/examples.html">Examples</a></li>
       <li class="dropdown">
-        <a href="/community.html" class="dropdown-toggle" data-toggle="dropdown">
+        <a href="#" class="dropdown-toggle" data-toggle="dropdown">
           Community <b class="caret"></b>
         </a>
         <ul class="dropdown-menu">
-          <li><a href="/community.html#mailing-lists">Mailing Lists</a></li>
+          <li><a href="/community.html">Mailing Lists &amp; Resources</a></li>
           <li><a href="/contributing.html">Contributing to Spark</a></li>
           <li><a href="https://issues.apache.org/jira/browse/SPARK">Issue Tracker</a></li>
-          <li><a href="/community.html#events">Events and Meetups</a></li>
-          <li><a href="/community.html#history">Project History</a></li>
           <li><a href="/powered-by.html">Powered By</a></li>
           <li><a href="/committers.html">Project Committers</a></li>
         </ul>
       </li>
-      <li><a href="/faq.html">FAQ</a></li>
+      <li class="dropdown">
+        <a href="#" class="dropdown-toggle" data-toggle="dropdown">
+           Developers <b class="caret"></b>
+        </a>
+        <ul class="dropdown-menu">
+          <li><a href="/developer-tools.html">Useful Developer Tools</a></li>
+          <li><a href="/versioning-policy.html">Versioning Policy</a></li>
+          <li><a href="/release-process.html">Release Process</a></li>
+        </ul>
+      </li>
     </ul>
     <ul class="nav navbar-nav navbar-right">
       <li class="dropdown">

http://git-wip-us.apache.org/repos/asf/spark-website/blob/cf21826b/site/faq.html
----------------------------------------------------------------------
diff --git a/site/faq.html b/site/faq.html
index 49ec5e3..cc8e0b9 100644
--- a/site/faq.html
+++ b/site/faq.html
@@ -108,24 +108,32 @@
         <ul class="dropdown-menu">
           <li><a href="/docs/latest/">Latest Release (Spark 2.0.2)</a></li>
           <li><a href="/documentation.html">Older Versions and Other Resources</a></li>
+          <li><a href="/faq.html">Frequently Asked Questions</a></li>
         </ul>
       </li>
       <li><a href="/examples.html">Examples</a></li>
       <li class="dropdown">
-        <a href="/community.html" class="dropdown-toggle" data-toggle="dropdown">
+        <a href="#" class="dropdown-toggle" data-toggle="dropdown">
           Community <b class="caret"></b>
         </a>
         <ul class="dropdown-menu">
-          <li><a href="/community.html#mailing-lists">Mailing Lists</a></li>
+          <li><a href="/community.html">Mailing Lists &amp; Resources</a></li>
           <li><a href="/contributing.html">Contributing to Spark</a></li>
           <li><a href="https://issues.apache.org/jira/browse/SPARK">Issue Tracker</a></li>
-          <li><a href="/community.html#events">Events and Meetups</a></li>
-          <li><a href="/community.html#history">Project History</a></li>
           <li><a href="/powered-by.html">Powered By</a></li>
           <li><a href="/committers.html">Project Committers</a></li>
         </ul>
       </li>
-      <li><a href="/faq.html">FAQ</a></li>
+      <li class="dropdown">
+        <a href="#" class="dropdown-toggle" data-toggle="dropdown">
+           Developers <b class="caret"></b>
+        </a>
+        <ul class="dropdown-menu">
+          <li><a href="/developer-tools.html">Useful Developer Tools</a></li>
+          <li><a href="/versioning-policy.html">Versioning Policy</a></li>
+          <li><a href="/release-process.html">Release Process</a></li>
+        </ul>
+      </li>
     </ul>
     <ul class="nav navbar-nav navbar-right">
       <li class="dropdown">

http://git-wip-us.apache.org/repos/asf/spark-website/blob/cf21826b/site/graphx/index.html
----------------------------------------------------------------------
diff --git a/site/graphx/index.html b/site/graphx/index.html
index 3a1d235..d3251bf 100644
--- a/site/graphx/index.html
+++ b/site/graphx/index.html
@@ -111,24 +111,32 @@
         <ul class="dropdown-menu">
           <li><a href="/docs/latest/">Latest Release (Spark 2.0.2)</a></li>
           <li><a href="/documentation.html">Older Versions and Other Resources</a></li>
+          <li><a href="/faq.html">Frequently Asked Questions</a></li>
         </ul>
       </li>
       <li><a href="/examples.html">Examples</a></li>
       <li class="dropdown">
-        <a href="/community.html" class="dropdown-toggle" data-toggle="dropdown">
+        <a href="#" class="dropdown-toggle" data-toggle="dropdown">
           Community <b class="caret"></b>
         </a>
         <ul class="dropdown-menu">
-          <li><a href="/community.html#mailing-lists">Mailing Lists</a></li>
+          <li><a href="/community.html">Mailing Lists &amp; Resources</a></li>
           <li><a href="/contributing.html">Contributing to Spark</a></li>
           <li><a href="https://issues.apache.org/jira/browse/SPARK">Issue Tracker</a></li>
-          <li><a href="/community.html#events">Events and Meetups</a></li>
-          <li><a href="/community.html#history">Project History</a></li>
           <li><a href="/powered-by.html">Powered By</a></li>
           <li><a href="/committers.html">Project Committers</a></li>
         </ul>
       </li>
-      <li><a href="/faq.html">FAQ</a></li>
+      <li class="dropdown">
+        <a href="#" class="dropdown-toggle" data-toggle="dropdown">
+           Developers <b class="caret"></b>
+        </a>
+        <ul class="dropdown-menu">
+          <li><a href="/developer-tools.html">Useful Developer Tools</a></li>
+          <li><a href="/versioning-policy.html">Versioning Policy</a></li>
+          <li><a href="/release-process.html">Release Process</a></li>
+        </ul>
+      </li>
     </ul>
     <ul class="nav navbar-nav navbar-right">
       <li class="dropdown">

http://git-wip-us.apache.org/repos/asf/spark-website/blob/cf21826b/site/index.html
----------------------------------------------------------------------
diff --git a/site/index.html b/site/index.html
index b35ec09..0ddfb81 100644
--- a/site/index.html
+++ b/site/index.html
@@ -110,24 +110,32 @@
         <ul class="dropdown-menu">
           <li><a href="/docs/latest/">Latest Release (Spark 2.0.2)</a></li>
           <li><a href="/documentation.html">Older Versions and Other Resources</a></li>
+          <li><a href="/faq.html">Frequently Asked Questions</a></li>
         </ul>
       </li>
       <li><a href="/examples.html">Examples</a></li>
       <li class="dropdown">
-        <a href="/community.html" class="dropdown-toggle" data-toggle="dropdown">
+        <a href="#" class="dropdown-toggle" data-toggle="dropdown">
           Community <b class="caret"></b>
         </a>
         <ul class="dropdown-menu">
-          <li><a href="/community.html#mailing-lists">Mailing Lists</a></li>
+          <li><a href="/community.html">Mailing Lists &amp; Resources</a></li>
           <li><a href="/contributing.html">Contributing to Spark</a></li>
           <li><a href="https://issues.apache.org/jira/browse/SPARK">Issue Tracker</a></li>
-          <li><a href="/community.html#events">Events and Meetups</a></li>
-          <li><a href="/community.html#history">Project History</a></li>
           <li><a href="/powered-by.html">Powered By</a></li>
           <li><a href="/committers.html">Project Committers</a></li>
         </ul>
       </li>
-      <li><a href="/faq.html">FAQ</a></li>
+      <li class="dropdown">
+        <a href="#" class="dropdown-toggle" data-toggle="dropdown">
+           Developers <b class="caret"></b>
+        </a>
+        <ul class="dropdown-menu">
+          <li><a href="/developer-tools.html">Useful Developer Tools</a></li>
+          <li><a href="/versioning-policy.html">Versioning Policy</a></li>
+          <li><a href="/release-process.html">Release Process</a></li>
+        </ul>
+      </li>
     </ul>
     <ul class="nav navbar-nav navbar-right">
       <li class="dropdown">

http://git-wip-us.apache.org/repos/asf/spark-website/blob/cf21826b/site/mailing-lists.html
----------------------------------------------------------------------
diff --git a/site/mailing-lists.html b/site/mailing-lists.html
index b0383fe..c113cdd 100644
--- a/site/mailing-lists.html
+++ b/site/mailing-lists.html
@@ -111,24 +111,32 @@
         <ul class="dropdown-menu">
           <li><a href="/docs/latest/">Latest Release (Spark 2.0.2)</a></li>
           <li><a href="/documentation.html">Older Versions and Other Resources</a></li>
+          <li><a href="/faq.html">Frequently Asked Questions</a></li>
         </ul>
       </li>
       <li><a href="/examples.html">Examples</a></li>
       <li class="dropdown">
-        <a href="/community.html" class="dropdown-toggle" data-toggle="dropdown">
+        <a href="#" class="dropdown-toggle" data-toggle="dropdown">
           Community <b class="caret"></b>
         </a>
         <ul class="dropdown-menu">
-          <li><a href="/community.html#mailing-lists">Mailing Lists</a></li>
+          <li><a href="/community.html">Mailing Lists &amp; Resources</a></li>
           <li><a href="/contributing.html">Contributing to Spark</a></li>
           <li><a href="https://issues.apache.org/jira/browse/SPARK">Issue Tracker</a></li>
-          <li><a href="/community.html#events">Events and Meetups</a></li>
-          <li><a href="/community.html#history">Project History</a></li>
           <li><a href="/powered-by.html">Powered By</a></li>
           <li><a href="/committers.html">Project Committers</a></li>
         </ul>
       </li>
-      <li><a href="/faq.html">FAQ</a></li>
+      <li class="dropdown">
+        <a href="#" class="dropdown-toggle" data-toggle="dropdown">
+           Developers <b class="caret"></b>
+        </a>
+        <ul class="dropdown-menu">
+          <li><a href="/developer-tools.html">Useful Developer Tools</a></li>
+          <li><a href="/versioning-policy.html">Versioning Policy</a></li>
+          <li><a href="/release-process.html">Release Process</a></li>
+        </ul>
+      </li>
     </ul>
     <ul class="nav navbar-nav navbar-right">
       <li class="dropdown">

http://git-wip-us.apache.org/repos/asf/spark-website/blob/cf21826b/site/mllib/index.html
----------------------------------------------------------------------
diff --git a/site/mllib/index.html b/site/mllib/index.html
index 3eb63eb..11808fb 100644
--- a/site/mllib/index.html
+++ b/site/mllib/index.html
@@ -111,24 +111,32 @@
         <ul class="dropdown-menu">
           <li><a href="/docs/latest/">Latest Release (Spark 2.0.2)</a></li>
           <li><a href="/documentation.html">Older Versions and Other Resources</a></li>
+          <li><a href="/faq.html">Frequently Asked Questions</a></li>
         </ul>
       </li>
       <li><a href="/examples.html">Examples</a></li>
       <li class="dropdown">
-        <a href="/community.html" class="dropdown-toggle" data-toggle="dropdown">
+        <a href="#" class="dropdown-toggle" data-toggle="dropdown">
           Community <b class="caret"></b>
         </a>
         <ul class="dropdown-menu">
-          <li><a href="/community.html#mailing-lists">Mailing Lists</a></li>
+          <li><a href="/community.html">Mailing Lists &amp; Resources</a></li>
           <li><a href="/contributing.html">Contributing to Spark</a></li>
           <li><a href="https://issues.apache.org/jira/browse/SPARK">Issue Tracker</a></li>
-          <li><a href="/community.html#events">Events and Meetups</a></li>
-          <li><a href="/community.html#history">Project History</a></li>
           <li><a href="/powered-by.html">Powered By</a></li>
           <li><a href="/committers.html">Project Committers</a></li>
         </ul>
       </li>
-      <li><a href="/faq.html">FAQ</a></li>
+      <li class="dropdown">
+        <a href="#" class="dropdown-toggle" data-toggle="dropdown">
+           Developers <b class="caret"></b>
+        </a>
+        <ul class="dropdown-menu">
+          <li><a href="/developer-tools.html">Useful Developer Tools</a></li>
+          <li><a href="/versioning-policy.html">Versioning Policy</a></li>
+          <li><a href="/release-process.html">Release Process</a></li>
+        </ul>
+      </li>
     </ul>
     <ul class="nav navbar-nav navbar-right">
       <li class="dropdown">

http://git-wip-us.apache.org/repos/asf/spark-website/blob/cf21826b/site/news/amp-camp-2013-registration-ope.html
----------------------------------------------------------------------
diff --git a/site/news/amp-camp-2013-registration-ope.html b/site/news/amp-camp-2013-registration-ope.html
index 067d74a..b9d1aba 100644
--- a/site/news/amp-camp-2013-registration-ope.html
+++ b/site/news/amp-camp-2013-registration-ope.html
@@ -108,24 +108,32 @@
         <ul class="dropdown-menu">
           <li><a href="/docs/latest/">Latest Release (Spark 2.0.2)</a></li>
           <li><a href="/documentation.html">Older Versions and Other Resources</a></li>
+          <li><a href="/faq.html">Frequently Asked Questions</a></li>
         </ul>
       </li>
       <li><a href="/examples.html">Examples</a></li>
       <li class="dropdown">
-        <a href="/community.html" class="dropdown-toggle" data-toggle="dropdown">
+        <a href="#" class="dropdown-toggle" data-toggle="dropdown">
           Community <b class="caret"></b>
         </a>
         <ul class="dropdown-menu">
-          <li><a href="/community.html#mailing-lists">Mailing Lists</a></li>
+          <li><a href="/community.html">Mailing Lists &amp; Resources</a></li>
           <li><a href="/contributing.html">Contributing to Spark</a></li>
           <li><a href="https://issues.apache.org/jira/browse/SPARK">Issue Tracker</a></li>
-          <li><a href="/community.html#events">Events and Meetups</a></li>
-          <li><a href="/community.html#history">Project History</a></li>
           <li><a href="/powered-by.html">Powered By</a></li>
           <li><a href="/committers.html">Project Committers</a></li>
         </ul>
       </li>
-      <li><a href="/faq.html">FAQ</a></li>
+      <li class="dropdown">
+        <a href="#" class="dropdown-toggle" data-toggle="dropdown">
+           Developers <b class="caret"></b>
+        </a>
+        <ul class="dropdown-menu">
+          <li><a href="/developer-tools.html">Useful Developer Tools</a></li>
+          <li><a href="/versioning-policy.html">Versioning Policy</a></li>
+          <li><a href="/release-process.html">Release Process</a></li>
+        </ul>
+      </li>
     </ul>
     <ul class="nav navbar-nav navbar-right">
       <li class="dropdown">


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org