You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by sr...@apache.org on 2017/04/25 08:11:03 UTC

[2/5] spark-website git commit: adjust the content structure to make it more reasonable

adjust the content structure to make it more reasonable


Project: http://git-wip-us.apache.org/repos/asf/spark-website/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark-website/commit/4e458563
Tree: http://git-wip-us.apache.org/repos/asf/spark-website/tree/4e458563
Diff: http://git-wip-us.apache.org/repos/asf/spark-website/diff/4e458563

Branch: refs/heads/asf-site
Commit: 4e458563361e07e4cfb8286fd0c64a948185271a
Parents: 05c9946
Author: Stan Zhai <zh...@haizhi.com>
Authored: Fri Mar 10 00:45:48 2017 +0800
Committer: Stan Zhai <zh...@haizhi.com>
Committed: Fri Mar 10 00:45:48 2017 +0800

----------------------------------------------------------------------
 developer-tools.md        | 97 +++++++++++++++++++++--------------------
 site/developer-tools.html | 98 +++++++++++++++++++++---------------------
 2 files changed, 97 insertions(+), 98 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark-website/blob/4e458563/developer-tools.md
----------------------------------------------------------------------
diff --git a/developer-tools.md b/developer-tools.md
index e012c8e..e712e7d 100644
--- a/developer-tools.md
+++ b/developer-tools.md
@@ -48,6 +48,23 @@ builds. This process will auto-start after the first time `build/mvn` is called
 shut down at any time by running `build/zinc-<version>/bin/zinc -shutdown` and will automatically
 restart whenever `build/mvn` is called.
 
+<h3>Building submodules individually</h3>
+
+For instance, you can build the Spark Core module using:
+
+```
+$ # sbt
+$ build/sbt
+> project core
+> package
+
+$ # or you can build the spark-core module with sbt directly using:
+$ build/sbt core/package
+
+$ # Maven
+$ build/mvn package -DskipTests -pl :spark-core_2.11
+```
+
 <a name="individual-tests"></a>
 <h3 id="running-individual-tests">Running Individual Tests</h3>
 
@@ -95,7 +112,6 @@ $ build/sbt "core/testOnly *DAGSchedulerSuite -- -z SPARK-12345"
 
 For more about how to run individual tests with sbt, see the [sbt documentation](http://www.scala-sbt.org/0.13/docs/Testing.html).
 
-
 <h4>Testing with Maven</h4>
 
 With Maven, you can use the `-DwildcardSuites` flag to run individual Scala tests:
@@ -112,6 +128,37 @@ To run individual Java tests, you can use the `-Dtest` flag:
 build/mvn test -DwildcardSuites=none -Dtest=org.apache.spark.streaming.JavaAPISuite test
 ```
 
+<h3>ScalaTest Issues</h3>
+
+If the following error occurs when running ScalaTest
+
+```
+An internal error occurred during: "Launching XYZSuite.scala".
+java.lang.NullPointerException
+```
+It is due to an incorrect Scala library in the classpath. To fix it:
+
+- Right click on project
+- Select `Build Path | Configure Build Path`
+- `Add Library | Scala Library`
+- Remove `scala-library-2.10.4.jar - lib_managed\jars`
+
+In the event of "Could not find resource path for Web UI: org/apache/spark/ui/static", 
+it's due to a classpath issue (some classes were probably not compiled). To fix this, it 
+sufficient to run a test from the command line:
+
+```
+build/sbt "test-only org.apache.spark.rdd.SortingSuite"
+```
+
+<h3>Running Different Test Permutations on Jenkins</h3>
+
+When running tests for a pull request on Jenkins, you can add special phrases to the title of 
+your pull request to change testing behavior. This includes:
+
+- `[test-maven]` - signals to test the pull request using maven
+- `[test-hadoop2.7]` - signals to test using Spark's Hadoop 2.7 profile
+
 <h3>Checking Out Pull Requests</h3>
 
 Git provides a mechanism for fetching remote pull requests into your own local repository. 
@@ -156,54 +203,6 @@ $ build/mvn -DskipTests install
 $ build/mvn dependency:tree
 ```
 
-<h3>Building submodules individually</h3>
-
-For instance, you can build the Spark Core module using:
-
-```
-$ # sbt
-$ build/sbt
-> project core
-> package
-
-$ # or you can build the spark-core module with sbt directly using:
-$ build/sbt core/package
-
-$ # Maven
-$ build/mvn package -DskipTests -pl :spark-core_2.11
-```
-
-<h3>ScalaTest Issues</h3>
-
-If the following error occurs when running ScalaTest
-
-```
-An internal error occurred during: "Launching XYZSuite.scala".
-java.lang.NullPointerException
-```
-It is due to an incorrect Scala library in the classpath. To fix it:
-
-- Right click on project
-- Select `Build Path | Configure Build Path`
-- `Add Library | Scala Library`
-- Remove `scala-library-2.10.4.jar - lib_managed\jars`
-
-In the event of "Could not find resource path for Web UI: org/apache/spark/ui/static", 
-it's due to a classpath issue (some classes were probably not compiled). To fix this, it 
-sufficient to run a test from the command line:
-
-```
-build/sbt "test-only org.apache.spark.rdd.SortingSuite"
-```
-
-<h3>Running Different Test Permutations on Jenkins</h3>
-
-When running tests for a pull request on Jenkins, you can add special phrases to the title of 
-your pull request to change testing behavior. This includes:
-
-- `[test-maven]` - signals to test the pull request using maven
-- `[test-hadoop2.7]` - signals to test using Spark's Hadoop 2.7 profile
-
 <h3>Organizing Imports</h3>
 
 You can use a <a href="https://plugins.jetbrains.com/plugin/7350">IntelliJ Imports Organizer</a> 

http://git-wip-us.apache.org/repos/asf/spark-website/blob/4e458563/site/developer-tools.html
----------------------------------------------------------------------
diff --git a/site/developer-tools.html b/site/developer-tools.html
index 1cbe7bb..b46d664 100644
--- a/site/developer-tools.html
+++ b/site/developer-tools.html
@@ -232,6 +232,22 @@ builds. This process will auto-start after the first time <code>build/mvn</code>
 shut down at any time by running <code>build/zinc-&lt;version&gt;/bin/zinc -shutdown</code> and will automatically
 restart whenever <code>build/mvn</code> is called.</p>
 
+<h3>Building submodules individually</h3>
+
+<p>For instance, you can build the Spark Core module using:</p>
+
+<pre><code>$ # sbt
+$ build/sbt
+&gt; project core
+&gt; package
+
+$ # or you can build the spark-core module with sbt directly using:
+$ build/sbt core/package
+
+$ # Maven
+$ build/mvn package -DskipTests -pl :spark-core_2.11
+</code></pre>
+
 <p><a name="individual-tests"></a></p>
 <h3 id="running-individual-tests">Running Individual Tests</h3>
 
@@ -287,6 +303,39 @@ restart whenever <code>build/mvn</code> is called.</p>
 <pre><code>build/mvn test -DwildcardSuites=none -Dtest=org.apache.spark.streaming.JavaAPISuite test
 </code></pre>
 
+<h3>ScalaTest Issues</h3>
+
+<p>If the following error occurs when running ScalaTest</p>
+
+<pre><code>An internal error occurred during: "Launching XYZSuite.scala".
+java.lang.NullPointerException
+</code></pre>
+<p>It is due to an incorrect Scala library in the classpath. To fix it:</p>
+
+<ul>
+  <li>Right click on project</li>
+  <li>Select <code>Build Path | Configure Build Path</code></li>
+  <li><code>Add Library | Scala Library</code></li>
+  <li>Remove <code>scala-library-2.10.4.jar - lib_managed\jars</code></li>
+</ul>
+
+<p>In the event of &#8220;Could not find resource path for Web UI: org/apache/spark/ui/static&#8221;, 
+it&#8217;s due to a classpath issue (some classes were probably not compiled). To fix this, it 
+sufficient to run a test from the command line:</p>
+
+<pre><code>build/sbt "test-only org.apache.spark.rdd.SortingSuite"
+</code></pre>
+
+<h3>Running Different Test Permutations on Jenkins</h3>
+
+<p>When running tests for a pull request on Jenkins, you can add special phrases to the title of 
+your pull request to change testing behavior. This includes:</p>
+
+<ul>
+  <li><code>[test-maven]</code> - signals to test the pull request using maven</li>
+  <li><code>[test-hadoop2.7]</code> - signals to test using Spark&#8217;s Hadoop 2.7 profile</li>
+</ul>
+
 <h3>Checking Out Pull Requests</h3>
 
 <p>Git provides a mechanism for fetching remote pull requests into your own local repository. 
@@ -327,55 +376,6 @@ $ build/mvn -DskipTests install
 $ build/mvn dependency:tree
 </code></pre>
 
-<h3>Building submodules individually</h3>
-
-<p>For instance, you can build the Spark Core module using:</p>
-
-<pre><code>$ # sbt
-$ build/sbt
-&gt; project core
-&gt; package
-
-$ # or you can build the spark-core module with sbt directly using:
-$ build/sbt core/package
-
-$ # Maven
-$ build/mvn package -DskipTests -pl :spark-core_2.11
-</code></pre>
-
-<h3>ScalaTest Issues</h3>
-
-<p>If the following error occurs when running ScalaTest</p>
-
-<pre><code>An internal error occurred during: "Launching XYZSuite.scala".
-java.lang.NullPointerException
-</code></pre>
-<p>It is due to an incorrect Scala library in the classpath. To fix it:</p>
-
-<ul>
-  <li>Right click on project</li>
-  <li>Select <code>Build Path | Configure Build Path</code></li>
-  <li><code>Add Library | Scala Library</code></li>
-  <li>Remove <code>scala-library-2.10.4.jar - lib_managed\jars</code></li>
-</ul>
-
-<p>In the event of &#8220;Could not find resource path for Web UI: org/apache/spark/ui/static&#8221;, 
-it&#8217;s due to a classpath issue (some classes were probably not compiled). To fix this, it 
-sufficient to run a test from the command line:</p>
-
-<pre><code>build/sbt "test-only org.apache.spark.rdd.SortingSuite"
-</code></pre>
-
-<h3>Running Different Test Permutations on Jenkins</h3>
-
-<p>When running tests for a pull request on Jenkins, you can add special phrases to the title of 
-your pull request to change testing behavior. This includes:</p>
-
-<ul>
-  <li><code>[test-maven]</code> - signals to test the pull request using maven</li>
-  <li><code>[test-hadoop2.7]</code> - signals to test using Spark&#8217;s Hadoop 2.7 profile</li>
-</ul>
-
 <h3>Organizing Imports</h3>
 
 <p>You can use a <a href="https://plugins.jetbrains.com/plugin/7350">IntelliJ Imports Organizer</a> 


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org