You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by gu...@apache.org on 2018/01/30 22:37:30 UTC

spark git commit: [MINOR] Fix typos in dev/* scripts.

Repository: spark
Updated Branches:
  refs/heads/master 58fcb5a95 -> 9623a9824


[MINOR] Fix typos in dev/* scripts.

## What changes were proposed in this pull request?

Consistency in style, grammar and removal of extraneous characters.

## How was this patch tested?

Manually as this is a doc change.

Author: Shashwat Anand <me...@shashwat.me>

Closes #20436 from ashashwat/SPARK-23174.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/9623a982
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/9623a982
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/9623a982

Branch: refs/heads/master
Commit: 9623a98248837da302ba4ec240335d1c4268ee21
Parents: 58fcb5a
Author: Shashwat Anand <me...@shashwat.me>
Authored: Wed Jan 31 07:37:25 2018 +0900
Committer: hyukjinkwon <gu...@gmail.com>
Committed: Wed Jan 31 07:37:25 2018 +0900

----------------------------------------------------------------------
 dev/appveyor-guide.md            |  6 +++---
 dev/lint-python                  | 12 ++++++------
 dev/run-pip-tests                |  4 ++--
 dev/run-tests-jenkins            |  2 +-
 dev/sparktestsupport/modules.py  |  8 ++++----
 dev/sparktestsupport/toposort.py |  6 +++---
 dev/tests/pr_merge_ability.sh    |  4 ++--
 dev/tests/pr_public_classes.sh   |  4 ++--
 8 files changed, 23 insertions(+), 23 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/9623a982/dev/appveyor-guide.md
----------------------------------------------------------------------
diff --git a/dev/appveyor-guide.md b/dev/appveyor-guide.md
index d2e00b4..a842f39 100644
--- a/dev/appveyor-guide.md
+++ b/dev/appveyor-guide.md
@@ -1,6 +1,6 @@
 # AppVeyor Guides
 
-Currently, SparkR on Windows is being tested with [AppVeyor](https://ci.appveyor.com). This page describes how to set up AppVeyor with Spark, how to run the build, check the status and stop the build via this tool. There is the documenation for AppVeyor [here](https://www.appveyor.com/docs). Please refer this for full details.
+Currently, SparkR on Windows is being tested with [AppVeyor](https://ci.appveyor.com). This page describes how to set up AppVeyor with Spark, how to run the build, check the status and stop the build via this tool. There is the documentation for AppVeyor [here](https://www.appveyor.com/docs). Please refer this for full details.
 
 
 ### Setting up AppVeyor
@@ -45,7 +45,7 @@ Currently, SparkR on Windows is being tested with [AppVeyor](https://ci.appveyor
   
   <img width="144" alt="2016-08-30 12 16 35" src="https://cloud.githubusercontent.com/assets/6477701/18075026/3ee57bc6-6eac-11e6-826e-5dd09aeb0e7c.png">
 
-- Since we will use Github here, click the "GITHUB" button and then click "Authorize Github" so that AppVeyor can access to the Github logs (e.g. commits).
+- Since we will use Github here, click the "GITHUB" button and then click "Authorize Github" so that AppVeyor can access the Github logs (e.g. commits).
     
   <img width="517" alt="2016-09-04 11 10 22" src="https://cloud.githubusercontent.com/assets/6477701/18228819/9a4d5722-7299-11e6-900c-c5ff6b0450b1.png">
 
@@ -87,7 +87,7 @@ Currently, SparkR on Windows is being tested with [AppVeyor](https://ci.appveyor
 
   <img width="176" alt="2016-08-30 12 29 41" src="https://cloud.githubusercontent.com/assets/6477701/18075336/de618b52-6eae-11e6-8f01-e4ce48963087.png">
 
-- If the build is running, "CANCEL BUILD" buttom appears. Click this button top cancel the current build.
+- If the build is running, "CANCEL BUILD" button appears. Click this button to cancel the current build.
 
   <img width="158" alt="2016-08-30 1 11 13" src="https://cloud.githubusercontent.com/assets/6477701/18075806/4de68564-6eb3-11e6-855b-ee22918767f9.png">
 

http://git-wip-us.apache.org/repos/asf/spark/blob/9623a982/dev/lint-python
----------------------------------------------------------------------
diff --git a/dev/lint-python b/dev/lint-python
index e069caf..f738af9 100755
--- a/dev/lint-python
+++ b/dev/lint-python
@@ -34,8 +34,8 @@ python -B -m compileall -q -l $PATHS_TO_CHECK > "$PYCODESTYLE_REPORT_PATH"
 compile_status="${PIPESTATUS[0]}"
 
 # Get pycodestyle at runtime so that we don't rely on it being installed on the build server.
-#+ See: https://github.com/apache/spark/pull/1744#issuecomment-50982162
-# Updated to latest official version for pep8. pep8 is formally renamed to pycodestyle.
+# See: https://github.com/apache/spark/pull/1744#issuecomment-50982162
+# Updated to the latest official version of pep8. pep8 is formally renamed to pycodestyle.
 PYCODESTYLE_VERSION="2.3.1"
 PYCODESTYLE_SCRIPT_PATH="$SPARK_ROOT_DIR/dev/pycodestyle-$PYCODESTYLE_VERSION.py"
 PYCODESTYLE_SCRIPT_REMOTE_PATH="https://raw.githubusercontent.com/PyCQA/pycodestyle/$PYCODESTYLE_VERSION/pycodestyle.py"
@@ -60,9 +60,9 @@ export "PYLINT_HOME=$PYTHONPATH"
 export "PATH=$PYTHONPATH:$PATH"
 
 # There is no need to write this output to a file
-#+ first, but we do so so that the check status can
-#+ be output before the report, like with the
-#+ scalastyle and RAT checks.
+# first, but we do so so that the check status can
+# be output before the report, like with the
+# scalastyle and RAT checks.
 python "$PYCODESTYLE_SCRIPT_PATH" --config=dev/tox.ini $PATHS_TO_CHECK >> "$PYCODESTYLE_REPORT_PATH"
 pycodestyle_status="${PIPESTATUS[0]}"
 
@@ -73,7 +73,7 @@ else
 fi
 
 if [ "$lint_status" -ne 0 ]; then
-    echo "PYCODESTYLE checks failed."
+    echo "pycodestyle checks failed."
     cat "$PYCODESTYLE_REPORT_PATH"
     rm "$PYCODESTYLE_REPORT_PATH"
     exit "$lint_status"

http://git-wip-us.apache.org/repos/asf/spark/blob/9623a982/dev/run-pip-tests
----------------------------------------------------------------------
diff --git a/dev/run-pip-tests b/dev/run-pip-tests
index d51dde1..1321c2b 100755
--- a/dev/run-pip-tests
+++ b/dev/run-pip-tests
@@ -25,10 +25,10 @@ shopt -s nullglob
 FWDIR="$(cd "$(dirname "$0")"/..; pwd)"
 cd "$FWDIR"
 
-echo "Constucting virtual env for testing"
+echo "Constructing virtual env for testing"
 VIRTUALENV_BASE=$(mktemp -d)
 
-# Clean up the virtual env enviroment used if we created one.
+# Clean up the virtual env environment used if we created one.
 function delete_virtualenv() {
   echo "Cleaning up temporary directory - $VIRTUALENV_BASE"
   rm -rf "$VIRTUALENV_BASE"

http://git-wip-us.apache.org/repos/asf/spark/blob/9623a982/dev/run-tests-jenkins
----------------------------------------------------------------------
diff --git a/dev/run-tests-jenkins b/dev/run-tests-jenkins
index 03fd6ff..5bc03e4 100755
--- a/dev/run-tests-jenkins
+++ b/dev/run-tests-jenkins
@@ -20,7 +20,7 @@
 # Wrapper script that runs the Spark tests then reports QA results
 # to github via its API.
 # Environment variables are populated by the code here:
-#+ https://github.com/jenkinsci/ghprb-plugin/blob/master/src/main/java/org/jenkinsci/plugins/ghprb/GhprbTrigger.java#L139
+# https://github.com/jenkinsci/ghprb-plugin/blob/master/src/main/java/org/jenkinsci/plugins/ghprb/GhprbTrigger.java#L139
 
 FWDIR="$( cd "$( dirname "$0" )/.." && pwd )"
 cd "$FWDIR"

http://git-wip-us.apache.org/repos/asf/spark/blob/9623a982/dev/sparktestsupport/modules.py
----------------------------------------------------------------------
diff --git a/dev/sparktestsupport/modules.py b/dev/sparktestsupport/modules.py
index b900f0b..dfea762 100644
--- a/dev/sparktestsupport/modules.py
+++ b/dev/sparktestsupport/modules.py
@@ -25,10 +25,10 @@ all_modules = []
 @total_ordering
 class Module(object):
     """
-    A module is the basic abstraction in our test runner script. Each module consists of a set of
-    source files, a set of test commands, and a set of dependencies on other modules. We use modules
-    to define a dependency graph that lets determine which tests to run based on which files have
-    changed.
+    A module is the basic abstraction in our test runner script. Each module consists of a set
+    of source files, a set of test commands, and a set of dependencies on other modules. We use
+    modules to define a dependency graph that let us determine which tests to run based on which
+    files have changed.
     """
 
     def __init__(self, name, dependencies, source_file_regexes, build_profile_flags=(), environ={},

http://git-wip-us.apache.org/repos/asf/spark/blob/9623a982/dev/sparktestsupport/toposort.py
----------------------------------------------------------------------
diff --git a/dev/sparktestsupport/toposort.py b/dev/sparktestsupport/toposort.py
index 6c67b45..8b2688d 100644
--- a/dev/sparktestsupport/toposort.py
+++ b/dev/sparktestsupport/toposort.py
@@ -43,8 +43,8 @@ def toposort(data):
     """Dependencies are expressed as a dictionary whose keys are items
 and whose values are a set of dependent items. Output is a list of
 sets in topological order. The first set consists of items with no
-dependences, each subsequent set consists of items that depend upon
-items in the preceeding sets.
+dependencies, each subsequent set consists of items that depend upon
+items in the preceding sets.
 """
 
     # Special case empty input.
@@ -59,7 +59,7 @@ items in the preceeding sets.
         v.discard(k)
     # Find all items that don't depend on anything.
     extra_items_in_deps = _reduce(set.union, data.values()) - set(data.keys())
-    # Add empty dependences where needed.
+    # Add empty dependencies where needed.
     data.update({item: set() for item in extra_items_in_deps})
     while True:
         ordered = set(item for item, dep in data.items() if len(dep) == 0)

http://git-wip-us.apache.org/repos/asf/spark/blob/9623a982/dev/tests/pr_merge_ability.sh
----------------------------------------------------------------------
diff --git a/dev/tests/pr_merge_ability.sh b/dev/tests/pr_merge_ability.sh
index d9a347f..25fdbcc 100755
--- a/dev/tests/pr_merge_ability.sh
+++ b/dev/tests/pr_merge_ability.sh
@@ -23,9 +23,9 @@
 # found at dev/run-tests-jenkins.
 #
 # Arg1: The Github Pull Request Actual Commit
-#+ known as `ghprbActualCommit` in `run-tests-jenkins`
+# known as `ghprbActualCommit` in `run-tests-jenkins`
 # Arg2: The SHA1 hash
-#+ known as `sha1` in `run-tests-jenkins`
+# known as `sha1` in `run-tests-jenkins`
 #
 
 ghprbActualCommit="$1"

http://git-wip-us.apache.org/repos/asf/spark/blob/9623a982/dev/tests/pr_public_classes.sh
----------------------------------------------------------------------
diff --git a/dev/tests/pr_public_classes.sh b/dev/tests/pr_public_classes.sh
index 41c5d3e..479d185 100755
--- a/dev/tests/pr_public_classes.sh
+++ b/dev/tests/pr_public_classes.sh
@@ -23,7 +23,7 @@
 # found at dev/run-tests-jenkins.
 #
 # Arg1: The Github Pull Request Actual Commit
-#+ known as `ghprbActualCommit` in `run-tests-jenkins`
+# known as `ghprbActualCommit` in `run-tests-jenkins`
 
 ghprbActualCommit="$1"
 
@@ -31,7 +31,7 @@ ghprbActualCommit="$1"
 # master commit and the tip of the pull request branch.
 
 # By diffing$ghprbActualCommit^...$ghprbActualCommit and filtering to examine the diffs of only
-# non-test files, we can gets us changes introduced in the PR and not anything else added to master
+# non-test files, we can get changes introduced in the PR and not anything else added to master
 # since the PR was branched.
 
 # Handle differences between GNU and BSD sed


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org