You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by sr...@apache.org on 2016/11/16 10:35:01 UTC
spark git commit: [MINOR][DOC] Fix typos in the 'configuration',
'monitoring' and 'sql-programming-guide' documentation
Repository: spark
Updated Branches:
refs/heads/master e6145772e -> 241e04bc0
[MINOR][DOC] Fix typos in the 'configuration', 'monitoring' and 'sql-programming-guide' documentation
## What changes were proposed in this pull request?
Fix typos in the 'configuration', 'monitoring' and 'sql-programming-guide' documentation.
## How was this patch tested?
Manually.
Author: Weiqing Yang <ya...@gmail.com>
Closes #15886 from weiqingy/fixTypo.
Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/241e04bc
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/241e04bc
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/241e04bc
Branch: refs/heads/master
Commit: 241e04bc03efb1379622c0c84299e617512973ac
Parents: e614577
Author: Weiqing Yang <ya...@gmail.com>
Authored: Wed Nov 16 10:34:56 2016 +0000
Committer: Sean Owen <so...@cloudera.com>
Committed: Wed Nov 16 10:34:56 2016 +0000
----------------------------------------------------------------------
docs/configuration.md | 2 +-
docs/monitoring.md | 2 +-
docs/sql-programming-guide.md | 6 +++---
3 files changed, 5 insertions(+), 5 deletions(-)
----------------------------------------------------------------------
http://git-wip-us.apache.org/repos/asf/spark/blob/241e04bc/docs/configuration.md
----------------------------------------------------------------------
diff --git a/docs/configuration.md b/docs/configuration.md
index ea99592..c021a37 100644
--- a/docs/configuration.md
+++ b/docs/configuration.md
@@ -1951,7 +1951,7 @@ showDF(properties, numRows = 200, truncate = FALSE)
<td><code>spark.r.heartBeatInterval</code></td>
<td>100</td>
<td>
- Interval for heartbeats sents from SparkR backend to R process to prevent connection timeout.
+ Interval for heartbeats sent from SparkR backend to R process to prevent connection timeout.
</td>
</tr>
http://git-wip-us.apache.org/repos/asf/spark/blob/241e04bc/docs/monitoring.md
----------------------------------------------------------------------
diff --git a/docs/monitoring.md b/docs/monitoring.md
index 5bc5e18..2eef456 100644
--- a/docs/monitoring.md
+++ b/docs/monitoring.md
@@ -41,7 +41,7 @@ directory must be supplied in the `spark.history.fs.logDirectory` configuration
and should contain sub-directories that each represents an application's event logs.
The spark jobs themselves must be configured to log events, and to log them to the same shared,
-writeable directory. For example, if the server was configured with a log directory of
+writable directory. For example, if the server was configured with a log directory of
`hdfs://namenode/shared/spark-logs`, then the client-side options would be:
```
http://git-wip-us.apache.org/repos/asf/spark/blob/241e04bc/docs/sql-programming-guide.md
----------------------------------------------------------------------
diff --git a/docs/sql-programming-guide.md b/docs/sql-programming-guide.md
index b9be7a7..ba3e55f 100644
--- a/docs/sql-programming-guide.md
+++ b/docs/sql-programming-guide.md
@@ -222,9 +222,9 @@ The `sql` function enables applications to run SQL queries programmatically and
## Global Temporary View
-Temporay views in Spark SQL are session-scoped and will disappear if the session that creates it
+Temporary views in Spark SQL are session-scoped and will disappear if the session that creates it
terminates. If you want to have a temporary view that is shared among all sessions and keep alive
-until the Spark application terminiates, you can create a global temporary view. Global temporary
+until the Spark application terminates, you can create a global temporary view. Global temporary
view is tied to a system preserved database `global_temp`, and we must use the qualified name to
refer it, e.g. `SELECT * FROM global_temp.view1`.
@@ -1029,7 +1029,7 @@ following command:
bin/spark-shell --driver-class-path postgresql-9.4.1207.jar --jars postgresql-9.4.1207.jar
{% endhighlight %}
-Tables from the remote database can be loaded as a DataFrame or Spark SQL Temporary table using
+Tables from the remote database can be loaded as a DataFrame or Spark SQL temporary view using
the Data Sources API. Users can specify the JDBC connection properties in the data source options.
<code>user</code> and <code>password</code> are normally provided as connection properties for
logging into the data sources. In addition to the connection properties, Spark also supports
---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org