You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2021/02/27 15:03:01 UTC

[GitHub] [spark] srowen commented on a change in pull request #31598: [SPARK-34478][SQL] When build SparkSession, we should check config keys

srowen commented on a change in pull request #31598:
URL: https://github.com/apache/spark/pull/31598#discussion_r584136599



##########
File path: docs/configuration.md
##########
@@ -114,12 +114,22 @@ in the `spark-defaults.conf` file. A few configuration keys have been renamed si
 versions of Spark; in such cases, the older key names are still accepted, but take lower
 precedence than any instance of the newer key.
 
-Spark properties mainly can be divided into two kinds: one is related to deploy, like
-"spark.driver.memory", "spark.executor.instances", this kind of properties may not be affected when
-setting programmatically through `SparkConf` in runtime, or the behavior is depending on which
-cluster manager and deploy mode you choose, so it would be suggested to set through configuration
-file or `spark-submit` command line options; another is mainly related to Spark runtime control,
-like "spark.task.maxFailures", this kind of properties can be set in either way.
+Spark properties mainly can be divided into three kinds: 
+
+    1. configuration used to submit application, such as "spark.driver.memory", "spark.driver.extraclassPath",

Review comment:
       This text needs a fair bit of cleanup - capitalization, punctuation, syntax. Break lists of configs into a list and code quote, and so on




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org