You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by gu...@apache.org on 2021/02/16 02:56:40 UTC
[spark] branch branch-3.1 updated:
[SPARK-33210][SQL][DOCS][FOLLOWUP] Fix descriptions of the SQL configs for
the parquet INT96 rebase modes
This is an automated email from the ASF dual-hosted git repository.
gurwls223 pushed a commit to branch branch-3.1
in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/branch-3.1 by this push:
new 4136edb [SPARK-33210][SQL][DOCS][FOLLOWUP] Fix descriptions of the SQL configs for the parquet INT96 rebase modes
4136edb is described below
commit 4136edb44bfd5d71f7b10b3c68d9a8f999bebf3d
Author: Max Gekk <ma...@gmail.com>
AuthorDate: Tue Feb 16 11:55:53 2021 +0900
[SPARK-33210][SQL][DOCS][FOLLOWUP] Fix descriptions of the SQL configs for the parquet INT96 rebase modes
### What changes were proposed in this pull request?
Fix descriptions of the SQL configs `spark.sql.legacy.parquet.int96RebaseModeInRead` and `spark.sql.legacy.parquet.int96RebaseModeInWrite`, and mention `EXCEPTION` as the default value.
### Why are the changes needed?
This fixes incorrect descriptions that can mislead users.
### Does this PR introduce _any_ user-facing change?
No
### How was this patch tested?
By running `./dev/scalastyle`.
Closes #31557 from MaxGekk/int96-exception-by-default-followup.
Authored-by: Max Gekk <ma...@gmail.com>
Signed-off-by: HyukjinKwon <gu...@apache.org>
(cherry picked from commit 1a11fe55017a79016dd138dd2afb4edd0a6cef2f)
Signed-off-by: HyukjinKwon <gu...@apache.org>
---
.../org/apache/spark/sql/internal/SQLConf.scala | 22 +++++++++++-----------
1 file changed, 11 insertions(+), 11 deletions(-)
diff --git a/sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala b/sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala
index ecd8f3a..fcdf910 100644
--- a/sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala
+++ b/sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala
@@ -2811,11 +2811,11 @@ object SQLConf {
val LEGACY_PARQUET_INT96_REBASE_MODE_IN_WRITE =
buildConf("spark.sql.legacy.parquet.int96RebaseModeInWrite")
.internal()
- .doc("When LEGACY, which is the default, Spark will rebase INT96 timestamps from " +
- "Proleptic Gregorian calendar to the legacy hybrid (Julian + Gregorian) calendar when " +
- "writing Parquet files. When CORRECTED, Spark will not do rebase and write the timestamps" +
- " as it is. When EXCEPTION, Spark will fail the writing if it sees ancient timestamps " +
- "that are ambiguous between the two calendars.")
+ .doc("When LEGACY, Spark will rebase INT96 timestamps from Proleptic Gregorian calendar to " +
+ "the legacy hybrid (Julian + Gregorian) calendar when writing Parquet files. " +
+ "When CORRECTED, Spark will not do rebase and write the timestamps as it is. " +
+ "When EXCEPTION, which is the default, Spark will fail the writing if it sees ancient " +
+ "timestamps that are ambiguous between the two calendars.")
.version("3.1.0")
.stringConf
.transform(_.toUpperCase(Locale.ROOT))
@@ -2842,12 +2842,12 @@ object SQLConf {
val LEGACY_PARQUET_INT96_REBASE_MODE_IN_READ =
buildConf("spark.sql.legacy.parquet.int96RebaseModeInRead")
.internal()
- .doc("When LEGACY, which is the default, Spark will rebase INT96 timestamps from " +
- "the legacy hybrid (Julian + Gregorian) calendar to Proleptic Gregorian calendar when " +
- "reading Parquet files. When CORRECTED, Spark will not do rebase and read the timestamps " +
- "as it is. When EXCEPTION, Spark will fail the reading if it sees ancient timestamps " +
- "that are ambiguous between the two calendars. This config is only effective if the " +
- "writer info (like Spark, Hive) of the Parquet files is unknown.")
+ .doc("When LEGACY, Spark will rebase INT96 timestamps from the legacy hybrid (Julian + " +
+ "Gregorian) calendar to Proleptic Gregorian calendar when reading Parquet files. " +
+ "When CORRECTED, Spark will not do rebase and read the timestamps as it is. " +
+ "When EXCEPTION, which is the default, Spark will fail the reading if it sees ancient " +
+ "timestamps that are ambiguous between the two calendars. This config is only effective " +
+ "if the writer info (like Spark, Hive) of the Parquet files is unknown.")
.version("3.1.0")
.stringConf
.transform(_.toUpperCase(Locale.ROOT))
---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org