You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2021/02/16 04:31:15 UTC

[GitHub] [spark] HyukjinKwon commented on a change in pull request #31564: [SPARK-34437][SQL][DOCS] Update Spark SQL guide about the rebasing DS options and SQL configs

HyukjinKwon commented on a change in pull request #31564:
URL: https://github.com/apache/spark/pull/31564#discussion_r576542456



##########
File path: docs/sql-data-sources-parquet.md
##########
@@ -329,4 +365,54 @@ Configuration of Parquet can be done using the `setConf` method on `SparkSession
   </td>
   <td>1.6.0</td>
 </tr>
+<tr>
+<td>spark.sql.legacy.parquet.datetimeRebaseModeInRead</td>
+  <td><code>EXCEPTION</code></td>
+  <td>The rebasing mode for the values of the <code>DATE</code>, <code>TIMESTAMP_MILLIS</code>, <code>TIMESTAMP_MICROS</code> logical types from the Julian to Proleptic Gregorian calendar:<br>
+    <ul>
+      <li><code>EXCEPTION</code>: Spark will fail the reading if it sees ancient dates/timestamps that are ambiguous between the two calendars.</li>
+      <li><code>CORRECTED</code>: Spark will not do rebase and read the dates/timestamps as it is.</li>
+      <li><code>LEGACY</code>: Spark will rebase dates/timestamps from the legacy hybrid (Julian + Gregorian) calendar to Proleptic Gregorian calendar when reading Parquet files.</li>
+    </ul>
+    This config is only effective if the writer info (like Spark, Hive) of the Parquet files is unknown.
+  </td>
+  <td>3.0.0</td>
+</tr>
+<tr>
+  <td>spark.sql.legacy.parquet.datetimeRebaseModeInWrite</td>

Review comment:
       We'll probably have to make these configuration properly exposed via removing `.internal()`.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org