You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by we...@apache.org on 2021/07/28 06:39:35 UTC
[spark] branch branch-3.2 updated: [SPARK-36312][SQL][FOLLOWUP] Add
back ParquetSchemaConverter.checkFieldNames
This is an automated email from the ASF dual-hosted git repository.
wenchen pushed a commit to branch branch-3.2
in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/branch-3.2 by this push:
new b58170b [SPARK-36312][SQL][FOLLOWUP] Add back ParquetSchemaConverter.checkFieldNames
b58170b is described below
commit b58170b192c51390944c988896c5dcf89d6dfbac
Author: Angerszhuuuu <an...@gmail.com>
AuthorDate: Wed Jul 28 14:38:23 2021 +0800
[SPARK-36312][SQL][FOLLOWUP] Add back ParquetSchemaConverter.checkFieldNames
### What changes were proposed in this pull request?
Add back ParquetSchemaConverter.checkFieldNames()
### Why are the changes needed?
Fix code
### Does this PR introduce _any_ user-facing change?
No
### How was this patch tested?
Closes #33552 from AngersZhuuuu/SPARK-36312-FOLLOWUP.
Authored-by: Angerszhuuuu <an...@gmail.com>
Signed-off-by: Wenchen Fan <we...@databricks.com>
(cherry picked from commit f086c17b8e1b74a3493b7381b36323afe0be3df5)
Signed-off-by: Wenchen Fan <we...@databricks.com>
---
.../execution/datasources/parquet/ParquetSchemaConverter.scala | 10 ++++++++++
1 file changed, 10 insertions(+)
diff --git a/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetSchemaConverter.scala b/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetSchemaConverter.scala
index 217c020..f3bfd99 100644
--- a/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetSchemaConverter.scala
+++ b/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetSchemaConverter.scala
@@ -591,6 +591,16 @@ private[sql] object ParquetSchemaConverter {
}
}
+ def checkFieldNames(schema: StructType): Unit = {
+ schema.foreach { field =>
+ checkFieldName(field.name)
+ field.dataType match {
+ case s: StructType => checkFieldNames(s)
+ case _ =>
+ }
+ }
+ }
+
def checkConversionRequirement(f: => Boolean, message: String): Unit = {
if (!f) {
throw new AnalysisException(message)
---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org