You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by ma...@apache.org on 2021/11/03 05:44:28 UTC
[spark] branch master updated: [SPARK-24774][SQL][FOLLOWUP] Remove
unused code in SchemaConverters.scala
This is an automated email from the ASF dual-hosted git repository.
maxgekk pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/master by this push:
new 59c55dd [SPARK-24774][SQL][FOLLOWUP] Remove unused code in SchemaConverters.scala
59c55dd is described below
commit 59c55dd4c6f7772ef7949653679a2b76211788e8
Author: Gengliang Wang <ge...@apache.org>
AuthorDate: Wed Nov 3 08:43:25 2021 +0300
[SPARK-24774][SQL][FOLLOWUP] Remove unused code in SchemaConverters.scala
### What changes were proposed in this pull request?
As MaxGekk pointed out in https://github.com/apache/spark/pull/22037/files#r741373793, there is some unused code in SchemaConverters.scala. The UUID generator was for generating `fix` avro field names but we figure out a better solution during PR review.
This PR is to remove the dead code.
### Why are the changes needed?
Code clean up
### Does this PR introduce _any_ user-facing change?
No
### How was this patch tested?
Existing UT.
Closes #34472 from gengliangwang/SPARK-24774-followup.
Authored-by: Gengliang Wang <ge...@apache.org>
Signed-off-by: Max Gekk <ma...@gmail.com>
---
.../src/main/scala/org/apache/spark/sql/avro/SchemaConverters.scala | 4 ----
1 file changed, 4 deletions(-)
diff --git a/external/avro/src/main/scala/org/apache/spark/sql/avro/SchemaConverters.scala b/external/avro/src/main/scala/org/apache/spark/sql/avro/SchemaConverters.scala
index 1c9b06b..347364c 100644
--- a/external/avro/src/main/scala/org/apache/spark/sql/avro/SchemaConverters.scala
+++ b/external/avro/src/main/scala/org/apache/spark/sql/avro/SchemaConverters.scala
@@ -18,14 +18,12 @@
package org.apache.spark.sql.avro
import scala.collection.JavaConverters._
-import scala.util.Random
import org.apache.avro.{LogicalTypes, Schema, SchemaBuilder}
import org.apache.avro.LogicalTypes.{Date, Decimal, LocalTimestampMicros, LocalTimestampMillis, TimestampMicros, TimestampMillis}
import org.apache.avro.Schema.Type._
import org.apache.spark.annotation.DeveloperApi
-import org.apache.spark.sql.catalyst.util.RandomUUIDGenerator
import org.apache.spark.sql.types._
import org.apache.spark.sql.types.Decimal.minBytesForPrecision
@@ -35,8 +33,6 @@ import org.apache.spark.sql.types.Decimal.minBytesForPrecision
*/
@DeveloperApi
object SchemaConverters {
- private lazy val uuidGenerator = RandomUUIDGenerator(new Random().nextLong())
-
private lazy val nullSchema = Schema.create(Schema.Type.NULL)
/**
---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org