You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Erik Krogen (Jira)" <ji...@apache.org> on 2021/02/04 18:59:00 UTC

[jira] [Created] (SPARK-34365) Support configurable Avro schema field matching for positional or by-name

Erik Krogen created SPARK-34365:
-----------------------------------

             Summary: Support configurable Avro schema field matching for positional or by-name
                 Key: SPARK-34365
                 URL: https://issues.apache.org/jira/browse/SPARK-34365
             Project: Spark
          Issue Type: Improvement
          Components: SQL
    Affects Versions: 3.0.1
            Reporter: Erik Krogen


When reading an Avro dataset (using the dataset's schema or by overriding it with 'avroSchema') or writing an Avro dataset with a provided schema by 'avroSchema', currently the matching of Catalyst-to-Avro fields is done by field name.

This behavior is somewhat recent; prior to SPARK-27762 (fixed in 3.0.0), at least on the write path, we would match the schemas by positionally ("structural" comparison). While I agree that this is much more sensible for default behavior, I propose that we make this behavior configurable using an {{option}} for the Avro datasource.

There is precedence for configurability of this behavior as seen in SPARK-32864, which added this support for ORC. Besides this precedence, the behavior of Hive is to perform matching positionally ([ref|https://cwiki.apache.org/confluence/display/Hive/AvroSerDe#AvroSerDe-WritingtablestoAvrofiles]), so this is behavior that Hadoop/Hive ecosystem users are familiar with:
{quote}
Hive is very forgiving about types: it will attempt to store whatever value matches the provided column in the equivalent column position in the new table. No matching is done on column names, for instance.
{quote}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org