You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Carlos del Prado Mota (Jira)" <ji...@apache.org> on 2019/11/14 13:58:00 UTC

[jira] [Updated] (SPARK-29898) Support Avro Custom Logical Types

     [ https://issues.apache.org/jira/browse/SPARK-29898?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Carlos del Prado Mota updated SPARK-29898:
------------------------------------------
    Description: 
Extends options for the Spark Avro formatter allowing to use custom Avro logical types.

At the moment only timestamp and decimal logical types are supported at Spark but Avro support any conversion that you could need. This change keep the default mappings and allow to add news.

{{{{spark}}}}
{{ {{  .read}}}}
{{ {{  .format("avro")}}}}
{{ {{  .option("logicalTypeMapper", "org.example.CustomAvroLogicalCatalystMapper")}}}}
{{  .load()}}

Only you need is register your custom Avro logical type and then implement `AvroLogicalTypeCatalystMapper`

> Support Avro Custom Logical Types
> ---------------------------------
>
>                 Key: SPARK-29898
>                 URL: https://issues.apache.org/jira/browse/SPARK-29898
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 2.4.4
>            Reporter: Carlos del Prado Mota
>            Priority: Major
>
> Extends options for the Spark Avro formatter allowing to use custom Avro logical types.
> At the moment only timestamp and decimal logical types are supported at Spark but Avro support any conversion that you could need. This change keep the default mappings and allow to add news.
> {{{{spark}}}}
> {{ {{  .read}}}}
> {{ {{  .format("avro")}}}}
> {{ {{  .option("logicalTypeMapper", "org.example.CustomAvroLogicalCatalystMapper")}}}}
> {{  .load()}}
> Only you need is register your custom Avro logical type and then implement `AvroLogicalTypeCatalystMapper`



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org