You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2018/08/28 08:06:00 UTC

[jira] [Assigned] (SPARK-25260) Fix namespace handling in SchemaConverters.toAvroType

     [ https://issues.apache.org/jira/browse/SPARK-25260?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Apache Spark reassigned SPARK-25260:
------------------------------------

    Assignee: Apache Spark

> Fix namespace handling in SchemaConverters.toAvroType
> -----------------------------------------------------
>
>                 Key: SPARK-25260
>                 URL: https://issues.apache.org/jira/browse/SPARK-25260
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 2.4.0
>            Reporter: Arun Mahadevan
>            Assignee: Apache Spark
>            Priority: Major
>
> `toAvroType` converts spark data type to avro schema. It always appends the record name to namespace so its impossible to have an Avro namespace independent of the record name.
>  
> When invoked with a spark data type like,
>  
> {code:java}
> val sparkSchema = StructType(Seq(
>     StructField("name", StringType, nullable = false),
>     StructField("address", StructType(Seq(
>         StructField("city", StringType, nullable = false),
>         StructField("state", StringType, nullable = false))),
>     nullable = false)))
>  
> // map it to an avro schema with top level namespace "foo.bar",
> val avroSchema = SchemaConverters.toAvroType(sparkSchema,  false, "employee", "foo.bar")
> // result is
> // avroSchema.getName = employee
> // avroSchema.getNamespace = foo.bar.employee
> // avroSchema.getFullname = foo.bar.employee.employee
>  
> {code}
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org