You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Dongjoon Hyun (JIRA)" <ji...@apache.org> on 2017/03/01 22:19:45 UTC

[jira] [Commented] (SPARK-15848) Spark unable to read partitioned table in avro format and column name in upper case

    [ https://issues.apache.org/jira/browse/SPARK-15848?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15891180#comment-15891180 ] 

Dongjoon Hyun commented on SPARK-15848:
---------------------------------------

Hi, [~pratik.shah2462].
It doesn't happen in Spark 2.1.
For Spark 1.6.3, I can do the following.
{code}
spark-sql> ALTER TABLE avro_table_uppercase SET TBLPROPERTIES ('avro.schema.literal'='{"namespace": "com.rishav.avro", "name": "student_marks", "type": "record", "fields": [ { "name":"student_id","aliases":["STUDENT_ID"],"type":"int"}, { "name":"subject_id","aliases":["SUBJECT_ID"],"type":"int"}, { "name":"marks","type":"int"}]}');
spark-sql> select * from avro_table_uppercase;
5	300	100	2000
7	650	20	2000
8	780	160	2000
1	340	963	2000
9	780	142	2000
2	110	430	2000
0	38	91	2002
0	65	28	2002
0	78	16	2002
1	34	96	2002
1	78	14	2002
1	11	43	2002
Time taken: 0.241 seconds, Fetched 12 row(s)
{code}

> Spark unable to read partitioned table in avro format and column name in upper case
> -----------------------------------------------------------------------------------
>
>                 Key: SPARK-15848
>                 URL: https://issues.apache.org/jira/browse/SPARK-15848
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.6.1
>            Reporter: Zhan Zhang
>
> If external partitioned Hive tables created in Avro format.
> Spark is returning "null" values if columns names are in Uppercase in the Avro schema.
> The same tables return proper data when queried in the Hive client.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org