You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "tuming (JIRA)" <ji...@apache.org> on 2016/08/23 03:32:21 UTC

[jira] [Created] (SPARK-17198) ORC fixed char literal filter does not work

tuming created SPARK-17198:
------------------------------

             Summary: ORC fixed char literal filter does not work
                 Key: SPARK-17198
                 URL: https://issues.apache.org/jira/browse/SPARK-17198
             Project: Spark
          Issue Type: Bug
          Components: SQL
    Affects Versions: 1.5.1
            Reporter: tuming


I have got wrong result when I run the following query in SparkSQL. 
select * from orc_table where char_col ='5LZS';

Table orc_table is a ORC format table.
Column char_col is defined as char(6). 

The hive record reader will return a char(6) string to the spark. And the spark has no fixed char type. All fixed char type attributes are converted to String by default. Meanwhile the constant literal is parsed to a string Literal.  So it won't return true forever while doing the equal comparison. For instance: '5LZS'=='5LZS  '.

But I can get correct result in Hive using same data and sql string because hive append spaces for those constant literal. Please refer to:
https://issues.apache.org/jira/browse/HIVE-11312

I found there is no such patch for spark.
 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org