You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Liang-Chi Hsieh (JIRA)" <ji...@apache.org> on 2019/04/30 10:31:00 UTC

[jira] [Commented] (SPARK-27595) Spark couldn't read partitioned(string type) Orc column correctly if the value contains Float/Double value

    [ https://issues.apache.org/jira/browse/SPARK-27595?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16830165#comment-16830165 ] 

Liang-Chi Hsieh commented on SPARK-27595:
-----------------------------------------

Is turning off {{spark.sql.sources.partitionColumnTypeInference.enabled}} helpful?  

> Spark couldn't read partitioned(string type) Orc column correctly if the value contains Float/Double value
> ----------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-27595
>                 URL: https://issues.apache.org/jira/browse/SPARK-27595
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.6.0
>            Reporter: Ameer Basha Pattan
>            Priority: Critical
>
> create external table unique_keys (
> key string
> ,locator_id string
> , create_date string
> , sequence int
> )
> partitioned by (source string)
> stored as orc location '/user/hive/warehouse/reservation.db/unique_keys';
> /user/hive/warehouse/reservation.db/unique_keys contains data like below:
> /user/hive/warehouse/reservation.db/unique_keys/source=6S
> /user/hive/warehouse/reservation.db/unique_keys/source=7F
> /user/hive/warehouse/reservation.db/unique_keys/source=7H
> /user/hive/warehouse/reservation.db/unique_keys/source=8D
>  
> If I try to read orc files through Spark, 
> val masterDF = hiveContext.read.orc("/user/hive/warehouse/reservation.db/unique_keys")
> source value getting changed to *7.0 and 8.0* for 7F and 8D respectively.
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org