You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Xiao Li (JIRA)" <ji...@apache.org> on 2017/01/07 20:33:58 UTC

[jira] [Commented] (SPARK-19120) Returned an Empty Result after Loading a Hive Partitioned Table

    [ https://issues.apache.org/jira/browse/SPARK-19120?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15808100#comment-15808100 ] 

Xiao Li commented on SPARK-19120:
---------------------------------

Will submit a fix soon.

> Returned an Empty Result after Loading a Hive Partitioned Table
> ---------------------------------------------------------------
>
>                 Key: SPARK-19120
>                 URL: https://issues.apache.org/jira/browse/SPARK-19120
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.1.0
>            Reporter: Xiao Li
>            Assignee: Xiao Li
>            Priority: Critical
>
> {noformat}
>         sql(
>           """
>             |CREATE TABLE test_added_partitions (a STRING)
>             |PARTITIONED BY (b INT)
>             |STORED AS PARQUET
>           """.stripMargin)
>         // Create partition without data files and check whether it can be read
>         sql(s"ALTER TABLE test_added_partitions ADD PARTITION (b='1')")
>         spark.table("test_added_partitions").show()
>         sql(
>           s"""
>              |LOAD DATA LOCAL INPATH '$newPartitionDir' OVERWRITE
>              |INTO TABLE test_added_partitions PARTITION(b='1')
>            """.stripMargin)
>         spark.table("test_added_partitions").show()
> {noformat}
> The returned result is empty after table loading. We should do metadata refresh after loading the data to the table. 
> This only happens if users manually add the partition by {{ALTER TABLE ADD PARTITION}}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org