You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Xiao Li (JIRA)" <ji...@apache.org> on 2017/01/07 23:57:58 UTC

[jira] [Updated] (SPARK-19120) Returned an Empty Result after Loading a Hive Table

     [ https://issues.apache.org/jira/browse/SPARK-19120?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Xiao Li updated SPARK-19120:
----------------------------
    Summary: Returned an Empty Result after Loading a Hive Table  (was: Returned an Empty Result after Loading a Hive Partitioned Table)

> Returned an Empty Result after Loading a Hive Table
> ---------------------------------------------------
>
>                 Key: SPARK-19120
>                 URL: https://issues.apache.org/jira/browse/SPARK-19120
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.1.0
>            Reporter: Xiao Li
>            Assignee: Xiao Li
>            Priority: Critical
>              Labels: correctness
>
> {noformat}
>         sql(
>           """
>             |CREATE TABLE test_added_partitions (a STRING)
>             |PARTITIONED BY (b INT)
>             |STORED AS PARQUET
>           """.stripMargin)
>         // Create partition without data files and check whether it can be read
>         sql(s"ALTER TABLE test_added_partitions ADD PARTITION (b='1')")
>         spark.table("test_added_partitions").show()
>         sql(
>           s"""
>              |LOAD DATA LOCAL INPATH '$newPartitionDir' OVERWRITE
>              |INTO TABLE test_added_partitions PARTITION(b='1')
>            """.stripMargin)
>         spark.table("test_added_partitions").show()
> {noformat}
> The returned result is empty after table loading. We should refresh the metadata cache after loading the data to the table. 
> This only happens if users manually add the partition by {{ALTER TABLE ADD PARTITION}}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org