You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Yin Huai (JIRA)" <ji...@apache.org> on 2016/06/17 19:51:05 UTC

[jira] [Updated] (SPARK-14459) SQL partitioning must match existing tables, but is not checked.

     [ https://issues.apache.org/jira/browse/SPARK-14459?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Yin Huai updated SPARK-14459:
-----------------------------
    Labels: release_notes releasenotes  (was: )

> SQL partitioning must match existing tables, but is not checked.
> ----------------------------------------------------------------
>
>                 Key: SPARK-14459
>                 URL: https://issues.apache.org/jira/browse/SPARK-14459
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.0.0
>            Reporter: Ryan Blue
>            Assignee: Ryan Blue
>              Labels: release_notes, releasenotes
>             Fix For: 2.0.0
>
>
> Writing into partitioned Hive tables has unexpected results because the table's partitioning is not detected and applied during the analysis phase. 
> For example, if I have two tables, {{source}} and {{partitioned}}, with the same column types:
> {code}
> CREATE TABLE source (id bigint, data string, part string);
> CREATE TABLE partitioned (id bigint, data string) PARTITIONED BY (part string);
> // copy from source to partitioned
> sqlContext.table("source").write.insertInto("partitioned")
> {code}
> Copying from {{source}} to {{partitioned}} succeeds, but results in 0 rows. This works if I explicitly partition by adding {{...write.partitionBy("part").insertInto(...)}}. This work-around isn't obvious and is prone to error because the {{partitionBy}} must match the table's partitioning, though it is not checked.
> I think when relations are resolved, the partitioning should be checked and updated if it isn't set.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org