You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Vu Tan (Jira)" <ji...@apache.org> on 2022/01/30 17:55:00 UTC

[jira] [Commented] (SPARK-35531) Can not insert into hive bucket table if create table with upper case schema

    [ https://issues.apache.org/jira/browse/SPARK-35531?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17484401#comment-17484401 ] 

Vu Tan commented on SPARK-35531:
--------------------------------

Hi, JFYI, I tested the above commands on my local PC with {color:#172b4d}*spark-3.1.2-bin-hadoop3.2 distribution* and the same error happened.{color}

{color:#172b4d}So I think the issue is on 3.1.2 too.{color}

> Can not insert into hive bucket table if create table with upper case schema
> ----------------------------------------------------------------------------
>
>                 Key: SPARK-35531
>                 URL: https://issues.apache.org/jira/browse/SPARK-35531
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 3.0.0, 3.1.1, 3.2.0
>            Reporter: Hongyi Zhang
>            Assignee: angerszhu
>            Priority: Major
>             Fix For: 3.3.0
>
>
>  
>  
> create table TEST1(
>  V1 BIGINT,
>  S1 INT)
>  partitioned by (PK BIGINT)
>  clustered by (V1)
>  sorted by (S1)
>  into 200 buckets
>  STORED AS PARQUET;
>  
> insert into test1
>  select
>  * from values(1,1,1);
>  
>  
> org.apache.hadoop.hive.ql.metadata.HiveException: Bucket columns V1 is not part of the table columns ([FieldSchema(name:v1, type:bigint, comment:null), FieldSchema(name:s1, type:int, comment:null)]
> org.apache.spark.sql.AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: Bucket columns V1 is not part of the table columns ([FieldSchema(name:v1, type:bigint, comment:null), FieldSchema(name:s1, type:int, comment:null)]



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org