You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "jacky (Jira)" <ji...@apache.org> on 2022/03/30 08:16:00 UTC

[jira] [Created] (SPARK-38695) ORC can not surport the dataType,such as char or varchar

jacky created SPARK-38695:
-----------------------------

             Summary: ORC can not surport the dataType,such as char or varchar
                 Key: SPARK-38695
                 URL: https://issues.apache.org/jira/browse/SPARK-38695
             Project: Spark
          Issue Type: Bug
          Components: SQL
    Affects Versions: 3.2.1, 3.1.2
            Reporter: jacky


When testing Spark performance with TPCDS,run some sql,such as:q1,I found this error

!image-2022-03-30-15-39-02-557.png!

I used the sql to create table,such as

create table customer
stored as orc
as select * from tpdc_text.customer
CLUSTER BY c_customer_sk

 

create table store
stored as orc
as select * from  tpdc_text.store
CLUSTER BY s_store_sk

 

create table date_dim
stored as orc
as select * from tpdc_text.date_dim;

 

create table store_returns
(
      sr_return_time_sk bigint
,     sr_item_sk bigint
,     sr_customer_sk bigint
,     sr_cdemo_sk bigint
,     sr_hdemo_sk bigint
,     sr_addr_sk bigint
,     sr_store_sk bigint
,     sr_reason_sk bigint
,     sr_ticket_number bigint
,     sr_return_quantity int
,     sr_return_amt decimal(7,2)
,     sr_return_tax decimal(7,2)
,     sr_return_amt_inc_tax decimal(7,2)
,     sr_fee decimal(7,2) 
,     sr_return_ship_cost decimal(7,2)
,     sr_refunded_cash decimal(7,2)
,     sr_reversed_charge decimal(7,2)
,     sr_store_credit decimal(7,2)
,     sr_net_loss decimal(7,2)
)
partitioned by (sr_returned_date_sk bigint)
stored as orc;

 

when,I changed this code,I can run succeed

 

!image-2022-03-30-14-58-32-823.png!

 

 



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org