You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@carbondata.apache.org by "Chetan Bhat (JIRA)" <ji...@apache.org> on 2018/08/06 07:32:00 UTC

[jira] [Updated] (CARBONDATA-2823) Alter table set local dictionary include after bloom creation and merge index on old V3 store fails throwing incorrect error

     [ https://issues.apache.org/jira/browse/CARBONDATA-2823?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Chetan Bhat updated CARBONDATA-2823:
------------------------------------
    Description: 
Steps :
 # create table
 # create bloom/lucene datamap
 # load data
 # alter table set tblProperties

0: jdbc:hive2://10.18.98.101:22550/default> CREATE TABLE uniqdata_load (CUST_ID int,CUST_NAME String,ACTIVE_EMUI_VERSION string, DOB timestamp, DOJ timestamp, BIGINT_COLUMN1 bigint,BIGINT_COLUMN2 bigint,DECIMAL_COLUMN1 decimal(30,10), DECIMAL_COLUMN2 decimal(36,36),Double_COLUMN1 double, Double_COLUMN2 double,INTEGER_COLUMN1 int) STORED BY 'org.apache.carbondata.format';
+---------+--+
| Result |
+---------+--+
+---------+--+
No rows selected (1.43 seconds)
0: jdbc:hive2://10.18.98.101:22550/default> CREATE DATAMAP dm_uniqdata1_tmstmp6 ON TABLE uniqdata_load USING 'bloomfilter' DMPROPERTIES ('INDEX_COLUMNS' = 'DOJ', 'BLOOM_SIZE'='640000', 'BLOOM_FPP'='0.00001');
+---------+--+
| Result |
+---------+--+
+---------+--+
No rows selected (0.828 seconds)
0: jdbc:hive2://10.18.98.101:22550/default> LOAD DATA INPATH 'hdfs://hacluster/chetan/2000_UniqData.csv' into table uniqdata_load OPTIONS('DELIMITER'=',' , 'QUOTECHAR'='"','BAD_RECORDS_ACTION'='FORCE','FILEHEADER'='CUST_ID,CUST_NAME,ACTIVE_EMUI_VERSION,DOB,DOJ,BIGINT_COLUMN1,BIGINT_COLUMN2,DECIMAL_COLUMN1,DECIMAL_COLUMN2,Double_COLUMN1,Double_COLUMN2,INTEGER_COLUMN1');
+---------+--+
| Result |
+---------+--+
+---------+--+
No rows selected (4.903 seconds)
0: jdbc:hive2://10.18.98.101:22550/default> alter table uniqdata_load set tblproperties('local_dictionary_include'='CUST_NAME');
Error: org.apache.carbondata.common.exceptions.sql.MalformedCarbonCommandException: streaming is not supported for index datamap (state=,code=0)

 

Issue : Alter table set local dictionary include fails with incorrect error.

0: jdbc:hive2://10.18.98.101:22550/default> alter table uniqdata_load set tblproperties('local_dictionary_include'='CUST_NAME');

*Error: org.apache.carbondata.common.exceptions.sql.MalformedCarbonCommandException: streaming is not supported for index datamap (state=,code=0)*

 

Expected : Operation should be success. If the operation is unsupported it should throw correct error message.

 

  was:
Steps :

In old version V3 store create table and load data.

CREATE TABLE uniqdata_load (CUST_ID int,CUST_NAME String,ACTIVE_EMUI_VERSION string, DOB timestamp, DOJ timestamp, BIGINT_COLUMN1 bigint,BIGINT_COLUMN2 bigint,DECIMAL_COLUMN1 decimal(30,10), DECIMAL_COLUMN2 decimal(36,36),Double_COLUMN1 double, Double_COLUMN2 double,INTEGER_COLUMN1 int) STORED BY 'org.apache.carbondata.format';
LOAD DATA INPATH 'hdfs://hacluster/chetan/2000_UniqData.csv' into table uniqdata_load OPTIONS('DELIMITER'=',' , 'QUOTECHAR'='"','BAD_RECORDS_ACTION'='FORCE','FILEHEADER'='CUST_ID,CUST_NAME,ACTIVE_EMUI_VERSION,DOB,DOJ,BIGINT_COLUMN1,BIGINT_COLUMN2,DECIMAL_COLUMN1,DECIMAL_COLUMN2,Double_COLUMN1,Double_COLUMN2,INTEGER_COLUMN1');

In 1.4.1 version refresh the table of old V3 store.

refresh table uniqdata_load;

Create bloom filter and merge index.

CREATE DATAMAP dm_uniqdata1_tmstmp ON TABLE uniqdata_load USING 'bloomfilter' DMPROPERTIES ('INDEX_COLUMNS' = 'DOJ', 'BLOOM_SIZE'='640000', 'BLOOM_FPP'='0.00001');

Alter table set local dictionary include.

 alter table uniqdata_load set tblproperties('local_dictionary_include'='CUST_NAME');

 

Issue : Alter table set local dictionary include fails with incorrect error.

0: jdbc:hive2://10.18.98.101:22550/default> alter table uniqdata_load set tblproperties('local_dictionary_include'='CUST_NAME');

*Error: org.apache.carbondata.common.exceptions.sql.MalformedCarbonCommandException: streaming is not supported for index datamap (state=,code=0)*

 

Expected : Operation should be success. If the operation is unsupported it should throw correct error message.

 


> Alter table set local dictionary include after bloom creation and merge index on old V3 store fails throwing incorrect error
> ----------------------------------------------------------------------------------------------------------------------------
>
>                 Key: CARBONDATA-2823
>                 URL: https://issues.apache.org/jira/browse/CARBONDATA-2823
>             Project: CarbonData
>          Issue Type: Bug
>          Components: data-query
>    Affects Versions: 1.4.1
>         Environment: Spark 2.1
>            Reporter: Chetan Bhat
>            Assignee: xuchuanyin
>            Priority: Minor
>
> Steps :
>  # create table
>  # create bloom/lucene datamap
>  # load data
>  # alter table set tblProperties
> 0: jdbc:hive2://10.18.98.101:22550/default> CREATE TABLE uniqdata_load (CUST_ID int,CUST_NAME String,ACTIVE_EMUI_VERSION string, DOB timestamp, DOJ timestamp, BIGINT_COLUMN1 bigint,BIGINT_COLUMN2 bigint,DECIMAL_COLUMN1 decimal(30,10), DECIMAL_COLUMN2 decimal(36,36),Double_COLUMN1 double, Double_COLUMN2 double,INTEGER_COLUMN1 int) STORED BY 'org.apache.carbondata.format';
> +---------+--+
> | Result |
> +---------+--+
> +---------+--+
> No rows selected (1.43 seconds)
> 0: jdbc:hive2://10.18.98.101:22550/default> CREATE DATAMAP dm_uniqdata1_tmstmp6 ON TABLE uniqdata_load USING 'bloomfilter' DMPROPERTIES ('INDEX_COLUMNS' = 'DOJ', 'BLOOM_SIZE'='640000', 'BLOOM_FPP'='0.00001');
> +---------+--+
> | Result |
> +---------+--+
> +---------+--+
> No rows selected (0.828 seconds)
> 0: jdbc:hive2://10.18.98.101:22550/default> LOAD DATA INPATH 'hdfs://hacluster/chetan/2000_UniqData.csv' into table uniqdata_load OPTIONS('DELIMITER'=',' , 'QUOTECHAR'='"','BAD_RECORDS_ACTION'='FORCE','FILEHEADER'='CUST_ID,CUST_NAME,ACTIVE_EMUI_VERSION,DOB,DOJ,BIGINT_COLUMN1,BIGINT_COLUMN2,DECIMAL_COLUMN1,DECIMAL_COLUMN2,Double_COLUMN1,Double_COLUMN2,INTEGER_COLUMN1');
> +---------+--+
> | Result |
> +---------+--+
> +---------+--+
> No rows selected (4.903 seconds)
> 0: jdbc:hive2://10.18.98.101:22550/default> alter table uniqdata_load set tblproperties('local_dictionary_include'='CUST_NAME');
> Error: org.apache.carbondata.common.exceptions.sql.MalformedCarbonCommandException: streaming is not supported for index datamap (state=,code=0)
>  
> Issue : Alter table set local dictionary include fails with incorrect error.
> 0: jdbc:hive2://10.18.98.101:22550/default> alter table uniqdata_load set tblproperties('local_dictionary_include'='CUST_NAME');
> *Error: org.apache.carbondata.common.exceptions.sql.MalformedCarbonCommandException: streaming is not supported for index datamap (state=,code=0)*
>  
> Expected : Operation should be success. If the operation is unsupported it should throw correct error message.
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)