You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@carbondata.apache.org by "xuchuanyin (JIRA)" <ji...@apache.org> on 2018/08/06 07:22:00 UTC

[jira] [Commented] (CARBONDATA-2823) Alter table set local dictionary include after bloom creation and merge index on old V3 store fails throwing incorrect error

    [ https://issues.apache.org/jira/browse/CARBONDATA-2823?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16569823#comment-16569823 ] 

xuchuanyin commented on CARBONDATA-2823:
----------------------------------------

As for CARBONDATA-2823, it can simply reproduced by
1. create table
2. create bloom/lucene datamap
3. load data
4. alter table set tblProperties

> Alter table set local dictionary include after bloom creation and merge index on old V3 store fails throwing incorrect error
> ----------------------------------------------------------------------------------------------------------------------------
>
>                 Key: CARBONDATA-2823
>                 URL: https://issues.apache.org/jira/browse/CARBONDATA-2823
>             Project: CarbonData
>          Issue Type: Bug
>          Components: data-query
>    Affects Versions: 1.4.1
>         Environment: Spark 2.1
>            Reporter: Chetan Bhat
>            Priority: Minor
>
> Steps :
> In old version V3 store create table and load data.
> CREATE TABLE uniqdata_load (CUST_ID int,CUST_NAME String,ACTIVE_EMUI_VERSION string, DOB timestamp, DOJ timestamp, BIGINT_COLUMN1 bigint,BIGINT_COLUMN2 bigint,DECIMAL_COLUMN1 decimal(30,10), DECIMAL_COLUMN2 decimal(36,36),Double_COLUMN1 double, Double_COLUMN2 double,INTEGER_COLUMN1 int) STORED BY 'org.apache.carbondata.format';
> LOAD DATA INPATH 'hdfs://hacluster/chetan/2000_UniqData.csv' into table uniqdata_load OPTIONS('DELIMITER'=',' , 'QUOTECHAR'='"','BAD_RECORDS_ACTION'='FORCE','FILEHEADER'='CUST_ID,CUST_NAME,ACTIVE_EMUI_VERSION,DOB,DOJ,BIGINT_COLUMN1,BIGINT_COLUMN2,DECIMAL_COLUMN1,DECIMAL_COLUMN2,Double_COLUMN1,Double_COLUMN2,INTEGER_COLUMN1');
> In 1.4.1 version refresh the table of old V3 store.
> refresh table uniqdata_load;
> Create bloom filter and merge index.
> CREATE DATAMAP dm_uniqdata1_tmstmp ON TABLE uniqdata_load USING 'bloomfilter' DMPROPERTIES ('INDEX_COLUMNS' = 'DOJ', 'BLOOM_SIZE'='640000', 'BLOOM_FPP'='0.00001');
> Alter table set local dictionary include.
>  alter table uniqdata_load set tblproperties('local_dictionary_include'='CUST_NAME');
>  
> Issue : Alter table set local dictionary include fails with incorrect error.
> 0: jdbc:hive2://10.18.98.101:22550/default> alter table uniqdata_load set tblproperties('local_dictionary_include'='CUST_NAME');
> *Error: org.apache.carbondata.common.exceptions.sql.MalformedCarbonCommandException: streaming is not supported for index datamap (state=,code=0)*
>  
> Expected : Operation should be success. If the operation is unsupported it should throw correct error message.
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)