You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@carbondata.apache.org by "Liang Chen (JIRA)" <ji...@apache.org> on 2017/07/07 00:08:00 UTC
[jira] [Updated] (CARBONDATA-652) Cannot update a table with 1000
columns
[ https://issues.apache.org/jira/browse/CARBONDATA-652?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Liang Chen updated CARBONDATA-652:
----------------------------------
Fix Version/s: (was: 1.1.1)
NONE
> Cannot update a table with 1000 columns
> ---------------------------------------
>
> Key: CARBONDATA-652
> URL: https://issues.apache.org/jira/browse/CARBONDATA-652
> Project: CarbonData
> Issue Type: Bug
> Components: data-query
> Affects Versions: 1.0.0-incubating
> Environment: Spark 1.6
> Reporter: Deepti Bhardwaj
> Assignee: sounak chakraborty
> Priority: Minor
> Fix For: NONE
>
> Attachments: create-table-1000-columns, create-table-1000-columns-hive, data.csv, error-while-update.png, thrift-log
>
>
> I created a hive table and loaded it with data(data.csv).
> The commands for hive table are in attached file(create-table-1000-columns-hive)
> Then I created a carbon table and inserted data in it from the above hive table(see create-table-1000-columns)
> after which I fired the below query:
> update tablewith1000columns set (a1)=('testing!~~~!!!!') where a1='A1';
> and it gave java.lang.ArrayIndexOutOfBoundsException
> !https://issues.apache.org/jira/secure/attachment/12847806/error-while-update.png!
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)