You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@kudu.apache.org by "YifanZhang (Jira)" <ji...@apache.org> on 2020/09/28 09:55:00 UTC

[jira] [Created] (KUDU-3198) Unable to delete a full row from a table with 64 columns when using java client

YifanZhang created KUDU-3198:
--------------------------------

             Summary: Unable to delete a full row from a table with 64 columns when using java client
                 Key: KUDU-3198
                 URL: https://issues.apache.org/jira/browse/KUDU-3198
             Project: Kudu
          Issue Type: Bug
          Components: java
    Affects Versions: 1.13.0, 1.12.0
            Reporter: YifanZhang


We recently got an error when deleted full rows from a table with 64 columns using sparkSQL, however if we delete a column, this error will not appear. The error is:
{code:java}
Failed to write at least 1000 rows to Kudu; Sample errors: Not implemented: Unknown row operation type (error 0){code}
I tested this by deleting a full row from a table with 64 column using java client 1.12.0/1.13.0, if the row is set NULL for some columns, I got an error:
{code:java}
Row error for primary key=[-128, 0, 0, 1], tablet=null, server=d584b3407ea444519e91b32f2744b162, status=Invalid argument: DELETE should not have a value for column: c63 STRING NULLABLE (error 0)
{code}
if the row is set values for all columns , I got an error like:
{code:java}
Row error for primary key=[-128, 0, 0, 1], tablet=null, server=null, status=Corruption: Not enough data for column: c63 STRING NULLABLE (error 0)
{code}
I also tested this with tables with different number of columns. The weird thing is I could delete full rows from a table with 8/16/32/63/65 columns,  but couldn't do this if the table has 64/128 columns.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)