You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@hive.apache.org by "Kevin Wilfong (JIRA)" <ji...@apache.org> on 2014/03/02 01:04:25 UTC

[jira] [Updated] (HIVE-4975) Reading orc file throws exception after adding new column

     [ https://issues.apache.org/jira/browse/HIVE-4975?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Kevin Wilfong updated HIVE-4975:
--------------------------------

    Attachment: HIVE-4975.1.patch.txt

> Reading orc file throws exception after adding new column
> ---------------------------------------------------------
>
>                 Key: HIVE-4975
>                 URL: https://issues.apache.org/jira/browse/HIVE-4975
>             Project: Hive
>          Issue Type: Bug
>          Components: File Formats
>    Affects Versions: 0.11.0
>         Environment: hive 0.11.0 hadoop 1.0.0
>            Reporter: cyril liao
>            Assignee: Kevin Wilfong
>            Priority: Critical
>              Labels: orcfile
>         Attachments: HIVE-4975.1.patch.txt
>
>
> ORC file read failure after add table column.
> create a table which have three column .(a string,b string,c string).
> add a new column after c by executing "ALTER TABLE table ADD COLUMNS (d string)".
> execute hiveql "select d from table",the following exception goes:
> java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row [Error getting row data with exception java.lang.ArrayIndexOutOfBoundsException: 4
> 	at org.apache.hadoop.hive.ql.io.orc.OrcStruct$OrcStructInspector.getStructFieldData(OrcStruct.java:206)
> 	at org.apache.hadoop.hive.serde2.objectinspector.UnionStructObjectInspector.getStructFieldData(UnionStructObjectInspector.java:128)
> 	at org.apache.hadoop.hive.serde2.SerDeUtils.buildJSONString(SerDeUtils.java:371)
> 	at org.apache.hadoop.hive.serde2.SerDeUtils.getJSONString(SerDeUtils.java:236)
> 	at org.apache.hadoop.hive.serde2.SerDeUtils.getJSONString(SerDeUtils.java:222)
> 	at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:665)
> 	at org.apache.hadoop.hive.ql.exec.ExecMapper.map(ExecMapper.java:144)
> 	at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50)
> 	at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:436)
> 	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)
> 	at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.security.auth.Subject.doAs(Subject.java:396)
> 	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1083)
> 	at org.apache.hadoop.mapred.Child.main(Child.java:249)
>  ]
> 	at org.apache.hadoop.hive.ql.exec.ExecMapper.map(ExecMapper.java:162)
> 	at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50)
> 	at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:436)
> 	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)
> 	at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.security.auth.Subject.doAs(Subject.java:396)
> 	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1083)
> 	at org.apache.hadoop.mapred.Child.main(Child.java:249)
> Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row [Error getting row data with exception java.lang.ArrayIndexOutOfBoundsException: 4
> 	at org.apache.hadoop.hive.ql.io.orc.OrcStruct$OrcStructInspector.getStructFieldData(OrcStruct.java:206)
> 	at org.apache.hadoop.hive.serde2.objectinspector.UnionStructObjectInspector.getStructFieldData(UnionStructObjectInspector.java:128)
> 	at org.apache.hadoop.hive.serde2.SerDeUtils.buildJSONString(SerDeUtils.java:371)
> 	at org.apache.hadoop.hive.serde2.SerDeUtils.getJSONString(SerDeUtils.java:236)
> 	at org.apache.hadoop.hive.serde2.SerDeUtils.getJSONString(SerDeUtils.java:222)
> 	at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:665)
> 	at org.apache.hadoop.hive.ql.exec.ExecMapper.map(ExecMapper.java:144)
> 	at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50)
> 	at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:436)
> 	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)
> 	at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.security.auth.Subject.doAs(Subject.java:396)
> 	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1083)
> 	at org.apache.hadoop.mapred.Child.main(Child.java:249)
>  ]
> 	at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:671)
> 	at org.apache.hadoop.hive.ql.exec.ExecMapper.map(ExecMapper.java:144)
> 	... 8 more
> Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Error evaluating d
> 	at org.apache.hadoop.hive.ql.exec.SelectOperator.processOp(SelectOperator.java:80)
> 	at org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:502)
> 	at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:832)
> 	at org.apache.hadoop.hive.ql.exec.TableScanOperator.processOp(TableScanOperator.java:90)
> 	at org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:502)
> 	at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:832)
> 	at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:654)
> 	... 9 more
> Caused by: java.lang.ArrayIndexOutOfBoundsException: 4
> 	at org.apache.hadoop.hive.ql.io.orc.OrcStruct$OrcStructInspector.getStructFieldData(OrcStruct.java:206)
> 	at org.apache.hadoop.hive.serde2.objectinspector.UnionStructObjectInspector.getStructFieldData(UnionStructObjectInspector.java:128)
> 	at org.apache.hadoop.hive.ql.exec.ExprNodeColumnEvaluator.evaluate(ExprNodeColumnEvaluator.java:98)
> 	at org.apache.hadoop.hive.ql.exec.SelectOperator.processOp(SelectOperator.java:76)
> 	... 15 more
> 2013-08-01 23:34:22,883 INFO org.apache.hadoop.mapred.Task: Runnning cleanup for the task



--
This message was sent by Atlassian JIRA
(v6.1.5#6160)