You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@hive.apache.org by "Szehon Ho (JIRA)" <ji...@apache.org> on 2013/11/12 03:16:17 UTC

[jira] [Created] (HIVE-5797) select * throws exception for orcfile table after adding new columns

Szehon Ho created HIVE-5797:
-------------------------------

             Summary: select * throws exception for orcfile table after adding new columns
                 Key: HIVE-5797
                 URL: https://issues.apache.org/jira/browse/HIVE-5797
             Project: Hive
          Issue Type: Bug
          Components: Serializers/Deserializers
            Reporter: Szehon Ho


Given the following tables:
select * from two;
-------+
id	 a
-------+
1	 a
2	 b
-------+
select * from three;
---------
id	 a	 b
---------
1	 a	 z
2	 b	 y
3	 c	 x
---------
Execute the following steps:
create table testrc (id bigint, a string) stored as rcfile;
insert into table testrc select * from two;
select * from testrc; //returns correctly
-------+
id	 a
-------+
1	 a
2	 b
-------+
alter table testrc add columns (b string);
insert into table testrc select * from three;
select * from testrc; 

throws the following exception:
Thread [pool-4-thread-1] (Suspended (exception ArrayIndexOutOfBoundsException))	
	OrcStruct$OrcStructInspector.getStructFieldData(Object, StructField) line: 210	
	DelegatedStructObjectInspector.getStructFieldData(Object, StructField) line: 85	
	ExprNodeColumnEvaluator._evaluate(Object, int) line: 94	
	ExprNodeColumnEvaluator(ExprNodeEvaluator<T>).evaluate(Object, int) line: 77	
	ExprNodeColumnEvaluator(ExprNodeEvaluator<T>).evaluate(Object) line: 65	
	SelectOperator.processOp(Object, int) line: 79	
	SelectOperator(Operator<T>).process(Object, int) line: 488	
	TableScanOperator(Operator<T>).forward(Object, ObjectInspector) line: 826	
	TableScanOperator.processOp(Object, int) line: 91	
	TableScanOperator(Operator<T>).process(Object, int) line: 488	
	FetchOperator.pushRow(InspectableObject) line: 518	
	FetchOperator.pushRow() line: 510	
	FetchTask.fetch(ArrayList<String>) line: 138	
	Driver.getResults(ArrayList<String>) line: 1499	
	SQLOperation.getNextRowSet(FetchOrientation, long) line: 219	
	OperationManager.getOperationNextRowSet(OperationHandle, FetchOrientation, long) line: 171	
	HiveSessionImpl.fetchResults(OperationHandle, FetchOrientation, long) line: 420	
	CLIService.fetchResults(OperationHandle, FetchOrientation, long) line: 318	
	ThriftBinaryCLIService(ThriftCLIService).FetchResults(TFetchResultsReq) line: 386	
	TCLIService$Processor$FetchResults<I>.getResult(I, FetchResults_args) line: 1373	
	TCLIService$Processor$FetchResults<I>.getResult(Object, TBase) line: 1358	
	TCLIService$Processor$FetchResults<I>(ProcessFunction<I,T>).process(int, TProtocol, TProtocol, I) line: 39	
	TCLIService$Processor<I>(TBaseProcessor<I>).process(TProtocol, TProtocol) line: 39	
	TUGIContainingProcessor$1.run() line: 58	
	TUGIContainingProcessor$1.run() line: 55	
	AccessController.doPrivileged(PrivilegedExceptionAction<T>, AccessControlContext) line: not available [native method]	
	Subject.doAs(Subject, PrivilegedExceptionAction<T>) line: 415	
	UserGroupInformation.doAs(PrivilegedExceptionAction<T>) line: 1485	
	Hadoop23Shims(HadoopShimsSecure).doAs(UserGroupInformation, PrivilegedExceptionAction<T>) line: 471	
	TUGIContainingProcessor.process(TProtocol, TProtocol) line: 55	
	TThreadPoolServer$WorkerProcess.run() line: 206	
	ThreadPoolExecutor.runWorker(ThreadPoolExecutor$Worker) line: 1145	
	ThreadPoolExecutor$Worker.run() line: 615	
	Thread.run() line: 724	




--
This message was sent by Atlassian JIRA
(v6.1#6144)