You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@hive.apache.org by "Matt McCline (JIRA)" <ji...@apache.org> on 2014/07/18 07:50:04 UTC
[jira] [Commented] (HIVE-7442)
ql.exec.vector.expressions.gen.DecimalColAddDecimalScalar.evaluate throws
ClassCastException: ...LongColumnVector cannot be cast to
...DecimalColumnVector
[ https://issues.apache.org/jira/browse/HIVE-7442?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14066090#comment-14066090 ]
Matt McCline commented on HIVE-7442:
------------------------------------
Here is the explain output with SPECIAL annotation showing the VectorExpression(s):
{code}
STAGE DEPENDENCIES:
Stage-1 is a root stage
Stage-0 depends on stages: Stage-1
STAGE PLANS:
Stage: Stage-1
Map Reduce
Map Operator Tree:
TableScan
alias: b
Statistics: Num rows: 500 Data size: 101000 Basic stats: COMPLETE Column stats: NONE
Filter Operator
predicate: (key + 450) is not null (type: boolean)
Statistics: Num rows: 250 Data size: 50500 Basic stats: COMPLETE Column stats: NONE
vector filter expressions: SelectColumnIsNotNull[-1](CastDecimalToBoolean[3](DecimalColAddDecimalScalar[2]))
Reduce Output Operator
key expressions: (key + 450) (type: decimal(4,0))
sort order: +
Map-reduce partition columns: (key + 450) (type: decimal(4,0))
Statistics: Num rows: 250 Data size: 50500 Basic stats: COMPLETE Column stats: NONE
value expressions: key (type: decimal(3,0)), value (type: string)
vector value expressions: IdentityExpression[0], IdentityExpression[1]
TableScan
alias: a
Statistics: Num rows: 500 Data size: 101000 Basic stats: COMPLETE Column stats: NONE
Filter Operator
predicate: key is not null (type: boolean)
Statistics: Num rows: 250 Data size: 50500 Basic stats: COMPLETE Column stats: NONE
vector filter expressions: SelectColumnIsNotNull[-1](CastDecimalToBoolean[2])
Reduce Output Operator
key expressions: key (type: decimal(3,0))
sort order: +
Map-reduce partition columns: key (type: decimal(3,0))
Statistics: Num rows: 250 Data size: 50500 Basic stats: COMPLETE Column stats: NONE
value expressions: value (type: string)
vector value expressions: IdentityExpression[1]
Execution mode: vectorized
Reduce Operator Tree:
Join Operator
condition map:
Inner Join 0 to 1
condition expressions:
0 {KEY.reducesinkkey0} {VALUE._col0}
1 {VALUE._col0} {VALUE._col1}
outputColumnNames: _col0, _col1, _col4, _col5
Statistics: Num rows: 275 Data size: 55550 Basic stats: COMPLETE Column stats: NONE
Select Operator
expressions: _col0 (type: decimal(3,0)), _col1 (type: string), _col4 (type: decimal(3,0)), _col5 (type: string)
outputColumnNames: _col0, _col1, _col2, _col3
Statistics: Num rows: 275 Data size: 55550 Basic stats: COMPLETE Column stats: NONE
File Output Operator
compressed: false
Statistics: Num rows: 275 Data size: 55550 Basic stats: COMPLETE Column stats: NONE
table:
input format: org.apache.hadoop.mapred.TextInputFormat
output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat
serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe
Stage: Stage-0
Fetch Operator
limit: -1
Processor Tree:
ListSink
{code}
> ql.exec.vector.expressions.gen.DecimalColAddDecimalScalar.evaluate throws ClassCastException: ...LongColumnVector cannot be cast to ...DecimalColumnVector
> ----------------------------------------------------------------------------------------------------------------------------------------------------------
>
> Key: HIVE-7442
> URL: https://issues.apache.org/jira/browse/HIVE-7442
> Project: Hive
> Issue Type: Bug
> Reporter: Matt McCline
> Assignee: Matt McCline
>
> Took decimal_join.q and converted it to read from ORC and turned on vectorization:
> vector_decimal_join.q
> {code}
> SET hive.vectorized.execution.enabled=true;
> -- HIVE-5292 Join on decimal columns fails
> create table src_dec_staging (key decimal(3,0), value string);
> load data local inpath '../../data/files/kv1.txt' into table src_dec_staging;
> create table src_dec (key decimal(3,0), value string) stored as orc;
> insert overwrite table src_dec select * from src_dec_staging;
> explain select * from src_dec a join src_dec b on a.key=b.key+450;
> select * from src_dec a join src_dec b on a.key=b.key+450;
> {code}
> Stack trace:
> {code}
> java.lang.Exception: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row
> at org.apache.hadoop.mapred.LocalJobRunner$Job.runTasks(LocalJobRunner.java:462)
> at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:522)
> Caused by: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row
> at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:195)
> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:430)
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
> at org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:243)
> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:439)
> at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
> at java.util.concurrent.FutureTask.run(FutureTask.java:138)
> at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895)
> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
> at java.lang.Thread.run(Thread.java:695)
> Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row
> at org.apache.hadoop.hive.ql.exec.vector.VectorMapOperator.process(VectorMapOperator.java:45)
> at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:177)
> ... 10 more
> Caused by: java.lang.ClassCastException: org.apache.hadoop.hive.ql.exec.vector.LongColumnVector cannot be cast to org.apache.hadoop.hive.ql.exec.vector.DecimalColumnVector
> at org.apache.hadoop.hive.ql.exec.vector.expressions.gen.DecimalColAddDecimalScalar.evaluate(DecimalColAddDecimalScalar.java:60)
> at org.apache.hadoop.hive.ql.exec.vector.expressions.VectorExpression.evaluateChildren(VectorExpression.java:112)
> at org.apache.hadoop.hive.ql.exec.vector.expressions.FuncDecimalToLong.evaluate(FuncDecimalToLong.java:51)
> at org.apache.hadoop.hive.ql.exec.vector.expressions.VectorExpression.evaluateChildren(VectorExpression.java:112)
> at org.apache.hadoop.hive.ql.exec.vector.expressions.SelectColumnIsNotNull.evaluate(SelectColumnIsNotNull.java:45)
> at org.apache.hadoop.hive.ql.exec.vector.VectorFilterOperator.processOp(VectorFilterOperator.java:91)
> at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:800)
> at org.apache.hadoop.hive.ql.exec.TableScanOperator.processOp(TableScanOperator.java:95)
> at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:800)
> at org.apache.hadoop.hive.ql.exec.vector.VectorMapOperator.process(VectorMapOperator.java:43)
> ... 11 more
> {code}
--
This message was sent by Atlassian JIRA
(v6.2#6252)