You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@hive.apache.org by "Ayush Saxena (Jira)" <ji...@apache.org> on 2022/07/13 14:15:00 UTC

[jira] [Commented] (HIVE-26388) ClassCastException when there is decimal type column in source table of CTAS query

    [ https://issues.apache.org/jira/browse/HIVE-26388?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17566333#comment-17566333 ] 

Ayush Saxena commented on HIVE-26388:
-------------------------------------

The OpenCSV serde supports only String column types. It is nothing specific to Decimal type. If you change your query from decimal to Char. You still would get a ClassCastException like:
{noformat}
Caused by: java.lang.ClassCastException: org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableHiveCharObjectInspector cannot be cast to org.apache.hadoop.hive.serde2.objectinspector.primitive.StringObjectInspector
{noformat}
In the serialise method, it is mentioned as well the data should be of String type, that is why it goes and :
{noformat}
// The data must be of type String
final StringObjectInspector fieldStringOI = (StringObjectInspector) fieldOI;{noformat}
Not a regression, It would be like that only and kind of feature behaviour that it only supports String type.

 

little bit related: https://stackoverflow.com/questions/50001124/why-does-all-columns-get-created-as-string-when-i-use-opencsvserde-in-hive

> ClassCastException when there is decimal type column in source table of CTAS query
> ----------------------------------------------------------------------------------
>
>                 Key: HIVE-26388
>                 URL: https://issues.apache.org/jira/browse/HIVE-26388
>             Project: Hive
>          Issue Type: Bug
>            Reporter: Taraka Rama Rao Lethavadla
>            Priority: Major
>
> Steps to reproduce
> cat ql/src/test/queries/clientpositive/ctas_open_csv_serde.q
> {noformat}
> create table T1(abc decimal(10,0));
> insert into table T1 values(1.25);
> create table T2 row format serde 'org.apache.hadoop.hive.serde2.OpenCSVSerde' with serdeproperties ("separatorChar" = ',' , "quoteChar" = '"') stored as textfile as select * from T1;{noformat}
> Then execute the test case with below command
> {noformat}
> mvn install -Pitests -pl itests/qtest -Dtest=TestMiniLlapLocalCliDriver -Dqfile=ctas_open_csv_serde.q -Dtest.output.overwrite{noformat}
> Exception trace looks like below
>  
> {noformat}
> [ERROR]   TestMiniLlapLocalCliDriver.testCliDriver:62 Client execution failed with error code = 2
> running
> create table T2 row format serde 'org.apache.hadoop.hive.serde2.OpenCSVSerde' with serdeproperties ("separatorChar" = ',' , "quoteChar" = '"') stored as textfile as select * from T1
> fname=ctas_open_csv_serde.q
> See ./ql/target/tmp/log/hive.log or ./itests/qtest/target/tmp/log/hive.log, or check ./ql/target/surefire-reports or ./itests/qtest/target/surefire-reports/ for specific test cases logs.
>  org.apache.hadoop.hive.ql.metadata.HiveException: Vertex failed, vertexName=Map 1, vertexId=vertex_1657718574697_0001_2_00, diagnostics=[Task failed, taskId=task_1657718574697_0001_2_00_000000, diagnostics=[TaskAttempt 0 failed, info=[Error: Error while running task ( failure ) : attempt_1657718574697_0001_2_00_000000_0:java.lang.RuntimeException: java.lang.RuntimeException: Hive Runtime Error while closing operators
>     at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:348)
>     at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.run(TezProcessor.java:276)
>     at org.apache.tez.runtime.LogicalIOProcessorRuntimeTask.run(LogicalIOProcessorRuntimeTask.java:381)
>     at org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:82)
>     at org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:69)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:422)
>     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1682)
>     at org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:69)
>     at org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:39)
>     at org.apache.tez.common.CallableWithNdc.call(CallableWithNdc.java:36)
>     at org.apache.hadoop.hive.llap.daemon.impl.StatsRecordingThreadPool$WrappedCallable.call(StatsRecordingThreadPool.java:118)
>     at java.util.concurrent.FutureTask.run(FutureTask.java:266)
>     at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>     at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>     at java.lang.Thread.run(Thread.java:748)
> Caused by: java.lang.RuntimeException: Hive Runtime Error while closing operators
>     at org.apache.hadoop.hive.ql.exec.tez.MapRecordProcessor.close(MapRecordProcessor.java:483)
>     at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:310)
>     ... 15 more
> Caused by: java.lang.ClassCastException: org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableHiveDecimalObjectInspector cannot be cast to org.apache.hadoop.hive.serde2.objectinspector.primitive.StringObjectInspector
>     at org.apache.hadoop.hive.serde2.OpenCSVSerde.serialize(OpenCSVSerde.java:119)
>     at org.apache.hadoop.hive.ql.exec.FileSinkOperator.process(FileSinkOperator.java:1116)
>     at org.apache.hadoop.hive.ql.exec.vector.VectorFileSinkOperator.process(VectorFileSinkOperator.java:111)
>     at org.apache.hadoop.hive.ql.exec.Operator.vectorForward(Operator.java:931)
>     at org.apache.hadoop.hive.ql.exec.vector.VectorSelectOperator.process(VectorSelectOperator.java:158)
>     at org.apache.hadoop.hive.ql.exec.Operator.vectorForward(Operator.java:919)
>     at org.apache.hadoop.hive.ql.exec.TableScanOperator.process(TableScanOperator.java:171)
>     at org.apache.hadoop.hive.ql.exec.vector.VectorMapOperator.closeOp(VectorMapOperator.java:1010)
>     at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:686)
>     at org.apache.hadoop.hive.ql.exec.tez.MapRecordProcessor.close(MapRecordProcessor.java:459)
>     ... 16 more
> ], TaskAttempt 1 failed, info=[Error: Error while running task ( failure ) : attempt_1657718574697_0001_2_00_000000_1:java.lang.RuntimeException: java.lang.RuntimeException: Hive Runtime Error while closing operators
>     at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:348)
>     at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.run(TezProcessor.java:276)
>     at org.apache.tez.runtime.LogicalIOProcessorRuntimeTask.run(LogicalIOProcessorRuntimeTask.java:381)
>     at org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:82)
>     at org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:69)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:422)
>     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1682)
>     at org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:69)
>     at org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:39)
>     at org.apache.tez.common.CallableWithNdc.call(CallableWithNdc.java:36)
>     at org.apache.hadoop.hive.llap.daemon.impl.StatsRecordingThreadPool$WrappedCallable.call(StatsRecordingThreadPool.java:118)
>     at java.util.concurrent.FutureTask.run(FutureTask.java:266)
>     at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>     at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>     at java.lang.Thread.run(Thread.java:748)
> Caused by: java.lang.RuntimeException: Hive Runtime Error while closing operators
>     at org.apache.hadoop.hive.ql.exec.tez.MapRecordProcessor.close(MapRecordProcessor.java:483)
>     at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:310)
>     ... 15 more
> Caused by: java.lang.ClassCastException: org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableHiveDecimalObjectInspector cannot be cast to org.apache.hadoop.hive.serde2.objectinspector.primitive.StringObjectInspector
>     at org.apache.hadoop.hive.serde2.OpenCSVSerde.serialize(OpenCSVSerde.java:119)
>     at org.apache.hadoop.hive.ql.exec.FileSinkOperator.process(FileSinkOperator.java:1116)
>     at org.apache.hadoop.hive.ql.exec.vector.VectorFileSinkOperator.process(VectorFileSinkOperator.java:111)
>     at org.apache.hadoop.hive.ql.exec.Operator.vectorForward(Operator.java:931)
>     at org.apache.hadoop.hive.ql.exec.vector.VectorSelectOperator.process(VectorSelectOperator.java:158)
>     at org.apache.hadoop.hive.ql.exec.Operator.vectorForward(Operator.java:919)
>     at org.apache.hadoop.hive.ql.exec.TableScanOperator.process(TableScanOperator.java:171)
>     at org.apache.hadoop.hive.ql.exec.vector.VectorMapOperator.closeOp(VectorMapOperator.java:1010)
>     at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:686)
>     at org.apache.hadoop.hive.ql.exec.tez.MapRecordProcessor.close(MapRecordProcessor.java:459)
>     ... 16 more
> {noformat}
>  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)