You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@hive.apache.org by "Hive QA (JIRA)" <ji...@apache.org> on 2017/08/21 20:08:00 UTC

[jira] [Commented] (HIVE-17365) Druid CTAS should support CHAR/VARCHAR type

    [ https://issues.apache.org/jira/browse/HIVE-17365?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16135734#comment-16135734 ] 

Hive QA commented on HIVE-17365:
--------------------------------



Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12882937/HIVE-17365.patch

{color:green}SUCCESS:{color} +1 due to 1 test(s) being added or modified.

{color:red}ERROR:{color} -1 due to 7 failed/errored test(s), 10994 tests executed
*Failed tests:*
{noformat}
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[subquery_notexists_having] (batchId=81)
org.apache.hadoop.hive.cli.TestMiniSparkOnYarnCliDriver.testCliDriver[spark_vectorized_dynamic_partition_pruning] (batchId=169)
org.apache.hadoop.hive.cli.TestPerfCliDriver.testCliDriver[query14] (batchId=235)
org.apache.hadoop.hive.cli.TestPerfCliDriver.testCliDriver[query23] (batchId=235)
org.apache.hive.hcatalog.api.TestHCatClient.testPartitionRegistrationWithCustomSchema (batchId=180)
org.apache.hive.hcatalog.api.TestHCatClient.testPartitionSpecRegistrationWithCustomSchema (batchId=180)
org.apache.hive.hcatalog.api.TestHCatClient.testTableSchemaPropagation (batchId=180)
{noformat}

Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/6473/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/6473/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-6473/

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.TestCheckPhase
Executing org.apache.hive.ptest.execution.PrepPhase
Executing org.apache.hive.ptest.execution.ExecutionPhase
Executing org.apache.hive.ptest.execution.ReportingPhase
Tests exited with: TestsFailedException: 7 tests failed
{noformat}

This message is automatically generated.

ATTACHMENT ID: 12882937 - PreCommit-HIVE-Build

> Druid CTAS should support CHAR/VARCHAR type
> -------------------------------------------
>
>                 Key: HIVE-17365
>                 URL: https://issues.apache.org/jira/browse/HIVE-17365
>             Project: Hive
>          Issue Type: Bug
>          Components: Druid integration
>    Affects Versions: 3.0.0
>            Reporter: Dileep Kumar Chiguruvada
>            Assignee: Jesus Camacho Rodriguez
>         Attachments: HIVE-17365.patch
>
>
> Currently this type is not recognized and we throw an exception when we try to create a table with it.
> {noformat}
> Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: org.apache.hadoop.hive.serde2.SerDeException: Unknown type: CHAR
> 	at org.apache.hadoop.hive.ql.exec.FileSinkOperator.process(FileSinkOperator.java:788)
> 	at org.apache.hadoop.hive.ql.exec.vector.VectorFileSinkOperator.process(VectorFileSinkOperator.java:101)
> 	at org.apache.hadoop.hive.ql.exec.Operator.baseForward(Operator.java:955)
> 	at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:903)
> 	at org.apache.hadoop.hive.ql.exec.vector.VectorSelectOperator.process(VectorSelectOperator.java:145)
> 	at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordSource.processVectorGroup(ReduceRecordSource.java:478)
> 	... 19 more
> Caused by: org.apache.hadoop.hive.serde2.SerDeException: Unknown type: CHAR
> 	at org.apache.hadoop.hive.druid.serde.DruidSerDe.serialize(DruidSerDe.java:501)
> 	at org.apache.hadoop.hive.ql.exec.FileSinkOperator.process(FileSinkOperator.java:715)
> 	... 24 more
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)