You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@hive.apache.org by "ASF GitHub Bot (Jira)" <ji...@apache.org> on 2023/01/08 00:22:00 UTC

[jira] [Work logged] (HIVE-26713) StringExpr ArrayIndexOutOfBoundsException with LIKE '%xxx%'

     [ https://issues.apache.org/jira/browse/HIVE-26713?focusedWorklogId=837709&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-837709 ]

ASF GitHub Bot logged work on HIVE-26713:
-----------------------------------------

                Author: ASF GitHub Bot
            Created on: 08/Jan/23 00:21
            Start Date: 08/Jan/23 00:21
    Worklog Time Spent: 10m 
      Work Description: github-actions[bot] commented on PR #3739:
URL: https://github.com/apache/hive/pull/3739#issuecomment-1374667080

   This pull request has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs.
   Feel free to reach out on the dev@hive.apache.org list if the patch is in need of reviews.




Issue Time Tracking
-------------------

    Worklog Id:     (was: 837709)
    Time Spent: 0.5h  (was: 20m)

> StringExpr ArrayIndexOutOfBoundsException with LIKE '%xxx%'
> -----------------------------------------------------------
>
>                 Key: HIVE-26713
>                 URL: https://issues.apache.org/jira/browse/HIVE-26713
>             Project: Hive
>          Issue Type: Bug
>          Components: storage-api
>    Affects Versions: All Versions
>            Reporter: Ryu Kobayashi
>            Assignee: Ryu Kobayashi
>            Priority: Major
>              Labels: pull-request-available
>          Time Spent: 0.5h
>  Remaining Estimate: 0h
>
> When LIKE(%xxx%) search is performed, if the character string contains control characters, overflow occurs as follows.
> https://github.com/apache/hive/blob/master/storage-api/src/java/org/apache/hadoop/hive/ql/exec/vector/expressions/StringExpr.java#L345
> {code:java}
> // input[next] == -1
> // shift[input[next] & MAX_BYTE] == 255
> next += shift[input[next] & MAX_BYTE]; {code}
>  
> Stack trace:
> {code:java}
> TaskAttempt 3 failed, info=[Error: Error while running task ( failure ) : attempt_1665986828766_64791_1_00_000000_3:java.lang.RuntimeException: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row 
> 2	at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:220)
> 3	at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.run(TezProcessor.java:177)
> 4	at org.apache.tez.runtime.LogicalIOProcessorRuntimeTask.run(LogicalIOProcessorRuntimeTask.java:479)
> 5	at org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:73)
> 6	at org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:61)
> 7	at java.security.AccessController.doPrivileged(Native Method)
> 8	at javax.security.auth.Subject.doAs(Subject.java:422)
> 9	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1893)
> 10	at org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:61)
> 11	at org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:37)
> 12	at org.apache.tez.common.CallableWithNdc.call(CallableWithNdc.java:36)
> 13	at com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:108)
> 14	at com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:41)
> 15	at com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:77)
> 16	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> 17	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> 18	at java.lang.Thread.run(Thread.java:750)
> 19Caused by: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row 
> 20	at org.apache.hadoop.hive.ql.exec.tez.MapRecordSource.processRow(MapRecordSource.java:95)
> 21	at org.apache.hadoop.hive.ql.exec.tez.MapRecordSource.pushRecord(MapRecordSource.java:70)
> 22	at org.apache.hadoop.hive.ql.exec.tez.MapRecordProcessor.run(MapRecordProcessor.java:419)
> 23	at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:194)
> 24	... 16 more
> 25Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row 
> 26	at org.apache.hadoop.hive.ql.exec.vector.VectorMapOperator.process(VectorMapOperator.java:883)
> 27	at org.apache.hadoop.hive.ql.exec.tez.MapRecordSource.processRow(MapRecordSource.java:86)
> 28	... 19 more
> 29Caused by: java.lang.ArrayIndexOutOfBoundsException: 255
> 30	at org.apache.hadoop.hive.ql.exec.vector.expressions.StringExpr$BoyerMooreHorspool.find(StringExpr.java:409)
> 31	at org.apache.hadoop.hive.ql.exec.vector.expressions.AbstractFilterStringColLikeStringScalar$MiddleChecker.index(AbstractFilterStringColLikeStringScalar.java:314)
> 32	at org.apache.hadoop.hive.ql.exec.vector.expressions.AbstractFilterStringColLikeStringScalar$MiddleChecker.check(AbstractFilterStringColLikeStringScalar.java:307)
> 33	at org.apache.hadoop.hive.ql.exec.vector.expressions.AbstractFilterStringColLikeStringScalar.evaluate(AbstractFilterStringColLikeStringScalar.java:115)
> 34	at org.apache.hadoop.hive.ql.exec.vector.expressions.FilterExprOrExpr.evaluate(FilterExprOrExpr.java:183)
> 35	at org.apache.hadoop.hive.ql.exec.vector.expressions.FilterExprAndExpr.evaluate(FilterExprAndExpr.java:42)
> 36	at org.apache.hadoop.hive.ql.exec.vector.VectorFilterOperator.process(VectorFilterOperator.java:113)
> 37	at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:897)
> 38	at org.apache.hadoop.hive.ql.exec.TableScanOperator.process(TableScanOperator.java:130)
> 39	at org.apache.hadoop.hive.ql.exec.vector.VectorMapOperator.process(VectorMapOperator.java:783)
> 40	... 20 more
> 41]], Vertex did not suc {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)