You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@hive.apache.org by "Teddy Choi (JIRA)" <ji...@apache.org> on 2017/02/03 12:31:51 UTC

[jira] [Assigned] (HIVE-15789) Vectorization: limit reduce vectorization to 32Mb chunks

     [ https://issues.apache.org/jira/browse/HIVE-15789?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Teddy Choi reassigned HIVE-15789:
---------------------------------

    Assignee: Teddy Choi

> Vectorization: limit reduce vectorization to 32Mb chunks
> --------------------------------------------------------
>
>                 Key: HIVE-15789
>                 URL: https://issues.apache.org/jira/browse/HIVE-15789
>             Project: Hive
>          Issue Type: Bug
>          Components: Vectorization
>            Reporter: Gopal V
>            Assignee: Teddy Choi
>
> Reduce vectorization accumulates 1024 rows before forwarding it into the reduce processor.
> Add a safety limit for 32Mb of writables, so that shorter sequences can be forwarded into the operator trees.
> {code}
>         rowIdx++;
>         if (rowIdx >= BATCH_SIZE) {
>           VectorizedBatchUtil.setBatchSize(batch, rowIdx);
>           reducer.process(batch, tag);
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)