You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hive.apache.org by "Chang.Wu" <58...@qq.com> on 2017/06/06 04:25:26 UTC
How to setup the max memory for my big Hive SQL which is on MapReduce of Yarn
My Hive engine is MapReduce and Yarn. What my urgent need is to limit the memory usage of my big sql so that my bigger sql will run a longer time instead of using up all the resource of queue or even all the resource of my whole yarn cluster at once.
But I cannot find any solution to define my sql resource-usage upper threshold.
Any one can give me some suggestions.
583424568@qq.com
Re: How to setup the max memory for my big Hive SQL which is on
MapReduce of Yarn
Posted by Stephen Sprague <sp...@gmail.com>.
have you researched the yarn schedulers? namely the capacity and fair
schedulers? those are the places where resource limits can be easily
defined.
On Mon, Jun 5, 2017 at 9:25 PM, Chang.Wu <58...@qq.com> wrote:
> My Hive engine is MapReduce and Yarn. What my urgent need is to limit the
> memory usage of my big sql so that my bigger sql will run a longer time
> instead of using up all the resource of queue or even all the resource of
> my whole yarn cluster at once.
> But I cannot find any solution to define my sql resource-usage upper
> threshold.
> Any one can give me some suggestions.
>
> 583424568@qq.com
>