You are viewing a plain text version of this content. The canonical link for it is here.
Posted to hdfs-user@hadoop.apache.org by Charles Robertson <ch...@gmail.com> on 2014/09/05 15:33:08 UTC

Map job not finishing

Hi all,

I'm using oozie to run a hive script, but the map job is not completing.
The tracking page shows its progress as 100%, and there's no warnings or
errors in the logs, it's just sitting there with a state of 'RUNNING'.

As best I can make out from the logs, the last statement in the hive script
has been successfully parsed and it tries to start the command, saying
"launching job 1 of 3". That job is sitting there in the "ACCEPTED" state,
but doing nothing.

This is on a single-node cluster running Hortonworks Data Platform 2.1. Can
anyone suggest what might be the cause, or where else to look for
diagnostic information?

Thanks,
Charles

Re: Map job not finishing

Posted by Ulul <ha...@ulul.org>.
Oops, you're using HDP 2.1 which means Hadoop 2.4, so property name is
mapreduce.tasktracker.map.tasks.maximum

and more importantly it should be irrelevant using Yarn for which map 
slots don't matter. Explanation anyone ?

Ulul

Le 07/09/2014 22:35, Ulul a écrit :
> Hi
>
> Adding an another TT may not be the only way, increasing
> mapred.tasktracker.map.tasks.maximum could also do the trick
>
> Explanation there : 
> http://www.thecloudavenue.com/2014/01/oozie-hangs-on-single-node-for-work-flow-with-fork.html
>
> Cheers
> Ulul
>
> Le 07/09/2014 01:01, Rich Haase a écrit :
>>
>> You're welcome.  Glad I could help.
>>
>> On Sep 6, 2014 9:56 AM, "Charles Robertson" 
>> <charles.robertson@gmail.com <ma...@gmail.com>> wrote:
>>
>>     Hi Rich,
>>
>>     Default setup, so presumably one. I opted to add a node rather
>>     than change the number of task trackers and it now runs successfully.
>>
>>     Thank you!
>>     Charles
>>
>>
>>     On 5 September 2014 16:44, Rich Haase <rdhaase@gmail.com
>>     <ma...@gmail.com>> wrote:
>>
>>         How many tasktrackers do you have setup for your single node
>>         cluster?  Oozie runs each action as a java program on an
>>         arbitrary cluster node, so running a workflow requires a
>>         minimum of two tasktrackers.
>>
>>
>>         On Fri, Sep 5, 2014 at 7:33 AM, Charles Robertson
>>         <charles.robertson@gmail.com
>>         <ma...@gmail.com>> wrote:
>>
>>             Hi all,
>>
>>             I'm using oozie to run a hive script, but the map job is
>>             not completing. The tracking page shows its progress as
>>             100%, and there's no warnings or errors in the logs, it's
>>             just sitting there with a state of 'RUNNING'.
>>
>>             As best I can make out from the logs, the last statement
>>             in the hive script has been successfully parsed and it
>>             tries to start the command, saying "launching job 1 of
>>             3". That job is sitting there in the "ACCEPTED" state,
>>             but doing nothing.
>>
>>             This is on a single-node cluster running Hortonworks Data
>>             Platform 2.1. Can anyone suggest what might be the cause,
>>             or where else to look for diagnostic information?
>>
>>             Thanks,
>>             Charles
>>
>>
>>
>>
>>         -- 
>>         *Kernighan's Law*
>>         "Debugging is twice as hard as writing the code in the first
>>         place.  Therefore, if you write the code as cleverly as
>>         possible, you are, by definition, not smart enough to debug it."
>>
>>
>


Re: Map job not finishing

Posted by Ulul <ha...@ulul.org>.
Oops, you're using HDP 2.1 which means Hadoop 2.4, so property name is
mapreduce.tasktracker.map.tasks.maximum

and more importantly it should be irrelevant using Yarn for which map 
slots don't matter. Explanation anyone ?

Ulul

Le 07/09/2014 22:35, Ulul a écrit :
> Hi
>
> Adding an another TT may not be the only way, increasing
> mapred.tasktracker.map.tasks.maximum could also do the trick
>
> Explanation there : 
> http://www.thecloudavenue.com/2014/01/oozie-hangs-on-single-node-for-work-flow-with-fork.html
>
> Cheers
> Ulul
>
> Le 07/09/2014 01:01, Rich Haase a écrit :
>>
>> You're welcome.  Glad I could help.
>>
>> On Sep 6, 2014 9:56 AM, "Charles Robertson" 
>> <charles.robertson@gmail.com <ma...@gmail.com>> wrote:
>>
>>     Hi Rich,
>>
>>     Default setup, so presumably one. I opted to add a node rather
>>     than change the number of task trackers and it now runs successfully.
>>
>>     Thank you!
>>     Charles
>>
>>
>>     On 5 September 2014 16:44, Rich Haase <rdhaase@gmail.com
>>     <ma...@gmail.com>> wrote:
>>
>>         How many tasktrackers do you have setup for your single node
>>         cluster?  Oozie runs each action as a java program on an
>>         arbitrary cluster node, so running a workflow requires a
>>         minimum of two tasktrackers.
>>
>>
>>         On Fri, Sep 5, 2014 at 7:33 AM, Charles Robertson
>>         <charles.robertson@gmail.com
>>         <ma...@gmail.com>> wrote:
>>
>>             Hi all,
>>
>>             I'm using oozie to run a hive script, but the map job is
>>             not completing. The tracking page shows its progress as
>>             100%, and there's no warnings or errors in the logs, it's
>>             just sitting there with a state of 'RUNNING'.
>>
>>             As best I can make out from the logs, the last statement
>>             in the hive script has been successfully parsed and it
>>             tries to start the command, saying "launching job 1 of
>>             3". That job is sitting there in the "ACCEPTED" state,
>>             but doing nothing.
>>
>>             This is on a single-node cluster running Hortonworks Data
>>             Platform 2.1. Can anyone suggest what might be the cause,
>>             or where else to look for diagnostic information?
>>
>>             Thanks,
>>             Charles
>>
>>
>>
>>
>>         -- 
>>         *Kernighan's Law*
>>         "Debugging is twice as hard as writing the code in the first
>>         place.  Therefore, if you write the code as cleverly as
>>         possible, you are, by definition, not smart enough to debug it."
>>
>>
>


Re: Map job not finishing

Posted by Ulul <ha...@ulul.org>.
Oops, you're using HDP 2.1 which means Hadoop 2.4, so property name is
mapreduce.tasktracker.map.tasks.maximum

and more importantly it should be irrelevant using Yarn for which map 
slots don't matter. Explanation anyone ?

Ulul

Le 07/09/2014 22:35, Ulul a écrit :
> Hi
>
> Adding an another TT may not be the only way, increasing
> mapred.tasktracker.map.tasks.maximum could also do the trick
>
> Explanation there : 
> http://www.thecloudavenue.com/2014/01/oozie-hangs-on-single-node-for-work-flow-with-fork.html
>
> Cheers
> Ulul
>
> Le 07/09/2014 01:01, Rich Haase a écrit :
>>
>> You're welcome.  Glad I could help.
>>
>> On Sep 6, 2014 9:56 AM, "Charles Robertson" 
>> <charles.robertson@gmail.com <ma...@gmail.com>> wrote:
>>
>>     Hi Rich,
>>
>>     Default setup, so presumably one. I opted to add a node rather
>>     than change the number of task trackers and it now runs successfully.
>>
>>     Thank you!
>>     Charles
>>
>>
>>     On 5 September 2014 16:44, Rich Haase <rdhaase@gmail.com
>>     <ma...@gmail.com>> wrote:
>>
>>         How many tasktrackers do you have setup for your single node
>>         cluster?  Oozie runs each action as a java program on an
>>         arbitrary cluster node, so running a workflow requires a
>>         minimum of two tasktrackers.
>>
>>
>>         On Fri, Sep 5, 2014 at 7:33 AM, Charles Robertson
>>         <charles.robertson@gmail.com
>>         <ma...@gmail.com>> wrote:
>>
>>             Hi all,
>>
>>             I'm using oozie to run a hive script, but the map job is
>>             not completing. The tracking page shows its progress as
>>             100%, and there's no warnings or errors in the logs, it's
>>             just sitting there with a state of 'RUNNING'.
>>
>>             As best I can make out from the logs, the last statement
>>             in the hive script has been successfully parsed and it
>>             tries to start the command, saying "launching job 1 of
>>             3". That job is sitting there in the "ACCEPTED" state,
>>             but doing nothing.
>>
>>             This is on a single-node cluster running Hortonworks Data
>>             Platform 2.1. Can anyone suggest what might be the cause,
>>             or where else to look for diagnostic information?
>>
>>             Thanks,
>>             Charles
>>
>>
>>
>>
>>         -- 
>>         *Kernighan's Law*
>>         "Debugging is twice as hard as writing the code in the first
>>         place.  Therefore, if you write the code as cleverly as
>>         possible, you are, by definition, not smart enough to debug it."
>>
>>
>


Re: Map job not finishing

Posted by Ulul <ha...@ulul.org>.
Oops, you're using HDP 2.1 which means Hadoop 2.4, so property name is
mapreduce.tasktracker.map.tasks.maximum

and more importantly it should be irrelevant using Yarn for which map 
slots don't matter. Explanation anyone ?

Ulul

Le 07/09/2014 22:35, Ulul a écrit :
> Hi
>
> Adding an another TT may not be the only way, increasing
> mapred.tasktracker.map.tasks.maximum could also do the trick
>
> Explanation there : 
> http://www.thecloudavenue.com/2014/01/oozie-hangs-on-single-node-for-work-flow-with-fork.html
>
> Cheers
> Ulul
>
> Le 07/09/2014 01:01, Rich Haase a écrit :
>>
>> You're welcome.  Glad I could help.
>>
>> On Sep 6, 2014 9:56 AM, "Charles Robertson" 
>> <charles.robertson@gmail.com <ma...@gmail.com>> wrote:
>>
>>     Hi Rich,
>>
>>     Default setup, so presumably one. I opted to add a node rather
>>     than change the number of task trackers and it now runs successfully.
>>
>>     Thank you!
>>     Charles
>>
>>
>>     On 5 September 2014 16:44, Rich Haase <rdhaase@gmail.com
>>     <ma...@gmail.com>> wrote:
>>
>>         How many tasktrackers do you have setup for your single node
>>         cluster?  Oozie runs each action as a java program on an
>>         arbitrary cluster node, so running a workflow requires a
>>         minimum of two tasktrackers.
>>
>>
>>         On Fri, Sep 5, 2014 at 7:33 AM, Charles Robertson
>>         <charles.robertson@gmail.com
>>         <ma...@gmail.com>> wrote:
>>
>>             Hi all,
>>
>>             I'm using oozie to run a hive script, but the map job is
>>             not completing. The tracking page shows its progress as
>>             100%, and there's no warnings or errors in the logs, it's
>>             just sitting there with a state of 'RUNNING'.
>>
>>             As best I can make out from the logs, the last statement
>>             in the hive script has been successfully parsed and it
>>             tries to start the command, saying "launching job 1 of
>>             3". That job is sitting there in the "ACCEPTED" state,
>>             but doing nothing.
>>
>>             This is on a single-node cluster running Hortonworks Data
>>             Platform 2.1. Can anyone suggest what might be the cause,
>>             or where else to look for diagnostic information?
>>
>>             Thanks,
>>             Charles
>>
>>
>>
>>
>>         -- 
>>         *Kernighan's Law*
>>         "Debugging is twice as hard as writing the code in the first
>>         place.  Therefore, if you write the code as cleverly as
>>         possible, you are, by definition, not smart enough to debug it."
>>
>>
>


Re: Map job not finishing

Posted by Ulul <ha...@ulul.org>.
Hi

Adding an another TT may not be the only way, increasing
mapred.tasktracker.map.tasks.maximum could also do the trick

Explanation there : 
http://www.thecloudavenue.com/2014/01/oozie-hangs-on-single-node-for-work-flow-with-fork.html

Cheers
Ulul

Le 07/09/2014 01:01, Rich Haase a écrit :
>
> You're welcome.  Glad I could help.
>
> On Sep 6, 2014 9:56 AM, "Charles Robertson" 
> <charles.robertson@gmail.com <ma...@gmail.com>> wrote:
>
>     Hi Rich,
>
>     Default setup, so presumably one. I opted to add a node rather
>     than change the number of task trackers and it now runs successfully.
>
>     Thank you!
>     Charles
>
>
>     On 5 September 2014 16:44, Rich Haase <rdhaase@gmail.com
>     <ma...@gmail.com>> wrote:
>
>         How many tasktrackers do you have setup for your single node
>         cluster?  Oozie runs each action as a java program on an
>         arbitrary cluster node, so running a workflow requires a
>         minimum of two tasktrackers.
>
>
>         On Fri, Sep 5, 2014 at 7:33 AM, Charles Robertson
>         <charles.robertson@gmail.com
>         <ma...@gmail.com>> wrote:
>
>             Hi all,
>
>             I'm using oozie to run a hive script, but the map job is
>             not completing. The tracking page shows its progress as
>             100%, and there's no warnings or errors in the logs, it's
>             just sitting there with a state of 'RUNNING'.
>
>             As best I can make out from the logs, the last statement
>             in the hive script has been successfully parsed and it
>             tries to start the command, saying "launching job 1 of 3".
>             That job is sitting there in the "ACCEPTED" state, but
>             doing nothing.
>
>             This is on a single-node cluster running Hortonworks Data
>             Platform 2.1. Can anyone suggest what might be the cause,
>             or where else to look for diagnostic information?
>
>             Thanks,
>             Charles
>
>
>
>
>         -- 
>         *Kernighan's Law*
>         "Debugging is twice as hard as writing the code in the first
>         place.  Therefore, if you write the code as cleverly as
>         possible, you are, by definition, not smart enough to debug it."
>
>


Re: Map job not finishing

Posted by Ulul <ha...@ulul.org>.
Hi

Adding an another TT may not be the only way, increasing
mapred.tasktracker.map.tasks.maximum could also do the trick

Explanation there : 
http://www.thecloudavenue.com/2014/01/oozie-hangs-on-single-node-for-work-flow-with-fork.html

Cheers
Ulul

Le 07/09/2014 01:01, Rich Haase a écrit :
>
> You're welcome.  Glad I could help.
>
> On Sep 6, 2014 9:56 AM, "Charles Robertson" 
> <charles.robertson@gmail.com <ma...@gmail.com>> wrote:
>
>     Hi Rich,
>
>     Default setup, so presumably one. I opted to add a node rather
>     than change the number of task trackers and it now runs successfully.
>
>     Thank you!
>     Charles
>
>
>     On 5 September 2014 16:44, Rich Haase <rdhaase@gmail.com
>     <ma...@gmail.com>> wrote:
>
>         How many tasktrackers do you have setup for your single node
>         cluster?  Oozie runs each action as a java program on an
>         arbitrary cluster node, so running a workflow requires a
>         minimum of two tasktrackers.
>
>
>         On Fri, Sep 5, 2014 at 7:33 AM, Charles Robertson
>         <charles.robertson@gmail.com
>         <ma...@gmail.com>> wrote:
>
>             Hi all,
>
>             I'm using oozie to run a hive script, but the map job is
>             not completing. The tracking page shows its progress as
>             100%, and there's no warnings or errors in the logs, it's
>             just sitting there with a state of 'RUNNING'.
>
>             As best I can make out from the logs, the last statement
>             in the hive script has been successfully parsed and it
>             tries to start the command, saying "launching job 1 of 3".
>             That job is sitting there in the "ACCEPTED" state, but
>             doing nothing.
>
>             This is on a single-node cluster running Hortonworks Data
>             Platform 2.1. Can anyone suggest what might be the cause,
>             or where else to look for diagnostic information?
>
>             Thanks,
>             Charles
>
>
>
>
>         -- 
>         *Kernighan's Law*
>         "Debugging is twice as hard as writing the code in the first
>         place.  Therefore, if you write the code as cleverly as
>         possible, you are, by definition, not smart enough to debug it."
>
>


Re: Map job not finishing

Posted by Ulul <ha...@ulul.org>.
Hi

Adding an another TT may not be the only way, increasing
mapred.tasktracker.map.tasks.maximum could also do the trick

Explanation there : 
http://www.thecloudavenue.com/2014/01/oozie-hangs-on-single-node-for-work-flow-with-fork.html

Cheers
Ulul

Le 07/09/2014 01:01, Rich Haase a écrit :
>
> You're welcome.  Glad I could help.
>
> On Sep 6, 2014 9:56 AM, "Charles Robertson" 
> <charles.robertson@gmail.com <ma...@gmail.com>> wrote:
>
>     Hi Rich,
>
>     Default setup, so presumably one. I opted to add a node rather
>     than change the number of task trackers and it now runs successfully.
>
>     Thank you!
>     Charles
>
>
>     On 5 September 2014 16:44, Rich Haase <rdhaase@gmail.com
>     <ma...@gmail.com>> wrote:
>
>         How many tasktrackers do you have setup for your single node
>         cluster?  Oozie runs each action as a java program on an
>         arbitrary cluster node, so running a workflow requires a
>         minimum of two tasktrackers.
>
>
>         On Fri, Sep 5, 2014 at 7:33 AM, Charles Robertson
>         <charles.robertson@gmail.com
>         <ma...@gmail.com>> wrote:
>
>             Hi all,
>
>             I'm using oozie to run a hive script, but the map job is
>             not completing. The tracking page shows its progress as
>             100%, and there's no warnings or errors in the logs, it's
>             just sitting there with a state of 'RUNNING'.
>
>             As best I can make out from the logs, the last statement
>             in the hive script has been successfully parsed and it
>             tries to start the command, saying "launching job 1 of 3".
>             That job is sitting there in the "ACCEPTED" state, but
>             doing nothing.
>
>             This is on a single-node cluster running Hortonworks Data
>             Platform 2.1. Can anyone suggest what might be the cause,
>             or where else to look for diagnostic information?
>
>             Thanks,
>             Charles
>
>
>
>
>         -- 
>         *Kernighan's Law*
>         "Debugging is twice as hard as writing the code in the first
>         place.  Therefore, if you write the code as cleverly as
>         possible, you are, by definition, not smart enough to debug it."
>
>


Re: Map job not finishing

Posted by Ulul <ha...@ulul.org>.
Hi

Adding an another TT may not be the only way, increasing
mapred.tasktracker.map.tasks.maximum could also do the trick

Explanation there : 
http://www.thecloudavenue.com/2014/01/oozie-hangs-on-single-node-for-work-flow-with-fork.html

Cheers
Ulul

Le 07/09/2014 01:01, Rich Haase a écrit :
>
> You're welcome.  Glad I could help.
>
> On Sep 6, 2014 9:56 AM, "Charles Robertson" 
> <charles.robertson@gmail.com <ma...@gmail.com>> wrote:
>
>     Hi Rich,
>
>     Default setup, so presumably one. I opted to add a node rather
>     than change the number of task trackers and it now runs successfully.
>
>     Thank you!
>     Charles
>
>
>     On 5 September 2014 16:44, Rich Haase <rdhaase@gmail.com
>     <ma...@gmail.com>> wrote:
>
>         How many tasktrackers do you have setup for your single node
>         cluster?  Oozie runs each action as a java program on an
>         arbitrary cluster node, so running a workflow requires a
>         minimum of two tasktrackers.
>
>
>         On Fri, Sep 5, 2014 at 7:33 AM, Charles Robertson
>         <charles.robertson@gmail.com
>         <ma...@gmail.com>> wrote:
>
>             Hi all,
>
>             I'm using oozie to run a hive script, but the map job is
>             not completing. The tracking page shows its progress as
>             100%, and there's no warnings or errors in the logs, it's
>             just sitting there with a state of 'RUNNING'.
>
>             As best I can make out from the logs, the last statement
>             in the hive script has been successfully parsed and it
>             tries to start the command, saying "launching job 1 of 3".
>             That job is sitting there in the "ACCEPTED" state, but
>             doing nothing.
>
>             This is on a single-node cluster running Hortonworks Data
>             Platform 2.1. Can anyone suggest what might be the cause,
>             or where else to look for diagnostic information?
>
>             Thanks,
>             Charles
>
>
>
>
>         -- 
>         *Kernighan's Law*
>         "Debugging is twice as hard as writing the code in the first
>         place.  Therefore, if you write the code as cleverly as
>         possible, you are, by definition, not smart enough to debug it."
>
>


Re: Map job not finishing

Posted by Rich Haase <rd...@gmail.com>.
You're welcome.  Glad I could help.
On Sep 6, 2014 9:56 AM, "Charles Robertson" <ch...@gmail.com>
wrote:

> Hi Rich,
>
> Default setup, so presumably one. I opted to add a node rather than change
> the number of task trackers and it now runs successfully.
>
> Thank you!
> Charles
>
>
> On 5 September 2014 16:44, Rich Haase <rd...@gmail.com> wrote:
>
>> How many tasktrackers do you have setup for your single node cluster?
>>  Oozie runs each action as a java program on an arbitrary cluster node, so
>> running a workflow requires a minimum of two tasktrackers.
>>
>>
>> On Fri, Sep 5, 2014 at 7:33 AM, Charles Robertson <
>> charles.robertson@gmail.com> wrote:
>>
>>> Hi all,
>>>
>>> I'm using oozie to run a hive script, but the map job is not completing.
>>> The tracking page shows its progress as 100%, and there's no warnings or
>>> errors in the logs, it's just sitting there with a state of 'RUNNING'.
>>>
>>> As best I can make out from the logs, the last statement in the hive
>>> script has been successfully parsed and it tries to start the command,
>>> saying "launching job 1 of 3". That job is sitting there in the "ACCEPTED"
>>> state, but doing nothing.
>>>
>>> This is on a single-node cluster running Hortonworks Data Platform 2.1.
>>> Can anyone suggest what might be the cause, or where else to look for
>>> diagnostic information?
>>>
>>> Thanks,
>>> Charles
>>>
>>
>>
>>
>> --
>> *Kernighan's Law*
>> "Debugging is twice as hard as writing the code in the first place.
>> Therefore, if you write the code as cleverly as possible, you are, by
>> definition, not smart enough to debug it."
>>
>
>

Re: Map job not finishing

Posted by Rich Haase <rd...@gmail.com>.
You're welcome.  Glad I could help.
On Sep 6, 2014 9:56 AM, "Charles Robertson" <ch...@gmail.com>
wrote:

> Hi Rich,
>
> Default setup, so presumably one. I opted to add a node rather than change
> the number of task trackers and it now runs successfully.
>
> Thank you!
> Charles
>
>
> On 5 September 2014 16:44, Rich Haase <rd...@gmail.com> wrote:
>
>> How many tasktrackers do you have setup for your single node cluster?
>>  Oozie runs each action as a java program on an arbitrary cluster node, so
>> running a workflow requires a minimum of two tasktrackers.
>>
>>
>> On Fri, Sep 5, 2014 at 7:33 AM, Charles Robertson <
>> charles.robertson@gmail.com> wrote:
>>
>>> Hi all,
>>>
>>> I'm using oozie to run a hive script, but the map job is not completing.
>>> The tracking page shows its progress as 100%, and there's no warnings or
>>> errors in the logs, it's just sitting there with a state of 'RUNNING'.
>>>
>>> As best I can make out from the logs, the last statement in the hive
>>> script has been successfully parsed and it tries to start the command,
>>> saying "launching job 1 of 3". That job is sitting there in the "ACCEPTED"
>>> state, but doing nothing.
>>>
>>> This is on a single-node cluster running Hortonworks Data Platform 2.1.
>>> Can anyone suggest what might be the cause, or where else to look for
>>> diagnostic information?
>>>
>>> Thanks,
>>> Charles
>>>
>>
>>
>>
>> --
>> *Kernighan's Law*
>> "Debugging is twice as hard as writing the code in the first place.
>> Therefore, if you write the code as cleverly as possible, you are, by
>> definition, not smart enough to debug it."
>>
>
>

Re: Map job not finishing

Posted by Rich Haase <rd...@gmail.com>.
You're welcome.  Glad I could help.
On Sep 6, 2014 9:56 AM, "Charles Robertson" <ch...@gmail.com>
wrote:

> Hi Rich,
>
> Default setup, so presumably one. I opted to add a node rather than change
> the number of task trackers and it now runs successfully.
>
> Thank you!
> Charles
>
>
> On 5 September 2014 16:44, Rich Haase <rd...@gmail.com> wrote:
>
>> How many tasktrackers do you have setup for your single node cluster?
>>  Oozie runs each action as a java program on an arbitrary cluster node, so
>> running a workflow requires a minimum of two tasktrackers.
>>
>>
>> On Fri, Sep 5, 2014 at 7:33 AM, Charles Robertson <
>> charles.robertson@gmail.com> wrote:
>>
>>> Hi all,
>>>
>>> I'm using oozie to run a hive script, but the map job is not completing.
>>> The tracking page shows its progress as 100%, and there's no warnings or
>>> errors in the logs, it's just sitting there with a state of 'RUNNING'.
>>>
>>> As best I can make out from the logs, the last statement in the hive
>>> script has been successfully parsed and it tries to start the command,
>>> saying "launching job 1 of 3". That job is sitting there in the "ACCEPTED"
>>> state, but doing nothing.
>>>
>>> This is on a single-node cluster running Hortonworks Data Platform 2.1.
>>> Can anyone suggest what might be the cause, or where else to look for
>>> diagnostic information?
>>>
>>> Thanks,
>>> Charles
>>>
>>
>>
>>
>> --
>> *Kernighan's Law*
>> "Debugging is twice as hard as writing the code in the first place.
>> Therefore, if you write the code as cleverly as possible, you are, by
>> definition, not smart enough to debug it."
>>
>
>

Re: Map job not finishing

Posted by Rich Haase <rd...@gmail.com>.
You're welcome.  Glad I could help.
On Sep 6, 2014 9:56 AM, "Charles Robertson" <ch...@gmail.com>
wrote:

> Hi Rich,
>
> Default setup, so presumably one. I opted to add a node rather than change
> the number of task trackers and it now runs successfully.
>
> Thank you!
> Charles
>
>
> On 5 September 2014 16:44, Rich Haase <rd...@gmail.com> wrote:
>
>> How many tasktrackers do you have setup for your single node cluster?
>>  Oozie runs each action as a java program on an arbitrary cluster node, so
>> running a workflow requires a minimum of two tasktrackers.
>>
>>
>> On Fri, Sep 5, 2014 at 7:33 AM, Charles Robertson <
>> charles.robertson@gmail.com> wrote:
>>
>>> Hi all,
>>>
>>> I'm using oozie to run a hive script, but the map job is not completing.
>>> The tracking page shows its progress as 100%, and there's no warnings or
>>> errors in the logs, it's just sitting there with a state of 'RUNNING'.
>>>
>>> As best I can make out from the logs, the last statement in the hive
>>> script has been successfully parsed and it tries to start the command,
>>> saying "launching job 1 of 3". That job is sitting there in the "ACCEPTED"
>>> state, but doing nothing.
>>>
>>> This is on a single-node cluster running Hortonworks Data Platform 2.1.
>>> Can anyone suggest what might be the cause, or where else to look for
>>> diagnostic information?
>>>
>>> Thanks,
>>> Charles
>>>
>>
>>
>>
>> --
>> *Kernighan's Law*
>> "Debugging is twice as hard as writing the code in the first place.
>> Therefore, if you write the code as cleverly as possible, you are, by
>> definition, not smart enough to debug it."
>>
>
>

Re: Map job not finishing

Posted by Charles Robertson <ch...@gmail.com>.
Hi Rich,

Default setup, so presumably one. I opted to add a node rather than change
the number of task trackers and it now runs successfully.

Thank you!
Charles


On 5 September 2014 16:44, Rich Haase <rd...@gmail.com> wrote:

> How many tasktrackers do you have setup for your single node cluster?
>  Oozie runs each action as a java program on an arbitrary cluster node, so
> running a workflow requires a minimum of two tasktrackers.
>
>
> On Fri, Sep 5, 2014 at 7:33 AM, Charles Robertson <
> charles.robertson@gmail.com> wrote:
>
>> Hi all,
>>
>> I'm using oozie to run a hive script, but the map job is not completing.
>> The tracking page shows its progress as 100%, and there's no warnings or
>> errors in the logs, it's just sitting there with a state of 'RUNNING'.
>>
>> As best I can make out from the logs, the last statement in the hive
>> script has been successfully parsed and it tries to start the command,
>> saying "launching job 1 of 3". That job is sitting there in the "ACCEPTED"
>> state, but doing nothing.
>>
>> This is on a single-node cluster running Hortonworks Data Platform 2.1.
>> Can anyone suggest what might be the cause, or where else to look for
>> diagnostic information?
>>
>> Thanks,
>> Charles
>>
>
>
>
> --
> *Kernighan's Law*
> "Debugging is twice as hard as writing the code in the first place.
> Therefore, if you write the code as cleverly as possible, you are, by
> definition, not smart enough to debug it."
>

Re: Map job not finishing

Posted by Charles Robertson <ch...@gmail.com>.
Hi Rich,

Default setup, so presumably one. I opted to add a node rather than change
the number of task trackers and it now runs successfully.

Thank you!
Charles


On 5 September 2014 16:44, Rich Haase <rd...@gmail.com> wrote:

> How many tasktrackers do you have setup for your single node cluster?
>  Oozie runs each action as a java program on an arbitrary cluster node, so
> running a workflow requires a minimum of two tasktrackers.
>
>
> On Fri, Sep 5, 2014 at 7:33 AM, Charles Robertson <
> charles.robertson@gmail.com> wrote:
>
>> Hi all,
>>
>> I'm using oozie to run a hive script, but the map job is not completing.
>> The tracking page shows its progress as 100%, and there's no warnings or
>> errors in the logs, it's just sitting there with a state of 'RUNNING'.
>>
>> As best I can make out from the logs, the last statement in the hive
>> script has been successfully parsed and it tries to start the command,
>> saying "launching job 1 of 3". That job is sitting there in the "ACCEPTED"
>> state, but doing nothing.
>>
>> This is on a single-node cluster running Hortonworks Data Platform 2.1.
>> Can anyone suggest what might be the cause, or where else to look for
>> diagnostic information?
>>
>> Thanks,
>> Charles
>>
>
>
>
> --
> *Kernighan's Law*
> "Debugging is twice as hard as writing the code in the first place.
> Therefore, if you write the code as cleverly as possible, you are, by
> definition, not smart enough to debug it."
>

Re: Map job not finishing

Posted by Charles Robertson <ch...@gmail.com>.
Hi Rich,

Default setup, so presumably one. I opted to add a node rather than change
the number of task trackers and it now runs successfully.

Thank you!
Charles


On 5 September 2014 16:44, Rich Haase <rd...@gmail.com> wrote:

> How many tasktrackers do you have setup for your single node cluster?
>  Oozie runs each action as a java program on an arbitrary cluster node, so
> running a workflow requires a minimum of two tasktrackers.
>
>
> On Fri, Sep 5, 2014 at 7:33 AM, Charles Robertson <
> charles.robertson@gmail.com> wrote:
>
>> Hi all,
>>
>> I'm using oozie to run a hive script, but the map job is not completing.
>> The tracking page shows its progress as 100%, and there's no warnings or
>> errors in the logs, it's just sitting there with a state of 'RUNNING'.
>>
>> As best I can make out from the logs, the last statement in the hive
>> script has been successfully parsed and it tries to start the command,
>> saying "launching job 1 of 3". That job is sitting there in the "ACCEPTED"
>> state, but doing nothing.
>>
>> This is on a single-node cluster running Hortonworks Data Platform 2.1.
>> Can anyone suggest what might be the cause, or where else to look for
>> diagnostic information?
>>
>> Thanks,
>> Charles
>>
>
>
>
> --
> *Kernighan's Law*
> "Debugging is twice as hard as writing the code in the first place.
> Therefore, if you write the code as cleverly as possible, you are, by
> definition, not smart enough to debug it."
>

Re: Map job not finishing

Posted by Charles Robertson <ch...@gmail.com>.
Hi Rich,

Default setup, so presumably one. I opted to add a node rather than change
the number of task trackers and it now runs successfully.

Thank you!
Charles


On 5 September 2014 16:44, Rich Haase <rd...@gmail.com> wrote:

> How many tasktrackers do you have setup for your single node cluster?
>  Oozie runs each action as a java program on an arbitrary cluster node, so
> running a workflow requires a minimum of two tasktrackers.
>
>
> On Fri, Sep 5, 2014 at 7:33 AM, Charles Robertson <
> charles.robertson@gmail.com> wrote:
>
>> Hi all,
>>
>> I'm using oozie to run a hive script, but the map job is not completing.
>> The tracking page shows its progress as 100%, and there's no warnings or
>> errors in the logs, it's just sitting there with a state of 'RUNNING'.
>>
>> As best I can make out from the logs, the last statement in the hive
>> script has been successfully parsed and it tries to start the command,
>> saying "launching job 1 of 3". That job is sitting there in the "ACCEPTED"
>> state, but doing nothing.
>>
>> This is on a single-node cluster running Hortonworks Data Platform 2.1.
>> Can anyone suggest what might be the cause, or where else to look for
>> diagnostic information?
>>
>> Thanks,
>> Charles
>>
>
>
>
> --
> *Kernighan's Law*
> "Debugging is twice as hard as writing the code in the first place.
> Therefore, if you write the code as cleverly as possible, you are, by
> definition, not smart enough to debug it."
>

Re: Map job not finishing

Posted by Rich Haase <rd...@gmail.com>.
How many tasktrackers do you have setup for your single node cluster?
 Oozie runs each action as a java program on an arbitrary cluster node, so
running a workflow requires a minimum of two tasktrackers.


On Fri, Sep 5, 2014 at 7:33 AM, Charles Robertson <
charles.robertson@gmail.com> wrote:

> Hi all,
>
> I'm using oozie to run a hive script, but the map job is not completing.
> The tracking page shows its progress as 100%, and there's no warnings or
> errors in the logs, it's just sitting there with a state of 'RUNNING'.
>
> As best I can make out from the logs, the last statement in the hive
> script has been successfully parsed and it tries to start the command,
> saying "launching job 1 of 3". That job is sitting there in the "ACCEPTED"
> state, but doing nothing.
>
> This is on a single-node cluster running Hortonworks Data Platform 2.1.
> Can anyone suggest what might be the cause, or where else to look for
> diagnostic information?
>
> Thanks,
> Charles
>



-- 
*Kernighan's Law*
"Debugging is twice as hard as writing the code in the first place.
Therefore, if you write the code as cleverly as possible, you are, by
definition, not smart enough to debug it."

Re: Map job not finishing

Posted by Rich Haase <rd...@gmail.com>.
How many tasktrackers do you have setup for your single node cluster?
 Oozie runs each action as a java program on an arbitrary cluster node, so
running a workflow requires a minimum of two tasktrackers.


On Fri, Sep 5, 2014 at 7:33 AM, Charles Robertson <
charles.robertson@gmail.com> wrote:

> Hi all,
>
> I'm using oozie to run a hive script, but the map job is not completing.
> The tracking page shows its progress as 100%, and there's no warnings or
> errors in the logs, it's just sitting there with a state of 'RUNNING'.
>
> As best I can make out from the logs, the last statement in the hive
> script has been successfully parsed and it tries to start the command,
> saying "launching job 1 of 3". That job is sitting there in the "ACCEPTED"
> state, but doing nothing.
>
> This is on a single-node cluster running Hortonworks Data Platform 2.1.
> Can anyone suggest what might be the cause, or where else to look for
> diagnostic information?
>
> Thanks,
> Charles
>



-- 
*Kernighan's Law*
"Debugging is twice as hard as writing the code in the first place.
Therefore, if you write the code as cleverly as possible, you are, by
definition, not smart enough to debug it."

Re: Map job not finishing

Posted by Rich Haase <rd...@gmail.com>.
How many tasktrackers do you have setup for your single node cluster?
 Oozie runs each action as a java program on an arbitrary cluster node, so
running a workflow requires a minimum of two tasktrackers.


On Fri, Sep 5, 2014 at 7:33 AM, Charles Robertson <
charles.robertson@gmail.com> wrote:

> Hi all,
>
> I'm using oozie to run a hive script, but the map job is not completing.
> The tracking page shows its progress as 100%, and there's no warnings or
> errors in the logs, it's just sitting there with a state of 'RUNNING'.
>
> As best I can make out from the logs, the last statement in the hive
> script has been successfully parsed and it tries to start the command,
> saying "launching job 1 of 3". That job is sitting there in the "ACCEPTED"
> state, but doing nothing.
>
> This is on a single-node cluster running Hortonworks Data Platform 2.1.
> Can anyone suggest what might be the cause, or where else to look for
> diagnostic information?
>
> Thanks,
> Charles
>



-- 
*Kernighan's Law*
"Debugging is twice as hard as writing the code in the first place.
Therefore, if you write the code as cleverly as possible, you are, by
definition, not smart enough to debug it."

Re: Map job not finishing

Posted by Rich Haase <rd...@gmail.com>.
How many tasktrackers do you have setup for your single node cluster?
 Oozie runs each action as a java program on an arbitrary cluster node, so
running a workflow requires a minimum of two tasktrackers.


On Fri, Sep 5, 2014 at 7:33 AM, Charles Robertson <
charles.robertson@gmail.com> wrote:

> Hi all,
>
> I'm using oozie to run a hive script, but the map job is not completing.
> The tracking page shows its progress as 100%, and there's no warnings or
> errors in the logs, it's just sitting there with a state of 'RUNNING'.
>
> As best I can make out from the logs, the last statement in the hive
> script has been successfully parsed and it tries to start the command,
> saying "launching job 1 of 3". That job is sitting there in the "ACCEPTED"
> state, but doing nothing.
>
> This is on a single-node cluster running Hortonworks Data Platform 2.1.
> Can anyone suggest what might be the cause, or where else to look for
> diagnostic information?
>
> Thanks,
> Charles
>



-- 
*Kernighan's Law*
"Debugging is twice as hard as writing the code in the first place.
Therefore, if you write the code as cleverly as possible, you are, by
definition, not smart enough to debug it."