You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hive.apache.org by Jihyun Suh <jh...@gmail.com> on 2014/12/05 01:09:54 UTC

# of failed Map Tasks exceeded allowed limit

I have a problem that inserting data from external table to managed table.

I already have done just 'limit 10' with below hql but can't all data.

I have the error just like '# of failed Map Tasks exceeded allowed limit'

How should I do?

---------------------------------------------------------------------------------

cause: SQLException:Error while executing statement:



INSERT OVERWRITE TABLE newTbl PARTITION(etl_ymd, etl_hh)

SELECT rec_type,

    hash(substr(svd_name,1,case when instr(svd_name,'F')=0 then
length(svd_name) when instr(svd_name,'F')>30 then 30 else
instr(svd_name,'F')-1 end )) as imsi,

    hash(substr(svd_nick,1,case when instr(svd_nick,'F')=0 then
length(svd_nick) when instr(svd_nick,'F')>0 then 30 else
instr(svd_nick,'F')-1 end)) as imei,

    hash(a.private) as private,

    hash(case when substr(a.private,3,2)='82' then
concat('0',substr(a.private,5)) else substr(a.private,5) end) as num,

    evt_timestamp,

    msg_ref,

    hash(dest_no) as
dest_no,

        system_type,

    a.file_name as
file_name,


        concat(substr(a.file_name,2,2),substr(a.file_name,6,1))  as own_no,


etl_ymd,


        etl_hh

FROM
(

        select *,

        substr(svd_private,1,case when instr(svd_private,'F')=0 then
length(svd_private) when instr(svd_private,'F')>30 then 30 else
instr(svd_private,'F')-1 end) as private,

        split(split(INPUT__FILE__NAME, "etl_hh=[0-9]+/")[1], ".DAT")[0] as
file_name

from oldTbl

) a



by FAILED: Execution Error, return code 2 from
org.apache.hadoop.hive.ql.exec.MapRedTask, SQLState = 08S01, ResponseCode =
2

org.apache.hive.jdbc.Utils.verifySuccess(Utils.java:168)

org.apache.hive.jdbc.Utils.verifySuccessWithInfo(Utils.java:156)

org.apache.hive.jdbc.HiveStatement.execute(HiveStatement.java:198)

com.nexr.ndap.driver.hive.jdbc.HiveJdbcConnector.doExecute(HiveJdbcConnector.java:169)

com.nexr.ndap.driver.hive.jdbc.HiveJdbcConnector.executeQueryWithSession(HiveJdbcConnector.java:856)

com.nexr.ndap.driver.hive.ConcurrentHiveConnector.executeQueryWithSession(ConcurrentHiveConnector.java:283)

com.nexr.ndap.service.workbench.impl.HiveQueryServiceImpl.runQuery(HiveQueryServiceImpl.java:582)

com.nexr.ndap.service.workbench.impl.HiveQueryServiceImpl.runQuery(HiveQueryServiceImpl.java:433)

sun.reflect.GeneratedMethodAccessor743.invoke(Unknown Source)

sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)

java.lang.reflect.Method.invoke(Method.java:597)

org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:309)

org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:183)

org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)

org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110)

org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)

org.springframework.aop.interceptor.AsyncExecutionInterceptor$1.call(AsyncExecutionInterceptor.java:80)

java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)

java.util.concurrent.FutureTask.run(FutureTask.java:138)

java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895)

java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)

java.lang.Thread.run(Thread.java:662)



--------------------------------------------------------------------------------

= job detail log

Hadoop job_201406192109_16353 on p-bigai-pk1-a03

User: admin

Job Name: INSERT OVERWRITE TABLE mzn.mzn_cdr_3gsms...a(Stage-1)

Job File:
hdfs://p-bigai-pk1-a03:9000/tmp/hadoop-mapred/mapred/staging/admin/.staging/job_201406192109_16353/job.xml

Submit Host: p-bigai-pk1-a01

Submit Host Address: 10.220.78.91

Job-ACLs: All users are allowed

Job Setup: Successful

Status: Failed

Failure Info:# of failed Map Tasks exceeded allowed limit. FailedCount: 1.
LastFailedTask: task_201406192109_16353_m_000006

Started at: Wed Dec 03 14:18:36 KST 2014

Failed at: Wed Dec 03 14:19:04 KST 2014

Failed in: 27sec

Job Cleanup: Successful


Black-listed TaskTrackers: 19

Re: # of failed Map Tasks exceeded allowed limit

Posted by Jihyun Suh <jh...@gmail.com>.
I got a real problem in the task log and solved it

"# of failed Map Tasks exceeded allowed limit" is not a real problem...

2014-12-05 9:09 GMT+09:00 Jihyun Suh <jh...@gmail.com>:

> I have a problem that inserting data from external table to managed table.
>
> I already have done just 'limit 10' with below hql but can't all data.
>
> I have the error just like '# of failed Map Tasks exceeded allowed limit'
>
> How should I do?
>
>
> ---------------------------------------------------------------------------------
>
> cause: SQLException:Error while executing statement:
>
>
>
> INSERT OVERWRITE TABLE newTbl PARTITION(etl_ymd, etl_hh)
>
> SELECT rec_type,
>
>     hash(substr(svd_name,1,case when instr(svd_name,'F')=0 then
> length(svd_name) when instr(svd_name,'F')>30 then 30 else
> instr(svd_name,'F')-1 end )) as imsi,
>
>     hash(substr(svd_nick,1,case when instr(svd_nick,'F')=0 then
> length(svd_nick) when instr(svd_nick,'F')>0 then 30 else
> instr(svd_nick,'F')-1 end)) as imei,
>
>     hash(a.private) as private,
>
>     hash(case when substr(a.private,3,2)='82' then
> concat('0',substr(a.private,5)) else substr(a.private,5) end) as num,
>
>     evt_timestamp,
>
>     msg_ref,
>
>     hash(dest_no) as
> dest_no,
>
>         system_type,
>
>     a.file_name as
> file_name,
>
>
>         concat(substr(a.file_name,2,2),substr(a.file_name,6,1))  as own_no,
>
>
> etl_ymd,
>
>
>         etl_hh
>
> FROM
> (
>
>         select *,
>
>         substr(svd_private,1,case when instr(svd_private,'F')=0 then
> length(svd_private) when instr(svd_private,'F')>30 then 30 else
> instr(svd_private,'F')-1 end) as private,
>
>         split(split(INPUT__FILE__NAME, "etl_hh=[0-9]+/")[1], ".DAT")[0] as
> file_name
>
> from oldTbl
>
> ) a
>
>
>
> by FAILED: Execution Error, return code 2 from
> org.apache.hadoop.hive.ql.exec.MapRedTask, SQLState = 08S01, ResponseCode =
> 2
>
> org.apache.hive.jdbc.Utils.verifySuccess(Utils.java:168)
>
> org.apache.hive.jdbc.Utils.verifySuccessWithInfo(Utils.java:156)
>
> org.apache.hive.jdbc.HiveStatement.execute(HiveStatement.java:198)
>
>
> com.nexr.ndap.driver.hive.jdbc.HiveJdbcConnector.doExecute(HiveJdbcConnector.java:169)
>
>
> com.nexr.ndap.driver.hive.jdbc.HiveJdbcConnector.executeQueryWithSession(HiveJdbcConnector.java:856)
>
>
> com.nexr.ndap.driver.hive.ConcurrentHiveConnector.executeQueryWithSession(ConcurrentHiveConnector.java:283)
>
>
> com.nexr.ndap.service.workbench.impl.HiveQueryServiceImpl.runQuery(HiveQueryServiceImpl.java:582)
>
>
> com.nexr.ndap.service.workbench.impl.HiveQueryServiceImpl.runQuery(HiveQueryServiceImpl.java:433)
>
> sun.reflect.GeneratedMethodAccessor743.invoke(Unknown Source)
>
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>
> java.lang.reflect.Method.invoke(Method.java:597)
>
>
> org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:309)
>
>
> org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:183)
>
>
> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
>
>
> org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110)
>
>
> org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
>
>
> org.springframework.aop.interceptor.AsyncExecutionInterceptor$1.call(AsyncExecutionInterceptor.java:80)
>
> java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
>
> java.util.concurrent.FutureTask.run(FutureTask.java:138)
>
>
> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895)
>
>
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
>
> java.lang.Thread.run(Thread.java:662)
>
>
>
>
> --------------------------------------------------------------------------------
>
> = job detail log
>
> Hadoop job_201406192109_16353 on p-bigai-pk1-a03
>
> User: admin
>
> Job Name: INSERT OVERWRITE TABLE mzn.mzn_cdr_3gsms...a(Stage-1)
>
> Job File:
> hdfs://p-bigai-pk1-a03:9000/tmp/hadoop-mapred/mapred/staging/admin/.staging/job_201406192109_16353/job.xml
>
> Submit Host: p-bigai-pk1-a01
>
> Submit Host Address: 10.220.78.91
>
> Job-ACLs: All users are allowed
>
> Job Setup: Successful
>
> Status: Failed
>
> Failure Info:# of failed Map Tasks exceeded allowed limit. FailedCount: 1.
> LastFailedTask: task_201406192109_16353_m_000006
>
> Started at: Wed Dec 03 14:18:36 KST 2014
>
> Failed at: Wed Dec 03 14:19:04 KST 2014
>
> Failed in: 27sec
>
> Job Cleanup: Successful
>
>
> Black-listed TaskTrackers: 19
>