You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@pig.apache.org by kiranprasad <ki...@imimobile.com> on 2011/10/04 13:53:26 UTC
ERROR 1066: Unable to open iterator for alias A. Backend error : Could not obtain block:
I am getting the below exception when trying to execute PIG latin script.
Failed!
Failed Jobs:
JobId Alias Feature Message Outputs
job_201110042009_0005 A MAP_ONLY Message: Job failed! hdfs://10.0.0.61/tmp/temp1751671187/tmp-592386019,
Input(s):
Failed to read data from "/data/arpumsisdn.txt"
Output(s):
Failed to produce result in "hdfs://10.0.0.61/tmp/temp1751671187/tmp-592386019"
Counters:
Total records written : 0
Total bytes written : 0
Spillable Memory Manager spill count : 0
Total bags proactively spilled: 0
Total records proactively spilled: 0
Job DAG:
job_201110042009_0005
2011-10-04 22:13:53,736 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - Failed!
2011-10-04 22:13:53,745 [main] ERROR org.apache.pig.tools.grunt.Grunt - ERROR 1066: Unable to open iterator for alias A. Backend error : Could not obtain block: blk_-8354424441116992221_1060 file=/data/arpumsisdn.txt
Details at logfile: /home/kiranprasad.g/pig-0.8.1/pig_1317746514798.log
Regards
Kiran.G
Re: ERROR 1066: Unable to open iterator for alias A. Backend error :
Could not obtain block:
Posted by Dmitriy Ryaboy <dv...@gmail.com>.
Do you have any nodes down? Bad disks?
Try to recover those nodes. What level of replication were you running with?
btw this is really an HDFS issue, not a pig issue, so you'll likely get
better support from the hdfs user list.
D
On Thu, Oct 6, 2011 at 11:02 PM, kiranprasad <ki...@imimobile.com>wrote:
>
> Hi Alex
>
> Thanks for your response.
>
> I ve checked with below mentioned command and I am getting
>
> [kiranprasad.g@pig4 hadoop-0.20.2]$ bin/hadoop fs -text
> /data/arpumsisdn.txt | tail
> 11/10/07 16:17:18 INFO hdfs.DFSClient: No node available for block:
> blk_-8354424441116992221_1060 file=/data/arpumsisdn.txt
> 11/10/07 16:17:18 INFO hdfs.DFSClient: Could not obtain block
> blk_-8354424441116992221_1060 from any node: java.io.IOException: No live
> nodes contain current block
> 11/10/07 16:17:21 INFO hdfs.DFSClient: No node available for block:
> blk_-8354424441116992221_1060 file=/data/arpumsisdn.txt
> 11/10/07 16:17:21 INFO hdfs.DFSClient: Could not obtain block
> blk_-8354424441116992221_1060 from any node: java.io.IOException: No live
> nodes contain current block
> 11/10/07 16:17:25 INFO hdfs.DFSClient: No node available for block:
> blk_-8354424441116992221_1060 file=/data/arpumsisdn.txt
> 11/10/07 16:17:25 INFO hdfs.DFSClient: Could not obtain block
> blk_-8354424441116992221_1060 from any node: java.io.IOException: No live
> nodes contain current block
> 11/10/07 16:17:29 WARN hdfs.DFSClient: DFS Read: java.io.IOException: Could
> not obtain block: blk_-8354424441116992221_1060 file=/data/arpumsisdn.txt
> at org.apache.hadoop.hdfs.**DFSClient$DFSInputStream.**
> chooseDataNode(DFSClient.java:**1812)
> at org.apache.hadoop.hdfs.**DFSClient$DFSInputStream.**
> blockSeekTo(DFSClient.java:**1638)
> at org.apache.hadoop.hdfs.**DFSClient$DFSInputStream.read(**
> DFSClient.java:1767)
> at org.apache.hadoop.hdfs.**DFSClient$DFSInputStream.read(**
> DFSClient.java:1695)
> at java.io.DataInputStream.**readShort(DataInputStream.**java:295)
> at org.apache.hadoop.fs.FsShell.**forMagic(FsShell.java:397)
> at org.apache.hadoop.fs.FsShell.**access$200(FsShell.java:49)
> at org.apache.hadoop.fs.FsShell$**2.process(FsShell.java:420)
> at org.apache.hadoop.fs.FsShell$**DelayedExceptionThrowing.**
> globAndProcess(FsShell.java:**1898)
> at org.apache.hadoop.fs.FsShell.**text(FsShell.java:414)
> at org.apache.hadoop.fs.FsShell.**doall(FsShell.java:1563)
> at org.apache.hadoop.fs.FsShell.**run(FsShell.java:1763)
> at org.apache.hadoop.util.**ToolRunner.run(ToolRunner.**java:65)
> at org.apache.hadoop.util.**ToolRunner.run(ToolRunner.**java:79)
> at org.apache.hadoop.fs.FsShell.**main(FsShell.java:1880)
>
> text: Could not obtain block: blk_-8354424441116992221_1060
> file=/data/arpumsisdn.txt
>
>
> The Block is not available. How to recover the data block ?
>
>
> -----Original Message----- From: Alex Rovner
> Sent: Wednesday, October 05, 2011 5:55 PM
> To: user@pig.apache.org
> Subject: Re: ERROR 1066: Unable to open iterator for alias A. Backend error
> : Could not obtain block:
>
>
> You can also test quickly if thats the issue by running the following
> command:
>
> hadoop fs -text /data/arpumsisdn.txt | tail
>
> On Wed, Oct 5, 2011 at 8:24 AM, Alex Rovner <al...@gmail.com> wrote:
>
> Kiran,
>>
>> This looks like your HDFS is missing some blocks. Can you run fsck and see
>> if you have missing blocks and if so for what files?
>>
>> http://hadoop.apache.org/**common/docs/r0.17.2/hdfs_user_**
>> guide.html#Fsck<http://hadoop.apache.org/common/docs/r0.17.2/hdfs_user_guide.html#Fsck>
>>
>> Alex
>>
>>
>> On Tue, Oct 4, 2011 at 7:53 AM, kiranprasad <ki...@imimobile.com>
>> **wrote:
>>
>> I am getting the below exception when trying to execute PIG latin script.
>>>
>>> Failed!
>>>
>>> Failed Jobs:
>>> JobId Alias Feature Message Outputs
>>> job_201110042009_0005 A MAP_ONLY Message: Job failed!
>>> hdfs://10.0.0.61/tmp/**temp1751671187/tmp-592386019<http://10.0.0.61/tmp/temp1751671187/tmp-592386019>
>>> ,
>>>
>>> Input(s):
>>> Failed to read data from "/data/arpumsisdn.txt"
>>>
>>> Output(s):
>>> Failed to produce result in "hdfs://
>>> 10.0.0.61/tmp/temp1751671187/**tmp-592386019<http://10.0.0.61/tmp/temp1751671187/tmp-592386019>
>>> "
>>>
>>> Counters:
>>> Total records written : 0
>>> Total bytes written : 0
>>> Spillable Memory Manager spill count : 0
>>> Total bags proactively spilled: 0
>>> Total records proactively spilled: 0
>>>
>>> Job DAG:
>>> job_201110042009_0005
>>>
>>>
>>> 2011-10-04 22:13:53,736 [main] INFO
>>>
>>> org.apache.pig.backend.hadoop.**executionengine.**mapReduceLayer.**
>>> MapReduceLauncher
>>> - Failed!
>>> 2011-10-04 22:13:53,745 [main] ERROR org.apache.pig.tools.grunt.**Grunt
>>> -
>>> ERROR 1066: Unable to open iterator for alias A. Backend error : Could
>>> not
>>> obtain block: blk_-8354424441116992221_1060 file=/data/arpumsisdn.txt
>>> Details at logfile: /home/kiranprasad.g/pig-0.8.1/**
>>> pig_1317746514798.log
>>>
>>>
>>>
>>> Regards
>>> Kiran.G
>>>
>>
>>
>>
>>
>
>
Re: ERROR 1066: Unable to open iterator for alias A. Backend error : Could not obtain block:
Posted by kiranprasad <ki...@imimobile.com>.
Hi Alex
Thanks for your response.
I ve checked with below mentioned command and I am getting
[kiranprasad.g@pig4 hadoop-0.20.2]$ bin/hadoop fs -text /data/arpumsisdn.txt
| tail
11/10/07 16:17:18 INFO hdfs.DFSClient: No node available for block:
blk_-8354424441116992221_1060 file=/data/arpumsisdn.txt
11/10/07 16:17:18 INFO hdfs.DFSClient: Could not obtain block
blk_-8354424441116992221_1060 from any node: java.io.IOException: No live
nodes contain current block
11/10/07 16:17:21 INFO hdfs.DFSClient: No node available for block:
blk_-8354424441116992221_1060 file=/data/arpumsisdn.txt
11/10/07 16:17:21 INFO hdfs.DFSClient: Could not obtain block
blk_-8354424441116992221_1060 from any node: java.io.IOException: No live
nodes contain current block
11/10/07 16:17:25 INFO hdfs.DFSClient: No node available for block:
blk_-8354424441116992221_1060 file=/data/arpumsisdn.txt
11/10/07 16:17:25 INFO hdfs.DFSClient: Could not obtain block
blk_-8354424441116992221_1060 from any node: java.io.IOException: No live
nodes contain current block
11/10/07 16:17:29 WARN hdfs.DFSClient: DFS Read: java.io.IOException: Could
not obtain block: blk_-8354424441116992221_1060 file=/data/arpumsisdn.txt
at
org.apache.hadoop.hdfs.DFSClient$DFSInputStream.chooseDataNode(DFSClient.java:1812)
at
org.apache.hadoop.hdfs.DFSClient$DFSInputStream.blockSeekTo(DFSClient.java:1638)
at
org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(DFSClient.java:1767)
at
org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(DFSClient.java:1695)
at java.io.DataInputStream.readShort(DataInputStream.java:295)
at org.apache.hadoop.fs.FsShell.forMagic(FsShell.java:397)
at org.apache.hadoop.fs.FsShell.access$200(FsShell.java:49)
at org.apache.hadoop.fs.FsShell$2.process(FsShell.java:420)
at
org.apache.hadoop.fs.FsShell$DelayedExceptionThrowing.globAndProcess(FsShell.java:1898)
at org.apache.hadoop.fs.FsShell.text(FsShell.java:414)
at org.apache.hadoop.fs.FsShell.doall(FsShell.java:1563)
at org.apache.hadoop.fs.FsShell.run(FsShell.java:1763)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
at org.apache.hadoop.fs.FsShell.main(FsShell.java:1880)
text: Could not obtain block: blk_-8354424441116992221_1060
file=/data/arpumsisdn.txt
The Block is not available. How to recover the data block ?
-----Original Message-----
From: Alex Rovner
Sent: Wednesday, October 05, 2011 5:55 PM
To: user@pig.apache.org
Subject: Re: ERROR 1066: Unable to open iterator for alias A. Backend error
: Could not obtain block:
You can also test quickly if thats the issue by running the following
command:
hadoop fs -text /data/arpumsisdn.txt | tail
On Wed, Oct 5, 2011 at 8:24 AM, Alex Rovner <al...@gmail.com> wrote:
> Kiran,
>
> This looks like your HDFS is missing some blocks. Can you run fsck and see
> if you have missing blocks and if so for what files?
>
> http://hadoop.apache.org/common/docs/r0.17.2/hdfs_user_guide.html#Fsck
>
> Alex
>
>
> On Tue, Oct 4, 2011 at 7:53 AM, kiranprasad
> <ki...@imimobile.com>wrote:
>
>> I am getting the below exception when trying to execute PIG latin script.
>>
>> Failed!
>>
>> Failed Jobs:
>> JobId Alias Feature Message Outputs
>> job_201110042009_0005 A MAP_ONLY Message: Job failed!
>> hdfs://10.0.0.61/tmp/temp1751671187/tmp-592386019,
>>
>> Input(s):
>> Failed to read data from "/data/arpumsisdn.txt"
>>
>> Output(s):
>> Failed to produce result in "hdfs://
>> 10.0.0.61/tmp/temp1751671187/tmp-592386019"
>>
>> Counters:
>> Total records written : 0
>> Total bytes written : 0
>> Spillable Memory Manager spill count : 0
>> Total bags proactively spilled: 0
>> Total records proactively spilled: 0
>>
>> Job DAG:
>> job_201110042009_0005
>>
>>
>> 2011-10-04 22:13:53,736 [main] INFO
>>
>> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher
>> - Failed!
>> 2011-10-04 22:13:53,745 [main] ERROR org.apache.pig.tools.grunt.Grunt -
>> ERROR 1066: Unable to open iterator for alias A. Backend error : Could
>> not
>> obtain block: blk_-8354424441116992221_1060 file=/data/arpumsisdn.txt
>> Details at logfile: /home/kiranprasad.g/pig-0.8.1/pig_1317746514798.log
>>
>>
>>
>> Regards
>> Kiran.G
>
>
>
Re: ERROR 1066: Unable to open iterator for alias A. Backend error :
Could not obtain block:
Posted by Alex Rovner <al...@gmail.com>.
You can also test quickly if thats the issue by running the following
command:
hadoop fs -text /data/arpumsisdn.txt | tail
On Wed, Oct 5, 2011 at 8:24 AM, Alex Rovner <al...@gmail.com> wrote:
> Kiran,
>
> This looks like your HDFS is missing some blocks. Can you run fsck and see
> if you have missing blocks and if so for what files?
>
> http://hadoop.apache.org/common/docs/r0.17.2/hdfs_user_guide.html#Fsck
>
> Alex
>
>
> On Tue, Oct 4, 2011 at 7:53 AM, kiranprasad <ki...@imimobile.com>wrote:
>
>> I am getting the below exception when trying to execute PIG latin script.
>>
>> Failed!
>>
>> Failed Jobs:
>> JobId Alias Feature Message Outputs
>> job_201110042009_0005 A MAP_ONLY Message: Job failed!
>> hdfs://10.0.0.61/tmp/temp1751671187/tmp-592386019,
>>
>> Input(s):
>> Failed to read data from "/data/arpumsisdn.txt"
>>
>> Output(s):
>> Failed to produce result in "hdfs://
>> 10.0.0.61/tmp/temp1751671187/tmp-592386019"
>>
>> Counters:
>> Total records written : 0
>> Total bytes written : 0
>> Spillable Memory Manager spill count : 0
>> Total bags proactively spilled: 0
>> Total records proactively spilled: 0
>>
>> Job DAG:
>> job_201110042009_0005
>>
>>
>> 2011-10-04 22:13:53,736 [main] INFO
>> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher
>> - Failed!
>> 2011-10-04 22:13:53,745 [main] ERROR org.apache.pig.tools.grunt.Grunt -
>> ERROR 1066: Unable to open iterator for alias A. Backend error : Could not
>> obtain block: blk_-8354424441116992221_1060 file=/data/arpumsisdn.txt
>> Details at logfile: /home/kiranprasad.g/pig-0.8.1/pig_1317746514798.log
>>
>>
>>
>> Regards
>> Kiran.G
>
>
>
Re: ERROR 1066: Unable to open iterator for alias A. Backend error :
Could not obtain block:
Posted by Alex Rovner <al...@gmail.com>.
Kiran,
This looks like your HDFS is missing some blocks. Can you run fsck and see
if you have missing blocks and if so for what files?
http://hadoop.apache.org/common/docs/r0.17.2/hdfs_user_guide.html#Fsck
Alex
On Tue, Oct 4, 2011 at 7:53 AM, kiranprasad <ki...@imimobile.com>wrote:
> I am getting the below exception when trying to execute PIG latin script.
>
> Failed!
>
> Failed Jobs:
> JobId Alias Feature Message Outputs
> job_201110042009_0005 A MAP_ONLY Message: Job failed!
> hdfs://10.0.0.61/tmp/temp1751671187/tmp-592386019,
>
> Input(s):
> Failed to read data from "/data/arpumsisdn.txt"
>
> Output(s):
> Failed to produce result in "hdfs://
> 10.0.0.61/tmp/temp1751671187/tmp-592386019"
>
> Counters:
> Total records written : 0
> Total bytes written : 0
> Spillable Memory Manager spill count : 0
> Total bags proactively spilled: 0
> Total records proactively spilled: 0
>
> Job DAG:
> job_201110042009_0005
>
>
> 2011-10-04 22:13:53,736 [main] INFO
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher
> - Failed!
> 2011-10-04 22:13:53,745 [main] ERROR org.apache.pig.tools.grunt.Grunt -
> ERROR 1066: Unable to open iterator for alias A. Backend error : Could not
> obtain block: blk_-8354424441116992221_1060 file=/data/arpumsisdn.txt
> Details at logfile: /home/kiranprasad.g/pig-0.8.1/pig_1317746514798.log
>
>
>
> Regards
> Kiran.G