You are viewing a plain text version of this content. The canonical link for it is here.
Posted to hdfs-user@hadoop.apache.org by lei liu <li...@gmail.com> on 2014/07/07 03:34:18 UTC

java.net.SocketTimeoutException: read(2) error: Resource temporarily unavailable

I use hbase-0.94 and hadoop-2.2, there is below exception:

2014-07-04 12:43:49,700 WARN org.apache.hadoop.hdfs.DFSClient: failed to
connect to
DomainSocket(fd=322,path=/home/hadoop/hadoop-current/cdh4-dn-socket/dn_socket)

java.net.SocketTimeoutException: read(2) error: Resource temporarily
unavailable

        at org.apache.hadoop.net.unix.DomainSocket.readArray0(Native Method)

        at
org.apache.hadoop.net.unix.DomainSocket.access$200(DomainSocket.java:47)

        at
org.apache.hadoop.net.unix.DomainSocket$DomainInputStream.read(DomainSocket.java:530)

        at java.io.FilterInputStream.read(FilterInputStream.java:66)

        at
org.apache.hadoop.hdfs.protocol.HdfsProtoUtil.vintPrefixed(HdfsProtoUtil.java:169)

        at
org.apache.hadoop.hdfs.BlockReaderFactory.newShortCircuitBlockReader(BlockReaderFactory.java:187)

        at
org.apache.hadoop.hdfs.BlockReaderFactory.newBlockReader(BlockReaderFactory.java:104)

        at
org.apache.hadoop.hdfs.DFSInputStream.getBlockReader(DFSInputStream.java:1060)

        at
org.apache.hadoop.hdfs.DFSInputStream.fetchBlockByteRange(DFSInputStream.java:898)

        at
org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:1148)

        at
org.apache.hadoop.fs.FSDataInputStream.read(FSDataInputStream.java:73)

        at
org.apache.hadoop.hbase.io.hfile.HFileBlock$AbstractFSReader.readAtOffset(HFileBlock.java:1388)

        at
org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderV2.readBlockDataInternal(HFileBlock.java:1880)

        at
org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderV2.readBlockData(HFileBlock.java:1723)

        at
org.apache.hadoop.hbase.io.hfile.HFileReaderV2.readBlock(HFileReaderV2.java:365)

        at
org.apache.hadoop.hbase.io.hfile.HFileReaderV2$AbstractScannerV2.readNextDataBlock(HFileReaderV2.java:633)

        at
org.apache.hadoop.hbase.io.hfile.HFileReaderV2$ScannerV2.next(HFileReaderV2.java:730)

        at
org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScanner.java:128)



why does appear the exception "java.net.SocketTimeoutException: read(2)
error: Resource temporarily unavailable"?


 Thanks,
LiuLei

Re: java.net.SocketTimeoutException: read(2) error: Resource temporarily unavailable

Posted by Jungi Jeong <jg...@calab.kaist.ac.kr>.
Not sure exactly, but I had similar issues with Hadoop 1.2.1.
If your machines do not have any network related issues (DNS, ping, etc),
it can be disk-related (heavy disk usages).
When the machine goes under heavy disk I/O, things get slow and expire the
socketTimeOut, thus, this error can happen.

I resolved (avoided) this issue by setting
dfs.datanode.socket.write.timeout = 0.

I really want this can help you,
thanks.

- Jungi


On 7 July 2014 10:34, lei liu <li...@gmail.com> wrote:

> I use hbase-0.94 and hadoop-2.2, there is below exception:
>
> 2014-07-04 12:43:49,700 WARN org.apache.hadoop.hdfs.DFSClient: failed to
> connect to
> DomainSocket(fd=322,path=/home/hadoop/hadoop-current/cdh4-dn-socket/dn_socket)
>
> java.net.SocketTimeoutException: read(2) error: Resource temporarily
> unavailable
>
>         at org.apache.hadoop.net.unix.DomainSocket.readArray0(Native
> Method)
>
>         at
> org.apache.hadoop.net.unix.DomainSocket.access$200(DomainSocket.java:47)
>
>         at
> org.apache.hadoop.net.unix.DomainSocket$DomainInputStream.read(DomainSocket.java:530)
>
>         at java.io.FilterInputStream.read(FilterInputStream.java:66)
>
>         at
> org.apache.hadoop.hdfs.protocol.HdfsProtoUtil.vintPrefixed(HdfsProtoUtil.java:169)
>
>         at
> org.apache.hadoop.hdfs.BlockReaderFactory.newShortCircuitBlockReader(BlockReaderFactory.java:187)
>
>         at
> org.apache.hadoop.hdfs.BlockReaderFactory.newBlockReader(BlockReaderFactory.java:104)
>
>         at
> org.apache.hadoop.hdfs.DFSInputStream.getBlockReader(DFSInputStream.java:1060)
>
>         at
> org.apache.hadoop.hdfs.DFSInputStream.fetchBlockByteRange(DFSInputStream.java:898)
>
>         at
> org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:1148)
>
>         at
> org.apache.hadoop.fs.FSDataInputStream.read(FSDataInputStream.java:73)
>
>         at
> org.apache.hadoop.hbase.io.hfile.HFileBlock$AbstractFSReader.readAtOffset(HFileBlock.java:1388)
>
>         at
> org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderV2.readBlockDataInternal(HFileBlock.java:1880)
>
>         at
> org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderV2.readBlockData(HFileBlock.java:1723)
>
>         at
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2.readBlock(HFileReaderV2.java:365)
>
>         at
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2$AbstractScannerV2.readNextDataBlock(HFileReaderV2.java:633)
>
>         at
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2$ScannerV2.next(HFileReaderV2.java:730)
>
>         at
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScanner.java:128)
>
>
>
> why does appear the exception "java.net.SocketTimeoutException: read(2)
> error: Resource temporarily unavailable"?
>
>
>  Thanks,
> LiuLei
>



-- 
Jungi Jeong
M.S Candidate, Computer Architecture Lab.
Div. of Computer Science, KAIST

Re: java.net.SocketTimeoutException: read(2) error: Resource temporarily unavailable

Posted by Jungi Jeong <jg...@calab.kaist.ac.kr>.
Not sure exactly, but I had similar issues with Hadoop 1.2.1.
If your machines do not have any network related issues (DNS, ping, etc),
it can be disk-related (heavy disk usages).
When the machine goes under heavy disk I/O, things get slow and expire the
socketTimeOut, thus, this error can happen.

I resolved (avoided) this issue by setting
dfs.datanode.socket.write.timeout = 0.

I really want this can help you,
thanks.

- Jungi


On 7 July 2014 10:34, lei liu <li...@gmail.com> wrote:

> I use hbase-0.94 and hadoop-2.2, there is below exception:
>
> 2014-07-04 12:43:49,700 WARN org.apache.hadoop.hdfs.DFSClient: failed to
> connect to
> DomainSocket(fd=322,path=/home/hadoop/hadoop-current/cdh4-dn-socket/dn_socket)
>
> java.net.SocketTimeoutException: read(2) error: Resource temporarily
> unavailable
>
>         at org.apache.hadoop.net.unix.DomainSocket.readArray0(Native
> Method)
>
>         at
> org.apache.hadoop.net.unix.DomainSocket.access$200(DomainSocket.java:47)
>
>         at
> org.apache.hadoop.net.unix.DomainSocket$DomainInputStream.read(DomainSocket.java:530)
>
>         at java.io.FilterInputStream.read(FilterInputStream.java:66)
>
>         at
> org.apache.hadoop.hdfs.protocol.HdfsProtoUtil.vintPrefixed(HdfsProtoUtil.java:169)
>
>         at
> org.apache.hadoop.hdfs.BlockReaderFactory.newShortCircuitBlockReader(BlockReaderFactory.java:187)
>
>         at
> org.apache.hadoop.hdfs.BlockReaderFactory.newBlockReader(BlockReaderFactory.java:104)
>
>         at
> org.apache.hadoop.hdfs.DFSInputStream.getBlockReader(DFSInputStream.java:1060)
>
>         at
> org.apache.hadoop.hdfs.DFSInputStream.fetchBlockByteRange(DFSInputStream.java:898)
>
>         at
> org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:1148)
>
>         at
> org.apache.hadoop.fs.FSDataInputStream.read(FSDataInputStream.java:73)
>
>         at
> org.apache.hadoop.hbase.io.hfile.HFileBlock$AbstractFSReader.readAtOffset(HFileBlock.java:1388)
>
>         at
> org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderV2.readBlockDataInternal(HFileBlock.java:1880)
>
>         at
> org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderV2.readBlockData(HFileBlock.java:1723)
>
>         at
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2.readBlock(HFileReaderV2.java:365)
>
>         at
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2$AbstractScannerV2.readNextDataBlock(HFileReaderV2.java:633)
>
>         at
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2$ScannerV2.next(HFileReaderV2.java:730)
>
>         at
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScanner.java:128)
>
>
>
> why does appear the exception "java.net.SocketTimeoutException: read(2)
> error: Resource temporarily unavailable"?
>
>
>  Thanks,
> LiuLei
>



-- 
Jungi Jeong
M.S Candidate, Computer Architecture Lab.
Div. of Computer Science, KAIST

Re: java.net.SocketTimeoutException: read(2) error: Resource temporarily unavailable

Posted by Jungi Jeong <jg...@calab.kaist.ac.kr>.
Not sure exactly, but I had similar issues with Hadoop 1.2.1.
If your machines do not have any network related issues (DNS, ping, etc),
it can be disk-related (heavy disk usages).
When the machine goes under heavy disk I/O, things get slow and expire the
socketTimeOut, thus, this error can happen.

I resolved (avoided) this issue by setting
dfs.datanode.socket.write.timeout = 0.

I really want this can help you,
thanks.

- Jungi


On 7 July 2014 10:34, lei liu <li...@gmail.com> wrote:

> I use hbase-0.94 and hadoop-2.2, there is below exception:
>
> 2014-07-04 12:43:49,700 WARN org.apache.hadoop.hdfs.DFSClient: failed to
> connect to
> DomainSocket(fd=322,path=/home/hadoop/hadoop-current/cdh4-dn-socket/dn_socket)
>
> java.net.SocketTimeoutException: read(2) error: Resource temporarily
> unavailable
>
>         at org.apache.hadoop.net.unix.DomainSocket.readArray0(Native
> Method)
>
>         at
> org.apache.hadoop.net.unix.DomainSocket.access$200(DomainSocket.java:47)
>
>         at
> org.apache.hadoop.net.unix.DomainSocket$DomainInputStream.read(DomainSocket.java:530)
>
>         at java.io.FilterInputStream.read(FilterInputStream.java:66)
>
>         at
> org.apache.hadoop.hdfs.protocol.HdfsProtoUtil.vintPrefixed(HdfsProtoUtil.java:169)
>
>         at
> org.apache.hadoop.hdfs.BlockReaderFactory.newShortCircuitBlockReader(BlockReaderFactory.java:187)
>
>         at
> org.apache.hadoop.hdfs.BlockReaderFactory.newBlockReader(BlockReaderFactory.java:104)
>
>         at
> org.apache.hadoop.hdfs.DFSInputStream.getBlockReader(DFSInputStream.java:1060)
>
>         at
> org.apache.hadoop.hdfs.DFSInputStream.fetchBlockByteRange(DFSInputStream.java:898)
>
>         at
> org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:1148)
>
>         at
> org.apache.hadoop.fs.FSDataInputStream.read(FSDataInputStream.java:73)
>
>         at
> org.apache.hadoop.hbase.io.hfile.HFileBlock$AbstractFSReader.readAtOffset(HFileBlock.java:1388)
>
>         at
> org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderV2.readBlockDataInternal(HFileBlock.java:1880)
>
>         at
> org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderV2.readBlockData(HFileBlock.java:1723)
>
>         at
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2.readBlock(HFileReaderV2.java:365)
>
>         at
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2$AbstractScannerV2.readNextDataBlock(HFileReaderV2.java:633)
>
>         at
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2$ScannerV2.next(HFileReaderV2.java:730)
>
>         at
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScanner.java:128)
>
>
>
> why does appear the exception "java.net.SocketTimeoutException: read(2)
> error: Resource temporarily unavailable"?
>
>
>  Thanks,
> LiuLei
>



-- 
Jungi Jeong
M.S Candidate, Computer Architecture Lab.
Div. of Computer Science, KAIST

Re: java.net.SocketTimeoutException: read(2) error: Resource temporarily unavailable

Posted by Jungi Jeong <jg...@calab.kaist.ac.kr>.
Not sure exactly, but I had similar issues with Hadoop 1.2.1.
If your machines do not have any network related issues (DNS, ping, etc),
it can be disk-related (heavy disk usages).
When the machine goes under heavy disk I/O, things get slow and expire the
socketTimeOut, thus, this error can happen.

I resolved (avoided) this issue by setting
dfs.datanode.socket.write.timeout = 0.

I really want this can help you,
thanks.

- Jungi


On 7 July 2014 10:34, lei liu <li...@gmail.com> wrote:

> I use hbase-0.94 and hadoop-2.2, there is below exception:
>
> 2014-07-04 12:43:49,700 WARN org.apache.hadoop.hdfs.DFSClient: failed to
> connect to
> DomainSocket(fd=322,path=/home/hadoop/hadoop-current/cdh4-dn-socket/dn_socket)
>
> java.net.SocketTimeoutException: read(2) error: Resource temporarily
> unavailable
>
>         at org.apache.hadoop.net.unix.DomainSocket.readArray0(Native
> Method)
>
>         at
> org.apache.hadoop.net.unix.DomainSocket.access$200(DomainSocket.java:47)
>
>         at
> org.apache.hadoop.net.unix.DomainSocket$DomainInputStream.read(DomainSocket.java:530)
>
>         at java.io.FilterInputStream.read(FilterInputStream.java:66)
>
>         at
> org.apache.hadoop.hdfs.protocol.HdfsProtoUtil.vintPrefixed(HdfsProtoUtil.java:169)
>
>         at
> org.apache.hadoop.hdfs.BlockReaderFactory.newShortCircuitBlockReader(BlockReaderFactory.java:187)
>
>         at
> org.apache.hadoop.hdfs.BlockReaderFactory.newBlockReader(BlockReaderFactory.java:104)
>
>         at
> org.apache.hadoop.hdfs.DFSInputStream.getBlockReader(DFSInputStream.java:1060)
>
>         at
> org.apache.hadoop.hdfs.DFSInputStream.fetchBlockByteRange(DFSInputStream.java:898)
>
>         at
> org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:1148)
>
>         at
> org.apache.hadoop.fs.FSDataInputStream.read(FSDataInputStream.java:73)
>
>         at
> org.apache.hadoop.hbase.io.hfile.HFileBlock$AbstractFSReader.readAtOffset(HFileBlock.java:1388)
>
>         at
> org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderV2.readBlockDataInternal(HFileBlock.java:1880)
>
>         at
> org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderV2.readBlockData(HFileBlock.java:1723)
>
>         at
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2.readBlock(HFileReaderV2.java:365)
>
>         at
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2$AbstractScannerV2.readNextDataBlock(HFileReaderV2.java:633)
>
>         at
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2$ScannerV2.next(HFileReaderV2.java:730)
>
>         at
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScanner.java:128)
>
>
>
> why does appear the exception "java.net.SocketTimeoutException: read(2)
> error: Resource temporarily unavailable"?
>
>
>  Thanks,
> LiuLei
>



-- 
Jungi Jeong
M.S Candidate, Computer Architecture Lab.
Div. of Computer Science, KAIST