You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hadoop.apache.org by Behroz Sikander <be...@gmail.com> on 2019/05/28 12:45:30 UTC

Client timeout property

Hello,
I have a client application which uses HDFS API to push data. My HDFS is
setup in HA mode.
My client application has certain timeout requirements e.g. an upload
shouldn't take longer than 10 seconds otherwise the operation is considered
a failure.

Sometimes, I notice the following exceptions in my code

2019-05-23 14:13:31,356 INFO Thread-11
org.apache.hadoop.hdfs.DFSClient []: Exception in
createBlockOutputStream
java.io.IOException: Got error, status message , ack with firstBadLink
as xx.xx.xx.xx:50010
        at org.apache.hadoop.hdfs.protocol.datatransfer.DataTransferProtoUtil.checkBlockOpStatus(DataTransferProtoUtil.java:142)
        at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1359)
        at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262)
        at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448)2019-05-23
14:13:31,357 INFO Thread-11 org.apache.hadoop.hdfs.DFSClient []:
Abandoning BP-1672040070-127.0.0.1-1527078068582:blk_1073742599_17752019-05-23
14:13:31,384 INFO Thread-11 org.apache.hadoop.hdfs.DFSClient []:
Excluding datanode
DatanodeInfoWithStorage[xx.xx.xx.xx:50010,DS-4be6740a-bad8-438b-99d6-fbc50d7760dd,DISK]


Even though the upload succeeds but takes around *20 seconds*.

I am interested in the configuration property which controls this
timeout, so I can adjust it

according to my client's requirements. Or is there any configuration which says

if the operation is not completed within X seconds, then HDFS API
throws an exception?

Can it be ipc.client.connect.timeout?


Thanks.

Behroz

Re: Client timeout property

Posted by Behroz Sikander <be...@gmail.com>.
Can anyone please help me here.

On Tue, May 28, 2019 at 2:45 PM Behroz Sikander <be...@gmail.com> wrote:

> Hello,
> I have a client application which uses HDFS API to push data. My HDFS is
> setup in HA mode.
> My client application has certain timeout requirements e.g. an upload
> shouldn't take longer than 10 seconds otherwise the operation is considered
> a failure.
>
> Sometimes, I notice the following exceptions in my code
>
> 2019-05-23 14:13:31,356 INFO Thread-11 org.apache.hadoop.hdfs.DFSClient []: Exception in createBlockOutputStream
> java.io.IOException: Got error, status message , ack with firstBadLink as xx.xx.xx.xx:50010
>         at org.apache.hadoop.hdfs.protocol.datatransfer.DataTransferProtoUtil.checkBlockOpStatus(DataTransferProtoUtil.java:142)
>         at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1359)
>         at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262)
>         at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448)2019-05-23 14:13:31,357 INFO Thread-11 org.apache.hadoop.hdfs.DFSClient []: Abandoning BP-1672040070-127.0.0.1-1527078068582:blk_1073742599_17752019-05-23 14:13:31,384 INFO Thread-11 org.apache.hadoop.hdfs.DFSClient []: Excluding datanode DatanodeInfoWithStorage[xx.xx.xx.xx:50010,DS-4be6740a-bad8-438b-99d6-fbc50d7760dd,DISK]
>
>
> Even though the upload succeeds but takes around *20 seconds*.
>
> I am interested in the configuration property which controls this timeout, so I can adjust it
>
> according to my client's requirements. Or is there any configuration which says
>
> if the operation is not completed within X seconds, then HDFS API throws an exception?
>
> Can it be ipc.client.connect.timeout?
>
>
> Thanks.
>
> Behroz
>
>