You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@flume.apache.org by "ASF subversion and git services (JIRA)" <ji...@apache.org> on 2014/04/29 02:20:26 UTC

[jira] [Commented] (FLUME-2357) HDFS sink should retry closing files that previously had close errors

    [ https://issues.apache.org/jira/browse/FLUME-2357?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13983777#comment-13983777 ] 

ASF subversion and git services commented on FLUME-2357:
--------------------------------------------------------

Commit a94594dd2c5cb980bc6f82b1fa606a922986569e in flume's branch refs/heads/trunk from [~mpercy]
[ https://git-wip-us.apache.org/repos/asf?p=flume.git;h=a94594d ]

FLUME-2357. HDFS sink should retry closing files that previously had close errors

(Hari Shreedharan via Mike Percy)


> HDFS sink should retry closing files that previously had close errors
> ---------------------------------------------------------------------
>
>                 Key: FLUME-2357
>                 URL: https://issues.apache.org/jira/browse/FLUME-2357
>             Project: Flume
>          Issue Type: Bug
>          Components: Sinks+Sources
>    Affects Versions: v1.4.0
>            Reporter: Patrick Dvorak
>            Assignee: Hari Shreedharan
>         Attachments: FLUME-2357-1.patch, FLUME-2357-2.patch, FLUME-2357-3.patch, FLUME-2357.patch
>
>
> When the AbstractHDFSWriter fails to close a file (due to exceeding the callTimeout or other hdfs issues), it will leave the the file open and never try again.  The only way to close the open files is to restart the flume agent.  There should be a  configurable option to allow the sink to retry to close files that had previously failed to close.



--
This message was sent by Atlassian JIRA
(v6.2#6252)