You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@flume.apache.org by "Patrick Dvorak (JIRA)" <ji...@apache.org> on 2014/04/09 23:51:15 UTC

[jira] [Created] (FLUME-2357) HDFS sink should retry closing files that previously had close errors

Patrick Dvorak created FLUME-2357:
-------------------------------------

             Summary: HDFS sink should retry closing files that previously had close errors
                 Key: FLUME-2357
                 URL: https://issues.apache.org/jira/browse/FLUME-2357
             Project: Flume
          Issue Type: Bug
          Components: Sinks+Sources
    Affects Versions: v1.4.0
            Reporter: Patrick Dvorak


When the AbstractHDFSWriter fails to close a file (due to exceeding the callTimeout or other hdfs issues), it will leave the the file open and never try again.  The only way to close the open files is to restart the flume agent.  There should be a  configurable option to allow the sink to retry to close files that had previously failed to close.




--
This message was sent by Atlassian JIRA
(v6.2#6252)