You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@flume.apache.org by "Erik Jutemar (JIRA)" <ji...@apache.org> on 2011/08/10 18:03:27 UTC

[jira] [Created] (FLUME-735) Using diskFailover with roll yields unexpected behavior

Using diskFailover with roll yields unexpected behavior
-------------------------------------------------------

                 Key: FLUME-735
                 URL: https://issues.apache.org/jira/browse/FLUME-735
             Project: Flume
          Issue Type: Bug
          Components: Sinks+Sources
    Affects Versions: v0.9.3
         Environment: Flume 0.9.3 on Debian Lenny
            Reporter: Erik Jutemar


Trying to use diskFailover and roll with the config < roll(1500) { escapedCustomDfs("hdfs") } ? diskFailover mask("rolltag") insistentAppend stubbornAppend insistentOpen roll(1500) { escapedCustomDfs("hdfs") } > saves data to disk if hdfs is unavailable. However when hdfs comes back, the data is never sent.

Scenario:
Continuously sending data.
Shutting down hdfs.
Enable hdfs.
Stop sending data.
Wait a while.
Shut down flume.

This generates the following log:


2011-08-10 16:57:42,384 INFO com.cloudera.flume.handlers.rolling.RollSink: closing RollSink 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
2011-08-10 16:57:42,384 INFO com.cloudera.flume.handlers.rolling.RollSink: double close 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
2011-08-10 16:57:42,384 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
2011-08-10 16:57:42,557 INFO com.cloudera.flume.handlers.rolling.RollSink: Created RollSink: trigger=[TimeTrigger: maxAge=10000 tagger=com.cloudera.flume.handlers.rolling.ProcessTagger@da0225b] checkPeriodMs = 250 spec='NaiveFileFailover'
2011-08-10 16:57:42,558 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'NaiveFileFailover'
2011-08-10 16:57:42,561 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000010.20110810-165742560+0200.445115919474417.seq
2011-08-10 16:57:42,591 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Opening hdfs://host-vip-1:9000//path/2011-08-10-1600/part-log.00000010.20110810-165742384+0200.445115743357603.seq
2011-08-10 16:57:42,652 INFO com.hadoop.compression.lzo.GPLNativeCodeLoader: Loaded native gpl library
2011-08-10 16:57:42,655 INFO com.hadoop.compression.lzo.LzoCodec: Successfully loaded & initialized native-lzo library [hadoop-lzo rev 0e7005136e4160ed4cc157c4ddd7f4f1c6e11ffa]
2011-08-10 16:57:43,152 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Creating HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1600/part-log.00000010.20110810-165742384+0200.445115743357603.seq
2011-08-10 16:57:43,890 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Closing hdfs://host-vip-1:9000//path/2011-08-10-1600/part-log.00000010.20110810-165742384+0200.445115743357603.seq
2011-08-10 16:57:43,891 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Closing HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1600/part-log.00000010.20110810-165742384+0200.445115743357603.seq
2011-08-10 16:57:43,891 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: done writing raw file to hdfs
2011-08-10 16:57:44,379 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Opening hdfs://host-vip-1:9000//path/2011-08-10-1600/part-log.00000014.20110810-165743890+0200.445117248899762.seq
2011-08-10 16:57:44,389 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Creating HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1600/part-log.00000014.20110810-165743890+0200.445117248899762.seq
2011-08-10 16:57:45,558 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Closing hdfs://host-vip-1:9000//path/2011-08-10-1600/part-log.00000014.20110810-165743890+0200.445117248899762.seq
2011-08-10 16:57:45,558 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Closing HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1600/part-log.00000014.20110810-165743890+0200.445117248899762.seq
2011-08-10 16:57:45,558 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: done writing raw file to hdfs
2011-08-10 16:57:46,380 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Opening hdfs://host-vip-1:9000//path/2011-08-10-1600/part-log.00000014.20110810-165745558+0200.445118916703188.seq
2011-08-10 16:57:46,389 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Creating HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1600/part-log.00000014.20110810-165745558+0200.445118916703188.seq
2011-08-10 16:57:47,085 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Closing hdfs://host-vip-1:9000//path/2011-08-10-1600/part-log.00000014.20110810-165745558+0200.445118916703188.seq
2011-08-10 16:57:47,085 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Closing HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1600/part-log.00000014.20110810-165745558+0200.445118916703188.seq
2011-08-10 16:57:47,085 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: done writing raw file to hdfs
2011-08-10 16:57:52,636 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000010.20110810-165742560+0200.445115919474417.seq
2011-08-10 16:57:52,636 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000010.20110810-165742560+0200.445115919474417.seq
2011-08-10 16:57:52,637 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165752637+0200.445125995972908.seq
2011-08-10 16:57:52,638 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: opening new file for log.00000010.20110810-165742560+0200.445115919474417.seq
2011-08-10 16:57:52,712 INFO com.cloudera.flume.agent.diskfailover.DiskFailoverSource: end of file com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager$StateChangeDeco@11c7865b
2011-08-10 16:58:02,642 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165752637+0200.445125995972908.seq
2011-08-10 16:58:02,642 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165752637+0200.445125995972908.seq
2011-08-10 16:58:02,642 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165802642+0200.445136001273694.seq
2011-08-10 16:58:02,643 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: opening new file for log.00000015.20110810-165752637+0200.445125995972908.seq
2011-08-10 16:58:02,678 INFO com.cloudera.flume.agent.diskfailover.DiskFailoverSource: end of file com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager$StateChangeDeco@67071c84
2011-08-10 16:58:12,646 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165802642+0200.445136001273694.seq
2011-08-10 16:58:12,646 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165802642+0200.445136001273694.seq
2011-08-10 16:58:12,647 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165812647+0200.445146005897644.seq
2011-08-10 16:58:12,648 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: opening new file for log.00000015.20110810-165802642+0200.445136001273694.seq
2011-08-10 16:58:12,678 INFO com.cloudera.flume.agent.diskfailover.DiskFailoverSource: end of file com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager$StateChangeDeco@b3e15f7
2011-08-10 16:58:15,905 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Opening hdfs://host-vip-1:9000//path/2011-08-10-1600/part-log.00000014.20110810-165815689+0200.445149048504058.seq
2011-08-10 16:58:16,907 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 0 time(s).
2011-08-10 16:58:17,908 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 1 time(s).
2011-08-10 16:58:18,909 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 2 time(s).
2011-08-10 16:58:19,910 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 3 time(s).
2011-08-10 16:58:20,911 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 4 time(s).
2011-08-10 16:58:21,912 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 5 time(s).
2011-08-10 16:58:22,651 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165812647+0200.445146005897644.seq
2011-08-10 16:58:22,651 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165812647+0200.445146005897644.seq
2011-08-10 16:58:22,652 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165822652+0200.445156010693063.seq
2011-08-10 16:58:22,652 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: opening new file for log.00000015.20110810-165812647+0200.445146005897644.seq
2011-08-10 16:58:22,676 INFO com.cloudera.flume.agent.diskfailover.DiskFailoverSource: end of file com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager$StateChangeDeco@4c6504bc
2011-08-10 16:58:22,913 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 6 time(s).
2011-08-10 16:58:23,914 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 7 time(s).
2011-08-10 16:58:24,916 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 8 time(s).
2011-08-10 16:58:25,917 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 9 time(s).
2011-08-10 16:58:25,918 INFO com.cloudera.flume.core.BackOffFailOverSink: Call to host-vip-1/10.0.62.210:9000 failed on connection exception: java.net.ConnectException: Connection refused
2011-08-10 16:58:27,908 INFO com.cloudera.flume.handlers.rolling.RollSink: closing RollSink 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
2011-08-10 16:58:27,909 WARN com.cloudera.flume.handlers.rolling.RollSink: TriggerThread interrupted
2011-08-10 16:58:27,909 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
2011-08-10 16:58:27,914 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Opening hdfs://host-vip-1:9000//path/2011-08-10-1600/part-log.00000010.20110810-165827910+0200.445161269239691.seq
2011-08-10 16:58:28,916 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 0 time(s).
2011-08-10 16:58:29,917 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 1 time(s).
2011-08-10 16:58:30,918 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 2 time(s).
2011-08-10 16:58:31,919 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 3 time(s).
2011-08-10 16:58:32,657 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165822652+0200.445156010693063.seq
2011-08-10 16:58:32,657 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165822652+0200.445156010693063.seq
2011-08-10 16:58:32,658 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165832658+0200.445166016694026.seq
2011-08-10 16:58:32,658 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: opening new file for log.00000015.20110810-165822652+0200.445156010693063.seq
2011-08-10 16:58:32,683 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
2011-08-10 16:58:32,684 INFO com.cloudera.flume.handlers.debug.InsistentOpenDecorator: Opened Roll on try 0
2011-08-10 16:58:32,685 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Opening hdfs://host-vip-1:9000//path/2011-08-10-1600/part-log.00000016.20110810-165832683+0200.445166041918942.seq
2011-08-10 16:58:32,921 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 4 time(s).
2011-08-10 16:58:33,922 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 5 time(s).
2011-08-10 16:58:34,924 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 6 time(s).
2011-08-10 16:58:35,925 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 7 time(s).
2011-08-10 16:58:36,926 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 8 time(s).
2011-08-10 16:58:37,927 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 9 time(s).
2011-08-10 16:58:37,928 INFO com.cloudera.flume.core.BackOffFailOverSink: Call to host-vip-1/10.0.62.210:9000 failed on connection exception: java.net.ConnectException: Connection refused
2011-08-10 16:58:38,929 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 0 time(s).
2011-08-10 16:58:39,930 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 1 time(s).
2011-08-10 16:58:40,911 INFO com.cloudera.flume.handlers.rolling.RollSink: closing RollSink 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
2011-08-10 16:58:40,912 WARN com.cloudera.flume.handlers.rolling.RollSink: TriggerThread interrupted
2011-08-10 16:58:40,912 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
2011-08-10 16:58:40,915 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Opening hdfs://host-vip-1:9000//path/2011-08-10-1600/part-log.00000010.20110810-165840912+0200.445174271130316.seq
2011-08-10 16:58:40,931 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 2 time(s).
2011-08-10 16:58:41,932 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 3 time(s).
2011-08-10 16:58:42,664 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165832658+0200.445166016694026.seq
2011-08-10 16:58:42,664 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165832658+0200.445166016694026.seq
2011-08-10 16:58:42,664 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165842664+0200.445176023166287.seq
2011-08-10 16:58:42,933 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 4 time(s).
2011-08-10 16:58:43,934 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 5 time(s).
2011-08-10 16:58:44,935 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 6 time(s).
2011-08-10 16:58:45,936 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 7 time(s).
2011-08-10 16:58:46,938 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 8 time(s).
2011-08-10 16:58:47,939 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 9 time(s).
2011-08-10 16:58:47,940 INFO com.cloudera.flume.handlers.debug.StubbornAppendSink: Call to host-vip-1/10.0.62.210:9000 failed on connection exception: java.net.ConnectException: Connection refused
2011-08-10 16:58:47,940 INFO com.cloudera.flume.handlers.rolling.RollSink: closing RollSink 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
2011-08-10 16:58:47,941 INFO com.cloudera.flume.handlers.rolling.RollSink: TriggerThread shutdown
2011-08-10 16:58:47,941 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
2011-08-10 16:58:47,943 INFO com.cloudera.flume.handlers.debug.InsistentOpenDecorator: Opened Roll on try 0
2011-08-10 16:58:47,944 INFO com.cloudera.flume.handlers.debug.InsistentAppendDecorator: append attempt 0 failed, backoff (1000ms): Event already had an event with attribute rolltag
2011-08-10 16:58:48,941 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 0 time(s).
2011-08-10 16:58:48,944 INFO com.cloudera.flume.handlers.debug.StubbornAppendSink: Event already had an event with attribute rolltag
2011-08-10 16:58:48,944 INFO com.cloudera.flume.handlers.rolling.RollSink: closing RollSink 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
2011-08-10 16:58:48,944 WARN com.cloudera.flume.handlers.rolling.RollSink: TriggerThread interrupted
2011-08-10 16:58:48,946 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
2011-08-10 16:58:48,947 INFO com.cloudera.flume.handlers.debug.InsistentOpenDecorator: Opened Roll on try 0
2011-08-10 16:58:48,947 INFO com.cloudera.flume.handlers.debug.InsistentAppendDecorator: append attempt 1 failed, backoff (2000ms): Event already had an event with attribute rolltag
2011-08-10 16:58:49,942 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 1 time(s).
2011-08-10 16:58:50,943 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 2 time(s).
2011-08-10 16:58:50,948 INFO com.cloudera.flume.handlers.debug.StubbornAppendSink: Event already had an event with attribute rolltag
2011-08-10 16:58:50,948 INFO com.cloudera.flume.handlers.rolling.RollSink: closing RollSink 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
2011-08-10 16:58:50,948 WARN com.cloudera.flume.handlers.rolling.RollSink: TriggerThread interrupted
2011-08-10 16:58:50,966 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
2011-08-10 16:58:50,967 INFO com.cloudera.flume.handlers.debug.InsistentOpenDecorator: Opened Roll on try 0
2011-08-10 16:58:50,967 INFO com.cloudera.flume.handlers.debug.InsistentAppendDecorator: append attempt 2 failed, backoff (4000ms): Event already had an event with attribute rolltag
2011-08-10 16:58:51,944 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 3 time(s).
2011-08-10 16:58:52,668 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165842664+0200.445176023166287.seq
2011-08-10 16:58:52,668 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165842664+0200.445176023166287.seq
2011-08-10 16:58:52,669 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165852669+0200.445186027755160.seq
2011-08-10 16:58:52,945 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 4 time(s).
2011-08-10 16:58:53,947 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 5 time(s).
2011-08-10 16:58:54,949 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 6 time(s).
2011-08-10 16:58:54,968 INFO com.cloudera.flume.handlers.debug.StubbornAppendSink: Event already had an event with attribute rolltag
2011-08-10 16:58:54,968 INFO com.cloudera.flume.handlers.rolling.RollSink: closing RollSink 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
2011-08-10 16:58:54,968 WARN com.cloudera.flume.handlers.rolling.RollSink: TriggerThread interrupted
2011-08-10 16:58:54,969 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
2011-08-10 16:58:54,970 INFO com.cloudera.flume.handlers.debug.InsistentOpenDecorator: Opened Roll on try 0
2011-08-10 16:58:54,970 INFO com.cloudera.flume.handlers.debug.InsistentAppendDecorator: append attempt 3 failed, backoff (8000ms): Event already had an event with attribute rolltag
2011-08-10 16:58:55,950 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 7 time(s).
2011-08-10 16:58:56,952 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 8 time(s).
2011-08-10 16:58:57,952 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 9 time(s).
2011-08-10 16:58:57,955 INFO com.cloudera.flume.core.BackOffFailOverSink: Call to host-vip-1/10.0.62.210:9000 failed on connection exception: java.net.ConnectException: Connection refused
2011-08-10 16:59:02,675 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165852669+0200.445186027755160.seq
2011-08-10 16:59:02,675 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165852669+0200.445186027755160.seq
2011-08-10 16:59:02,675 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165902675+0200.445196034139189.seq
2011-08-10 16:59:02,916 INFO com.cloudera.flume.handlers.rolling.RollSink: closing RollSink 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
2011-08-10 16:59:02,916 WARN com.cloudera.flume.handlers.rolling.RollSink: TriggerThread interrupted
2011-08-10 16:59:02,917 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
2011-08-10 16:59:02,919 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Opening hdfs://host-vip-1:9000//path/2011-08-10-1700/part-log.00000010.20110810-165902917+0200.445196275742099.seq
2011-08-10 16:59:02,970 INFO com.cloudera.flume.handlers.debug.StubbornAppendSink: Event already had an event with attribute rolltag
2011-08-10 16:59:02,970 INFO com.cloudera.flume.handlers.rolling.RollSink: closing RollSink 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
2011-08-10 16:59:02,971 WARN com.cloudera.flume.handlers.rolling.RollSink: TriggerThread interrupted
2011-08-10 16:59:02,971 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
2011-08-10 16:59:02,972 INFO com.cloudera.flume.handlers.debug.InsistentOpenDecorator: Opened Roll on try 0
2011-08-10 16:59:02,972 INFO com.cloudera.flume.handlers.debug.InsistentAppendDecorator: append attempt 4 failed, backoff (16000ms): Event already had an event with attribute rolltag
2011-08-10 16:59:03,920 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 0 time(s).
2011-08-10 16:59:04,921 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 1 time(s).
2011-08-10 16:59:05,922 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 2 time(s).
2011-08-10 16:59:06,923 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 3 time(s).
2011-08-10 16:59:07,924 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 4 time(s).
2011-08-10 16:59:08,925 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 5 time(s).
2011-08-10 16:59:09,926 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 6 time(s).
2011-08-10 16:59:10,927 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 7 time(s).
2011-08-10 16:59:11,928 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 8 time(s).
2011-08-10 16:59:12,679 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165902675+0200.445196034139189.seq
2011-08-10 16:59:12,679 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165902675+0200.445196034139189.seq
2011-08-10 16:59:12,680 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165912680+0200.445206038676079.seq
2011-08-10 16:59:12,929 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 9 time(s).
2011-08-10 16:59:12,930 INFO com.cloudera.flume.core.BackOffFailOverSink: Call to host-vip-1/10.0.62.210:9000 failed on connection exception: java.net.ConnectException: Connection refused
2011-08-10 16:59:18,973 INFO com.cloudera.flume.handlers.debug.StubbornAppendSink: Event already had an event with attribute rolltag
2011-08-10 16:59:18,973 INFO com.cloudera.flume.handlers.rolling.RollSink: closing RollSink 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
2011-08-10 16:59:18,973 WARN com.cloudera.flume.handlers.rolling.RollSink: TriggerThread interrupted
2011-08-10 16:59:18,973 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
2011-08-10 16:59:18,975 INFO com.cloudera.flume.handlers.debug.InsistentOpenDecorator: Opened Roll on try 0
2011-08-10 16:59:18,976 INFO com.cloudera.flume.handlers.debug.InsistentAppendDecorator: append attempt 5 failed, backoff (32000ms): Event already had an event with attribute rolltag
2011-08-10 16:59:21,921 INFO com.cloudera.flume.handlers.rolling.RollSink: closing RollSink 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
2011-08-10 16:59:21,922 WARN com.cloudera.flume.handlers.rolling.RollSink: TriggerThread interrupted
2011-08-10 16:59:21,922 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
2011-08-10 16:59:21,968 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Opening hdfs://host-vip-1:9000//path/2011-08-10-1700/part-log.00000010.20110810-165921922+0200.445215281259212.seq
2011-08-10 16:59:22,719 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165912680+0200.445206038676079.seq
2011-08-10 16:59:22,719 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165912680+0200.445206038676079.seq
2011-08-10 16:59:22,720 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165922719+0200.445216078556307.seq
2011-08-10 16:59:22,969 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 0 time(s).
2011-08-10 16:59:23,970 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 1 time(s).
2011-08-10 16:59:24,972 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 2 time(s).
2011-08-10 16:59:25,972 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 3 time(s).
2011-08-10 16:59:26,974 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 4 time(s).
2011-08-10 16:59:27,975 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 5 time(s).
2011-08-10 16:59:28,976 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 6 time(s).
2011-08-10 16:59:29,304 INFO com.cloudera.flume.core.BackOffFailOverSink: org.apache.hadoop.hdfs.server.namenode.SafeModeException: Cannot create file/path/2011-08-10-1700/part-log.00000010.20110810-165921922+0200.445215281259212.seq. Name node is in safe mode.
The ratio of reported blocks 0.0000 has not reached the threshold 0.9990. Safe mode will be turned off automatically.
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:1019)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:989)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.create(NameNode.java:377)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:508)
	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:961)
	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:957)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:396)
	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:955)

2011-08-10 16:59:32,726 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165922719+0200.445216078556307.seq
2011-08-10 16:59:32,726 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165922719+0200.445216078556307.seq
2011-08-10 16:59:32,727 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165932726+0200.445226085601122.seq
2011-08-10 16:59:42,732 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165932726+0200.445226085601122.seq
2011-08-10 16:59:42,732 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165932726+0200.445226085601122.seq
2011-08-10 16:59:42,732 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165942732+0200.445236091341764.seq
2011-08-10 16:59:45,926 INFO com.cloudera.flume.handlers.rolling.RollSink: closing RollSink 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
2011-08-10 16:59:45,927 WARN com.cloudera.flume.handlers.rolling.RollSink: TriggerThread interrupted
2011-08-10 16:59:45,927 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
2011-08-10 16:59:45,929 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Opening hdfs://host-vip-1:9000//path/2011-08-10-1700/part-log.00000010.20110810-165945928+0200.445239286795624.seq
2011-08-10 16:59:45,964 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Creating HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1700/part-log.00000010.20110810-165945928+0200.445239286795624.seq
2011-08-10 16:59:47,429 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Closing hdfs://host-vip-1:9000//path/2011-08-10-1700/part-log.00000010.20110810-165945928+0200.445239286795624.seq
2011-08-10 16:59:47,429 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Closing HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1700/part-log.00000010.20110810-165945928+0200.445239286795624.seq
2011-08-10 16:59:47,429 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: done writing raw file to hdfs
2011-08-10 16:59:47,927 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Opening hdfs://host-vip-1:9000//path/2011-08-10-1700/part-log.00000056.20110810-165947429+0200.445240787825417.seq
2011-08-10 16:59:47,938 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Creating HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1700/part-log.00000056.20110810-165947429+0200.445240787825417.seq
2011-08-10 16:59:49,121 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Closing hdfs://host-vip-1:9000//path/2011-08-10-1700/part-log.00000056.20110810-165947429+0200.445240787825417.seq
2011-08-10 16:59:49,121 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Closing HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1700/part-log.00000056.20110810-165947429+0200.445240787825417.seq
2011-08-10 16:59:49,121 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: done writing raw file to hdfs
2011-08-10 16:59:49,927 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Opening hdfs://host-vip-1:9000//path/2011-08-10-1700/part-log.00000056.20110810-165949121+0200.445242480222436.seq
2011-08-10 16:59:49,934 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Creating HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1700/part-log.00000056.20110810-165949121+0200.445242480222436.seq
2011-08-10 16:59:50,727 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Closing hdfs://host-vip-1:9000//path/2011-08-10-1700/part-log.00000056.20110810-165949121+0200.445242480222436.seq
2011-08-10 16:59:50,727 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Closing HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1700/part-log.00000056.20110810-165949121+0200.445242480222436.seq
2011-08-10 16:59:50,727 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: done writing raw file to hdfs
2011-08-10 16:59:50,927 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Opening hdfs://host-vip-1:9000//path/2011-08-10-1700/part-log.00000056.20110810-165950727+0200.445244085712823.seq
2011-08-10 16:59:50,933 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Creating HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1700/part-log.00000056.20110810-165950727+0200.445244085712823.seq
2011-08-10 16:59:50,976 INFO com.cloudera.flume.handlers.debug.StubbornAppendSink: Event already had an event with attribute rolltag
2011-08-10 16:59:50,976 INFO com.cloudera.flume.handlers.rolling.RollSink: closing RollSink 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
2011-08-10 16:59:50,976 WARN com.cloudera.flume.handlers.rolling.RollSink: TriggerThread interrupted
2011-08-10 16:59:50,976 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
2011-08-10 16:59:50,978 INFO com.cloudera.flume.handlers.debug.InsistentOpenDecorator: Opened Roll on try 0
2011-08-10 16:59:50,978 INFO com.cloudera.flume.handlers.debug.InsistentAppendDecorator: append attempt 6 failed, backoff (60000ms): Event already had an event with attribute rolltag
2011-08-10 16:59:52,278 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Closing hdfs://host-vip-1:9000//path/2011-08-10-1700/part-log.00000056.20110810-165950727+0200.445244085712823.seq
2011-08-10 16:59:52,278 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Closing HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1700/part-log.00000056.20110810-165950727+0200.445244085712823.seq
2011-08-10 16:59:52,278 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: done writing raw file to hdfs
2011-08-10 16:59:52,737 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165942732+0200.445236091341764.seq
2011-08-10 16:59:52,737 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165942732+0200.445236091341764.seq
2011-08-10 16:59:52,737 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165952737+0200.445246096329911.seq
2011-08-10 17:00:02,741 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165952737+0200.445246096329911.seq
2011-08-10 17:00:02,741 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165952737+0200.445246096329911.seq
2011-08-10 17:00:02,742 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-170002741+0200.445256100500909.seq
2011-08-10 17:00:12,745 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-170002741+0200.445256100500909.seq
2011-08-10 17:00:12,745 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-170002741+0200.445256100500909.seq
2011-08-10 17:00:12,746 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-170012745+0200.445266104587375.seq
2011-08-10 17:00:22,768 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-170012745+0200.445266104587375.seq
2011-08-10 17:00:22,769 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-170012745+0200.445266104587375.seq
2011-08-10 17:00:22,769 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-170022769+0200.445276127976521.seq
2011-08-10 17:00:32,774 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-170022769+0200.445276127976521.seq
2011-08-10 17:00:32,774 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-170022769+0200.445276127976521.seq
2011-08-10 17:00:32,774 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-170032774+0200.445286133232984.seq
2011-08-10 17:00:42,779 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-170032774+0200.445286133232984.seq
2011-08-10 17:00:42,779 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-170032774+0200.445286133232984.seq
2011-08-10 17:00:42,780 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-170042779+0200.445296138546943.seq
2011-08-10 17:00:45,294 INFO com.cloudera.flume.watchdog.Watchdog: Watchdog shutdown hook

This results in the following files being written to dfo_* :
ls -lah *
dfo_error:
total 8.0K
drwxr-xr-x 2 user nogroup 4.0K 2011-07-18 11:11 .
drwxr-xr-x 7 user nogroup 4.0K 2011-07-18 11:11 ..

dfo_import:
total 8.0K
drwxr-xr-x 2 user nogroup 4.0K 2011-07-18 11:11 .
drwxr-xr-x 7 user nogroup 4.0K 2011-07-18 11:11 ..

dfo_logged:
total 120K
drwxr-xr-x 2 user nogroup  12K 2011-08-10 17:00 .
drwxr-xr-x 7 user nogroup 4.0K 2011-07-18 11:11 ..
-rw-r--r-- 1 user nogroup  12K 2011-08-10 16:58 log.00000015.20110810-165832658+0200.445166016694026.seq
-rw-r--r-- 1 user nogroup  125 2011-08-10 16:58 log.00000015.20110810-165842664+0200.445176023166287.seq
-rw-r--r-- 1 user nogroup  19K 2011-08-10 16:59 log.00000015.20110810-165852669+0200.445186027755160.seq
-rw-r--r-- 1 user nogroup  125 2011-08-10 16:59 log.00000015.20110810-165902675+0200.445196034139189.seq
-rw-r--r-- 1 user nogroup  17K 2011-08-10 16:59 log.00000015.20110810-165912680+0200.445206038676079.seq
-rw-r--r-- 1 user nogroup 9.6K 2011-08-10 16:59 log.00000015.20110810-165922719+0200.445216078556307.seq
-rw-r--r-- 1 user nogroup 8.8K 2011-08-10 16:59 log.00000015.20110810-165932726+0200.445226085601122.seq
-rw-r--r-- 1 user nogroup 2.8K 2011-08-10 16:59 log.00000015.20110810-165942732+0200.445236091341764.seq
-rw-r--r-- 1 user nogroup  125 2011-08-10 16:59 log.00000015.20110810-165952737+0200.445246096329911.seq
-rw-r--r-- 1 user nogroup  125 2011-08-10 17:00 log.00000015.20110810-170002741+0200.445256100500909.seq
-rw-r--r-- 1 user nogroup  125 2011-08-10 17:00 log.00000015.20110810-170012745+0200.445266104587375.seq
-rw-r--r-- 1 user nogroup  125 2011-08-10 17:00 log.00000015.20110810-170022769+0200.445276127976521.seq

dfo_sending:
total 20K
drwxr-xr-x 2 user nogroup 4.0K 2011-08-10 16:58 .
drwxr-xr-x 7 user nogroup 4.0K 2011-07-18 11:11 ..
-rw-r--r-- 1 user nogroup  11K 2011-08-10 16:58 log.00000015.20110810-165822652+0200.445156010693063.seq

dfo_writing:
total 12K
drwxr-xr-x 2 user nogroup 4.0K 2011-08-10 17:00 .
drwxr-xr-x 7 user nogroup 4.0K 2011-07-18 11:11 ..
-rw-r--r-- 1 user nogroup  125 2011-08-10 17:00 log.00000015.20110810-170032774+0200.445286133232984.seq

--
This message is automatically generated by JIRA.
For more information on JIRA, see: http://www.atlassian.com/software/jira

        

[jira] [Reopened] (FLUME-735) Using diskFailover with roll yields unexpected behavior

Posted by "Alexander Lorenz-Alten (Reopened) (JIRA)" <ji...@apache.org>.
     [ https://issues.apache.org/jira/browse/FLUME-735?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Alexander Lorenz-Alten reopened FLUME-735:
------------------------------------------

    
> Using diskFailover with roll yields unexpected behavior
> -------------------------------------------------------
>
>                 Key: FLUME-735
>                 URL: https://issues.apache.org/jira/browse/FLUME-735
>             Project: Flume
>          Issue Type: Bug
>          Components: Sinks+Sources
>    Affects Versions: v0.9.3
>         Environment: Flume 0.9.3 on Debian Lenny
>            Reporter: Erik Jutemar
>
> Trying to use diskFailover and roll with the config < roll(1500) { escapedCustomDfs("hdfs") } ? diskFailover mask("rolltag") insistentAppend stubbornAppend insistentOpen roll(1500) { escapedCustomDfs("hdfs") } > saves data to disk if hdfs is unavailable. However when hdfs comes back, the data is never sent.
> Scenario:
> Continuously sending data.
> Shutting down hdfs.
> Enable hdfs.
> Stop sending data.
> Wait a while.
> Shut down flume.
> This generates the following log:
> 2011-08-10 16:57:42,384 INFO com.cloudera.flume.handlers.rolling.RollSink: closing RollSink 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:57:42,384 INFO com.cloudera.flume.handlers.rolling.RollSink: double close 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:57:42,384 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:57:42,557 INFO com.cloudera.flume.handlers.rolling.RollSink: Created RollSink: trigger=[TimeTrigger: maxAge=10000 tagger=com.cloudera.flume.handlers.rolling.ProcessTagger@da0225b] checkPeriodMs = 250 spec='NaiveFileFailover'
> 2011-08-10 16:57:42,558 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'NaiveFileFailover'
> 2011-08-10 16:57:42,561 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000010.20110810-165742560+0200.445115919474417.seq
> 2011-08-10 16:57:42,591 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Opening hdfs://host-vip-1:9000//path/2011-08-10-1600/part-log.00000010.20110810-165742384+0200.445115743357603.seq
> 2011-08-10 16:57:42,652 INFO com.hadoop.compression.lzo.GPLNativeCodeLoader: Loaded native gpl library
> 2011-08-10 16:57:42,655 INFO com.hadoop.compression.lzo.LzoCodec: Successfully loaded & initialized native-lzo library [hadoop-lzo rev 0e7005136e4160ed4cc157c4ddd7f4f1c6e11ffa]
> 2011-08-10 16:57:43,152 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Creating HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1600/part-log.00000010.20110810-165742384+0200.445115743357603.seq
> 2011-08-10 16:57:43,890 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Closing hdfs://host-vip-1:9000//path/2011-08-10-1600/part-log.00000010.20110810-165742384+0200.445115743357603.seq
> 2011-08-10 16:57:43,891 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Closing HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1600/part-log.00000010.20110810-165742384+0200.445115743357603.seq
> 2011-08-10 16:57:43,891 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: done writing raw file to hdfs
> 2011-08-10 16:57:44,379 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Opening hdfs://host-vip-1:9000//path/2011-08-10-1600/part-log.00000014.20110810-165743890+0200.445117248899762.seq
> 2011-08-10 16:57:44,389 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Creating HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1600/part-log.00000014.20110810-165743890+0200.445117248899762.seq
> 2011-08-10 16:57:45,558 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Closing hdfs://host-vip-1:9000//path/2011-08-10-1600/part-log.00000014.20110810-165743890+0200.445117248899762.seq
> 2011-08-10 16:57:45,558 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Closing HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1600/part-log.00000014.20110810-165743890+0200.445117248899762.seq
> 2011-08-10 16:57:45,558 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: done writing raw file to hdfs
> 2011-08-10 16:57:46,380 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Opening hdfs://host-vip-1:9000//path/2011-08-10-1600/part-log.00000014.20110810-165745558+0200.445118916703188.seq
> 2011-08-10 16:57:46,389 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Creating HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1600/part-log.00000014.20110810-165745558+0200.445118916703188.seq
> 2011-08-10 16:57:47,085 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Closing hdfs://host-vip-1:9000//path/2011-08-10-1600/part-log.00000014.20110810-165745558+0200.445118916703188.seq
> 2011-08-10 16:57:47,085 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Closing HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1600/part-log.00000014.20110810-165745558+0200.445118916703188.seq
> 2011-08-10 16:57:47,085 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: done writing raw file to hdfs
> 2011-08-10 16:57:52,636 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000010.20110810-165742560+0200.445115919474417.seq
> 2011-08-10 16:57:52,636 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000010.20110810-165742560+0200.445115919474417.seq
> 2011-08-10 16:57:52,637 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165752637+0200.445125995972908.seq
> 2011-08-10 16:57:52,638 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: opening new file for log.00000010.20110810-165742560+0200.445115919474417.seq
> 2011-08-10 16:57:52,712 INFO com.cloudera.flume.agent.diskfailover.DiskFailoverSource: end of file com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager$StateChangeDeco@11c7865b
> 2011-08-10 16:58:02,642 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165752637+0200.445125995972908.seq
> 2011-08-10 16:58:02,642 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165752637+0200.445125995972908.seq
> 2011-08-10 16:58:02,642 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165802642+0200.445136001273694.seq
> 2011-08-10 16:58:02,643 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: opening new file for log.00000015.20110810-165752637+0200.445125995972908.seq
> 2011-08-10 16:58:02,678 INFO com.cloudera.flume.agent.diskfailover.DiskFailoverSource: end of file com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager$StateChangeDeco@67071c84
> 2011-08-10 16:58:12,646 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165802642+0200.445136001273694.seq
> 2011-08-10 16:58:12,646 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165802642+0200.445136001273694.seq
> 2011-08-10 16:58:12,647 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165812647+0200.445146005897644.seq
> 2011-08-10 16:58:12,648 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: opening new file for log.00000015.20110810-165802642+0200.445136001273694.seq
> 2011-08-10 16:58:12,678 INFO com.cloudera.flume.agent.diskfailover.DiskFailoverSource: end of file com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager$StateChangeDeco@b3e15f7
> 2011-08-10 16:58:15,905 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Opening hdfs://host-vip-1:9000//path/2011-08-10-1600/part-log.00000014.20110810-165815689+0200.445149048504058.seq
> 2011-08-10 16:58:16,907 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 0 time(s).
> 2011-08-10 16:58:17,908 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 1 time(s).
> 2011-08-10 16:58:18,909 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 2 time(s).
> 2011-08-10 16:58:19,910 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 3 time(s).
> 2011-08-10 16:58:20,911 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 4 time(s).
> 2011-08-10 16:58:21,912 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 5 time(s).
> 2011-08-10 16:58:22,651 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165812647+0200.445146005897644.seq
> 2011-08-10 16:58:22,651 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165812647+0200.445146005897644.seq
> 2011-08-10 16:58:22,652 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165822652+0200.445156010693063.seq
> 2011-08-10 16:58:22,652 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: opening new file for log.00000015.20110810-165812647+0200.445146005897644.seq
> 2011-08-10 16:58:22,676 INFO com.cloudera.flume.agent.diskfailover.DiskFailoverSource: end of file com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager$StateChangeDeco@4c6504bc
> 2011-08-10 16:58:22,913 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 6 time(s).
> 2011-08-10 16:58:23,914 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 7 time(s).
> 2011-08-10 16:58:24,916 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 8 time(s).
> 2011-08-10 16:58:25,917 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 9 time(s).
> 2011-08-10 16:58:25,918 INFO com.cloudera.flume.core.BackOffFailOverSink: Call to host-vip-1/10.0.62.210:9000 failed on connection exception: java.net.ConnectException: Connection refused
> 2011-08-10 16:58:27,908 INFO com.cloudera.flume.handlers.rolling.RollSink: closing RollSink 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:58:27,909 WARN com.cloudera.flume.handlers.rolling.RollSink: TriggerThread interrupted
> 2011-08-10 16:58:27,909 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:58:27,914 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Opening hdfs://host-vip-1:9000//path/2011-08-10-1600/part-log.00000010.20110810-165827910+0200.445161269239691.seq
> 2011-08-10 16:58:28,916 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 0 time(s).
> 2011-08-10 16:58:29,917 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 1 time(s).
> 2011-08-10 16:58:30,918 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 2 time(s).
> 2011-08-10 16:58:31,919 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 3 time(s).
> 2011-08-10 16:58:32,657 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165822652+0200.445156010693063.seq
> 2011-08-10 16:58:32,657 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165822652+0200.445156010693063.seq
> 2011-08-10 16:58:32,658 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165832658+0200.445166016694026.seq
> 2011-08-10 16:58:32,658 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: opening new file for log.00000015.20110810-165822652+0200.445156010693063.seq
> 2011-08-10 16:58:32,683 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:58:32,684 INFO com.cloudera.flume.handlers.debug.InsistentOpenDecorator: Opened Roll on try 0
> 2011-08-10 16:58:32,685 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Opening hdfs://host-vip-1:9000//path/2011-08-10-1600/part-log.00000016.20110810-165832683+0200.445166041918942.seq
> 2011-08-10 16:58:32,921 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 4 time(s).
> 2011-08-10 16:58:33,922 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 5 time(s).
> 2011-08-10 16:58:34,924 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 6 time(s).
> 2011-08-10 16:58:35,925 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 7 time(s).
> 2011-08-10 16:58:36,926 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 8 time(s).
> 2011-08-10 16:58:37,927 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 9 time(s).
> 2011-08-10 16:58:37,928 INFO com.cloudera.flume.core.BackOffFailOverSink: Call to host-vip-1/10.0.62.210:9000 failed on connection exception: java.net.ConnectException: Connection refused
> 2011-08-10 16:58:38,929 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 0 time(s).
> 2011-08-10 16:58:39,930 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 1 time(s).
> 2011-08-10 16:58:40,911 INFO com.cloudera.flume.handlers.rolling.RollSink: closing RollSink 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:58:40,912 WARN com.cloudera.flume.handlers.rolling.RollSink: TriggerThread interrupted
> 2011-08-10 16:58:40,912 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:58:40,915 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Opening hdfs://host-vip-1:9000//path/2011-08-10-1600/part-log.00000010.20110810-165840912+0200.445174271130316.seq
> 2011-08-10 16:58:40,931 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 2 time(s).
> 2011-08-10 16:58:41,932 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 3 time(s).
> 2011-08-10 16:58:42,664 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165832658+0200.445166016694026.seq
> 2011-08-10 16:58:42,664 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165832658+0200.445166016694026.seq
> 2011-08-10 16:58:42,664 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165842664+0200.445176023166287.seq
> 2011-08-10 16:58:42,933 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 4 time(s).
> 2011-08-10 16:58:43,934 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 5 time(s).
> 2011-08-10 16:58:44,935 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 6 time(s).
> 2011-08-10 16:58:45,936 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 7 time(s).
> 2011-08-10 16:58:46,938 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 8 time(s).
> 2011-08-10 16:58:47,939 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 9 time(s).
> 2011-08-10 16:58:47,940 INFO com.cloudera.flume.handlers.debug.StubbornAppendSink: Call to host-vip-1/10.0.62.210:9000 failed on connection exception: java.net.ConnectException: Connection refused
> 2011-08-10 16:58:47,940 INFO com.cloudera.flume.handlers.rolling.RollSink: closing RollSink 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:58:47,941 INFO com.cloudera.flume.handlers.rolling.RollSink: TriggerThread shutdown
> 2011-08-10 16:58:47,941 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:58:47,943 INFO com.cloudera.flume.handlers.debug.InsistentOpenDecorator: Opened Roll on try 0
> 2011-08-10 16:58:47,944 INFO com.cloudera.flume.handlers.debug.InsistentAppendDecorator: append attempt 0 failed, backoff (1000ms): Event already had an event with attribute rolltag
> 2011-08-10 16:58:48,941 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 0 time(s).
> 2011-08-10 16:58:48,944 INFO com.cloudera.flume.handlers.debug.StubbornAppendSink: Event already had an event with attribute rolltag
> 2011-08-10 16:58:48,944 INFO com.cloudera.flume.handlers.rolling.RollSink: closing RollSink 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:58:48,944 WARN com.cloudera.flume.handlers.rolling.RollSink: TriggerThread interrupted
> 2011-08-10 16:58:48,946 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:58:48,947 INFO com.cloudera.flume.handlers.debug.InsistentOpenDecorator: Opened Roll on try 0
> 2011-08-10 16:58:48,947 INFO com.cloudera.flume.handlers.debug.InsistentAppendDecorator: append attempt 1 failed, backoff (2000ms): Event already had an event with attribute rolltag
> 2011-08-10 16:58:49,942 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 1 time(s).
> 2011-08-10 16:58:50,943 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 2 time(s).
> 2011-08-10 16:58:50,948 INFO com.cloudera.flume.handlers.debug.StubbornAppendSink: Event already had an event with attribute rolltag
> 2011-08-10 16:58:50,948 INFO com.cloudera.flume.handlers.rolling.RollSink: closing RollSink 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:58:50,948 WARN com.cloudera.flume.handlers.rolling.RollSink: TriggerThread interrupted
> 2011-08-10 16:58:50,966 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:58:50,967 INFO com.cloudera.flume.handlers.debug.InsistentOpenDecorator: Opened Roll on try 0
> 2011-08-10 16:58:50,967 INFO com.cloudera.flume.handlers.debug.InsistentAppendDecorator: append attempt 2 failed, backoff (4000ms): Event already had an event with attribute rolltag
> 2011-08-10 16:58:51,944 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 3 time(s).
> 2011-08-10 16:58:52,668 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165842664+0200.445176023166287.seq
> 2011-08-10 16:58:52,668 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165842664+0200.445176023166287.seq
> 2011-08-10 16:58:52,669 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165852669+0200.445186027755160.seq
> 2011-08-10 16:58:52,945 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 4 time(s).
> 2011-08-10 16:58:53,947 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 5 time(s).
> 2011-08-10 16:58:54,949 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 6 time(s).
> 2011-08-10 16:58:54,968 INFO com.cloudera.flume.handlers.debug.StubbornAppendSink: Event already had an event with attribute rolltag
> 2011-08-10 16:58:54,968 INFO com.cloudera.flume.handlers.rolling.RollSink: closing RollSink 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:58:54,968 WARN com.cloudera.flume.handlers.rolling.RollSink: TriggerThread interrupted
> 2011-08-10 16:58:54,969 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:58:54,970 INFO com.cloudera.flume.handlers.debug.InsistentOpenDecorator: Opened Roll on try 0
> 2011-08-10 16:58:54,970 INFO com.cloudera.flume.handlers.debug.InsistentAppendDecorator: append attempt 3 failed, backoff (8000ms): Event already had an event with attribute rolltag
> 2011-08-10 16:58:55,950 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 7 time(s).
> 2011-08-10 16:58:56,952 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 8 time(s).
> 2011-08-10 16:58:57,952 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 9 time(s).
> 2011-08-10 16:58:57,955 INFO com.cloudera.flume.core.BackOffFailOverSink: Call to host-vip-1/10.0.62.210:9000 failed on connection exception: java.net.ConnectException: Connection refused
> 2011-08-10 16:59:02,675 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165852669+0200.445186027755160.seq
> 2011-08-10 16:59:02,675 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165852669+0200.445186027755160.seq
> 2011-08-10 16:59:02,675 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165902675+0200.445196034139189.seq
> 2011-08-10 16:59:02,916 INFO com.cloudera.flume.handlers.rolling.RollSink: closing RollSink 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:59:02,916 WARN com.cloudera.flume.handlers.rolling.RollSink: TriggerThread interrupted
> 2011-08-10 16:59:02,917 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:59:02,919 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Opening hdfs://host-vip-1:9000//path/2011-08-10-1700/part-log.00000010.20110810-165902917+0200.445196275742099.seq
> 2011-08-10 16:59:02,970 INFO com.cloudera.flume.handlers.debug.StubbornAppendSink: Event already had an event with attribute rolltag
> 2011-08-10 16:59:02,970 INFO com.cloudera.flume.handlers.rolling.RollSink: closing RollSink 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:59:02,971 WARN com.cloudera.flume.handlers.rolling.RollSink: TriggerThread interrupted
> 2011-08-10 16:59:02,971 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:59:02,972 INFO com.cloudera.flume.handlers.debug.InsistentOpenDecorator: Opened Roll on try 0
> 2011-08-10 16:59:02,972 INFO com.cloudera.flume.handlers.debug.InsistentAppendDecorator: append attempt 4 failed, backoff (16000ms): Event already had an event with attribute rolltag
> 2011-08-10 16:59:03,920 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 0 time(s).
> 2011-08-10 16:59:04,921 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 1 time(s).
> 2011-08-10 16:59:05,922 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 2 time(s).
> 2011-08-10 16:59:06,923 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 3 time(s).
> 2011-08-10 16:59:07,924 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 4 time(s).
> 2011-08-10 16:59:08,925 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 5 time(s).
> 2011-08-10 16:59:09,926 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 6 time(s).
> 2011-08-10 16:59:10,927 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 7 time(s).
> 2011-08-10 16:59:11,928 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 8 time(s).
> 2011-08-10 16:59:12,679 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165902675+0200.445196034139189.seq
> 2011-08-10 16:59:12,679 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165902675+0200.445196034139189.seq
> 2011-08-10 16:59:12,680 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165912680+0200.445206038676079.seq
> 2011-08-10 16:59:12,929 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 9 time(s).
> 2011-08-10 16:59:12,930 INFO com.cloudera.flume.core.BackOffFailOverSink: Call to host-vip-1/10.0.62.210:9000 failed on connection exception: java.net.ConnectException: Connection refused
> 2011-08-10 16:59:18,973 INFO com.cloudera.flume.handlers.debug.StubbornAppendSink: Event already had an event with attribute rolltag
> 2011-08-10 16:59:18,973 INFO com.cloudera.flume.handlers.rolling.RollSink: closing RollSink 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:59:18,973 WARN com.cloudera.flume.handlers.rolling.RollSink: TriggerThread interrupted
> 2011-08-10 16:59:18,973 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:59:18,975 INFO com.cloudera.flume.handlers.debug.InsistentOpenDecorator: Opened Roll on try 0
> 2011-08-10 16:59:18,976 INFO com.cloudera.flume.handlers.debug.InsistentAppendDecorator: append attempt 5 failed, backoff (32000ms): Event already had an event with attribute rolltag
> 2011-08-10 16:59:21,921 INFO com.cloudera.flume.handlers.rolling.RollSink: closing RollSink 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:59:21,922 WARN com.cloudera.flume.handlers.rolling.RollSink: TriggerThread interrupted
> 2011-08-10 16:59:21,922 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:59:21,968 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Opening hdfs://host-vip-1:9000//path/2011-08-10-1700/part-log.00000010.20110810-165921922+0200.445215281259212.seq
> 2011-08-10 16:59:22,719 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165912680+0200.445206038676079.seq
> 2011-08-10 16:59:22,719 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165912680+0200.445206038676079.seq
> 2011-08-10 16:59:22,720 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165922719+0200.445216078556307.seq
> 2011-08-10 16:59:22,969 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 0 time(s).
> 2011-08-10 16:59:23,970 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 1 time(s).
> 2011-08-10 16:59:24,972 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 2 time(s).
> 2011-08-10 16:59:25,972 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 3 time(s).
> 2011-08-10 16:59:26,974 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 4 time(s).
> 2011-08-10 16:59:27,975 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 5 time(s).
> 2011-08-10 16:59:28,976 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 6 time(s).
> 2011-08-10 16:59:29,304 INFO com.cloudera.flume.core.BackOffFailOverSink: org.apache.hadoop.hdfs.server.namenode.SafeModeException: Cannot create file/path/2011-08-10-1700/part-log.00000010.20110810-165921922+0200.445215281259212.seq. Name node is in safe mode.
> The ratio of reported blocks 0.0000 has not reached the threshold 0.9990. Safe mode will be turned off automatically.
> 	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:1019)
> 	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:989)
> 	at org.apache.hadoop.hdfs.server.namenode.NameNode.create(NameNode.java:377)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> 	at java.lang.reflect.Method.invoke(Method.java:597)
> 	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:508)
> 	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:961)
> 	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:957)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.security.auth.Subject.doAs(Subject.java:396)
> 	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:955)
> 2011-08-10 16:59:32,726 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165922719+0200.445216078556307.seq
> 2011-08-10 16:59:32,726 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165922719+0200.445216078556307.seq
> 2011-08-10 16:59:32,727 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165932726+0200.445226085601122.seq
> 2011-08-10 16:59:42,732 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165932726+0200.445226085601122.seq
> 2011-08-10 16:59:42,732 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165932726+0200.445226085601122.seq
> 2011-08-10 16:59:42,732 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165942732+0200.445236091341764.seq
> 2011-08-10 16:59:45,926 INFO com.cloudera.flume.handlers.rolling.RollSink: closing RollSink 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:59:45,927 WARN com.cloudera.flume.handlers.rolling.RollSink: TriggerThread interrupted
> 2011-08-10 16:59:45,927 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:59:45,929 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Opening hdfs://host-vip-1:9000//path/2011-08-10-1700/part-log.00000010.20110810-165945928+0200.445239286795624.seq
> 2011-08-10 16:59:45,964 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Creating HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1700/part-log.00000010.20110810-165945928+0200.445239286795624.seq
> 2011-08-10 16:59:47,429 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Closing hdfs://host-vip-1:9000//path/2011-08-10-1700/part-log.00000010.20110810-165945928+0200.445239286795624.seq
> 2011-08-10 16:59:47,429 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Closing HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1700/part-log.00000010.20110810-165945928+0200.445239286795624.seq
> 2011-08-10 16:59:47,429 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: done writing raw file to hdfs
> 2011-08-10 16:59:47,927 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Opening hdfs://host-vip-1:9000//path/2011-08-10-1700/part-log.00000056.20110810-165947429+0200.445240787825417.seq
> 2011-08-10 16:59:47,938 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Creating HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1700/part-log.00000056.20110810-165947429+0200.445240787825417.seq
> 2011-08-10 16:59:49,121 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Closing hdfs://host-vip-1:9000//path/2011-08-10-1700/part-log.00000056.20110810-165947429+0200.445240787825417.seq
> 2011-08-10 16:59:49,121 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Closing HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1700/part-log.00000056.20110810-165947429+0200.445240787825417.seq
> 2011-08-10 16:59:49,121 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: done writing raw file to hdfs
> 2011-08-10 16:59:49,927 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Opening hdfs://host-vip-1:9000//path/2011-08-10-1700/part-log.00000056.20110810-165949121+0200.445242480222436.seq
> 2011-08-10 16:59:49,934 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Creating HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1700/part-log.00000056.20110810-165949121+0200.445242480222436.seq
> 2011-08-10 16:59:50,727 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Closing hdfs://host-vip-1:9000//path/2011-08-10-1700/part-log.00000056.20110810-165949121+0200.445242480222436.seq
> 2011-08-10 16:59:50,727 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Closing HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1700/part-log.00000056.20110810-165949121+0200.445242480222436.seq
> 2011-08-10 16:59:50,727 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: done writing raw file to hdfs
> 2011-08-10 16:59:50,927 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Opening hdfs://host-vip-1:9000//path/2011-08-10-1700/part-log.00000056.20110810-165950727+0200.445244085712823.seq
> 2011-08-10 16:59:50,933 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Creating HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1700/part-log.00000056.20110810-165950727+0200.445244085712823.seq
> 2011-08-10 16:59:50,976 INFO com.cloudera.flume.handlers.debug.StubbornAppendSink: Event already had an event with attribute rolltag
> 2011-08-10 16:59:50,976 INFO com.cloudera.flume.handlers.rolling.RollSink: closing RollSink 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:59:50,976 WARN com.cloudera.flume.handlers.rolling.RollSink: TriggerThread interrupted
> 2011-08-10 16:59:50,976 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:59:50,978 INFO com.cloudera.flume.handlers.debug.InsistentOpenDecorator: Opened Roll on try 0
> 2011-08-10 16:59:50,978 INFO com.cloudera.flume.handlers.debug.InsistentAppendDecorator: append attempt 6 failed, backoff (60000ms): Event already had an event with attribute rolltag
> 2011-08-10 16:59:52,278 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Closing hdfs://host-vip-1:9000//path/2011-08-10-1700/part-log.00000056.20110810-165950727+0200.445244085712823.seq
> 2011-08-10 16:59:52,278 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Closing HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1700/part-log.00000056.20110810-165950727+0200.445244085712823.seq
> 2011-08-10 16:59:52,278 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: done writing raw file to hdfs
> 2011-08-10 16:59:52,737 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165942732+0200.445236091341764.seq
> 2011-08-10 16:59:52,737 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165942732+0200.445236091341764.seq
> 2011-08-10 16:59:52,737 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165952737+0200.445246096329911.seq
> 2011-08-10 17:00:02,741 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165952737+0200.445246096329911.seq
> 2011-08-10 17:00:02,741 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165952737+0200.445246096329911.seq
> 2011-08-10 17:00:02,742 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-170002741+0200.445256100500909.seq
> 2011-08-10 17:00:12,745 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-170002741+0200.445256100500909.seq
> 2011-08-10 17:00:12,745 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-170002741+0200.445256100500909.seq
> 2011-08-10 17:00:12,746 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-170012745+0200.445266104587375.seq
> 2011-08-10 17:00:22,768 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-170012745+0200.445266104587375.seq
> 2011-08-10 17:00:22,769 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-170012745+0200.445266104587375.seq
> 2011-08-10 17:00:22,769 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-170022769+0200.445276127976521.seq
> 2011-08-10 17:00:32,774 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-170022769+0200.445276127976521.seq
> 2011-08-10 17:00:32,774 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-170022769+0200.445276127976521.seq
> 2011-08-10 17:00:32,774 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-170032774+0200.445286133232984.seq
> 2011-08-10 17:00:42,779 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-170032774+0200.445286133232984.seq
> 2011-08-10 17:00:42,779 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-170032774+0200.445286133232984.seq
> 2011-08-10 17:00:42,780 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-170042779+0200.445296138546943.seq
> 2011-08-10 17:00:45,294 INFO com.cloudera.flume.watchdog.Watchdog: Watchdog shutdown hook
> This results in the following files being written to dfo_* :
> ls -lah *
> dfo_error:
> total 8.0K
> drwxr-xr-x 2 user nogroup 4.0K 2011-07-18 11:11 .
> drwxr-xr-x 7 user nogroup 4.0K 2011-07-18 11:11 ..
> dfo_import:
> total 8.0K
> drwxr-xr-x 2 user nogroup 4.0K 2011-07-18 11:11 .
> drwxr-xr-x 7 user nogroup 4.0K 2011-07-18 11:11 ..
> dfo_logged:
> total 120K
> drwxr-xr-x 2 user nogroup  12K 2011-08-10 17:00 .
> drwxr-xr-x 7 user nogroup 4.0K 2011-07-18 11:11 ..
> -rw-r--r-- 1 user nogroup  12K 2011-08-10 16:58 log.00000015.20110810-165832658+0200.445166016694026.seq
> -rw-r--r-- 1 user nogroup  125 2011-08-10 16:58 log.00000015.20110810-165842664+0200.445176023166287.seq
> -rw-r--r-- 1 user nogroup  19K 2011-08-10 16:59 log.00000015.20110810-165852669+0200.445186027755160.seq
> -rw-r--r-- 1 user nogroup  125 2011-08-10 16:59 log.00000015.20110810-165902675+0200.445196034139189.seq
> -rw-r--r-- 1 user nogroup  17K 2011-08-10 16:59 log.00000015.20110810-165912680+0200.445206038676079.seq
> -rw-r--r-- 1 user nogroup 9.6K 2011-08-10 16:59 log.00000015.20110810-165922719+0200.445216078556307.seq
> -rw-r--r-- 1 user nogroup 8.8K 2011-08-10 16:59 log.00000015.20110810-165932726+0200.445226085601122.seq
> -rw-r--r-- 1 user nogroup 2.8K 2011-08-10 16:59 log.00000015.20110810-165942732+0200.445236091341764.seq
> -rw-r--r-- 1 user nogroup  125 2011-08-10 16:59 log.00000015.20110810-165952737+0200.445246096329911.seq
> -rw-r--r-- 1 user nogroup  125 2011-08-10 17:00 log.00000015.20110810-170002741+0200.445256100500909.seq
> -rw-r--r-- 1 user nogroup  125 2011-08-10 17:00 log.00000015.20110810-170012745+0200.445266104587375.seq
> -rw-r--r-- 1 user nogroup  125 2011-08-10 17:00 log.00000015.20110810-170022769+0200.445276127976521.seq
> dfo_sending:
> total 20K
> drwxr-xr-x 2 user nogroup 4.0K 2011-08-10 16:58 .
> drwxr-xr-x 7 user nogroup 4.0K 2011-07-18 11:11 ..
> -rw-r--r-- 1 user nogroup  11K 2011-08-10 16:58 log.00000015.20110810-165822652+0200.445156010693063.seq
> dfo_writing:
> total 12K
> drwxr-xr-x 2 user nogroup 4.0K 2011-08-10 17:00 .
> drwxr-xr-x 7 user nogroup 4.0K 2011-07-18 11:11 ..
> -rw-r--r-- 1 user nogroup  125 2011-08-10 17:00 log.00000015.20110810-170032774+0200.445286133232984.seq

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators: https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa
For more information on JIRA, see: http://www.atlassian.com/software/jira

        

[jira] [Resolved] (FLUME-735) Using diskFailover with roll yields unexpected behavior

Posted by "Alexander Lorenz-Alten (Resolved) (JIRA)" <ji...@apache.org>.
     [ https://issues.apache.org/jira/browse/FLUME-735?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Alexander Lorenz-Alten resolved FLUME-735.
------------------------------------------

    Resolution: Won't Fix
    
> Using diskFailover with roll yields unexpected behavior
> -------------------------------------------------------
>
>                 Key: FLUME-735
>                 URL: https://issues.apache.org/jira/browse/FLUME-735
>             Project: Flume
>          Issue Type: Bug
>          Components: Sinks+Sources
>    Affects Versions: v0.9.3
>         Environment: Flume 0.9.3 on Debian Lenny
>            Reporter: Erik Jutemar
>
> Trying to use diskFailover and roll with the config < roll(1500) { escapedCustomDfs("hdfs") } ? diskFailover mask("rolltag") insistentAppend stubbornAppend insistentOpen roll(1500) { escapedCustomDfs("hdfs") } > saves data to disk if hdfs is unavailable. However when hdfs comes back, the data is never sent.
> Scenario:
> Continuously sending data.
> Shutting down hdfs.
> Enable hdfs.
> Stop sending data.
> Wait a while.
> Shut down flume.
> This generates the following log:
> 2011-08-10 16:57:42,384 INFO com.cloudera.flume.handlers.rolling.RollSink: closing RollSink 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:57:42,384 INFO com.cloudera.flume.handlers.rolling.RollSink: double close 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:57:42,384 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:57:42,557 INFO com.cloudera.flume.handlers.rolling.RollSink: Created RollSink: trigger=[TimeTrigger: maxAge=10000 tagger=com.cloudera.flume.handlers.rolling.ProcessTagger@da0225b] checkPeriodMs = 250 spec='NaiveFileFailover'
> 2011-08-10 16:57:42,558 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'NaiveFileFailover'
> 2011-08-10 16:57:42,561 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000010.20110810-165742560+0200.445115919474417.seq
> 2011-08-10 16:57:42,591 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Opening hdfs://host-vip-1:9000//path/2011-08-10-1600/part-log.00000010.20110810-165742384+0200.445115743357603.seq
> 2011-08-10 16:57:42,652 INFO com.hadoop.compression.lzo.GPLNativeCodeLoader: Loaded native gpl library
> 2011-08-10 16:57:42,655 INFO com.hadoop.compression.lzo.LzoCodec: Successfully loaded & initialized native-lzo library [hadoop-lzo rev 0e7005136e4160ed4cc157c4ddd7f4f1c6e11ffa]
> 2011-08-10 16:57:43,152 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Creating HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1600/part-log.00000010.20110810-165742384+0200.445115743357603.seq
> 2011-08-10 16:57:43,890 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Closing hdfs://host-vip-1:9000//path/2011-08-10-1600/part-log.00000010.20110810-165742384+0200.445115743357603.seq
> 2011-08-10 16:57:43,891 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Closing HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1600/part-log.00000010.20110810-165742384+0200.445115743357603.seq
> 2011-08-10 16:57:43,891 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: done writing raw file to hdfs
> 2011-08-10 16:57:44,379 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Opening hdfs://host-vip-1:9000//path/2011-08-10-1600/part-log.00000014.20110810-165743890+0200.445117248899762.seq
> 2011-08-10 16:57:44,389 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Creating HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1600/part-log.00000014.20110810-165743890+0200.445117248899762.seq
> 2011-08-10 16:57:45,558 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Closing hdfs://host-vip-1:9000//path/2011-08-10-1600/part-log.00000014.20110810-165743890+0200.445117248899762.seq
> 2011-08-10 16:57:45,558 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Closing HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1600/part-log.00000014.20110810-165743890+0200.445117248899762.seq
> 2011-08-10 16:57:45,558 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: done writing raw file to hdfs
> 2011-08-10 16:57:46,380 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Opening hdfs://host-vip-1:9000//path/2011-08-10-1600/part-log.00000014.20110810-165745558+0200.445118916703188.seq
> 2011-08-10 16:57:46,389 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Creating HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1600/part-log.00000014.20110810-165745558+0200.445118916703188.seq
> 2011-08-10 16:57:47,085 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Closing hdfs://host-vip-1:9000//path/2011-08-10-1600/part-log.00000014.20110810-165745558+0200.445118916703188.seq
> 2011-08-10 16:57:47,085 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Closing HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1600/part-log.00000014.20110810-165745558+0200.445118916703188.seq
> 2011-08-10 16:57:47,085 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: done writing raw file to hdfs
> 2011-08-10 16:57:52,636 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000010.20110810-165742560+0200.445115919474417.seq
> 2011-08-10 16:57:52,636 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000010.20110810-165742560+0200.445115919474417.seq
> 2011-08-10 16:57:52,637 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165752637+0200.445125995972908.seq
> 2011-08-10 16:57:52,638 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: opening new file for log.00000010.20110810-165742560+0200.445115919474417.seq
> 2011-08-10 16:57:52,712 INFO com.cloudera.flume.agent.diskfailover.DiskFailoverSource: end of file com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager$StateChangeDeco@11c7865b
> 2011-08-10 16:58:02,642 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165752637+0200.445125995972908.seq
> 2011-08-10 16:58:02,642 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165752637+0200.445125995972908.seq
> 2011-08-10 16:58:02,642 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165802642+0200.445136001273694.seq
> 2011-08-10 16:58:02,643 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: opening new file for log.00000015.20110810-165752637+0200.445125995972908.seq
> 2011-08-10 16:58:02,678 INFO com.cloudera.flume.agent.diskfailover.DiskFailoverSource: end of file com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager$StateChangeDeco@67071c84
> 2011-08-10 16:58:12,646 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165802642+0200.445136001273694.seq
> 2011-08-10 16:58:12,646 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165802642+0200.445136001273694.seq
> 2011-08-10 16:58:12,647 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165812647+0200.445146005897644.seq
> 2011-08-10 16:58:12,648 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: opening new file for log.00000015.20110810-165802642+0200.445136001273694.seq
> 2011-08-10 16:58:12,678 INFO com.cloudera.flume.agent.diskfailover.DiskFailoverSource: end of file com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager$StateChangeDeco@b3e15f7
> 2011-08-10 16:58:15,905 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Opening hdfs://host-vip-1:9000//path/2011-08-10-1600/part-log.00000014.20110810-165815689+0200.445149048504058.seq
> 2011-08-10 16:58:16,907 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 0 time(s).
> 2011-08-10 16:58:17,908 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 1 time(s).
> 2011-08-10 16:58:18,909 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 2 time(s).
> 2011-08-10 16:58:19,910 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 3 time(s).
> 2011-08-10 16:58:20,911 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 4 time(s).
> 2011-08-10 16:58:21,912 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 5 time(s).
> 2011-08-10 16:58:22,651 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165812647+0200.445146005897644.seq
> 2011-08-10 16:58:22,651 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165812647+0200.445146005897644.seq
> 2011-08-10 16:58:22,652 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165822652+0200.445156010693063.seq
> 2011-08-10 16:58:22,652 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: opening new file for log.00000015.20110810-165812647+0200.445146005897644.seq
> 2011-08-10 16:58:22,676 INFO com.cloudera.flume.agent.diskfailover.DiskFailoverSource: end of file com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager$StateChangeDeco@4c6504bc
> 2011-08-10 16:58:22,913 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 6 time(s).
> 2011-08-10 16:58:23,914 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 7 time(s).
> 2011-08-10 16:58:24,916 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 8 time(s).
> 2011-08-10 16:58:25,917 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 9 time(s).
> 2011-08-10 16:58:25,918 INFO com.cloudera.flume.core.BackOffFailOverSink: Call to host-vip-1/10.0.62.210:9000 failed on connection exception: java.net.ConnectException: Connection refused
> 2011-08-10 16:58:27,908 INFO com.cloudera.flume.handlers.rolling.RollSink: closing RollSink 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:58:27,909 WARN com.cloudera.flume.handlers.rolling.RollSink: TriggerThread interrupted
> 2011-08-10 16:58:27,909 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:58:27,914 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Opening hdfs://host-vip-1:9000//path/2011-08-10-1600/part-log.00000010.20110810-165827910+0200.445161269239691.seq
> 2011-08-10 16:58:28,916 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 0 time(s).
> 2011-08-10 16:58:29,917 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 1 time(s).
> 2011-08-10 16:58:30,918 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 2 time(s).
> 2011-08-10 16:58:31,919 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 3 time(s).
> 2011-08-10 16:58:32,657 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165822652+0200.445156010693063.seq
> 2011-08-10 16:58:32,657 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165822652+0200.445156010693063.seq
> 2011-08-10 16:58:32,658 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165832658+0200.445166016694026.seq
> 2011-08-10 16:58:32,658 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: opening new file for log.00000015.20110810-165822652+0200.445156010693063.seq
> 2011-08-10 16:58:32,683 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:58:32,684 INFO com.cloudera.flume.handlers.debug.InsistentOpenDecorator: Opened Roll on try 0
> 2011-08-10 16:58:32,685 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Opening hdfs://host-vip-1:9000//path/2011-08-10-1600/part-log.00000016.20110810-165832683+0200.445166041918942.seq
> 2011-08-10 16:58:32,921 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 4 time(s).
> 2011-08-10 16:58:33,922 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 5 time(s).
> 2011-08-10 16:58:34,924 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 6 time(s).
> 2011-08-10 16:58:35,925 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 7 time(s).
> 2011-08-10 16:58:36,926 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 8 time(s).
> 2011-08-10 16:58:37,927 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 9 time(s).
> 2011-08-10 16:58:37,928 INFO com.cloudera.flume.core.BackOffFailOverSink: Call to host-vip-1/10.0.62.210:9000 failed on connection exception: java.net.ConnectException: Connection refused
> 2011-08-10 16:58:38,929 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 0 time(s).
> 2011-08-10 16:58:39,930 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 1 time(s).
> 2011-08-10 16:58:40,911 INFO com.cloudera.flume.handlers.rolling.RollSink: closing RollSink 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:58:40,912 WARN com.cloudera.flume.handlers.rolling.RollSink: TriggerThread interrupted
> 2011-08-10 16:58:40,912 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:58:40,915 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Opening hdfs://host-vip-1:9000//path/2011-08-10-1600/part-log.00000010.20110810-165840912+0200.445174271130316.seq
> 2011-08-10 16:58:40,931 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 2 time(s).
> 2011-08-10 16:58:41,932 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 3 time(s).
> 2011-08-10 16:58:42,664 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165832658+0200.445166016694026.seq
> 2011-08-10 16:58:42,664 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165832658+0200.445166016694026.seq
> 2011-08-10 16:58:42,664 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165842664+0200.445176023166287.seq
> 2011-08-10 16:58:42,933 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 4 time(s).
> 2011-08-10 16:58:43,934 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 5 time(s).
> 2011-08-10 16:58:44,935 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 6 time(s).
> 2011-08-10 16:58:45,936 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 7 time(s).
> 2011-08-10 16:58:46,938 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 8 time(s).
> 2011-08-10 16:58:47,939 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 9 time(s).
> 2011-08-10 16:58:47,940 INFO com.cloudera.flume.handlers.debug.StubbornAppendSink: Call to host-vip-1/10.0.62.210:9000 failed on connection exception: java.net.ConnectException: Connection refused
> 2011-08-10 16:58:47,940 INFO com.cloudera.flume.handlers.rolling.RollSink: closing RollSink 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:58:47,941 INFO com.cloudera.flume.handlers.rolling.RollSink: TriggerThread shutdown
> 2011-08-10 16:58:47,941 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:58:47,943 INFO com.cloudera.flume.handlers.debug.InsistentOpenDecorator: Opened Roll on try 0
> 2011-08-10 16:58:47,944 INFO com.cloudera.flume.handlers.debug.InsistentAppendDecorator: append attempt 0 failed, backoff (1000ms): Event already had an event with attribute rolltag
> 2011-08-10 16:58:48,941 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 0 time(s).
> 2011-08-10 16:58:48,944 INFO com.cloudera.flume.handlers.debug.StubbornAppendSink: Event already had an event with attribute rolltag
> 2011-08-10 16:58:48,944 INFO com.cloudera.flume.handlers.rolling.RollSink: closing RollSink 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:58:48,944 WARN com.cloudera.flume.handlers.rolling.RollSink: TriggerThread interrupted
> 2011-08-10 16:58:48,946 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:58:48,947 INFO com.cloudera.flume.handlers.debug.InsistentOpenDecorator: Opened Roll on try 0
> 2011-08-10 16:58:48,947 INFO com.cloudera.flume.handlers.debug.InsistentAppendDecorator: append attempt 1 failed, backoff (2000ms): Event already had an event with attribute rolltag
> 2011-08-10 16:58:49,942 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 1 time(s).
> 2011-08-10 16:58:50,943 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 2 time(s).
> 2011-08-10 16:58:50,948 INFO com.cloudera.flume.handlers.debug.StubbornAppendSink: Event already had an event with attribute rolltag
> 2011-08-10 16:58:50,948 INFO com.cloudera.flume.handlers.rolling.RollSink: closing RollSink 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:58:50,948 WARN com.cloudera.flume.handlers.rolling.RollSink: TriggerThread interrupted
> 2011-08-10 16:58:50,966 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:58:50,967 INFO com.cloudera.flume.handlers.debug.InsistentOpenDecorator: Opened Roll on try 0
> 2011-08-10 16:58:50,967 INFO com.cloudera.flume.handlers.debug.InsistentAppendDecorator: append attempt 2 failed, backoff (4000ms): Event already had an event with attribute rolltag
> 2011-08-10 16:58:51,944 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 3 time(s).
> 2011-08-10 16:58:52,668 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165842664+0200.445176023166287.seq
> 2011-08-10 16:58:52,668 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165842664+0200.445176023166287.seq
> 2011-08-10 16:58:52,669 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165852669+0200.445186027755160.seq
> 2011-08-10 16:58:52,945 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 4 time(s).
> 2011-08-10 16:58:53,947 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 5 time(s).
> 2011-08-10 16:58:54,949 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 6 time(s).
> 2011-08-10 16:58:54,968 INFO com.cloudera.flume.handlers.debug.StubbornAppendSink: Event already had an event with attribute rolltag
> 2011-08-10 16:58:54,968 INFO com.cloudera.flume.handlers.rolling.RollSink: closing RollSink 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:58:54,968 WARN com.cloudera.flume.handlers.rolling.RollSink: TriggerThread interrupted
> 2011-08-10 16:58:54,969 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:58:54,970 INFO com.cloudera.flume.handlers.debug.InsistentOpenDecorator: Opened Roll on try 0
> 2011-08-10 16:58:54,970 INFO com.cloudera.flume.handlers.debug.InsistentAppendDecorator: append attempt 3 failed, backoff (8000ms): Event already had an event with attribute rolltag
> 2011-08-10 16:58:55,950 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 7 time(s).
> 2011-08-10 16:58:56,952 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 8 time(s).
> 2011-08-10 16:58:57,952 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 9 time(s).
> 2011-08-10 16:58:57,955 INFO com.cloudera.flume.core.BackOffFailOverSink: Call to host-vip-1/10.0.62.210:9000 failed on connection exception: java.net.ConnectException: Connection refused
> 2011-08-10 16:59:02,675 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165852669+0200.445186027755160.seq
> 2011-08-10 16:59:02,675 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165852669+0200.445186027755160.seq
> 2011-08-10 16:59:02,675 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165902675+0200.445196034139189.seq
> 2011-08-10 16:59:02,916 INFO com.cloudera.flume.handlers.rolling.RollSink: closing RollSink 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:59:02,916 WARN com.cloudera.flume.handlers.rolling.RollSink: TriggerThread interrupted
> 2011-08-10 16:59:02,917 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:59:02,919 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Opening hdfs://host-vip-1:9000//path/2011-08-10-1700/part-log.00000010.20110810-165902917+0200.445196275742099.seq
> 2011-08-10 16:59:02,970 INFO com.cloudera.flume.handlers.debug.StubbornAppendSink: Event already had an event with attribute rolltag
> 2011-08-10 16:59:02,970 INFO com.cloudera.flume.handlers.rolling.RollSink: closing RollSink 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:59:02,971 WARN com.cloudera.flume.handlers.rolling.RollSink: TriggerThread interrupted
> 2011-08-10 16:59:02,971 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:59:02,972 INFO com.cloudera.flume.handlers.debug.InsistentOpenDecorator: Opened Roll on try 0
> 2011-08-10 16:59:02,972 INFO com.cloudera.flume.handlers.debug.InsistentAppendDecorator: append attempt 4 failed, backoff (16000ms): Event already had an event with attribute rolltag
> 2011-08-10 16:59:03,920 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 0 time(s).
> 2011-08-10 16:59:04,921 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 1 time(s).
> 2011-08-10 16:59:05,922 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 2 time(s).
> 2011-08-10 16:59:06,923 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 3 time(s).
> 2011-08-10 16:59:07,924 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 4 time(s).
> 2011-08-10 16:59:08,925 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 5 time(s).
> 2011-08-10 16:59:09,926 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 6 time(s).
> 2011-08-10 16:59:10,927 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 7 time(s).
> 2011-08-10 16:59:11,928 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 8 time(s).
> 2011-08-10 16:59:12,679 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165902675+0200.445196034139189.seq
> 2011-08-10 16:59:12,679 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165902675+0200.445196034139189.seq
> 2011-08-10 16:59:12,680 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165912680+0200.445206038676079.seq
> 2011-08-10 16:59:12,929 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 9 time(s).
> 2011-08-10 16:59:12,930 INFO com.cloudera.flume.core.BackOffFailOverSink: Call to host-vip-1/10.0.62.210:9000 failed on connection exception: java.net.ConnectException: Connection refused
> 2011-08-10 16:59:18,973 INFO com.cloudera.flume.handlers.debug.StubbornAppendSink: Event already had an event with attribute rolltag
> 2011-08-10 16:59:18,973 INFO com.cloudera.flume.handlers.rolling.RollSink: closing RollSink 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:59:18,973 WARN com.cloudera.flume.handlers.rolling.RollSink: TriggerThread interrupted
> 2011-08-10 16:59:18,973 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:59:18,975 INFO com.cloudera.flume.handlers.debug.InsistentOpenDecorator: Opened Roll on try 0
> 2011-08-10 16:59:18,976 INFO com.cloudera.flume.handlers.debug.InsistentAppendDecorator: append attempt 5 failed, backoff (32000ms): Event already had an event with attribute rolltag
> 2011-08-10 16:59:21,921 INFO com.cloudera.flume.handlers.rolling.RollSink: closing RollSink 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:59:21,922 WARN com.cloudera.flume.handlers.rolling.RollSink: TriggerThread interrupted
> 2011-08-10 16:59:21,922 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:59:21,968 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Opening hdfs://host-vip-1:9000//path/2011-08-10-1700/part-log.00000010.20110810-165921922+0200.445215281259212.seq
> 2011-08-10 16:59:22,719 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165912680+0200.445206038676079.seq
> 2011-08-10 16:59:22,719 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165912680+0200.445206038676079.seq
> 2011-08-10 16:59:22,720 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165922719+0200.445216078556307.seq
> 2011-08-10 16:59:22,969 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 0 time(s).
> 2011-08-10 16:59:23,970 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 1 time(s).
> 2011-08-10 16:59:24,972 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 2 time(s).
> 2011-08-10 16:59:25,972 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 3 time(s).
> 2011-08-10 16:59:26,974 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 4 time(s).
> 2011-08-10 16:59:27,975 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 5 time(s).
> 2011-08-10 16:59:28,976 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 6 time(s).
> 2011-08-10 16:59:29,304 INFO com.cloudera.flume.core.BackOffFailOverSink: org.apache.hadoop.hdfs.server.namenode.SafeModeException: Cannot create file/path/2011-08-10-1700/part-log.00000010.20110810-165921922+0200.445215281259212.seq. Name node is in safe mode.
> The ratio of reported blocks 0.0000 has not reached the threshold 0.9990. Safe mode will be turned off automatically.
> 	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:1019)
> 	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:989)
> 	at org.apache.hadoop.hdfs.server.namenode.NameNode.create(NameNode.java:377)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> 	at java.lang.reflect.Method.invoke(Method.java:597)
> 	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:508)
> 	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:961)
> 	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:957)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.security.auth.Subject.doAs(Subject.java:396)
> 	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:955)
> 2011-08-10 16:59:32,726 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165922719+0200.445216078556307.seq
> 2011-08-10 16:59:32,726 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165922719+0200.445216078556307.seq
> 2011-08-10 16:59:32,727 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165932726+0200.445226085601122.seq
> 2011-08-10 16:59:42,732 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165932726+0200.445226085601122.seq
> 2011-08-10 16:59:42,732 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165932726+0200.445226085601122.seq
> 2011-08-10 16:59:42,732 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165942732+0200.445236091341764.seq
> 2011-08-10 16:59:45,926 INFO com.cloudera.flume.handlers.rolling.RollSink: closing RollSink 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:59:45,927 WARN com.cloudera.flume.handlers.rolling.RollSink: TriggerThread interrupted
> 2011-08-10 16:59:45,927 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:59:45,929 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Opening hdfs://host-vip-1:9000//path/2011-08-10-1700/part-log.00000010.20110810-165945928+0200.445239286795624.seq
> 2011-08-10 16:59:45,964 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Creating HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1700/part-log.00000010.20110810-165945928+0200.445239286795624.seq
> 2011-08-10 16:59:47,429 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Closing hdfs://host-vip-1:9000//path/2011-08-10-1700/part-log.00000010.20110810-165945928+0200.445239286795624.seq
> 2011-08-10 16:59:47,429 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Closing HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1700/part-log.00000010.20110810-165945928+0200.445239286795624.seq
> 2011-08-10 16:59:47,429 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: done writing raw file to hdfs
> 2011-08-10 16:59:47,927 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Opening hdfs://host-vip-1:9000//path/2011-08-10-1700/part-log.00000056.20110810-165947429+0200.445240787825417.seq
> 2011-08-10 16:59:47,938 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Creating HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1700/part-log.00000056.20110810-165947429+0200.445240787825417.seq
> 2011-08-10 16:59:49,121 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Closing hdfs://host-vip-1:9000//path/2011-08-10-1700/part-log.00000056.20110810-165947429+0200.445240787825417.seq
> 2011-08-10 16:59:49,121 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Closing HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1700/part-log.00000056.20110810-165947429+0200.445240787825417.seq
> 2011-08-10 16:59:49,121 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: done writing raw file to hdfs
> 2011-08-10 16:59:49,927 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Opening hdfs://host-vip-1:9000//path/2011-08-10-1700/part-log.00000056.20110810-165949121+0200.445242480222436.seq
> 2011-08-10 16:59:49,934 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Creating HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1700/part-log.00000056.20110810-165949121+0200.445242480222436.seq
> 2011-08-10 16:59:50,727 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Closing hdfs://host-vip-1:9000//path/2011-08-10-1700/part-log.00000056.20110810-165949121+0200.445242480222436.seq
> 2011-08-10 16:59:50,727 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Closing HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1700/part-log.00000056.20110810-165949121+0200.445242480222436.seq
> 2011-08-10 16:59:50,727 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: done writing raw file to hdfs
> 2011-08-10 16:59:50,927 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Opening hdfs://host-vip-1:9000//path/2011-08-10-1700/part-log.00000056.20110810-165950727+0200.445244085712823.seq
> 2011-08-10 16:59:50,933 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Creating HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1700/part-log.00000056.20110810-165950727+0200.445244085712823.seq
> 2011-08-10 16:59:50,976 INFO com.cloudera.flume.handlers.debug.StubbornAppendSink: Event already had an event with attribute rolltag
> 2011-08-10 16:59:50,976 INFO com.cloudera.flume.handlers.rolling.RollSink: closing RollSink 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:59:50,976 WARN com.cloudera.flume.handlers.rolling.RollSink: TriggerThread interrupted
> 2011-08-10 16:59:50,976 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:59:50,978 INFO com.cloudera.flume.handlers.debug.InsistentOpenDecorator: Opened Roll on try 0
> 2011-08-10 16:59:50,978 INFO com.cloudera.flume.handlers.debug.InsistentAppendDecorator: append attempt 6 failed, backoff (60000ms): Event already had an event with attribute rolltag
> 2011-08-10 16:59:52,278 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Closing hdfs://host-vip-1:9000//path/2011-08-10-1700/part-log.00000056.20110810-165950727+0200.445244085712823.seq
> 2011-08-10 16:59:52,278 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Closing HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1700/part-log.00000056.20110810-165950727+0200.445244085712823.seq
> 2011-08-10 16:59:52,278 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: done writing raw file to hdfs
> 2011-08-10 16:59:52,737 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165942732+0200.445236091341764.seq
> 2011-08-10 16:59:52,737 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165942732+0200.445236091341764.seq
> 2011-08-10 16:59:52,737 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165952737+0200.445246096329911.seq
> 2011-08-10 17:00:02,741 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165952737+0200.445246096329911.seq
> 2011-08-10 17:00:02,741 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165952737+0200.445246096329911.seq
> 2011-08-10 17:00:02,742 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-170002741+0200.445256100500909.seq
> 2011-08-10 17:00:12,745 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-170002741+0200.445256100500909.seq
> 2011-08-10 17:00:12,745 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-170002741+0200.445256100500909.seq
> 2011-08-10 17:00:12,746 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-170012745+0200.445266104587375.seq
> 2011-08-10 17:00:22,768 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-170012745+0200.445266104587375.seq
> 2011-08-10 17:00:22,769 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-170012745+0200.445266104587375.seq
> 2011-08-10 17:00:22,769 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-170022769+0200.445276127976521.seq
> 2011-08-10 17:00:32,774 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-170022769+0200.445276127976521.seq
> 2011-08-10 17:00:32,774 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-170022769+0200.445276127976521.seq
> 2011-08-10 17:00:32,774 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-170032774+0200.445286133232984.seq
> 2011-08-10 17:00:42,779 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-170032774+0200.445286133232984.seq
> 2011-08-10 17:00:42,779 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-170032774+0200.445286133232984.seq
> 2011-08-10 17:00:42,780 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-170042779+0200.445296138546943.seq
> 2011-08-10 17:00:45,294 INFO com.cloudera.flume.watchdog.Watchdog: Watchdog shutdown hook
> This results in the following files being written to dfo_* :
> ls -lah *
> dfo_error:
> total 8.0K
> drwxr-xr-x 2 user nogroup 4.0K 2011-07-18 11:11 .
> drwxr-xr-x 7 user nogroup 4.0K 2011-07-18 11:11 ..
> dfo_import:
> total 8.0K
> drwxr-xr-x 2 user nogroup 4.0K 2011-07-18 11:11 .
> drwxr-xr-x 7 user nogroup 4.0K 2011-07-18 11:11 ..
> dfo_logged:
> total 120K
> drwxr-xr-x 2 user nogroup  12K 2011-08-10 17:00 .
> drwxr-xr-x 7 user nogroup 4.0K 2011-07-18 11:11 ..
> -rw-r--r-- 1 user nogroup  12K 2011-08-10 16:58 log.00000015.20110810-165832658+0200.445166016694026.seq
> -rw-r--r-- 1 user nogroup  125 2011-08-10 16:58 log.00000015.20110810-165842664+0200.445176023166287.seq
> -rw-r--r-- 1 user nogroup  19K 2011-08-10 16:59 log.00000015.20110810-165852669+0200.445186027755160.seq
> -rw-r--r-- 1 user nogroup  125 2011-08-10 16:59 log.00000015.20110810-165902675+0200.445196034139189.seq
> -rw-r--r-- 1 user nogroup  17K 2011-08-10 16:59 log.00000015.20110810-165912680+0200.445206038676079.seq
> -rw-r--r-- 1 user nogroup 9.6K 2011-08-10 16:59 log.00000015.20110810-165922719+0200.445216078556307.seq
> -rw-r--r-- 1 user nogroup 8.8K 2011-08-10 16:59 log.00000015.20110810-165932726+0200.445226085601122.seq
> -rw-r--r-- 1 user nogroup 2.8K 2011-08-10 16:59 log.00000015.20110810-165942732+0200.445236091341764.seq
> -rw-r--r-- 1 user nogroup  125 2011-08-10 16:59 log.00000015.20110810-165952737+0200.445246096329911.seq
> -rw-r--r-- 1 user nogroup  125 2011-08-10 17:00 log.00000015.20110810-170002741+0200.445256100500909.seq
> -rw-r--r-- 1 user nogroup  125 2011-08-10 17:00 log.00000015.20110810-170012745+0200.445266104587375.seq
> -rw-r--r-- 1 user nogroup  125 2011-08-10 17:00 log.00000015.20110810-170022769+0200.445276127976521.seq
> dfo_sending:
> total 20K
> drwxr-xr-x 2 user nogroup 4.0K 2011-08-10 16:58 .
> drwxr-xr-x 7 user nogroup 4.0K 2011-07-18 11:11 ..
> -rw-r--r-- 1 user nogroup  11K 2011-08-10 16:58 log.00000015.20110810-165822652+0200.445156010693063.seq
> dfo_writing:
> total 12K
> drwxr-xr-x 2 user nogroup 4.0K 2011-08-10 17:00 .
> drwxr-xr-x 7 user nogroup 4.0K 2011-07-18 11:11 ..
> -rw-r--r-- 1 user nogroup  125 2011-08-10 17:00 log.00000015.20110810-170032774+0200.445286133232984.seq

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators: https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa
For more information on JIRA, see: http://www.atlassian.com/software/jira

        

[jira] [Commented] (FLUME-735) Using diskFailover with roll yields unexpected behavior

Posted by "Jonathan Hsieh (JIRA)" <ji...@apache.org>.
    [ https://issues.apache.org/jira/browse/FLUME-735?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13083189#comment-13083189 ] 

Jonathan Hsieh commented on FLUME-735:
--------------------------------------

So we have a work around now.  I'll try to spend some time to add this to the faq as a recipe and possibly consider adding a new collector type that simplifies this.

04:11 <igno> jmhsieh, good news, I managed to get it working with: < mask("rolltag") roll(1500) { escapedCustomDfs("hdfs") } ? mask("rolltag") diskFailover insistentAppend stubbornAppend insistentOpen mask("rolltag") roll(1500) { escapedCustomDfs("hdfs") } >

> Using diskFailover with roll yields unexpected behavior
> -------------------------------------------------------
>
>                 Key: FLUME-735
>                 URL: https://issues.apache.org/jira/browse/FLUME-735
>             Project: Flume
>          Issue Type: Bug
>          Components: Sinks+Sources
>    Affects Versions: v0.9.3
>         Environment: Flume 0.9.3 on Debian Lenny
>            Reporter: Erik Jutemar
>
> Trying to use diskFailover and roll with the config < roll(1500) { escapedCustomDfs("hdfs") } ? diskFailover mask("rolltag") insistentAppend stubbornAppend insistentOpen roll(1500) { escapedCustomDfs("hdfs") } > saves data to disk if hdfs is unavailable. However when hdfs comes back, the data is never sent.
> Scenario:
> Continuously sending data.
> Shutting down hdfs.
> Enable hdfs.
> Stop sending data.
> Wait a while.
> Shut down flume.
> This generates the following log:
> 2011-08-10 16:57:42,384 INFO com.cloudera.flume.handlers.rolling.RollSink: closing RollSink 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:57:42,384 INFO com.cloudera.flume.handlers.rolling.RollSink: double close 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:57:42,384 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:57:42,557 INFO com.cloudera.flume.handlers.rolling.RollSink: Created RollSink: trigger=[TimeTrigger: maxAge=10000 tagger=com.cloudera.flume.handlers.rolling.ProcessTagger@da0225b] checkPeriodMs = 250 spec='NaiveFileFailover'
> 2011-08-10 16:57:42,558 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'NaiveFileFailover'
> 2011-08-10 16:57:42,561 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000010.20110810-165742560+0200.445115919474417.seq
> 2011-08-10 16:57:42,591 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Opening hdfs://host-vip-1:9000//path/2011-08-10-1600/part-log.00000010.20110810-165742384+0200.445115743357603.seq
> 2011-08-10 16:57:42,652 INFO com.hadoop.compression.lzo.GPLNativeCodeLoader: Loaded native gpl library
> 2011-08-10 16:57:42,655 INFO com.hadoop.compression.lzo.LzoCodec: Successfully loaded & initialized native-lzo library [hadoop-lzo rev 0e7005136e4160ed4cc157c4ddd7f4f1c6e11ffa]
> 2011-08-10 16:57:43,152 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Creating HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1600/part-log.00000010.20110810-165742384+0200.445115743357603.seq
> 2011-08-10 16:57:43,890 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Closing hdfs://host-vip-1:9000//path/2011-08-10-1600/part-log.00000010.20110810-165742384+0200.445115743357603.seq
> 2011-08-10 16:57:43,891 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Closing HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1600/part-log.00000010.20110810-165742384+0200.445115743357603.seq
> 2011-08-10 16:57:43,891 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: done writing raw file to hdfs
> 2011-08-10 16:57:44,379 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Opening hdfs://host-vip-1:9000//path/2011-08-10-1600/part-log.00000014.20110810-165743890+0200.445117248899762.seq
> 2011-08-10 16:57:44,389 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Creating HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1600/part-log.00000014.20110810-165743890+0200.445117248899762.seq
> 2011-08-10 16:57:45,558 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Closing hdfs://host-vip-1:9000//path/2011-08-10-1600/part-log.00000014.20110810-165743890+0200.445117248899762.seq
> 2011-08-10 16:57:45,558 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Closing HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1600/part-log.00000014.20110810-165743890+0200.445117248899762.seq
> 2011-08-10 16:57:45,558 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: done writing raw file to hdfs
> 2011-08-10 16:57:46,380 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Opening hdfs://host-vip-1:9000//path/2011-08-10-1600/part-log.00000014.20110810-165745558+0200.445118916703188.seq
> 2011-08-10 16:57:46,389 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Creating HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1600/part-log.00000014.20110810-165745558+0200.445118916703188.seq
> 2011-08-10 16:57:47,085 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Closing hdfs://host-vip-1:9000//path/2011-08-10-1600/part-log.00000014.20110810-165745558+0200.445118916703188.seq
> 2011-08-10 16:57:47,085 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Closing HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1600/part-log.00000014.20110810-165745558+0200.445118916703188.seq
> 2011-08-10 16:57:47,085 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: done writing raw file to hdfs
> 2011-08-10 16:57:52,636 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000010.20110810-165742560+0200.445115919474417.seq
> 2011-08-10 16:57:52,636 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000010.20110810-165742560+0200.445115919474417.seq
> 2011-08-10 16:57:52,637 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165752637+0200.445125995972908.seq
> 2011-08-10 16:57:52,638 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: opening new file for log.00000010.20110810-165742560+0200.445115919474417.seq
> 2011-08-10 16:57:52,712 INFO com.cloudera.flume.agent.diskfailover.DiskFailoverSource: end of file com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager$StateChangeDeco@11c7865b
> 2011-08-10 16:58:02,642 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165752637+0200.445125995972908.seq
> 2011-08-10 16:58:02,642 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165752637+0200.445125995972908.seq
> 2011-08-10 16:58:02,642 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165802642+0200.445136001273694.seq
> 2011-08-10 16:58:02,643 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: opening new file for log.00000015.20110810-165752637+0200.445125995972908.seq
> 2011-08-10 16:58:02,678 INFO com.cloudera.flume.agent.diskfailover.DiskFailoverSource: end of file com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager$StateChangeDeco@67071c84
> 2011-08-10 16:58:12,646 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165802642+0200.445136001273694.seq
> 2011-08-10 16:58:12,646 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165802642+0200.445136001273694.seq
> 2011-08-10 16:58:12,647 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165812647+0200.445146005897644.seq
> 2011-08-10 16:58:12,648 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: opening new file for log.00000015.20110810-165802642+0200.445136001273694.seq
> 2011-08-10 16:58:12,678 INFO com.cloudera.flume.agent.diskfailover.DiskFailoverSource: end of file com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager$StateChangeDeco@b3e15f7
> 2011-08-10 16:58:15,905 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Opening hdfs://host-vip-1:9000//path/2011-08-10-1600/part-log.00000014.20110810-165815689+0200.445149048504058.seq
> 2011-08-10 16:58:16,907 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 0 time(s).
> 2011-08-10 16:58:17,908 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 1 time(s).
> 2011-08-10 16:58:18,909 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 2 time(s).
> 2011-08-10 16:58:19,910 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 3 time(s).
> 2011-08-10 16:58:20,911 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 4 time(s).
> 2011-08-10 16:58:21,912 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 5 time(s).
> 2011-08-10 16:58:22,651 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165812647+0200.445146005897644.seq
> 2011-08-10 16:58:22,651 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165812647+0200.445146005897644.seq
> 2011-08-10 16:58:22,652 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165822652+0200.445156010693063.seq
> 2011-08-10 16:58:22,652 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: opening new file for log.00000015.20110810-165812647+0200.445146005897644.seq
> 2011-08-10 16:58:22,676 INFO com.cloudera.flume.agent.diskfailover.DiskFailoverSource: end of file com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager$StateChangeDeco@4c6504bc
> 2011-08-10 16:58:22,913 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 6 time(s).
> 2011-08-10 16:58:23,914 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 7 time(s).
> 2011-08-10 16:58:24,916 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 8 time(s).
> 2011-08-10 16:58:25,917 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 9 time(s).
> 2011-08-10 16:58:25,918 INFO com.cloudera.flume.core.BackOffFailOverSink: Call to host-vip-1/10.0.62.210:9000 failed on connection exception: java.net.ConnectException: Connection refused
> 2011-08-10 16:58:27,908 INFO com.cloudera.flume.handlers.rolling.RollSink: closing RollSink 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:58:27,909 WARN com.cloudera.flume.handlers.rolling.RollSink: TriggerThread interrupted
> 2011-08-10 16:58:27,909 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:58:27,914 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Opening hdfs://host-vip-1:9000//path/2011-08-10-1600/part-log.00000010.20110810-165827910+0200.445161269239691.seq
> 2011-08-10 16:58:28,916 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 0 time(s).
> 2011-08-10 16:58:29,917 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 1 time(s).
> 2011-08-10 16:58:30,918 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 2 time(s).
> 2011-08-10 16:58:31,919 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 3 time(s).
> 2011-08-10 16:58:32,657 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165822652+0200.445156010693063.seq
> 2011-08-10 16:58:32,657 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165822652+0200.445156010693063.seq
> 2011-08-10 16:58:32,658 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165832658+0200.445166016694026.seq
> 2011-08-10 16:58:32,658 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: opening new file for log.00000015.20110810-165822652+0200.445156010693063.seq
> 2011-08-10 16:58:32,683 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:58:32,684 INFO com.cloudera.flume.handlers.debug.InsistentOpenDecorator: Opened Roll on try 0
> 2011-08-10 16:58:32,685 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Opening hdfs://host-vip-1:9000//path/2011-08-10-1600/part-log.00000016.20110810-165832683+0200.445166041918942.seq
> 2011-08-10 16:58:32,921 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 4 time(s).
> 2011-08-10 16:58:33,922 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 5 time(s).
> 2011-08-10 16:58:34,924 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 6 time(s).
> 2011-08-10 16:58:35,925 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 7 time(s).
> 2011-08-10 16:58:36,926 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 8 time(s).
> 2011-08-10 16:58:37,927 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 9 time(s).
> 2011-08-10 16:58:37,928 INFO com.cloudera.flume.core.BackOffFailOverSink: Call to host-vip-1/10.0.62.210:9000 failed on connection exception: java.net.ConnectException: Connection refused
> 2011-08-10 16:58:38,929 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 0 time(s).
> 2011-08-10 16:58:39,930 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 1 time(s).
> 2011-08-10 16:58:40,911 INFO com.cloudera.flume.handlers.rolling.RollSink: closing RollSink 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:58:40,912 WARN com.cloudera.flume.handlers.rolling.RollSink: TriggerThread interrupted
> 2011-08-10 16:58:40,912 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:58:40,915 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Opening hdfs://host-vip-1:9000//path/2011-08-10-1600/part-log.00000010.20110810-165840912+0200.445174271130316.seq
> 2011-08-10 16:58:40,931 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 2 time(s).
> 2011-08-10 16:58:41,932 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 3 time(s).
> 2011-08-10 16:58:42,664 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165832658+0200.445166016694026.seq
> 2011-08-10 16:58:42,664 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165832658+0200.445166016694026.seq
> 2011-08-10 16:58:42,664 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165842664+0200.445176023166287.seq
> 2011-08-10 16:58:42,933 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 4 time(s).
> 2011-08-10 16:58:43,934 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 5 time(s).
> 2011-08-10 16:58:44,935 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 6 time(s).
> 2011-08-10 16:58:45,936 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 7 time(s).
> 2011-08-10 16:58:46,938 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 8 time(s).
> 2011-08-10 16:58:47,939 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 9 time(s).
> 2011-08-10 16:58:47,940 INFO com.cloudera.flume.handlers.debug.StubbornAppendSink: Call to host-vip-1/10.0.62.210:9000 failed on connection exception: java.net.ConnectException: Connection refused
> 2011-08-10 16:58:47,940 INFO com.cloudera.flume.handlers.rolling.RollSink: closing RollSink 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:58:47,941 INFO com.cloudera.flume.handlers.rolling.RollSink: TriggerThread shutdown
> 2011-08-10 16:58:47,941 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:58:47,943 INFO com.cloudera.flume.handlers.debug.InsistentOpenDecorator: Opened Roll on try 0
> 2011-08-10 16:58:47,944 INFO com.cloudera.flume.handlers.debug.InsistentAppendDecorator: append attempt 0 failed, backoff (1000ms): Event already had an event with attribute rolltag
> 2011-08-10 16:58:48,941 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 0 time(s).
> 2011-08-10 16:58:48,944 INFO com.cloudera.flume.handlers.debug.StubbornAppendSink: Event already had an event with attribute rolltag
> 2011-08-10 16:58:48,944 INFO com.cloudera.flume.handlers.rolling.RollSink: closing RollSink 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:58:48,944 WARN com.cloudera.flume.handlers.rolling.RollSink: TriggerThread interrupted
> 2011-08-10 16:58:48,946 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:58:48,947 INFO com.cloudera.flume.handlers.debug.InsistentOpenDecorator: Opened Roll on try 0
> 2011-08-10 16:58:48,947 INFO com.cloudera.flume.handlers.debug.InsistentAppendDecorator: append attempt 1 failed, backoff (2000ms): Event already had an event with attribute rolltag
> 2011-08-10 16:58:49,942 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 1 time(s).
> 2011-08-10 16:58:50,943 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 2 time(s).
> 2011-08-10 16:58:50,948 INFO com.cloudera.flume.handlers.debug.StubbornAppendSink: Event already had an event with attribute rolltag
> 2011-08-10 16:58:50,948 INFO com.cloudera.flume.handlers.rolling.RollSink: closing RollSink 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:58:50,948 WARN com.cloudera.flume.handlers.rolling.RollSink: TriggerThread interrupted
> 2011-08-10 16:58:50,966 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:58:50,967 INFO com.cloudera.flume.handlers.debug.InsistentOpenDecorator: Opened Roll on try 0
> 2011-08-10 16:58:50,967 INFO com.cloudera.flume.handlers.debug.InsistentAppendDecorator: append attempt 2 failed, backoff (4000ms): Event already had an event with attribute rolltag
> 2011-08-10 16:58:51,944 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 3 time(s).
> 2011-08-10 16:58:52,668 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165842664+0200.445176023166287.seq
> 2011-08-10 16:58:52,668 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165842664+0200.445176023166287.seq
> 2011-08-10 16:58:52,669 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165852669+0200.445186027755160.seq
> 2011-08-10 16:58:52,945 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 4 time(s).
> 2011-08-10 16:58:53,947 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 5 time(s).
> 2011-08-10 16:58:54,949 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 6 time(s).
> 2011-08-10 16:58:54,968 INFO com.cloudera.flume.handlers.debug.StubbornAppendSink: Event already had an event with attribute rolltag
> 2011-08-10 16:58:54,968 INFO com.cloudera.flume.handlers.rolling.RollSink: closing RollSink 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:58:54,968 WARN com.cloudera.flume.handlers.rolling.RollSink: TriggerThread interrupted
> 2011-08-10 16:58:54,969 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:58:54,970 INFO com.cloudera.flume.handlers.debug.InsistentOpenDecorator: Opened Roll on try 0
> 2011-08-10 16:58:54,970 INFO com.cloudera.flume.handlers.debug.InsistentAppendDecorator: append attempt 3 failed, backoff (8000ms): Event already had an event with attribute rolltag
> 2011-08-10 16:58:55,950 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 7 time(s).
> 2011-08-10 16:58:56,952 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 8 time(s).
> 2011-08-10 16:58:57,952 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 9 time(s).
> 2011-08-10 16:58:57,955 INFO com.cloudera.flume.core.BackOffFailOverSink: Call to host-vip-1/10.0.62.210:9000 failed on connection exception: java.net.ConnectException: Connection refused
> 2011-08-10 16:59:02,675 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165852669+0200.445186027755160.seq
> 2011-08-10 16:59:02,675 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165852669+0200.445186027755160.seq
> 2011-08-10 16:59:02,675 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165902675+0200.445196034139189.seq
> 2011-08-10 16:59:02,916 INFO com.cloudera.flume.handlers.rolling.RollSink: closing RollSink 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:59:02,916 WARN com.cloudera.flume.handlers.rolling.RollSink: TriggerThread interrupted
> 2011-08-10 16:59:02,917 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:59:02,919 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Opening hdfs://host-vip-1:9000//path/2011-08-10-1700/part-log.00000010.20110810-165902917+0200.445196275742099.seq
> 2011-08-10 16:59:02,970 INFO com.cloudera.flume.handlers.debug.StubbornAppendSink: Event already had an event with attribute rolltag
> 2011-08-10 16:59:02,970 INFO com.cloudera.flume.handlers.rolling.RollSink: closing RollSink 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:59:02,971 WARN com.cloudera.flume.handlers.rolling.RollSink: TriggerThread interrupted
> 2011-08-10 16:59:02,971 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:59:02,972 INFO com.cloudera.flume.handlers.debug.InsistentOpenDecorator: Opened Roll on try 0
> 2011-08-10 16:59:02,972 INFO com.cloudera.flume.handlers.debug.InsistentAppendDecorator: append attempt 4 failed, backoff (16000ms): Event already had an event with attribute rolltag
> 2011-08-10 16:59:03,920 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 0 time(s).
> 2011-08-10 16:59:04,921 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 1 time(s).
> 2011-08-10 16:59:05,922 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 2 time(s).
> 2011-08-10 16:59:06,923 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 3 time(s).
> 2011-08-10 16:59:07,924 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 4 time(s).
> 2011-08-10 16:59:08,925 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 5 time(s).
> 2011-08-10 16:59:09,926 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 6 time(s).
> 2011-08-10 16:59:10,927 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 7 time(s).
> 2011-08-10 16:59:11,928 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 8 time(s).
> 2011-08-10 16:59:12,679 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165902675+0200.445196034139189.seq
> 2011-08-10 16:59:12,679 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165902675+0200.445196034139189.seq
> 2011-08-10 16:59:12,680 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165912680+0200.445206038676079.seq
> 2011-08-10 16:59:12,929 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 9 time(s).
> 2011-08-10 16:59:12,930 INFO com.cloudera.flume.core.BackOffFailOverSink: Call to host-vip-1/10.0.62.210:9000 failed on connection exception: java.net.ConnectException: Connection refused
> 2011-08-10 16:59:18,973 INFO com.cloudera.flume.handlers.debug.StubbornAppendSink: Event already had an event with attribute rolltag
> 2011-08-10 16:59:18,973 INFO com.cloudera.flume.handlers.rolling.RollSink: closing RollSink 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:59:18,973 WARN com.cloudera.flume.handlers.rolling.RollSink: TriggerThread interrupted
> 2011-08-10 16:59:18,973 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:59:18,975 INFO com.cloudera.flume.handlers.debug.InsistentOpenDecorator: Opened Roll on try 0
> 2011-08-10 16:59:18,976 INFO com.cloudera.flume.handlers.debug.InsistentAppendDecorator: append attempt 5 failed, backoff (32000ms): Event already had an event with attribute rolltag
> 2011-08-10 16:59:21,921 INFO com.cloudera.flume.handlers.rolling.RollSink: closing RollSink 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:59:21,922 WARN com.cloudera.flume.handlers.rolling.RollSink: TriggerThread interrupted
> 2011-08-10 16:59:21,922 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:59:21,968 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Opening hdfs://host-vip-1:9000//path/2011-08-10-1700/part-log.00000010.20110810-165921922+0200.445215281259212.seq
> 2011-08-10 16:59:22,719 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165912680+0200.445206038676079.seq
> 2011-08-10 16:59:22,719 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165912680+0200.445206038676079.seq
> 2011-08-10 16:59:22,720 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165922719+0200.445216078556307.seq
> 2011-08-10 16:59:22,969 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 0 time(s).
> 2011-08-10 16:59:23,970 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 1 time(s).
> 2011-08-10 16:59:24,972 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 2 time(s).
> 2011-08-10 16:59:25,972 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 3 time(s).
> 2011-08-10 16:59:26,974 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 4 time(s).
> 2011-08-10 16:59:27,975 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 5 time(s).
> 2011-08-10 16:59:28,976 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 6 time(s).
> 2011-08-10 16:59:29,304 INFO com.cloudera.flume.core.BackOffFailOverSink: org.apache.hadoop.hdfs.server.namenode.SafeModeException: Cannot create file/path/2011-08-10-1700/part-log.00000010.20110810-165921922+0200.445215281259212.seq. Name node is in safe mode.
> The ratio of reported blocks 0.0000 has not reached the threshold 0.9990. Safe mode will be turned off automatically.
> 	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:1019)
> 	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:989)
> 	at org.apache.hadoop.hdfs.server.namenode.NameNode.create(NameNode.java:377)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> 	at java.lang.reflect.Method.invoke(Method.java:597)
> 	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:508)
> 	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:961)
> 	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:957)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.security.auth.Subject.doAs(Subject.java:396)
> 	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:955)
> 2011-08-10 16:59:32,726 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165922719+0200.445216078556307.seq
> 2011-08-10 16:59:32,726 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165922719+0200.445216078556307.seq
> 2011-08-10 16:59:32,727 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165932726+0200.445226085601122.seq
> 2011-08-10 16:59:42,732 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165932726+0200.445226085601122.seq
> 2011-08-10 16:59:42,732 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165932726+0200.445226085601122.seq
> 2011-08-10 16:59:42,732 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165942732+0200.445236091341764.seq
> 2011-08-10 16:59:45,926 INFO com.cloudera.flume.handlers.rolling.RollSink: closing RollSink 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:59:45,927 WARN com.cloudera.flume.handlers.rolling.RollSink: TriggerThread interrupted
> 2011-08-10 16:59:45,927 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:59:45,929 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Opening hdfs://host-vip-1:9000//path/2011-08-10-1700/part-log.00000010.20110810-165945928+0200.445239286795624.seq
> 2011-08-10 16:59:45,964 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Creating HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1700/part-log.00000010.20110810-165945928+0200.445239286795624.seq
> 2011-08-10 16:59:47,429 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Closing hdfs://host-vip-1:9000//path/2011-08-10-1700/part-log.00000010.20110810-165945928+0200.445239286795624.seq
> 2011-08-10 16:59:47,429 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Closing HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1700/part-log.00000010.20110810-165945928+0200.445239286795624.seq
> 2011-08-10 16:59:47,429 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: done writing raw file to hdfs
> 2011-08-10 16:59:47,927 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Opening hdfs://host-vip-1:9000//path/2011-08-10-1700/part-log.00000056.20110810-165947429+0200.445240787825417.seq
> 2011-08-10 16:59:47,938 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Creating HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1700/part-log.00000056.20110810-165947429+0200.445240787825417.seq
> 2011-08-10 16:59:49,121 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Closing hdfs://host-vip-1:9000//path/2011-08-10-1700/part-log.00000056.20110810-165947429+0200.445240787825417.seq
> 2011-08-10 16:59:49,121 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Closing HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1700/part-log.00000056.20110810-165947429+0200.445240787825417.seq
> 2011-08-10 16:59:49,121 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: done writing raw file to hdfs
> 2011-08-10 16:59:49,927 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Opening hdfs://host-vip-1:9000//path/2011-08-10-1700/part-log.00000056.20110810-165949121+0200.445242480222436.seq
> 2011-08-10 16:59:49,934 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Creating HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1700/part-log.00000056.20110810-165949121+0200.445242480222436.seq
> 2011-08-10 16:59:50,727 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Closing hdfs://host-vip-1:9000//path/2011-08-10-1700/part-log.00000056.20110810-165949121+0200.445242480222436.seq
> 2011-08-10 16:59:50,727 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Closing HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1700/part-log.00000056.20110810-165949121+0200.445242480222436.seq
> 2011-08-10 16:59:50,727 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: done writing raw file to hdfs
> 2011-08-10 16:59:50,927 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Opening hdfs://host-vip-1:9000//path/2011-08-10-1700/part-log.00000056.20110810-165950727+0200.445244085712823.seq
> 2011-08-10 16:59:50,933 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Creating HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1700/part-log.00000056.20110810-165950727+0200.445244085712823.seq
> 2011-08-10 16:59:50,976 INFO com.cloudera.flume.handlers.debug.StubbornAppendSink: Event already had an event with attribute rolltag
> 2011-08-10 16:59:50,976 INFO com.cloudera.flume.handlers.rolling.RollSink: closing RollSink 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:59:50,976 WARN com.cloudera.flume.handlers.rolling.RollSink: TriggerThread interrupted
> 2011-08-10 16:59:50,976 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:59:50,978 INFO com.cloudera.flume.handlers.debug.InsistentOpenDecorator: Opened Roll on try 0
> 2011-08-10 16:59:50,978 INFO com.cloudera.flume.handlers.debug.InsistentAppendDecorator: append attempt 6 failed, backoff (60000ms): Event already had an event with attribute rolltag
> 2011-08-10 16:59:52,278 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Closing hdfs://host-vip-1:9000//path/2011-08-10-1700/part-log.00000056.20110810-165950727+0200.445244085712823.seq
> 2011-08-10 16:59:52,278 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Closing HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1700/part-log.00000056.20110810-165950727+0200.445244085712823.seq
> 2011-08-10 16:59:52,278 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: done writing raw file to hdfs
> 2011-08-10 16:59:52,737 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165942732+0200.445236091341764.seq
> 2011-08-10 16:59:52,737 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165942732+0200.445236091341764.seq
> 2011-08-10 16:59:52,737 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165952737+0200.445246096329911.seq
> 2011-08-10 17:00:02,741 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165952737+0200.445246096329911.seq
> 2011-08-10 17:00:02,741 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165952737+0200.445246096329911.seq
> 2011-08-10 17:00:02,742 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-170002741+0200.445256100500909.seq
> 2011-08-10 17:00:12,745 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-170002741+0200.445256100500909.seq
> 2011-08-10 17:00:12,745 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-170002741+0200.445256100500909.seq
> 2011-08-10 17:00:12,746 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-170012745+0200.445266104587375.seq
> 2011-08-10 17:00:22,768 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-170012745+0200.445266104587375.seq
> 2011-08-10 17:00:22,769 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-170012745+0200.445266104587375.seq
> 2011-08-10 17:00:22,769 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-170022769+0200.445276127976521.seq
> 2011-08-10 17:00:32,774 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-170022769+0200.445276127976521.seq
> 2011-08-10 17:00:32,774 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-170022769+0200.445276127976521.seq
> 2011-08-10 17:00:32,774 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-170032774+0200.445286133232984.seq
> 2011-08-10 17:00:42,779 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-170032774+0200.445286133232984.seq
> 2011-08-10 17:00:42,779 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-170032774+0200.445286133232984.seq
> 2011-08-10 17:00:42,780 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-170042779+0200.445296138546943.seq
> 2011-08-10 17:00:45,294 INFO com.cloudera.flume.watchdog.Watchdog: Watchdog shutdown hook
> This results in the following files being written to dfo_* :
> ls -lah *
> dfo_error:
> total 8.0K
> drwxr-xr-x 2 user nogroup 4.0K 2011-07-18 11:11 .
> drwxr-xr-x 7 user nogroup 4.0K 2011-07-18 11:11 ..
> dfo_import:
> total 8.0K
> drwxr-xr-x 2 user nogroup 4.0K 2011-07-18 11:11 .
> drwxr-xr-x 7 user nogroup 4.0K 2011-07-18 11:11 ..
> dfo_logged:
> total 120K
> drwxr-xr-x 2 user nogroup  12K 2011-08-10 17:00 .
> drwxr-xr-x 7 user nogroup 4.0K 2011-07-18 11:11 ..
> -rw-r--r-- 1 user nogroup  12K 2011-08-10 16:58 log.00000015.20110810-165832658+0200.445166016694026.seq
> -rw-r--r-- 1 user nogroup  125 2011-08-10 16:58 log.00000015.20110810-165842664+0200.445176023166287.seq
> -rw-r--r-- 1 user nogroup  19K 2011-08-10 16:59 log.00000015.20110810-165852669+0200.445186027755160.seq
> -rw-r--r-- 1 user nogroup  125 2011-08-10 16:59 log.00000015.20110810-165902675+0200.445196034139189.seq
> -rw-r--r-- 1 user nogroup  17K 2011-08-10 16:59 log.00000015.20110810-165912680+0200.445206038676079.seq
> -rw-r--r-- 1 user nogroup 9.6K 2011-08-10 16:59 log.00000015.20110810-165922719+0200.445216078556307.seq
> -rw-r--r-- 1 user nogroup 8.8K 2011-08-10 16:59 log.00000015.20110810-165932726+0200.445226085601122.seq
> -rw-r--r-- 1 user nogroup 2.8K 2011-08-10 16:59 log.00000015.20110810-165942732+0200.445236091341764.seq
> -rw-r--r-- 1 user nogroup  125 2011-08-10 16:59 log.00000015.20110810-165952737+0200.445246096329911.seq
> -rw-r--r-- 1 user nogroup  125 2011-08-10 17:00 log.00000015.20110810-170002741+0200.445256100500909.seq
> -rw-r--r-- 1 user nogroup  125 2011-08-10 17:00 log.00000015.20110810-170012745+0200.445266104587375.seq
> -rw-r--r-- 1 user nogroup  125 2011-08-10 17:00 log.00000015.20110810-170022769+0200.445276127976521.seq
> dfo_sending:
> total 20K
> drwxr-xr-x 2 user nogroup 4.0K 2011-08-10 16:58 .
> drwxr-xr-x 7 user nogroup 4.0K 2011-07-18 11:11 ..
> -rw-r--r-- 1 user nogroup  11K 2011-08-10 16:58 log.00000015.20110810-165822652+0200.445156010693063.seq
> dfo_writing:
> total 12K
> drwxr-xr-x 2 user nogroup 4.0K 2011-08-10 17:00 .
> drwxr-xr-x 7 user nogroup 4.0K 2011-07-18 11:11 ..
> -rw-r--r-- 1 user nogroup  125 2011-08-10 17:00 log.00000015.20110810-170032774+0200.445286133232984.seq

--
This message is automatically generated by JIRA.
For more information on JIRA, see: http://www.atlassian.com/software/jira

        

[jira] [Commented] (FLUME-735) Using diskFailover with roll yields unexpected behavior

Posted by "Jonathan Hsieh (JIRA)" <ji...@apache.org>.
    [ https://issues.apache.org/jira/browse/FLUME-735?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13082451#comment-13082451 ] 

Jonathan Hsieh commented on FLUME-735:
--------------------------------------

The main goal for the specified configuration as to have a single node that listened for data and did disk failover spooling to disk if hdfs goes down.

One odd issue is that the mast("rolltag") in the failover side doesn't seem to succeed at removing the rolltag attribute after the diskFailover, specifically on this line:

2011-08-10 16:59:50,976 INFO com.cloudera.flume.handlers.debug.StubbornAppendSink: Event already had an event with attribute rolltag 

> Using diskFailover with roll yields unexpected behavior
> -------------------------------------------------------
>
>                 Key: FLUME-735
>                 URL: https://issues.apache.org/jira/browse/FLUME-735
>             Project: Flume
>          Issue Type: Bug
>          Components: Sinks+Sources
>    Affects Versions: v0.9.3
>         Environment: Flume 0.9.3 on Debian Lenny
>            Reporter: Erik Jutemar
>
> Trying to use diskFailover and roll with the config < roll(1500) { escapedCustomDfs("hdfs") } ? diskFailover mask("rolltag") insistentAppend stubbornAppend insistentOpen roll(1500) { escapedCustomDfs("hdfs") } > saves data to disk if hdfs is unavailable. However when hdfs comes back, the data is never sent.
> Scenario:
> Continuously sending data.
> Shutting down hdfs.
> Enable hdfs.
> Stop sending data.
> Wait a while.
> Shut down flume.
> This generates the following log:
> 2011-08-10 16:57:42,384 INFO com.cloudera.flume.handlers.rolling.RollSink: closing RollSink 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:57:42,384 INFO com.cloudera.flume.handlers.rolling.RollSink: double close 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:57:42,384 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:57:42,557 INFO com.cloudera.flume.handlers.rolling.RollSink: Created RollSink: trigger=[TimeTrigger: maxAge=10000 tagger=com.cloudera.flume.handlers.rolling.ProcessTagger@da0225b] checkPeriodMs = 250 spec='NaiveFileFailover'
> 2011-08-10 16:57:42,558 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'NaiveFileFailover'
> 2011-08-10 16:57:42,561 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000010.20110810-165742560+0200.445115919474417.seq
> 2011-08-10 16:57:42,591 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Opening hdfs://host-vip-1:9000//path/2011-08-10-1600/part-log.00000010.20110810-165742384+0200.445115743357603.seq
> 2011-08-10 16:57:42,652 INFO com.hadoop.compression.lzo.GPLNativeCodeLoader: Loaded native gpl library
> 2011-08-10 16:57:42,655 INFO com.hadoop.compression.lzo.LzoCodec: Successfully loaded & initialized native-lzo library [hadoop-lzo rev 0e7005136e4160ed4cc157c4ddd7f4f1c6e11ffa]
> 2011-08-10 16:57:43,152 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Creating HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1600/part-log.00000010.20110810-165742384+0200.445115743357603.seq
> 2011-08-10 16:57:43,890 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Closing hdfs://host-vip-1:9000//path/2011-08-10-1600/part-log.00000010.20110810-165742384+0200.445115743357603.seq
> 2011-08-10 16:57:43,891 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Closing HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1600/part-log.00000010.20110810-165742384+0200.445115743357603.seq
> 2011-08-10 16:57:43,891 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: done writing raw file to hdfs
> 2011-08-10 16:57:44,379 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Opening hdfs://host-vip-1:9000//path/2011-08-10-1600/part-log.00000014.20110810-165743890+0200.445117248899762.seq
> 2011-08-10 16:57:44,389 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Creating HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1600/part-log.00000014.20110810-165743890+0200.445117248899762.seq
> 2011-08-10 16:57:45,558 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Closing hdfs://host-vip-1:9000//path/2011-08-10-1600/part-log.00000014.20110810-165743890+0200.445117248899762.seq
> 2011-08-10 16:57:45,558 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Closing HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1600/part-log.00000014.20110810-165743890+0200.445117248899762.seq
> 2011-08-10 16:57:45,558 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: done writing raw file to hdfs
> 2011-08-10 16:57:46,380 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Opening hdfs://host-vip-1:9000//path/2011-08-10-1600/part-log.00000014.20110810-165745558+0200.445118916703188.seq
> 2011-08-10 16:57:46,389 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Creating HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1600/part-log.00000014.20110810-165745558+0200.445118916703188.seq
> 2011-08-10 16:57:47,085 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Closing hdfs://host-vip-1:9000//path/2011-08-10-1600/part-log.00000014.20110810-165745558+0200.445118916703188.seq
> 2011-08-10 16:57:47,085 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Closing HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1600/part-log.00000014.20110810-165745558+0200.445118916703188.seq
> 2011-08-10 16:57:47,085 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: done writing raw file to hdfs
> 2011-08-10 16:57:52,636 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000010.20110810-165742560+0200.445115919474417.seq
> 2011-08-10 16:57:52,636 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000010.20110810-165742560+0200.445115919474417.seq
> 2011-08-10 16:57:52,637 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165752637+0200.445125995972908.seq
> 2011-08-10 16:57:52,638 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: opening new file for log.00000010.20110810-165742560+0200.445115919474417.seq
> 2011-08-10 16:57:52,712 INFO com.cloudera.flume.agent.diskfailover.DiskFailoverSource: end of file com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager$StateChangeDeco@11c7865b
> 2011-08-10 16:58:02,642 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165752637+0200.445125995972908.seq
> 2011-08-10 16:58:02,642 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165752637+0200.445125995972908.seq
> 2011-08-10 16:58:02,642 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165802642+0200.445136001273694.seq
> 2011-08-10 16:58:02,643 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: opening new file for log.00000015.20110810-165752637+0200.445125995972908.seq
> 2011-08-10 16:58:02,678 INFO com.cloudera.flume.agent.diskfailover.DiskFailoverSource: end of file com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager$StateChangeDeco@67071c84
> 2011-08-10 16:58:12,646 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165802642+0200.445136001273694.seq
> 2011-08-10 16:58:12,646 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165802642+0200.445136001273694.seq
> 2011-08-10 16:58:12,647 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165812647+0200.445146005897644.seq
> 2011-08-10 16:58:12,648 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: opening new file for log.00000015.20110810-165802642+0200.445136001273694.seq
> 2011-08-10 16:58:12,678 INFO com.cloudera.flume.agent.diskfailover.DiskFailoverSource: end of file com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager$StateChangeDeco@b3e15f7
> 2011-08-10 16:58:15,905 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Opening hdfs://host-vip-1:9000//path/2011-08-10-1600/part-log.00000014.20110810-165815689+0200.445149048504058.seq
> 2011-08-10 16:58:16,907 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 0 time(s).
> 2011-08-10 16:58:17,908 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 1 time(s).
> 2011-08-10 16:58:18,909 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 2 time(s).
> 2011-08-10 16:58:19,910 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 3 time(s).
> 2011-08-10 16:58:20,911 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 4 time(s).
> 2011-08-10 16:58:21,912 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 5 time(s).
> 2011-08-10 16:58:22,651 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165812647+0200.445146005897644.seq
> 2011-08-10 16:58:22,651 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165812647+0200.445146005897644.seq
> 2011-08-10 16:58:22,652 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165822652+0200.445156010693063.seq
> 2011-08-10 16:58:22,652 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: opening new file for log.00000015.20110810-165812647+0200.445146005897644.seq
> 2011-08-10 16:58:22,676 INFO com.cloudera.flume.agent.diskfailover.DiskFailoverSource: end of file com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager$StateChangeDeco@4c6504bc
> 2011-08-10 16:58:22,913 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 6 time(s).
> 2011-08-10 16:58:23,914 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 7 time(s).
> 2011-08-10 16:58:24,916 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 8 time(s).
> 2011-08-10 16:58:25,917 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 9 time(s).
> 2011-08-10 16:58:25,918 INFO com.cloudera.flume.core.BackOffFailOverSink: Call to host-vip-1/10.0.62.210:9000 failed on connection exception: java.net.ConnectException: Connection refused
> 2011-08-10 16:58:27,908 INFO com.cloudera.flume.handlers.rolling.RollSink: closing RollSink 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:58:27,909 WARN com.cloudera.flume.handlers.rolling.RollSink: TriggerThread interrupted
> 2011-08-10 16:58:27,909 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:58:27,914 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Opening hdfs://host-vip-1:9000//path/2011-08-10-1600/part-log.00000010.20110810-165827910+0200.445161269239691.seq
> 2011-08-10 16:58:28,916 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 0 time(s).
> 2011-08-10 16:58:29,917 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 1 time(s).
> 2011-08-10 16:58:30,918 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 2 time(s).
> 2011-08-10 16:58:31,919 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 3 time(s).
> 2011-08-10 16:58:32,657 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165822652+0200.445156010693063.seq
> 2011-08-10 16:58:32,657 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165822652+0200.445156010693063.seq
> 2011-08-10 16:58:32,658 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165832658+0200.445166016694026.seq
> 2011-08-10 16:58:32,658 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: opening new file for log.00000015.20110810-165822652+0200.445156010693063.seq
> 2011-08-10 16:58:32,683 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:58:32,684 INFO com.cloudera.flume.handlers.debug.InsistentOpenDecorator: Opened Roll on try 0
> 2011-08-10 16:58:32,685 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Opening hdfs://host-vip-1:9000//path/2011-08-10-1600/part-log.00000016.20110810-165832683+0200.445166041918942.seq
> 2011-08-10 16:58:32,921 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 4 time(s).
> 2011-08-10 16:58:33,922 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 5 time(s).
> 2011-08-10 16:58:34,924 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 6 time(s).
> 2011-08-10 16:58:35,925 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 7 time(s).
> 2011-08-10 16:58:36,926 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 8 time(s).
> 2011-08-10 16:58:37,927 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 9 time(s).
> 2011-08-10 16:58:37,928 INFO com.cloudera.flume.core.BackOffFailOverSink: Call to host-vip-1/10.0.62.210:9000 failed on connection exception: java.net.ConnectException: Connection refused
> 2011-08-10 16:58:38,929 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 0 time(s).
> 2011-08-10 16:58:39,930 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 1 time(s).
> 2011-08-10 16:58:40,911 INFO com.cloudera.flume.handlers.rolling.RollSink: closing RollSink 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:58:40,912 WARN com.cloudera.flume.handlers.rolling.RollSink: TriggerThread interrupted
> 2011-08-10 16:58:40,912 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:58:40,915 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Opening hdfs://host-vip-1:9000//path/2011-08-10-1600/part-log.00000010.20110810-165840912+0200.445174271130316.seq
> 2011-08-10 16:58:40,931 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 2 time(s).
> 2011-08-10 16:58:41,932 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 3 time(s).
> 2011-08-10 16:58:42,664 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165832658+0200.445166016694026.seq
> 2011-08-10 16:58:42,664 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165832658+0200.445166016694026.seq
> 2011-08-10 16:58:42,664 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165842664+0200.445176023166287.seq
> 2011-08-10 16:58:42,933 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 4 time(s).
> 2011-08-10 16:58:43,934 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 5 time(s).
> 2011-08-10 16:58:44,935 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 6 time(s).
> 2011-08-10 16:58:45,936 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 7 time(s).
> 2011-08-10 16:58:46,938 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 8 time(s).
> 2011-08-10 16:58:47,939 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 9 time(s).
> 2011-08-10 16:58:47,940 INFO com.cloudera.flume.handlers.debug.StubbornAppendSink: Call to host-vip-1/10.0.62.210:9000 failed on connection exception: java.net.ConnectException: Connection refused
> 2011-08-10 16:58:47,940 INFO com.cloudera.flume.handlers.rolling.RollSink: closing RollSink 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:58:47,941 INFO com.cloudera.flume.handlers.rolling.RollSink: TriggerThread shutdown
> 2011-08-10 16:58:47,941 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:58:47,943 INFO com.cloudera.flume.handlers.debug.InsistentOpenDecorator: Opened Roll on try 0
> 2011-08-10 16:58:47,944 INFO com.cloudera.flume.handlers.debug.InsistentAppendDecorator: append attempt 0 failed, backoff (1000ms): Event already had an event with attribute rolltag
> 2011-08-10 16:58:48,941 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 0 time(s).
> 2011-08-10 16:58:48,944 INFO com.cloudera.flume.handlers.debug.StubbornAppendSink: Event already had an event with attribute rolltag
> 2011-08-10 16:58:48,944 INFO com.cloudera.flume.handlers.rolling.RollSink: closing RollSink 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:58:48,944 WARN com.cloudera.flume.handlers.rolling.RollSink: TriggerThread interrupted
> 2011-08-10 16:58:48,946 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:58:48,947 INFO com.cloudera.flume.handlers.debug.InsistentOpenDecorator: Opened Roll on try 0
> 2011-08-10 16:58:48,947 INFO com.cloudera.flume.handlers.debug.InsistentAppendDecorator: append attempt 1 failed, backoff (2000ms): Event already had an event with attribute rolltag
> 2011-08-10 16:58:49,942 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 1 time(s).
> 2011-08-10 16:58:50,943 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 2 time(s).
> 2011-08-10 16:58:50,948 INFO com.cloudera.flume.handlers.debug.StubbornAppendSink: Event already had an event with attribute rolltag
> 2011-08-10 16:58:50,948 INFO com.cloudera.flume.handlers.rolling.RollSink: closing RollSink 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:58:50,948 WARN com.cloudera.flume.handlers.rolling.RollSink: TriggerThread interrupted
> 2011-08-10 16:58:50,966 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:58:50,967 INFO com.cloudera.flume.handlers.debug.InsistentOpenDecorator: Opened Roll on try 0
> 2011-08-10 16:58:50,967 INFO com.cloudera.flume.handlers.debug.InsistentAppendDecorator: append attempt 2 failed, backoff (4000ms): Event already had an event with attribute rolltag
> 2011-08-10 16:58:51,944 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 3 time(s).
> 2011-08-10 16:58:52,668 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165842664+0200.445176023166287.seq
> 2011-08-10 16:58:52,668 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165842664+0200.445176023166287.seq
> 2011-08-10 16:58:52,669 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165852669+0200.445186027755160.seq
> 2011-08-10 16:58:52,945 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 4 time(s).
> 2011-08-10 16:58:53,947 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 5 time(s).
> 2011-08-10 16:58:54,949 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 6 time(s).
> 2011-08-10 16:58:54,968 INFO com.cloudera.flume.handlers.debug.StubbornAppendSink: Event already had an event with attribute rolltag
> 2011-08-10 16:58:54,968 INFO com.cloudera.flume.handlers.rolling.RollSink: closing RollSink 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:58:54,968 WARN com.cloudera.flume.handlers.rolling.RollSink: TriggerThread interrupted
> 2011-08-10 16:58:54,969 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:58:54,970 INFO com.cloudera.flume.handlers.debug.InsistentOpenDecorator: Opened Roll on try 0
> 2011-08-10 16:58:54,970 INFO com.cloudera.flume.handlers.debug.InsistentAppendDecorator: append attempt 3 failed, backoff (8000ms): Event already had an event with attribute rolltag
> 2011-08-10 16:58:55,950 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 7 time(s).
> 2011-08-10 16:58:56,952 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 8 time(s).
> 2011-08-10 16:58:57,952 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 9 time(s).
> 2011-08-10 16:58:57,955 INFO com.cloudera.flume.core.BackOffFailOverSink: Call to host-vip-1/10.0.62.210:9000 failed on connection exception: java.net.ConnectException: Connection refused
> 2011-08-10 16:59:02,675 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165852669+0200.445186027755160.seq
> 2011-08-10 16:59:02,675 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165852669+0200.445186027755160.seq
> 2011-08-10 16:59:02,675 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165902675+0200.445196034139189.seq
> 2011-08-10 16:59:02,916 INFO com.cloudera.flume.handlers.rolling.RollSink: closing RollSink 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:59:02,916 WARN com.cloudera.flume.handlers.rolling.RollSink: TriggerThread interrupted
> 2011-08-10 16:59:02,917 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:59:02,919 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Opening hdfs://host-vip-1:9000//path/2011-08-10-1700/part-log.00000010.20110810-165902917+0200.445196275742099.seq
> 2011-08-10 16:59:02,970 INFO com.cloudera.flume.handlers.debug.StubbornAppendSink: Event already had an event with attribute rolltag
> 2011-08-10 16:59:02,970 INFO com.cloudera.flume.handlers.rolling.RollSink: closing RollSink 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:59:02,971 WARN com.cloudera.flume.handlers.rolling.RollSink: TriggerThread interrupted
> 2011-08-10 16:59:02,971 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:59:02,972 INFO com.cloudera.flume.handlers.debug.InsistentOpenDecorator: Opened Roll on try 0
> 2011-08-10 16:59:02,972 INFO com.cloudera.flume.handlers.debug.InsistentAppendDecorator: append attempt 4 failed, backoff (16000ms): Event already had an event with attribute rolltag
> 2011-08-10 16:59:03,920 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 0 time(s).
> 2011-08-10 16:59:04,921 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 1 time(s).
> 2011-08-10 16:59:05,922 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 2 time(s).
> 2011-08-10 16:59:06,923 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 3 time(s).
> 2011-08-10 16:59:07,924 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 4 time(s).
> 2011-08-10 16:59:08,925 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 5 time(s).
> 2011-08-10 16:59:09,926 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 6 time(s).
> 2011-08-10 16:59:10,927 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 7 time(s).
> 2011-08-10 16:59:11,928 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 8 time(s).
> 2011-08-10 16:59:12,679 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165902675+0200.445196034139189.seq
> 2011-08-10 16:59:12,679 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165902675+0200.445196034139189.seq
> 2011-08-10 16:59:12,680 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165912680+0200.445206038676079.seq
> 2011-08-10 16:59:12,929 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 9 time(s).
> 2011-08-10 16:59:12,930 INFO com.cloudera.flume.core.BackOffFailOverSink: Call to host-vip-1/10.0.62.210:9000 failed on connection exception: java.net.ConnectException: Connection refused
> 2011-08-10 16:59:18,973 INFO com.cloudera.flume.handlers.debug.StubbornAppendSink: Event already had an event with attribute rolltag
> 2011-08-10 16:59:18,973 INFO com.cloudera.flume.handlers.rolling.RollSink: closing RollSink 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:59:18,973 WARN com.cloudera.flume.handlers.rolling.RollSink: TriggerThread interrupted
> 2011-08-10 16:59:18,973 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:59:18,975 INFO com.cloudera.flume.handlers.debug.InsistentOpenDecorator: Opened Roll on try 0
> 2011-08-10 16:59:18,976 INFO com.cloudera.flume.handlers.debug.InsistentAppendDecorator: append attempt 5 failed, backoff (32000ms): Event already had an event with attribute rolltag
> 2011-08-10 16:59:21,921 INFO com.cloudera.flume.handlers.rolling.RollSink: closing RollSink 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:59:21,922 WARN com.cloudera.flume.handlers.rolling.RollSink: TriggerThread interrupted
> 2011-08-10 16:59:21,922 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:59:21,968 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Opening hdfs://host-vip-1:9000//path/2011-08-10-1700/part-log.00000010.20110810-165921922+0200.445215281259212.seq
> 2011-08-10 16:59:22,719 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165912680+0200.445206038676079.seq
> 2011-08-10 16:59:22,719 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165912680+0200.445206038676079.seq
> 2011-08-10 16:59:22,720 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165922719+0200.445216078556307.seq
> 2011-08-10 16:59:22,969 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 0 time(s).
> 2011-08-10 16:59:23,970 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 1 time(s).
> 2011-08-10 16:59:24,972 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 2 time(s).
> 2011-08-10 16:59:25,972 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 3 time(s).
> 2011-08-10 16:59:26,974 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 4 time(s).
> 2011-08-10 16:59:27,975 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 5 time(s).
> 2011-08-10 16:59:28,976 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: host-vip-1/10.0.62.210:9000. Already tried 6 time(s).
> 2011-08-10 16:59:29,304 INFO com.cloudera.flume.core.BackOffFailOverSink: org.apache.hadoop.hdfs.server.namenode.SafeModeException: Cannot create file/path/2011-08-10-1700/part-log.00000010.20110810-165921922+0200.445215281259212.seq. Name node is in safe mode.
> The ratio of reported blocks 0.0000 has not reached the threshold 0.9990. Safe mode will be turned off automatically.
> 	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:1019)
> 	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:989)
> 	at org.apache.hadoop.hdfs.server.namenode.NameNode.create(NameNode.java:377)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> 	at java.lang.reflect.Method.invoke(Method.java:597)
> 	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:508)
> 	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:961)
> 	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:957)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.security.auth.Subject.doAs(Subject.java:396)
> 	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:955)
> 2011-08-10 16:59:32,726 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165922719+0200.445216078556307.seq
> 2011-08-10 16:59:32,726 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165922719+0200.445216078556307.seq
> 2011-08-10 16:59:32,727 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165932726+0200.445226085601122.seq
> 2011-08-10 16:59:42,732 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165932726+0200.445226085601122.seq
> 2011-08-10 16:59:42,732 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165932726+0200.445226085601122.seq
> 2011-08-10 16:59:42,732 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165942732+0200.445236091341764.seq
> 2011-08-10 16:59:45,926 INFO com.cloudera.flume.handlers.rolling.RollSink: closing RollSink 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:59:45,927 WARN com.cloudera.flume.handlers.rolling.RollSink: TriggerThread interrupted
> 2011-08-10 16:59:45,927 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:59:45,929 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Opening hdfs://host-vip-1:9000//path/2011-08-10-1700/part-log.00000010.20110810-165945928+0200.445239286795624.seq
> 2011-08-10 16:59:45,964 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Creating HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1700/part-log.00000010.20110810-165945928+0200.445239286795624.seq
> 2011-08-10 16:59:47,429 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Closing hdfs://host-vip-1:9000//path/2011-08-10-1700/part-log.00000010.20110810-165945928+0200.445239286795624.seq
> 2011-08-10 16:59:47,429 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Closing HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1700/part-log.00000010.20110810-165945928+0200.445239286795624.seq
> 2011-08-10 16:59:47,429 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: done writing raw file to hdfs
> 2011-08-10 16:59:47,927 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Opening hdfs://host-vip-1:9000//path/2011-08-10-1700/part-log.00000056.20110810-165947429+0200.445240787825417.seq
> 2011-08-10 16:59:47,938 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Creating HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1700/part-log.00000056.20110810-165947429+0200.445240787825417.seq
> 2011-08-10 16:59:49,121 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Closing hdfs://host-vip-1:9000//path/2011-08-10-1700/part-log.00000056.20110810-165947429+0200.445240787825417.seq
> 2011-08-10 16:59:49,121 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Closing HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1700/part-log.00000056.20110810-165947429+0200.445240787825417.seq
> 2011-08-10 16:59:49,121 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: done writing raw file to hdfs
> 2011-08-10 16:59:49,927 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Opening hdfs://host-vip-1:9000//path/2011-08-10-1700/part-log.00000056.20110810-165949121+0200.445242480222436.seq
> 2011-08-10 16:59:49,934 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Creating HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1700/part-log.00000056.20110810-165949121+0200.445242480222436.seq
> 2011-08-10 16:59:50,727 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Closing hdfs://host-vip-1:9000//path/2011-08-10-1700/part-log.00000056.20110810-165949121+0200.445242480222436.seq
> 2011-08-10 16:59:50,727 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Closing HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1700/part-log.00000056.20110810-165949121+0200.445242480222436.seq
> 2011-08-10 16:59:50,727 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: done writing raw file to hdfs
> 2011-08-10 16:59:50,927 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Opening hdfs://host-vip-1:9000//path/2011-08-10-1700/part-log.00000056.20110810-165950727+0200.445244085712823.seq
> 2011-08-10 16:59:50,933 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Creating HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1700/part-log.00000056.20110810-165950727+0200.445244085712823.seq
> 2011-08-10 16:59:50,976 INFO com.cloudera.flume.handlers.debug.StubbornAppendSink: Event already had an event with attribute rolltag
> 2011-08-10 16:59:50,976 INFO com.cloudera.flume.handlers.rolling.RollSink: closing RollSink 'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:59:50,976 WARN com.cloudera.flume.handlers.rolling.RollSink: TriggerThread interrupted
> 2011-08-10 16:59:50,976 INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink  'escapedCustomDfs( "hdfs://host-vip-1:9000//legacy/incoming/%{module}/%Y-%m-%{dateday}-%{datehr}00/", "part-%{rolltag}", "raw" )'
> 2011-08-10 16:59:50,978 INFO com.cloudera.flume.handlers.debug.InsistentOpenDecorator: Opened Roll on try 0
> 2011-08-10 16:59:50,978 INFO com.cloudera.flume.handlers.debug.InsistentAppendDecorator: append attempt 6 failed, backoff (60000ms): Event already had an event with attribute rolltag
> 2011-08-10 16:59:52,278 INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Closing hdfs://host-vip-1:9000//path/2011-08-10-1700/part-log.00000056.20110810-165950727+0200.445244085712823.seq
> 2011-08-10 16:59:52,278 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: Closing HDFS file: hdfs://host-vip-1:9000/path/2011-08-10-1700/part-log.00000056.20110810-165950727+0200.445244085712823.seq
> 2011-08-10 16:59:52,278 INFO com.cloudera.flume.handlers.hdfs.CustomDfsSink: done writing raw file to hdfs
> 2011-08-10 16:59:52,737 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165942732+0200.445236091341764.seq
> 2011-08-10 16:59:52,737 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165942732+0200.445236091341764.seq
> 2011-08-10 16:59:52,737 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165952737+0200.445246096329911.seq
> 2011-08-10 17:00:02,741 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165952737+0200.445246096329911.seq
> 2011-08-10 17:00:02,741 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-165952737+0200.445246096329911.seq
> 2011-08-10 17:00:02,742 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-170002741+0200.445256100500909.seq
> 2011-08-10 17:00:12,745 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-170002741+0200.445256100500909.seq
> 2011-08-10 17:00:12,745 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-170002741+0200.445256100500909.seq
> 2011-08-10 17:00:12,746 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-170012745+0200.445266104587375.seq
> 2011-08-10 17:00:22,768 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-170012745+0200.445266104587375.seq
> 2011-08-10 17:00:22,769 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-170012745+0200.445266104587375.seq
> 2011-08-10 17:00:22,769 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-170022769+0200.445276127976521.seq
> 2011-08-10 17:00:32,774 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-170022769+0200.445276127976521.seq
> 2011-08-10 17:00:32,774 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-170022769+0200.445276127976521.seq
> 2011-08-10 17:00:32,774 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-170032774+0200.445286133232984.seq
> 2011-08-10 17:00:42,779 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: closed /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-170032774+0200.445286133232984.seq
> 2011-08-10 17:00:42,779 INFO com.cloudera.flume.agent.diskfailover.NaiveFileFailoverManager: File lives in /tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-170032774+0200.445286133232984.seq
> 2011-08-10 17:00:42,780 INFO com.cloudera.flume.handlers.hdfs.SeqfileEventSink: constructed new seqfile event sink: file=/tmp/var/flume/journal/flume/AGG/dfo_writing/log.00000015.20110810-170042779+0200.445296138546943.seq
> 2011-08-10 17:00:45,294 INFO com.cloudera.flume.watchdog.Watchdog: Watchdog shutdown hook
> This results in the following files being written to dfo_* :
> ls -lah *
> dfo_error:
> total 8.0K
> drwxr-xr-x 2 user nogroup 4.0K 2011-07-18 11:11 .
> drwxr-xr-x 7 user nogroup 4.0K 2011-07-18 11:11 ..
> dfo_import:
> total 8.0K
> drwxr-xr-x 2 user nogroup 4.0K 2011-07-18 11:11 .
> drwxr-xr-x 7 user nogroup 4.0K 2011-07-18 11:11 ..
> dfo_logged:
> total 120K
> drwxr-xr-x 2 user nogroup  12K 2011-08-10 17:00 .
> drwxr-xr-x 7 user nogroup 4.0K 2011-07-18 11:11 ..
> -rw-r--r-- 1 user nogroup  12K 2011-08-10 16:58 log.00000015.20110810-165832658+0200.445166016694026.seq
> -rw-r--r-- 1 user nogroup  125 2011-08-10 16:58 log.00000015.20110810-165842664+0200.445176023166287.seq
> -rw-r--r-- 1 user nogroup  19K 2011-08-10 16:59 log.00000015.20110810-165852669+0200.445186027755160.seq
> -rw-r--r-- 1 user nogroup  125 2011-08-10 16:59 log.00000015.20110810-165902675+0200.445196034139189.seq
> -rw-r--r-- 1 user nogroup  17K 2011-08-10 16:59 log.00000015.20110810-165912680+0200.445206038676079.seq
> -rw-r--r-- 1 user nogroup 9.6K 2011-08-10 16:59 log.00000015.20110810-165922719+0200.445216078556307.seq
> -rw-r--r-- 1 user nogroup 8.8K 2011-08-10 16:59 log.00000015.20110810-165932726+0200.445226085601122.seq
> -rw-r--r-- 1 user nogroup 2.8K 2011-08-10 16:59 log.00000015.20110810-165942732+0200.445236091341764.seq
> -rw-r--r-- 1 user nogroup  125 2011-08-10 16:59 log.00000015.20110810-165952737+0200.445246096329911.seq
> -rw-r--r-- 1 user nogroup  125 2011-08-10 17:00 log.00000015.20110810-170002741+0200.445256100500909.seq
> -rw-r--r-- 1 user nogroup  125 2011-08-10 17:00 log.00000015.20110810-170012745+0200.445266104587375.seq
> -rw-r--r-- 1 user nogroup  125 2011-08-10 17:00 log.00000015.20110810-170022769+0200.445276127976521.seq
> dfo_sending:
> total 20K
> drwxr-xr-x 2 user nogroup 4.0K 2011-08-10 16:58 .
> drwxr-xr-x 7 user nogroup 4.0K 2011-07-18 11:11 ..
> -rw-r--r-- 1 user nogroup  11K 2011-08-10 16:58 log.00000015.20110810-165822652+0200.445156010693063.seq
> dfo_writing:
> total 12K
> drwxr-xr-x 2 user nogroup 4.0K 2011-08-10 17:00 .
> drwxr-xr-x 7 user nogroup 4.0K 2011-07-18 11:11 ..
> -rw-r--r-- 1 user nogroup  125 2011-08-10 17:00 log.00000015.20110810-170032774+0200.445286133232984.seq

--
This message is automatically generated by JIRA.
For more information on JIRA, see: http://www.atlassian.com/software/jira