You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@flume.apache.org by George Blazer <gb...@gmail.com> on 2015/08/11 20:05:22 UTC

The channel is full, and cannot write data now (Flume 1.5)

Hellok

I'm getting "Channel is full" error because ChannelFillPercentage is nearly
at 100%.

What can I look at?

2015-08-11 17:55:51,173 (pool-5-thread-1) [INFO -
org.apache.flume.client.avro.ReliableSpoolingFileEventReader.readEvents(ReliableSpoolingFileEventReader.java:238)]
Last read was never committed - resetting mark position.
2015-08-11 17:55:54,195 (pool-5-thread-1) [WARN -
org.apache.flume.source.SpoolDirectorySource$SpoolDirectoryRunnable.run(SpoolDirectorySource.java:239)]
*The channel is full, and cannot write data now. The source will try again
after 4000 milliseconds*

curl localhost:5653/metrics
{"SINK.s3-sink":{"BatchCompleteCount":"0","ConnectionFailedCount":"1","EventDrainAttemptCount":"0","ConnectionCreatedCount":"0","Type":"SINK","BatchEmptyCount":"0","ConnectionClosedCount":"0","EventDrainSuccessCount":"0","StopTime":"0","StartTime":"1439315713336","BatchUnderflowCount":"0"},"SOURCE.spooling-directory":{"OpenConnectionCount":"0","Type":"SOURCE","AppendBatchAcceptedCount":"0","AppendBatchReceivedCount":"10","EventAcceptedCount":"0","AppendReceivedCount":"0","StopTime":"0","StartTime":"1439315713378","EventReceivedCount":"1000","AppendAcceptedCount":"0"},"CHANNEL.fileChannel":{"EventPutSuccessCount":"0","ChannelFillPercentage":"99.9956","Type":"CHANNEL","StopTime":"0","EventPutAttemptCount":"450","ChannelSize":"999956","StartTime":"1439315713327","EventTakeSuccessCount":"0","ChannelCapacity":"1000000","EventTakeAttemptCount":"1"}}

My current config is:

agent_messaging.sinks = s3-sink
agent_messaging.sources = spooling-directory
agent_messaging.channels = fileChannel

agent_messaging.channels.fileChannel.type = file
agent_messaging.channels.fileChannel.checkpointDir = /mnt/flume/checkpoint
agent_messaging.channels.fileChannel.dataDirs = /mnt/flume/data

agent_messaging.sources.spooling-directory.channels = fileChannel
agent_messaging.sources.spooling-directory.type = spooldir
agent_messaging.sources.spooling-directory.deletePolicy = immediate
agent_messaging.sources.spooling-directory.spoolDir = /mnt/flume/spool

agent_messaging.sources.spooling-directory.interceptors = sr i1 ts
agent_messaging.sources.spooling-directory.interceptors.sr.type =
org.apache.flume.interceptor.SearchAndReplaceInterceptor$Builder
agent_messaging.sources.spooling-directory.interceptors.sr.searchPattern =
(\\d{4})/(\\d{2})/(\\d{2}) (\\d{2}):(\\d{2}):(\\d{2})
agent_messaging.sources.spooling-directory.interceptors.sr.replaceString =
$1-$2-$3 $4:$5:$6

agent_messaging.sources.spooling-directory.interceptors.i1.type =
regex_extractor
agent_messaging.sources.spooling-directory.interceptors.i1.regex =
^(?:\\n)?(\\d\\d\\d\\d-\\d\\d-\\d\\d\\s\\d\\d:\\d\\d:\\d\\d)
agent_messaging.sources.spooling-directory.interceptors.i1.serializers = s1
agent_messaging.sources.spooling-directory.interceptors.i1.serializers.s1.type
= org.apache.flume.interceptor.RegexExtractorInterceptorMillisSerializer
agent_messaging.sources.spooling-directory.interceptors.i1.serializers.s1.name
= timestamp
agent_messaging.sources.spooling-directory.interceptors.i1.serializers.s1.pattern
= yyyy-MM-dd HH:mm:ss

agent_messaging.sources.spooling-directory.interceptors.ts.type =
org.apache.flume.interceptor.TimestampInterceptor$Builder
agent_messaging.sources.spooling-directory.interceptors.ts.preserveExisting
= true

agent_messaging.sources.spooling-directory.deserializer.maxLineLength =
10000

agent_messaging.sinks.s3-sink.channel = fileChannel
agent_messaging.sinks.s3-sink.type = hdfs
agent_messaging.sinks.s3-sink.hdfs.path = s3n://SECRETS@logs/events/%Y-%m-%d
agent_messaging.sinks.s3-sink.hdfs.filePrefix = prod-role-8599226c
agent_messaging.sinks.s3-sink.hdfs.rollInterval = 0
agent_messaging.sinks.s3-sink.hdfs.rollSize = 67108864
agent_messaging.sinks.s3-sink.hdfs.rollCount = 0
agent_messaging.sinks.s3-sink.hdfs.serializer = TEXT
agent_messaging.sinks.s3-sink.hdfs.fileType = DataStream
agent_messaging.sinks.s3-sink.hdfs.closeTries = 3
agent_messaging.sinks.s3-sink.hdfs.idleTimeout = 0
agent_messaging.sinks.s3-sink.hdfs.callTimeout = 180000

Re: The channel is full, and cannot write data now (Flume 1.5)

Posted by Shady Xu <sh...@gmail.com>.
Flume source cannot put data into channel unless there is free space. You
can try increasing the capacity of the channel.

2015-08-12 2:05 GMT+08:00 George Blazer <gb...@gmail.com>:

> Hellok
>
> I'm getting "Channel is full" error because ChannelFillPercentage is
> nearly at 100%.
>
> What can I look at?
>
> 2015-08-11 17:55:51,173 (pool-5-thread-1) [INFO -
> org.apache.flume.client.avro.ReliableSpoolingFileEventReader.readEvents(ReliableSpoolingFileEventReader.java:238)]
> Last read was never committed - resetting mark position.
> 2015-08-11 17:55:54,195 (pool-5-thread-1) [WARN -
> org.apache.flume.source.SpoolDirectorySource$SpoolDirectoryRunnable.run(SpoolDirectorySource.java:239)]
> *The channel is full, and cannot write data now. The source will try again
> after 4000 milliseconds*
>
> curl localhost:5653/metrics
>
> {"SINK.s3-sink":{"BatchCompleteCount":"0","ConnectionFailedCount":"1","EventDrainAttemptCount":"0","ConnectionCreatedCount":"0","Type":"SINK","BatchEmptyCount":"0","ConnectionClosedCount":"0","EventDrainSuccessCount":"0","StopTime":"0","StartTime":"1439315713336","BatchUnderflowCount":"0"},"SOURCE.spooling-directory":{"OpenConnectionCount":"0","Type":"SOURCE","AppendBatchAcceptedCount":"0","AppendBatchReceivedCount":"10","EventAcceptedCount":"0","AppendReceivedCount":"0","StopTime":"0","StartTime":"1439315713378","EventReceivedCount":"1000","AppendAcceptedCount":"0"},"CHANNEL.fileChannel":{"EventPutSuccessCount":"0","ChannelFillPercentage":"99.9956","Type":"CHANNEL","StopTime":"0","EventPutAttemptCount":"450","ChannelSize":"999956","StartTime":"1439315713327","EventTakeSuccessCount":"0","ChannelCapacity":"1000000","EventTakeAttemptCount":"1"}}
>
> My current config is:
>
> agent_messaging.sinks = s3-sink
> agent_messaging.sources = spooling-directory
> agent_messaging.channels = fileChannel
>
> agent_messaging.channels.fileChannel.type = file
> agent_messaging.channels.fileChannel.checkpointDir = /mnt/flume/checkpoint
> agent_messaging.channels.fileChannel.dataDirs = /mnt/flume/data
>
> agent_messaging.sources.spooling-directory.channels = fileChannel
> agent_messaging.sources.spooling-directory.type = spooldir
> agent_messaging.sources.spooling-directory.deletePolicy = immediate
> agent_messaging.sources.spooling-directory.spoolDir = /mnt/flume/spool
>
> agent_messaging.sources.spooling-directory.interceptors = sr i1 ts
> agent_messaging.sources.spooling-directory.interceptors.sr.type =
> org.apache.flume.interceptor.SearchAndReplaceInterceptor$Builder
> agent_messaging.sources.spooling-directory.interceptors.sr.searchPattern =
> (\\d{4})/(\\d{2})/(\\d{2}) (\\d{2}):(\\d{2}):(\\d{2})
> agent_messaging.sources.spooling-directory.interceptors.sr.replaceString =
> $1-$2-$3 $4:$5:$6
>
> agent_messaging.sources.spooling-directory.interceptors.i1.type =
> regex_extractor
> agent_messaging.sources.spooling-directory.interceptors.i1.regex =
> ^(?:\\n)?(\\d\\d\\d\\d-\\d\\d-\\d\\d\\s\\d\\d:\\d\\d:\\d\\d)
> agent_messaging.sources.spooling-directory.interceptors.i1.serializers = s1
> agent_messaging.sources.spooling-directory.interceptors.i1.serializers.s1.type
> = org.apache.flume.interceptor.RegexExtractorInterceptorMillisSerializer
>
> agent_messaging.sources.spooling-directory.interceptors.i1.serializers.s1.name
> = timestamp
> agent_messaging.sources.spooling-directory.interceptors.i1.serializers.s1.pattern
> = yyyy-MM-dd HH:mm:ss
>
> agent_messaging.sources.spooling-directory.interceptors.ts.type =
> org.apache.flume.interceptor.TimestampInterceptor$Builder
> agent_messaging.sources.spooling-directory.interceptors.ts.preserveExisting
> = true
>
> agent_messaging.sources.spooling-directory.deserializer.maxLineLength =
> 10000
>
> agent_messaging.sinks.s3-sink.channel = fileChannel
> agent_messaging.sinks.s3-sink.type = hdfs
> agent_messaging.sinks.s3-sink.hdfs.path = s3n://SECRETS@logs
> /events/%Y-%m-%d
> agent_messaging.sinks.s3-sink.hdfs.filePrefix = prod-role-8599226c
> agent_messaging.sinks.s3-sink.hdfs.rollInterval = 0
> agent_messaging.sinks.s3-sink.hdfs.rollSize = 67108864
> agent_messaging.sinks.s3-sink.hdfs.rollCount = 0
> agent_messaging.sinks.s3-sink.hdfs.serializer = TEXT
> agent_messaging.sinks.s3-sink.hdfs.fileType = DataStream
> agent_messaging.sinks.s3-sink.hdfs.closeTries = 3
> agent_messaging.sinks.s3-sink.hdfs.idleTimeout = 0
> agent_messaging.sinks.s3-sink.hdfs.callTimeout = 180000
>