You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@flume.apache.org by Nitin Pawar <ni...@gmail.com> on 2012/09/18 10:32:08 UTC
doubt on hdfs sink
hello,
I have a working setup of flume which writes into hdfs.
I am using flum 1.2.0
I have following settings.
agent1.sinks.HDFS.hdfs.file.Type = SequenceFile
agent1.sinks.HDFS.hdfs.writeFormat = Text
agent1.sinks.hdfs.rollInterval = 86400
agent1.channels.MemoryChannel-2.type = memory
My doubt is, why does flume creates one hdfs file for one event. I
want it to write a single hdfs file per day for the log.
Can someone please help me find out what I have done wrong?
Thanks
--
Nitin Pawar
Re: doubt on hdfs sink
Posted by Nitin Pawar <ni...@gmail.com>.
I got this working by setting all the properties
rollIinterval
rollCount
rollSize
Also realized, rollSize is after decompression
Thanks,
Nitin
On Tue, Sep 18, 2012 at 2:02 PM, Nitin Pawar <ni...@gmail.com> wrote:
> hello,
>
> I have a working setup of flume which writes into hdfs.
> I am using flum 1.2.0
>
> I have following settings.
>
> agent1.sinks.HDFS.hdfs.file.Type = SequenceFile
> agent1.sinks.HDFS.hdfs.writeFormat = Text
> agent1.sinks.hdfs.rollInterval = 86400
> agent1.channels.MemoryChannel-2.type = memory
>
> My doubt is, why does flume creates one hdfs file for one event. I
> want it to write a single hdfs file per day for the log.
> Can someone please help me find out what I have done wrong?
>
> Thanks
>
> --
> Nitin Pawar
--
Nitin Pawar