You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@flume.apache.org by 荆棘鸟 <li...@qq.com> on 2012/08/02 11:47:35 UTC
About Mine Flume Demand
hello people:
I am going to custom the flume source.But I have a little problem.I want to monitor the source path ,but in my path has many different log-files.I hope to merger the same log-files and don't to merger the different log-files.At last , throught sink's configuration sends these log-files to the hadoop-hdfs.About my demand ,I have to talk out a good way with everyone.
Thanks very much!
My Name: Yanzhi liu.
Re: About Mine Flume Demand
Posted by Denny Ye <de...@gmail.com>.
hi Yanzhi,
Can I guess your requirement with "sending original files to HDFS by
path"? If my understanding is yes, the possible solution may be looks like
this:
When you got Event from source (log-files), you could set the log file
name or other identifier to Event header. May be the folder potentially,
then the HDFS path can be set with pattern '/topfolder/${your log file
folder/name}/'. Thus, different file logs comes into different HDFS file.
Could it satisfy your requirement?
-Regards
Denny Ye
2012/8/2 荆棘鸟 <li...@qq.com>
> hello people:
> I am going to custom the flume source.But I have a little problem.I want
> to monitor the source path ,but in my path has many different log-files.I
> hope to merger the same log-files and don't to merger the different
> log-files.At last , throught sink's configuration sends these log-files to
> the hadoop-hdfs.About my demand ,I have to talk out a good way with
> everyone.
> Thanks very much!
> My Name: Yanzhi liu.
>