You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Mike Anderson <mi...@mit.edu> on 2009/08/20 01:32:02 UTC

syslog-ng and hadoop

Has anybody had any luck setting up the log4j.properties file to send logs
to a syslog-ng server?
My log4j.properties excerpt:
log4j.appender.SYSLOG=org.apache.log4j.net.SyslogAppender
log4j.appender.SYSLOG.syslogHost=10.0.20.164
log4j.appender.SYSLOG.layout=org.apache.log4j.PatternLayout
log4j.appender.SYSLOG.layout.ConversionPattern=%d{ISO8601} %p %c: %m%n
log4j.appender.SYSLOG.Facility=HADOOP

and my syslog-ng.conf file running on 10.0.20.164

source s_hadoop {
        # message generated by Syslog-NG
        internal();
        # standard Linux log source (this is the default place for the
syslog()
        # function to send logs to)
        unix-stream("/dev/log");
        udp();
};
destination df_hadoop { file("/var/log/hadoop/hadoop.log");};
filter f_hadoop {facility(hadoop);};
log {
source(s_hadoop);
filter(f_hadoop);
destination(df_hadoop);
};


Thanks in advance,
Mike

Re: syslog-ng and hadoop

Posted by mike anderson <sa...@gmail.com>.
I got it working! fantastic. One thing that hung me up for a while was how
picky the log4j.properties files are about syntax. For future reference to
others, I used this in log4j.properties:
# Define the root logger to the system property "hadoop.root.logger".
log4j.rootLogger=${hadoop.root.logger}, EventCounter, Socket


On Thu, Aug 20, 2009 at 11:16 AM, Edward Capriolo <ed...@gmail.com>wrote:

> On Thu, Aug 20, 2009 at 10:49 AM, mike anderson<sa...@gmail.com>
> wrote:
> > Yeah, that is interesting Edward. I don't need syslog-ng for any
> particular
> > reason, other than that I'm familiar with it. If there were another way
> to
> > get all my logs collated into one log file that would be great.
> > mike
> >
> > On Thu, Aug 20, 2009 at 10:44 AM, Edward Capriolo <edlinuxguru@gmail.com
> >wrote:
> >
> >> On Wed, Aug 19, 2009 at 11:50 PM, Brian Bockelman<bb...@cse.unl.edu>
> >> wrote:
> >> > Hey Mike,
> >> >
> >> > Yup.  We find the stock log4j needs two things:
> >> >
> >> > 1) Set the rootLogger manually.  The way 0.19.x has the root logger
> set
> >> up
> >> > breaks when adding new appenders.  I.e., do:
> >> >
> >> > log4j.rootLogger=INFO,SYSLOG,console,DRFA,EventCounter
> >> >
> >> > 2) Add the headers; otherwise log4j is not compatible with syslog:
> >> >
> >> > log4j.appender.SYSLOG=org.apache.log4j.net.SyslogAppender
> >> > log4j.appender.SYSLOG.facility=local0
> >> > log4j.appender.SYSLOG.layout=org.apache.log4j.PatternLayout
> >> > log4j.appender.SYSLOG.layout.ConversionPattern=%p %c{2}: %m%n
> >> > log4j.appender.SYSLOG.SyslogHost=red
> >> > log4j.appender.SYSLOG.threshold=ERROR
> >> > log4j.appender.SYSLOG.Header=true
> >> > log4j.appender.SYSLOG.FacilityPrinting=true
> >> >
> >> > Brian
> >> >
> >> > On Aug 19, 2009, at 6:32 PM, Mike Anderson wrote:
> >> >
> >> >> Has anybody had any luck setting up the log4j.properties file to send
> >> logs
> >> >> to a syslog-ng server?
> >> >> My log4j.properties excerpt:
> >> >> log4j.appender.SYSLOG=org.apache.log4j.net.SyslogAppender
> >> >> log4j.appender.SYSLOG.syslogHost=10.0.20.164
> >> >> log4j.appender.SYSLOG.layout=org.apache.log4j.PatternLayout
> >> >> log4j.appender.SYSLOG.layout.ConversionPattern=%d{ISO8601} %p %c:
> %m%n
> >> >> log4j.appender.SYSLOG.Facility=HADOOP
> >> >>
> >> >> and my syslog-ng.conf file running on 10.0.20.164
> >> >>
> >> >> source s_hadoop {
> >> >>       # message generated by Syslog-NG
> >> >>       internal();
> >> >>       # standard Linux log source (this is the default place for the
> >> >> syslog()
> >> >>       # function to send logs to)
> >> >>       unix-stream("/dev/log");
> >> >>       udp();
> >> >> };
> >> >> destination df_hadoop { file("/var/log/hadoop/hadoop.log");};
> >> >> filter f_hadoop {facility(hadoop);};
> >> >> log {
> >> >> source(s_hadoop);
> >> >> filter(f_hadoop);
> >> >> destination(df_hadoop);
> >> >> };
> >> >>
> >> >>
> >> >> Thanks in advance,
> >> >> Mike
> >> >
> >> >
> >>
> >> Mike slightly off topic but you can also run a Log 4J server which
> >> perfectly transports the messages fired off by LOG4j. The
> >> log4J->syslog loses/ changes some information. If anyone is interested
> >> in this let me know and I will write up something about it.
> >>
> >
>
> Mike,
> I just put this up for you.
> http://www.edwardcapriolo.com/wiki/en/Log4j_Server
>
> All of the functionality is in the class
> org.apache.log4j.net.SocketServer which ships as part of Log4j.
>
> I pretty much followed this http://timarcher.com/node/10
>
> I started with the syslog appender but it had some quirks. Mostly the
> syslog appender can only write a syslog so it loses some information.
> The Log4jserver transfers the log.error("whatever" ) as is and can
> handle it on the server end though the servers logging properties.
> Cool stuff.
>

Re: syslog-ng and hadoop

Posted by Edward Capriolo <ed...@gmail.com>.
On Thu, Aug 20, 2009 at 10:49 AM, mike anderson<sa...@gmail.com> wrote:
> Yeah, that is interesting Edward. I don't need syslog-ng for any particular
> reason, other than that I'm familiar with it. If there were another way to
> get all my logs collated into one log file that would be great.
> mike
>
> On Thu, Aug 20, 2009 at 10:44 AM, Edward Capriolo <ed...@gmail.com>wrote:
>
>> On Wed, Aug 19, 2009 at 11:50 PM, Brian Bockelman<bb...@cse.unl.edu>
>> wrote:
>> > Hey Mike,
>> >
>> > Yup.  We find the stock log4j needs two things:
>> >
>> > 1) Set the rootLogger manually.  The way 0.19.x has the root logger set
>> up
>> > breaks when adding new appenders.  I.e., do:
>> >
>> > log4j.rootLogger=INFO,SYSLOG,console,DRFA,EventCounter
>> >
>> > 2) Add the headers; otherwise log4j is not compatible with syslog:
>> >
>> > log4j.appender.SYSLOG=org.apache.log4j.net.SyslogAppender
>> > log4j.appender.SYSLOG.facility=local0
>> > log4j.appender.SYSLOG.layout=org.apache.log4j.PatternLayout
>> > log4j.appender.SYSLOG.layout.ConversionPattern=%p %c{2}: %m%n
>> > log4j.appender.SYSLOG.SyslogHost=red
>> > log4j.appender.SYSLOG.threshold=ERROR
>> > log4j.appender.SYSLOG.Header=true
>> > log4j.appender.SYSLOG.FacilityPrinting=true
>> >
>> > Brian
>> >
>> > On Aug 19, 2009, at 6:32 PM, Mike Anderson wrote:
>> >
>> >> Has anybody had any luck setting up the log4j.properties file to send
>> logs
>> >> to a syslog-ng server?
>> >> My log4j.properties excerpt:
>> >> log4j.appender.SYSLOG=org.apache.log4j.net.SyslogAppender
>> >> log4j.appender.SYSLOG.syslogHost=10.0.20.164
>> >> log4j.appender.SYSLOG.layout=org.apache.log4j.PatternLayout
>> >> log4j.appender.SYSLOG.layout.ConversionPattern=%d{ISO8601} %p %c: %m%n
>> >> log4j.appender.SYSLOG.Facility=HADOOP
>> >>
>> >> and my syslog-ng.conf file running on 10.0.20.164
>> >>
>> >> source s_hadoop {
>> >>       # message generated by Syslog-NG
>> >>       internal();
>> >>       # standard Linux log source (this is the default place for the
>> >> syslog()
>> >>       # function to send logs to)
>> >>       unix-stream("/dev/log");
>> >>       udp();
>> >> };
>> >> destination df_hadoop { file("/var/log/hadoop/hadoop.log");};
>> >> filter f_hadoop {facility(hadoop);};
>> >> log {
>> >> source(s_hadoop);
>> >> filter(f_hadoop);
>> >> destination(df_hadoop);
>> >> };
>> >>
>> >>
>> >> Thanks in advance,
>> >> Mike
>> >
>> >
>>
>> Mike slightly off topic but you can also run a Log 4J server which
>> perfectly transports the messages fired off by LOG4j. The
>> log4J->syslog loses/ changes some information. If anyone is interested
>> in this let me know and I will write up something about it.
>>
>

Mike,
I just put this up for you.
http://www.edwardcapriolo.com/wiki/en/Log4j_Server

All of the functionality is in the class
org.apache.log4j.net.SocketServer which ships as part of Log4j.

I pretty much followed this http://timarcher.com/node/10

I started with the syslog appender but it had some quirks. Mostly the
syslog appender can only write a syslog so it loses some information.
The Log4jserver transfers the log.error("whatever" ) as is and can
handle it on the server end though the servers logging properties.
Cool stuff.

Re: syslog-ng and hadoop

Posted by mike anderson <sa...@gmail.com>.
Yeah, that is interesting Edward. I don't need syslog-ng for any particular
reason, other than that I'm familiar with it. If there were another way to
get all my logs collated into one log file that would be great.
mike

On Thu, Aug 20, 2009 at 10:44 AM, Edward Capriolo <ed...@gmail.com>wrote:

> On Wed, Aug 19, 2009 at 11:50 PM, Brian Bockelman<bb...@cse.unl.edu>
> wrote:
> > Hey Mike,
> >
> > Yup.  We find the stock log4j needs two things:
> >
> > 1) Set the rootLogger manually.  The way 0.19.x has the root logger set
> up
> > breaks when adding new appenders.  I.e., do:
> >
> > log4j.rootLogger=INFO,SYSLOG,console,DRFA,EventCounter
> >
> > 2) Add the headers; otherwise log4j is not compatible with syslog:
> >
> > log4j.appender.SYSLOG=org.apache.log4j.net.SyslogAppender
> > log4j.appender.SYSLOG.facility=local0
> > log4j.appender.SYSLOG.layout=org.apache.log4j.PatternLayout
> > log4j.appender.SYSLOG.layout.ConversionPattern=%p %c{2}: %m%n
> > log4j.appender.SYSLOG.SyslogHost=red
> > log4j.appender.SYSLOG.threshold=ERROR
> > log4j.appender.SYSLOG.Header=true
> > log4j.appender.SYSLOG.FacilityPrinting=true
> >
> > Brian
> >
> > On Aug 19, 2009, at 6:32 PM, Mike Anderson wrote:
> >
> >> Has anybody had any luck setting up the log4j.properties file to send
> logs
> >> to a syslog-ng server?
> >> My log4j.properties excerpt:
> >> log4j.appender.SYSLOG=org.apache.log4j.net.SyslogAppender
> >> log4j.appender.SYSLOG.syslogHost=10.0.20.164
> >> log4j.appender.SYSLOG.layout=org.apache.log4j.PatternLayout
> >> log4j.appender.SYSLOG.layout.ConversionPattern=%d{ISO8601} %p %c: %m%n
> >> log4j.appender.SYSLOG.Facility=HADOOP
> >>
> >> and my syslog-ng.conf file running on 10.0.20.164
> >>
> >> source s_hadoop {
> >>       # message generated by Syslog-NG
> >>       internal();
> >>       # standard Linux log source (this is the default place for the
> >> syslog()
> >>       # function to send logs to)
> >>       unix-stream("/dev/log");
> >>       udp();
> >> };
> >> destination df_hadoop { file("/var/log/hadoop/hadoop.log");};
> >> filter f_hadoop {facility(hadoop);};
> >> log {
> >> source(s_hadoop);
> >> filter(f_hadoop);
> >> destination(df_hadoop);
> >> };
> >>
> >>
> >> Thanks in advance,
> >> Mike
> >
> >
>
> Mike slightly off topic but you can also run a Log 4J server which
> perfectly transports the messages fired off by LOG4j. The
> log4J->syslog loses/ changes some information. If anyone is interested
> in this let me know and I will write up something about it.
>

Re: syslog-ng and hadoop

Posted by Edward Capriolo <ed...@gmail.com>.
On Wed, Aug 19, 2009 at 11:50 PM, Brian Bockelman<bb...@cse.unl.edu> wrote:
> Hey Mike,
>
> Yup.  We find the stock log4j needs two things:
>
> 1) Set the rootLogger manually.  The way 0.19.x has the root logger set up
> breaks when adding new appenders.  I.e., do:
>
> log4j.rootLogger=INFO,SYSLOG,console,DRFA,EventCounter
>
> 2) Add the headers; otherwise log4j is not compatible with syslog:
>
> log4j.appender.SYSLOG=org.apache.log4j.net.SyslogAppender
> log4j.appender.SYSLOG.facility=local0
> log4j.appender.SYSLOG.layout=org.apache.log4j.PatternLayout
> log4j.appender.SYSLOG.layout.ConversionPattern=%p %c{2}: %m%n
> log4j.appender.SYSLOG.SyslogHost=red
> log4j.appender.SYSLOG.threshold=ERROR
> log4j.appender.SYSLOG.Header=true
> log4j.appender.SYSLOG.FacilityPrinting=true
>
> Brian
>
> On Aug 19, 2009, at 6:32 PM, Mike Anderson wrote:
>
>> Has anybody had any luck setting up the log4j.properties file to send logs
>> to a syslog-ng server?
>> My log4j.properties excerpt:
>> log4j.appender.SYSLOG=org.apache.log4j.net.SyslogAppender
>> log4j.appender.SYSLOG.syslogHost=10.0.20.164
>> log4j.appender.SYSLOG.layout=org.apache.log4j.PatternLayout
>> log4j.appender.SYSLOG.layout.ConversionPattern=%d{ISO8601} %p %c: %m%n
>> log4j.appender.SYSLOG.Facility=HADOOP
>>
>> and my syslog-ng.conf file running on 10.0.20.164
>>
>> source s_hadoop {
>>       # message generated by Syslog-NG
>>       internal();
>>       # standard Linux log source (this is the default place for the
>> syslog()
>>       # function to send logs to)
>>       unix-stream("/dev/log");
>>       udp();
>> };
>> destination df_hadoop { file("/var/log/hadoop/hadoop.log");};
>> filter f_hadoop {facility(hadoop);};
>> log {
>> source(s_hadoop);
>> filter(f_hadoop);
>> destination(df_hadoop);
>> };
>>
>>
>> Thanks in advance,
>> Mike
>
>

Mike slightly off topic but you can also run a Log 4J server which
perfectly transports the messages fired off by LOG4j. The
log4J->syslog loses/ changes some information. If anyone is interested
in this let me know and I will write up something about it.

Re: syslog-ng and hadoop

Posted by Brian Bockelman <bb...@cse.unl.edu>.
Hey Mike,

Yup.  We find the stock log4j needs two things:

1) Set the rootLogger manually.  The way 0.19.x has the root logger  
set up breaks when adding new appenders.  I.e., do:

log4j.rootLogger=INFO,SYSLOG,console,DRFA,EventCounter

2) Add the headers; otherwise log4j is not compatible with syslog:

log4j.appender.SYSLOG=org.apache.log4j.net.SyslogAppender
log4j.appender.SYSLOG.facility=local0
log4j.appender.SYSLOG.layout=org.apache.log4j.PatternLayout
log4j.appender.SYSLOG.layout.ConversionPattern=%p %c{2}: %m%n
log4j.appender.SYSLOG.SyslogHost=red
log4j.appender.SYSLOG.threshold=ERROR
log4j.appender.SYSLOG.Header=true
log4j.appender.SYSLOG.FacilityPrinting=true

Brian

On Aug 19, 2009, at 6:32 PM, Mike Anderson wrote:

> Has anybody had any luck setting up the log4j.properties file to  
> send logs
> to a syslog-ng server?
> My log4j.properties excerpt:
> log4j.appender.SYSLOG=org.apache.log4j.net.SyslogAppender
> log4j.appender.SYSLOG.syslogHost=10.0.20.164
> log4j.appender.SYSLOG.layout=org.apache.log4j.PatternLayout
> log4j.appender.SYSLOG.layout.ConversionPattern=%d{ISO8601} %p %c: %m%n
> log4j.appender.SYSLOG.Facility=HADOOP
>
> and my syslog-ng.conf file running on 10.0.20.164
>
> source s_hadoop {
>        # message generated by Syslog-NG
>        internal();
>        # standard Linux log source (this is the default place for the
> syslog()
>        # function to send logs to)
>        unix-stream("/dev/log");
>        udp();
> };
> destination df_hadoop { file("/var/log/hadoop/hadoop.log");};
> filter f_hadoop {facility(hadoop);};
> log {
> source(s_hadoop);
> filter(f_hadoop);
> destination(df_hadoop);
> };
>
>
> Thanks in advance,
> Mike