You are viewing a plain text version of this content. The canonical link for it is here.
Posted to log4j-user@logging.apache.org by Praveen Sripati <pr...@gmail.com> on 2013/10/23 12:37:58 UTC
Not able to get the events from Log4J into Flume
Hi,
I am trying to get the events from Log4J into HDFS through Flume using the
Log4J appender. Created two appenders FILE and flume. It works for the FILE
appender, but with the flume appender the program just hangs in Eclipse. I
don't see any exception, except the below in the log.out.
Batch size string = null
Using Netty bootstrap options: {tcpNoDelay=true, connectTimeoutMillis=20000}
Connecting to localhost/127.0.0.1:41414
[id: 0x52a00770] OPEN
Flume works properly, I am able to send messages to the avro source using
the avro client and see the messages in HDFS. But, it's not getting
integrated with Log4J. How to get around this problem?
-----
Here is the Java program
import java.io.IOException;
import java.sql.SQLException;
import org.apache.log4j.Logger;
public class log4jExample {
static Logger log = Logger.getRootLogger();
public static void main(String[] args) throws IOException, SQLException
{
log.debug("Hello this is an debug message");
}
}
-----
Here is the log4j.properties
# Define the root logger with appender file
log = /home/vm4learning/WorkSpace/BigData/Log4J-Example/log
log4j.rootLogger = DEBUG, FILE, flume
# Define the file appender
log4j.appender.FILE=org.apache.log4j.FileAppender
log4j.appender.FILE.File=${log}/log.out
log4j.appender.FILE.layout=org.apache.log4j.PatternLayout
log4j.appender.FILE.layout.conversionPattern=%m%n
# Define the flume appender
log4j.appender.flume = org.apache.flume.clients.log4jappender.Log4jAppender
log4j.appender.flume.Hostname = localhost
log4j.appender.flume.Port = 41414
log4j.appender.flume.UnsafeMode = false
log4j.appender.flume.layout=org.apache.log4j.PatternLayout
log4j.appender.flume.layout.ConversionPattern=%m%n
-----
Here are the dependencies
<classpathentry kind="lib" path="flume-ng-log4jappender-1.4.0.jar"/>
<classpathentry kind="lib" path="log4j-1.2.17.jar"/>
<classpathentry kind="lib" path="flume-ng-sdk-1.4.0.jar"/>
<classpathentry kind="lib" path="avro-1.7.3.jar"/>
<classpathentry kind="lib" path="netty-3.4.0.Final.jar"/>
<classpathentry kind="lib" path="avro-ipc-1.7.3.jar"/>
<classpathentry kind="lib" path="slf4j-api-1.6.1.jar"/>
<classpathentry kind="lib" path="slf4j-log4j12-1.6.1.jar"/>
Thanks,
Praveen
Re: Not able to get the events from Log4J into Flume
Posted by Ralph Goers <rg...@apache.org>.
You appear to be using log4j 1.x. log4j 1.x does not come with a Flume Appender. The Flume project does provide one and if you are having problems with it you would need to raise it it them. log4j 2 does have a Flume Appender however.
Ralph
> On Oct 23, 2013, at 3:37 AM, Praveen Sripati <pr...@gmail.com> wrote:
>
> Hi,
>
> I am trying to get the events from Log4J into HDFS through Flume using the
> Log4J appender. Created two appenders FILE and flume. It works for the FILE
> appender, but with the flume appender the program just hangs in Eclipse. I
> don't see any exception, except the below in the log.out.
>
> Batch size string = null
> Using Netty bootstrap options: {tcpNoDelay=true, connectTimeoutMillis=20000}
> Connecting to localhost/127.0.0.1:41414
> [id: 0x52a00770] OPEN
>
> Flume works properly, I am able to send messages to the avro source using
> the avro client and see the messages in HDFS. But, it's not getting
> integrated with Log4J. How to get around this problem?
>
> -----
>
> Here is the Java program
>
> import java.io.IOException;
> import java.sql.SQLException;
> import org.apache.log4j.Logger;
>
> public class log4jExample {
>
> static Logger log = Logger.getRootLogger();
> public static void main(String[] args) throws IOException, SQLException
> {
> log.debug("Hello this is an debug message");
> }
> }
>
> -----
>
> Here is the log4j.properties
>
> # Define the root logger with appender file
> log = /home/vm4learning/WorkSpace/BigData/Log4J-Example/log
> log4j.rootLogger = DEBUG, FILE, flume
>
> # Define the file appender
> log4j.appender.FILE=org.apache.log4j.FileAppender
> log4j.appender.FILE.File=${log}/log.out
> log4j.appender.FILE.layout=org.apache.log4j.PatternLayout
> log4j.appender.FILE.layout.conversionPattern=%m%n
>
> # Define the flume appender
> log4j.appender.flume = org.apache.flume.clients.log4jappender.Log4jAppender
> log4j.appender.flume.Hostname = localhost
> log4j.appender.flume.Port = 41414
> log4j.appender.flume.UnsafeMode = false
> log4j.appender.flume.layout=org.apache.log4j.PatternLayout
> log4j.appender.flume.layout.ConversionPattern=%m%n
>
> -----
>
> Here are the dependencies
>
> <classpathentry kind="lib" path="flume-ng-log4jappender-1.4.0.jar"/>
> <classpathentry kind="lib" path="log4j-1.2.17.jar"/>
> <classpathentry kind="lib" path="flume-ng-sdk-1.4.0.jar"/>
> <classpathentry kind="lib" path="avro-1.7.3.jar"/>
> <classpathentry kind="lib" path="netty-3.4.0.Final.jar"/>
> <classpathentry kind="lib" path="avro-ipc-1.7.3.jar"/>
> <classpathentry kind="lib" path="slf4j-api-1.6.1.jar"/>
> <classpathentry kind="lib" path="slf4j-log4j12-1.6.1.jar"/>
>
> Thanks,
> Praveen
---------------------------------------------------------------------
To unsubscribe, e-mail: log4j-user-unsubscribe@logging.apache.org
For additional commands, e-mail: log4j-user-help@logging.apache.org
Re: Not able to get the events from Log4J into Flume
Posted by Praveen Sripati <pr...@gmail.com>.
Hi,
If it helps I did run the program in debug mode and when it hangs, I did a
suspend and took the stack trace. Tried to look into the code, but not sure
why the program hangs with the flume appender.
Daemon Thread [Avro NettyTransceiver I/O Worker 1] (Suspended)
Logger(Category).callAppenders(LoggingEvent) line: 205
Logger(Category).forcedLog(String, Priority, Object, Throwable) line:
391
Logger(Category).log(String, Priority, Object, Throwable) line: 856
Log4jLoggerAdapter.debug(String) line: 209
NettyTransceiver$NettyClientAvroHandler.handleUpstream(ChannelHandlerContext,
ChannelEvent) line: 491
DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline$DefaultChannelHandlerContext,
ChannelEvent) line: 564
DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(ChannelEvent)
line: 792
NettyTransportCodec$NettyFrameDecoder(SimpleChannelUpstreamHandler).channelBound(ChannelHandlerContext,
ChannelStateEvent) line: 166
NettyTransportCodec$NettyFrameDecoder(SimpleChannelUpstreamHandler).handleUpstream(ChannelHandlerContext,
ChannelEvent) line: 98
DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline$DefaultChannelHandlerContext,
ChannelEvent) line: 564
DefaultChannelPipeline.sendUpstream(ChannelEvent) line: 559
Channels.fireChannelBound(Channel, SocketAddress) line: 199
NioWorker$RegisterTask.run() line: 191
NioWorker(AbstractNioWorker).processRegisterTaskQueue() line: 329
NioWorker(AbstractNioWorker).run() line: 235
NioWorker.run() line: 38
DeadLockProofWorker$1.run() line: 42
ThreadPoolExecutor.runWorker(ThreadPoolExecutor$Worker) line: 1145
ThreadPoolExecutor$Worker.run() line: 615
Thread.run() line: 744
Thanks,
Praveen
On Wed, Oct 23, 2013 at 11:37 AM, Praveen Sripati
<pr...@gmail.com>wrote:
>
> Hi,
>
> I am trying to get the events from Log4J into HDFS through Flume using the
> Log4J appender. Created two appenders FILE and flume. It works for the FILE
> appender, but with the flume appender the program just hangs in Eclipse. I
> don't see any exception, except the below in the log.out.
>
> Batch size string = null
> Using Netty bootstrap options: {tcpNoDelay=true,
> connectTimeoutMillis=20000}
> Connecting to localhost/127.0.0.1:41414
> [id: 0x52a00770] OPEN
>
> Flume works properly, I am able to send messages to the avro source using
> the avro client and see the messages in HDFS. But, it's not getting
> integrated with Log4J. How to get around this problem?
>
> -----
>
> Here is the Java program
>
> import java.io.IOException;
> import java.sql.SQLException;
> import org.apache.log4j.Logger;
>
> public class log4jExample {
>
> static Logger log = Logger.getRootLogger();
> public static void main(String[] args) throws IOException,
> SQLException {
> log.debug("Hello this is an debug message");
> }
> }
>
> -----
>
> Here is the log4j.properties
>
> # Define the root logger with appender file
> log = /home/vm4learning/WorkSpace/BigData/Log4J-Example/log
> log4j.rootLogger = DEBUG, FILE, flume
>
> # Define the file appender
> log4j.appender.FILE=org.apache.log4j.FileAppender
> log4j.appender.FILE.File=${log}/log.out
> log4j.appender.FILE.layout=org.apache.log4j.PatternLayout
> log4j.appender.FILE.layout.conversionPattern=%m%n
>
> # Define the flume appender
> log4j.appender.flume = org.apache.flume.clients.log4jappender.Log4jAppender
> log4j.appender.flume.Hostname = localhost
> log4j.appender.flume.Port = 41414
> log4j.appender.flume.UnsafeMode = false
> log4j.appender.flume.layout=org.apache.log4j.PatternLayout
> log4j.appender.flume.layout.ConversionPattern=%m%n
>
> -----
>
> Here are the dependencies
>
> <classpathentry kind="lib" path="flume-ng-log4jappender-1.4.0.jar"/>
> <classpathentry kind="lib" path="log4j-1.2.17.jar"/>
> <classpathentry kind="lib" path="flume-ng-sdk-1.4.0.jar"/>
> <classpathentry kind="lib" path="avro-1.7.3.jar"/>
> <classpathentry kind="lib" path="netty-3.4.0.Final.jar"/>
> <classpathentry kind="lib" path="avro-ipc-1.7.3.jar"/>
> <classpathentry kind="lib" path="slf4j-api-1.6.1.jar"/>
> <classpathentry kind="lib" path="slf4j-log4j12-1.6.1.jar"/>
>
> Thanks,
> Praveen
>