You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@kafka.apache.org by "Preston, Dale" <Da...@conocophillips.com> on 2017/11/06 13:59:26 UTC

RE: [EXTERNAL]Re: 0.9.0.0 Log4j appender slow startup

(Answering now from my work email so please don't be confused.)

The topic is already existing.

-----Original Message-----
From: Jaikiran Pai [mailto:jai.forums2013@gmail.com] 
Sent: Sunday, November 5, 2017 10:56 PM
To: users@kafka.apache.org
Subject: [EXTERNAL]Re: 0.9.0.0 Log4j appender slow startup

Is the topic to which the message is being produced, already present or is it auto created?

-Jaikiran


On 05/11/17 3:43 PM, Dale wrote:
> I am using the 0.9.0.0 log4j appender for Kafka because I have a lot of apps dependent on log4j 1.2.x that cannot be upgraded to use newer versions of log4j.   It appears that the appender has become part of log4j code in later versions of both tools.
>
> When I start my test app, the first message takes an exact and consistent 60 seconds plus a couple milliseconds to go out.  The second message takes right around 200 milliseconds, and all the messages after that take a couple of milliseconds.  The timing from message 1 to 2 could be tolerated but the 60 seconds will never work since the production use case app would typically run for 20 to 30 seconds.
>
> For testing, I brought the appender code into my project and added some additional console messages so I could see what is going on.  Here’s a snippet of the console output:
>
> ********START LOG SNIPPET***********
> G:\kafkademoworkspace\testlog4jgenerator>java -Dlog4j.debug 
> -Dlog4j.configuration=file:///g:\kafkademoworkspace\testlog4jgenerator
> \log4j.properties -cp 
> .\;G:\kafkademoworkspace\testlog4jgenerator\target\testlog4jgenerator.
> jar;g:\kafkademoworkspace\testlog4jgenerator\target\libs\log4j-1.2.17.
> jar;g:\kafkademoworkspace\testlog4jgenerator\target\libs\*;g:\kafkadem
> oworkspace\testlog4jgenerator\target\libs\kafka-clients-0.9.0.0.jar 
> com.mydomainname.messaging.testlog4jgenerator.LogGenerator
> log4j: Using URL [file:/g:/kafkademoworkspace/testlog4jgenerator/log4j.properties] for automatic log4j configuration.
> log4j: Reading configuration from URL 
> file:/g:/kafkademoworkspace/testlog4jgenerator/log4j.properties
> log4j: Parsing for [root] with value=[DEBUG,file,KAFKA].
> log4j: Level token is [DEBUG].
> log4j: Category root set to DEBUG
> log4j: Parsing appender named "file".
> log4j: Parsing layout options for "file".
> log4j: Setting property [conversionPattern] to [%d{yyyy-MM-dd HH:mm:ss,SSS} %-5p %c{1}:%L - %m%n].
> log4j: End of parsing for "file".
> log4j: Setting property [file] to [/apps/logs/logtest.log].
> log4j: Setting property [maxBackupIndex] to [10].
> log4j: Setting property [maxFileSize] to [10MB].
> log4j: setFile called: /apps/logs/logtest.log, true
> log4j: setFile ended
> log4j: Parsed "file" options.
> log4j: Parsing appender named "KAFKA".
> log4j: Parsing layout options for "KAFKA".
> log4j: Setting property [conversionPattern] to [%d{yyyy-MM-dd HH:mm:ss,SSS} %-5p %c{1}:%L - %m%n].
> log4j: End of parsing for "KAFKA".
> log4j: Setting property [compressionType] to [none].
> log4j: Setting property [topic] to [test].
> log4j: Setting property [brokerList] to [localhost:9092].
> log4j: Setting property [syncSend] to [false].
> DPLOG: 2017-11-05T09:56:16.072Z - in Producer - creating new 
> KafkaProducer
> log4j: Kafka producer connected to localhost:9092
> log4j: Logging for topic: test
> log4j: Parsed "KAFKA" options.
> log4j: Finished configuring.
> ****************************************************************
> DPLOG: 2017-11-05T09:56:16.338Z - append START
> DPLOG: 2017-11-05T09:56:16.339Z - after subAppend.  Message is: 2017-11-05 03:56:16,333 DEBUG Sender:123 - Starting Kafka producer I/O thread.
>
> log4j: [Sun Nov 05 03:56:16 CST 2017]2017-11-05 03:56:16,333 DEBUG Sender:123 - Starting Kafka producer I/O thread.
>
> DPLOG: 2017-11-05T09:56:16.342Z - getting ready to send to producer.
> DPLOG: 2017-11-05T09:57:16.347Z - after send to producer.
> DPLOG: 2017-11-05T09:57:16.348Z - append END
> ****************************************************************
> DPLOG: 2017-11-05T09:57:16.352Z - append START
> DPLOG: 2017-11-05T09:57:16.353Z - after subAppend.  Message is: 
> 2017-11-05 03:56:16,338 INFO  root:36 - Logging message: x=0
>
> log4j: [Sun Nov 05 03:56:16 CST 2017]2017-11-05 03:56:16,338 INFO  
> root:36 - Logging message: x=0
>
> DPLOG: 2017-11-05T09:57:16.361Z - getting ready to send to producer.
> DPLOG: 2017-11-05T09:57:16.526Z - after send to producer.
> DPLOG: 2017-11-05T09:57:16.526Z - append END
> ****************************************************************
> DPLOG: 2017-11-05T09:57:16.527Z - append START
> DPLOG: 2017-11-05T09:57:16.528Z - after subAppend.  Message is: 
> 2017-11-05 03:57:16,527 INFO  root:36 - Logging message: x=1
>
> log4j: [Sun Nov 05 03:57:16 CST 2017]2017-11-05 03:57:16,527 INFO  
> root:36 - Logging message: x=1
>
> DPLOG: 2017-11-05T09:57:16.529Z - getting ready to send to producer.
> DPLOG: 2017-11-05T09:57:16.530Z - after send to producer.
> DPLOG: 2017-11-05T09:57:16.530Z - append END
> ****************************************************************
> DPLOG: 2017-11-05T09:57:16.531Z - append START
> DPLOG: 2017-11-05T09:57:16.531Z - after subAppend.  Message is: 
> 2017-11-05 03:57:16,531 INFO  root:36 - Logging message: x=2
>
> log4j: [Sun Nov 05 03:57:16 CST 2017]2017-11-05 03:57:16,531 INFO  
> root:36 - Logging message: x=2
>
> DPLOG: 2017-11-05T09:57:16.532Z - getting ready to send to producer.
> DPLOG: 2017-11-05T09:57:16.533Z - after send to producer.
> DPLOG: 2017-11-05T09:57:16.535Z - append END
> ****************************************************************
> DPLOG: 2017-11-05T09:57:16.536Z - append START
> DPLOG: 2017-11-05T09:57:16.536Z - after subAppend.  Message is: 
> 2017-11-05 03:57:16,536 INFO  root:36 - Logging message: x=3
>
> log4j: [Sun Nov 05 03:57:16 CST 2017]2017-11-05 03:57:16,536 INFO  
> root:36 - Logging message: x=3
>
> DPLOG: 2017-11-05T09:57:16.538Z - getting ready to send to producer.
> DPLOG: 2017-11-05T09:57:16.538Z - after send to producer.
> DPLOG: 2017-11-05T09:57:16.538Z - append END
> ****************************************************************
> DPLOG: 2017-11-05T09:57:16.539Z - append START
> DPLOG: 2017-11-05T09:57:16.540Z - after subAppend.  Message is: 
> 2017-11-05 03:57:16,539 INFO  root:36 - Logging message: x=4
>
> log4j: [Sun Nov 05 03:57:16 CST 2017]2017-11-05 03:57:16,539 INFO  
> root:36 - Logging message: x=4
>
> DPLOG: 2017-11-05T09:57:16.541Z - getting ready to send to producer.
> DPLOG: 2017-11-05T09:57:16.542Z - after send to producer.
> DPLOG: 2017-11-05T09:57:16.542Z - append END
> ****************************************************************
> ********END LOG SNIPPET***********
>
>
>
> The code, modified for logging, for sending to the producer is:
> ********START CODE SNIPPET***********
> System.out.println("DPLOG: " + Instant.now().toString() + " - getting 
> ready to send to producer."); Future<RecordMetadata> response = 
> producer.send(new ProducerRecord<byte[], byte[]>(topic, 
> message.getBytes()));
> System.out.println("DPLOG: " + Instant.now().toString() + " - after 
> send to producer."); ********END CODE SNIPPET***********
>
> I have not been able to find any timeout of 60 seconds (60000 ms) anywhere but it is clear that something is happening that is exactly 60 seconds at the start of the run.  Whatever the timeout is, it is not breaking because the messages are sent to kafka.  I’ve searched the KafkaProducer code and kafka server configuration looking for any 60 second timeout reference and cannot find one.  In what few examples I could find online of log data using the producer, I see the Starting kafka producer I/O thread with the next log entries in the same second so the 60 second delay I am seeing does not always appear to be there but it’s always there in my case.  Any ideas what could be causing it and/or how I can get rid of it?
>
> Thanks,
>
> Dale
>


RE: [EXTERNAL]Re: 0.9.0.0 Log4j appender slow startup

Posted by "Preston, Dale" <Da...@conocophillips.com>.
I have found a solution to the problem.  To give credit, the fix came from this: https://stackoverflow.com/questions/35628706/kafka-log4j-appender-0-9-does-not-work.  I don't think his problem was exactly like mine, maybe it was, but following his solution fixed my issue.

The problem appears to be from creating a KafkaAppender on the rootLogger.  I changed the appender to a new logger first by just changing the assignment in the log4j properties and then I also tried it in programmatic appender configuration.  Both scenarios worked.  This is the log4j.properties:

log4j.rootLogger=DEBUG,file
log4j.kLogger=DEBUG,KAFKA
# Direct log messages to a log file
log4j.appender.file=org.apache.log4j.RollingFileAppender
# RotatingFile Appender
log4j.appender.file.File=/apps/logs/logtest.log
log4j.appender.file.MaxFileSize=10MB
log4j.appender.file.MaxBackupIndex=10
log4j.appender.file.layout=org.apache.log4j.PatternLayout
log4j.appender.file.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss,SSS} %-5p %c{1}:%L - %m%n
# Kafka Appender
log4j.appender.KAFKA=org.apache.kafka.log4jappender.KafkaLog4jAppender
log4j.appender.KAFKA.brokerList=localhost:9092
log4j.appender.KAFKA.compressionType=none
log4j.appender.KAFKA.syncSend=false
log4j.appender.KAFKA.Topic=test
log4j.appender.KAFKA.layout=org.apache.log4j.PatternLayout    
log4j.appender.KAFKA.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss,SSS} %-5p %c{1}:%L - %m%n#

-----Original Message-----
From: Preston, Dale [mailto:Dale.Preston@conocophillips.com] 
Sent: Monday, November 6, 2017 10:46 AM
To: users@kafka.apache.org
Subject: RE: [EXTERNAL]Re: 0.9.0.0 Log4j appender slow startup

Thanks, Jaikiran.  I will do that today or tonight.

Dale

-----Original Message-----
From: Jaikiran Pai [mailto:jai.forums2013@gmail.com]
Sent: Monday, November 6, 2017 10:43 AM
To: users@kafka.apache.org
Subject: [EXTERNAL]Re: 0.9.0.0 Log4j appender slow startup

Can you take a couple of thread dumps with an interval of around 5 seconds each when that 60 second delay occurs? You can use a tool like jstack to do that. That might give some hint on what’s going on.

-Jaikiran

On Monday, November 6, 2017, Preston, Dale <Da...@conocophillips.com>
wrote:

> (Answering now from my work email so please don't be confused.)
>
> The topic is already existing.
>
> -----Original Message-----
> From: Jaikiran Pai [mailto:jai.forums2013@gmail.com <javascript:;>]
> Sent: Sunday, November 5, 2017 10:56 PM
> To: users@kafka.apache.org <javascript:;>
> Subject: [EXTERNAL]Re: 0.9.0.0 Log4j appender slow startup
>
> Is the topic to which the message is being produced, already present 
> or is it auto created?
>
> -Jaikiran
>
>
> On 05/11/17 3:43 PM, Dale wrote:
> > I am using the 0.9.0.0 log4j appender for Kafka because I have a lot 
> > of
> apps dependent on log4j 1.2.x that cannot be upgraded to use newer versions
> of log4j.   It appears that the appender has become part of log4j code in
> later versions of both tools.
> >
> > When I start my test app, the first message takes an exact and
> consistent 60 seconds plus a couple milliseconds to go out.  The 
> second message takes right around 200 milliseconds, and all the 
> messages after that take a couple of milliseconds.  The timing from 
> message 1 to 2 could be tolerated but the 60 seconds will never work 
> since the production use case app would typically run for 20 to 30 seconds.
> >
> > For testing, I brought the appender code into my project and added 
> > some
> additional console messages so I could see what is going on.  Here’s a 
> snippet of the console output:
> >
> > ********START LOG SNIPPET*********** 
> > G:\kafkademoworkspace\testlog4jgenerator>java -Dlog4j.debug 
> > -Dlog4j.configuration=file:///g:\kafkademoworkspace\testlog4jgenerat
> > or
> > \log4j.properties -cp
> > .\;G:\kafkademoworkspace\testlog4jgenerator\target\testlog4jgenerator.
> > jar;g:\kafkademoworkspace\testlog4jgenerator\target\libs\log4j-1.2.17.
> > jar;g:\kafkademoworkspace\testlog4jgenerator\target\libs\*;g:\kafkad
> > em
> > oworkspace\testlog4jgenerator\target\libs\kafka-clients-0.9.0.0.jar
> > com.mydomainname.messaging.testlog4jgenerator.LogGenerator
> > log4j: Using URL
> > [file:/g:/kafkademoworkspace/testlog4jgenerator/log4j.properties]
> for automatic log4j configuration.
> > log4j: Reading configuration from URL 
> > file:/g:/kafkademoworkspace/testlog4jgenerator/log4j.properties
> > log4j: Parsing for [root] with value=[DEBUG,file,KAFKA].
> > log4j: Level token is [DEBUG].
> > log4j: Category root set to DEBUG
> > log4j: Parsing appender named "file".
> > log4j: Parsing layout options for "file".
> > log4j: Setting property [conversionPattern] to [%d{yyyy-MM-dd
> HH:mm:ss,SSS} %-5p %c{1}:%L - %m%n].
> > log4j: End of parsing for "file".
> > log4j: Setting property [file] to [/apps/logs/logtest.log].
> > log4j: Setting property [maxBackupIndex] to [10].
> > log4j: Setting property [maxFileSize] to [10MB].
> > log4j: setFile called: /apps/logs/logtest.log, true
> > log4j: setFile ended
> > log4j: Parsed "file" options.
> > log4j: Parsing appender named "KAFKA".
> > log4j: Parsing layout options for "KAFKA".
> > log4j: Setting property [conversionPattern] to [%d{yyyy-MM-dd
> HH:mm:ss,SSS} %-5p %c{1}:%L - %m%n].
> > log4j: End of parsing for "KAFKA".
> > log4j: Setting property [compressionType] to [none].
> > log4j: Setting property [topic] to [test].
> > log4j: Setting property [brokerList] to [localhost:9092].
> > log4j: Setting property [syncSend] to [false].
> > DPLOG: 2017-11-05T09:56:16.072Z - in Producer - creating new 
> > KafkaProducer
> > log4j: Kafka producer connected to localhost:9092
> > log4j: Logging for topic: test
> > log4j: Parsed "KAFKA" options.
> > log4j: Finished configuring.
> > ****************************************************************
> > DPLOG: 2017-11-05T09:56:16.338Z - append START
> > DPLOG: 2017-11-05T09:56:16.339Z - after subAppend.  Message is:
> 2017-11-05 03:56:16,333 DEBUG Sender:123 - Starting Kafka producer I/O 
> thread.
> >
> > log4j: [Sun Nov 05 03:56:16 CST 2017]2017-11-05 03:56:16,333 DEBUG
> Sender:123 - Starting Kafka producer I/O thread.
> >
> > DPLOG: 2017-11-05T09:56:16.342Z - getting ready to send to producer.
> > DPLOG: 2017-11-05T09:57:16.347Z - after send to producer.
> > DPLOG: 2017-11-05T09:57:16.348Z - append END
> > ****************************************************************
> > DPLOG: 2017-11-05T09:57:16.352Z - append START
> > DPLOG: 2017-11-05T09:57:16.353Z - after subAppend.  Message is:
> > 2017-11-05 03:56:16,338 INFO  root:36 - Logging message: x=0
> >
> > log4j: [Sun Nov 05 03:56:16 CST 2017]2017-11-05 03:56:16,338 INFO
> > root:36 - Logging message: x=0
> >
> > DPLOG: 2017-11-05T09:57:16.361Z - getting ready to send to producer.
> > DPLOG: 2017-11-05T09:57:16.526Z - after send to producer.
> > DPLOG: 2017-11-05T09:57:16.526Z - append END
> > ****************************************************************
> > DPLOG: 2017-11-05T09:57:16.527Z - append START
> > DPLOG: 2017-11-05T09:57:16.528Z - after subAppend.  Message is:
> > 2017-11-05 03:57:16,527 INFO  root:36 - Logging message: x=1
> >
> > log4j: [Sun Nov 05 03:57:16 CST 2017]2017-11-05 03:57:16,527 INFO
> > root:36 - Logging message: x=1
> >
> > DPLOG: 2017-11-05T09:57:16.529Z - getting ready to send to producer.
> > DPLOG: 2017-11-05T09:57:16.530Z - after send to producer.
> > DPLOG: 2017-11-05T09:57:16.530Z - append END
> > ****************************************************************
> > DPLOG: 2017-11-05T09:57:16.531Z - append START
> > DPLOG: 2017-11-05T09:57:16.531Z - after subAppend.  Message is:
> > 2017-11-05 03:57:16,531 INFO  root:36 - Logging message: x=2
> >
> > log4j: [Sun Nov 05 03:57:16 CST 2017]2017-11-05 03:57:16,531 INFO
> > root:36 - Logging message: x=2
> >
> > DPLOG: 2017-11-05T09:57:16.532Z - getting ready to send to producer.
> > DPLOG: 2017-11-05T09:57:16.533Z - after send to producer.
> > DPLOG: 2017-11-05T09:57:16.535Z - append END
> > ****************************************************************
> > DPLOG: 2017-11-05T09:57:16.536Z - append START
> > DPLOG: 2017-11-05T09:57:16.536Z - after subAppend.  Message is:
> > 2017-11-05 03:57:16,536 INFO  root:36 - Logging message: x=3
> >
> > log4j: [Sun Nov 05 03:57:16 CST 2017]2017-11-05 03:57:16,536 INFO
> > root:36 - Logging message: x=3
> >
> > DPLOG: 2017-11-05T09:57:16.538Z - getting ready to send to producer.
> > DPLOG: 2017-11-05T09:57:16.538Z - after send to producer.
> > DPLOG: 2017-11-05T09:57:16.538Z - append END
> > ****************************************************************
> > DPLOG: 2017-11-05T09:57:16.539Z - append START
> > DPLOG: 2017-11-05T09:57:16.540Z - after subAppend.  Message is:
> > 2017-11-05 03:57:16,539 INFO  root:36 - Logging message: x=4
> >
> > log4j: [Sun Nov 05 03:57:16 CST 2017]2017-11-05 03:57:16,539 INFO
> > root:36 - Logging message: x=4
> >
> > DPLOG: 2017-11-05T09:57:16.541Z - getting ready to send to producer.
> > DPLOG: 2017-11-05T09:57:16.542Z - after send to producer.
> > DPLOG: 2017-11-05T09:57:16.542Z - append END
> > ****************************************************************
> > ********END LOG SNIPPET***********
> >
> >
> >
> > The code, modified for logging, for sending to the producer is:
> > ********START CODE SNIPPET***********
> > System.out.println("DPLOG: " + Instant.now().toString() + " - 
> > getting ready to send to producer."); Future<RecordMetadata> 
> > response = producer.send(new ProducerRecord<byte[], byte[]>(topic, 
> > message.getBytes()));
> > System.out.println("DPLOG: " + Instant.now().toString() + " - after 
> > send to producer."); ********END CODE SNIPPET***********
> >
> > I have not been able to find any timeout of 60 seconds (60000 ms)
> anywhere but it is clear that something is happening that is exactly
> 60 seconds at the start of the run.  Whatever the timeout is, it is 
> not breaking because the messages are sent to kafka.  I’ve searched 
> the KafkaProducer code and kafka server configuration looking for any
> 60 second timeout reference and cannot find one.  In what few examples 
> I could find online of log data using the producer, I see the Starting 
> kafka producer I/O thread with the next log entries in the same second 
> so the 60 second delay I am seeing does not always appear to be there 
> but it’s always there in my case.  Any ideas what could be causing it 
> and/or how I can get rid of it?
> >
> > Thanks,
> >
> > Dale
> >
>
>

RE: [EXTERNAL]Re: 0.9.0.0 Log4j appender slow startup

Posted by "Preston, Dale" <Da...@conocophillips.com>.
Thanks, Jaikiran.  I will do that today or tonight.

Dale

-----Original Message-----
From: Jaikiran Pai [mailto:jai.forums2013@gmail.com] 
Sent: Monday, November 6, 2017 10:43 AM
To: users@kafka.apache.org
Subject: [EXTERNAL]Re: 0.9.0.0 Log4j appender slow startup

Can you take a couple of thread dumps with an interval of around 5 seconds each when that 60 second delay occurs? You can use a tool like jstack to do that. That might give some hint on what’s going on.

-Jaikiran

On Monday, November 6, 2017, Preston, Dale <Da...@conocophillips.com>
wrote:

> (Answering now from my work email so please don't be confused.)
>
> The topic is already existing.
>
> -----Original Message-----
> From: Jaikiran Pai [mailto:jai.forums2013@gmail.com <javascript:;>]
> Sent: Sunday, November 5, 2017 10:56 PM
> To: users@kafka.apache.org <javascript:;>
> Subject: [EXTERNAL]Re: 0.9.0.0 Log4j appender slow startup
>
> Is the topic to which the message is being produced, already present 
> or is it auto created?
>
> -Jaikiran
>
>
> On 05/11/17 3:43 PM, Dale wrote:
> > I am using the 0.9.0.0 log4j appender for Kafka because I have a lot 
> > of
> apps dependent on log4j 1.2.x that cannot be upgraded to use newer versions
> of log4j.   It appears that the appender has become part of log4j code in
> later versions of both tools.
> >
> > When I start my test app, the first message takes an exact and
> consistent 60 seconds plus a couple milliseconds to go out.  The 
> second message takes right around 200 milliseconds, and all the 
> messages after that take a couple of milliseconds.  The timing from 
> message 1 to 2 could be tolerated but the 60 seconds will never work 
> since the production use case app would typically run for 20 to 30 seconds.
> >
> > For testing, I brought the appender code into my project and added 
> > some
> additional console messages so I could see what is going on.  Here’s a 
> snippet of the console output:
> >
> > ********START LOG SNIPPET*********** 
> > G:\kafkademoworkspace\testlog4jgenerator>java -Dlog4j.debug 
> > -Dlog4j.configuration=file:///g:\kafkademoworkspace\testlog4jgenerat
> > or
> > \log4j.properties -cp
> > .\;G:\kafkademoworkspace\testlog4jgenerator\target\testlog4jgenerator.
> > jar;g:\kafkademoworkspace\testlog4jgenerator\target\libs\log4j-1.2.17.
> > jar;g:\kafkademoworkspace\testlog4jgenerator\target\libs\*;g:\kafkad
> > em 
> > oworkspace\testlog4jgenerator\target\libs\kafka-clients-0.9.0.0.jar
> > com.mydomainname.messaging.testlog4jgenerator.LogGenerator
> > log4j: Using URL 
> > [file:/g:/kafkademoworkspace/testlog4jgenerator/log4j.properties]
> for automatic log4j configuration.
> > log4j: Reading configuration from URL 
> > file:/g:/kafkademoworkspace/testlog4jgenerator/log4j.properties
> > log4j: Parsing for [root] with value=[DEBUG,file,KAFKA].
> > log4j: Level token is [DEBUG].
> > log4j: Category root set to DEBUG
> > log4j: Parsing appender named "file".
> > log4j: Parsing layout options for "file".
> > log4j: Setting property [conversionPattern] to [%d{yyyy-MM-dd
> HH:mm:ss,SSS} %-5p %c{1}:%L - %m%n].
> > log4j: End of parsing for "file".
> > log4j: Setting property [file] to [/apps/logs/logtest.log].
> > log4j: Setting property [maxBackupIndex] to [10].
> > log4j: Setting property [maxFileSize] to [10MB].
> > log4j: setFile called: /apps/logs/logtest.log, true
> > log4j: setFile ended
> > log4j: Parsed "file" options.
> > log4j: Parsing appender named "KAFKA".
> > log4j: Parsing layout options for "KAFKA".
> > log4j: Setting property [conversionPattern] to [%d{yyyy-MM-dd
> HH:mm:ss,SSS} %-5p %c{1}:%L - %m%n].
> > log4j: End of parsing for "KAFKA".
> > log4j: Setting property [compressionType] to [none].
> > log4j: Setting property [topic] to [test].
> > log4j: Setting property [brokerList] to [localhost:9092].
> > log4j: Setting property [syncSend] to [false].
> > DPLOG: 2017-11-05T09:56:16.072Z - in Producer - creating new 
> > KafkaProducer
> > log4j: Kafka producer connected to localhost:9092
> > log4j: Logging for topic: test
> > log4j: Parsed "KAFKA" options.
> > log4j: Finished configuring.
> > ****************************************************************
> > DPLOG: 2017-11-05T09:56:16.338Z - append START
> > DPLOG: 2017-11-05T09:56:16.339Z - after subAppend.  Message is:
> 2017-11-05 03:56:16,333 DEBUG Sender:123 - Starting Kafka producer I/O 
> thread.
> >
> > log4j: [Sun Nov 05 03:56:16 CST 2017]2017-11-05 03:56:16,333 DEBUG
> Sender:123 - Starting Kafka producer I/O thread.
> >
> > DPLOG: 2017-11-05T09:56:16.342Z - getting ready to send to producer.
> > DPLOG: 2017-11-05T09:57:16.347Z - after send to producer.
> > DPLOG: 2017-11-05T09:57:16.348Z - append END
> > ****************************************************************
> > DPLOG: 2017-11-05T09:57:16.352Z - append START
> > DPLOG: 2017-11-05T09:57:16.353Z - after subAppend.  Message is:
> > 2017-11-05 03:56:16,338 INFO  root:36 - Logging message: x=0
> >
> > log4j: [Sun Nov 05 03:56:16 CST 2017]2017-11-05 03:56:16,338 INFO
> > root:36 - Logging message: x=0
> >
> > DPLOG: 2017-11-05T09:57:16.361Z - getting ready to send to producer.
> > DPLOG: 2017-11-05T09:57:16.526Z - after send to producer.
> > DPLOG: 2017-11-05T09:57:16.526Z - append END
> > ****************************************************************
> > DPLOG: 2017-11-05T09:57:16.527Z - append START
> > DPLOG: 2017-11-05T09:57:16.528Z - after subAppend.  Message is:
> > 2017-11-05 03:57:16,527 INFO  root:36 - Logging message: x=1
> >
> > log4j: [Sun Nov 05 03:57:16 CST 2017]2017-11-05 03:57:16,527 INFO
> > root:36 - Logging message: x=1
> >
> > DPLOG: 2017-11-05T09:57:16.529Z - getting ready to send to producer.
> > DPLOG: 2017-11-05T09:57:16.530Z - after send to producer.
> > DPLOG: 2017-11-05T09:57:16.530Z - append END
> > ****************************************************************
> > DPLOG: 2017-11-05T09:57:16.531Z - append START
> > DPLOG: 2017-11-05T09:57:16.531Z - after subAppend.  Message is:
> > 2017-11-05 03:57:16,531 INFO  root:36 - Logging message: x=2
> >
> > log4j: [Sun Nov 05 03:57:16 CST 2017]2017-11-05 03:57:16,531 INFO
> > root:36 - Logging message: x=2
> >
> > DPLOG: 2017-11-05T09:57:16.532Z - getting ready to send to producer.
> > DPLOG: 2017-11-05T09:57:16.533Z - after send to producer.
> > DPLOG: 2017-11-05T09:57:16.535Z - append END
> > ****************************************************************
> > DPLOG: 2017-11-05T09:57:16.536Z - append START
> > DPLOG: 2017-11-05T09:57:16.536Z - after subAppend.  Message is:
> > 2017-11-05 03:57:16,536 INFO  root:36 - Logging message: x=3
> >
> > log4j: [Sun Nov 05 03:57:16 CST 2017]2017-11-05 03:57:16,536 INFO
> > root:36 - Logging message: x=3
> >
> > DPLOG: 2017-11-05T09:57:16.538Z - getting ready to send to producer.
> > DPLOG: 2017-11-05T09:57:16.538Z - after send to producer.
> > DPLOG: 2017-11-05T09:57:16.538Z - append END
> > ****************************************************************
> > DPLOG: 2017-11-05T09:57:16.539Z - append START
> > DPLOG: 2017-11-05T09:57:16.540Z - after subAppend.  Message is:
> > 2017-11-05 03:57:16,539 INFO  root:36 - Logging message: x=4
> >
> > log4j: [Sun Nov 05 03:57:16 CST 2017]2017-11-05 03:57:16,539 INFO
> > root:36 - Logging message: x=4
> >
> > DPLOG: 2017-11-05T09:57:16.541Z - getting ready to send to producer.
> > DPLOG: 2017-11-05T09:57:16.542Z - after send to producer.
> > DPLOG: 2017-11-05T09:57:16.542Z - append END
> > ****************************************************************
> > ********END LOG SNIPPET***********
> >
> >
> >
> > The code, modified for logging, for sending to the producer is:
> > ********START CODE SNIPPET***********
> > System.out.println("DPLOG: " + Instant.now().toString() + " - 
> > getting ready to send to producer."); Future<RecordMetadata> 
> > response = producer.send(new ProducerRecord<byte[], byte[]>(topic, 
> > message.getBytes()));
> > System.out.println("DPLOG: " + Instant.now().toString() + " - after 
> > send to producer."); ********END CODE SNIPPET***********
> >
> > I have not been able to find any timeout of 60 seconds (60000 ms)
> anywhere but it is clear that something is happening that is exactly 
> 60 seconds at the start of the run.  Whatever the timeout is, it is 
> not breaking because the messages are sent to kafka.  I’ve searched 
> the KafkaProducer code and kafka server configuration looking for any 
> 60 second timeout reference and cannot find one.  In what few examples 
> I could find online of log data using the producer, I see the Starting 
> kafka producer I/O thread with the next log entries in the same second 
> so the 60 second delay I am seeing does not always appear to be there 
> but it’s always there in my case.  Any ideas what could be causing it 
> and/or how I can get rid of it?
> >
> > Thanks,
> >
> > Dale
> >
>
>

Re: 0.9.0.0 Log4j appender slow startup

Posted by Jaikiran Pai <ja...@gmail.com>.
Can you take a couple of thread dumps with an interval of around 5 seconds
each when that 60 second delay occurs? You can use a tool like jstack to do
that. That might give some hint on what’s going on.

-Jaikiran

On Monday, November 6, 2017, Preston, Dale <Da...@conocophillips.com>
wrote:

> (Answering now from my work email so please don't be confused.)
>
> The topic is already existing.
>
> -----Original Message-----
> From: Jaikiran Pai [mailto:jai.forums2013@gmail.com <javascript:;>]
> Sent: Sunday, November 5, 2017 10:56 PM
> To: users@kafka.apache.org <javascript:;>
> Subject: [EXTERNAL]Re: 0.9.0.0 Log4j appender slow startup
>
> Is the topic to which the message is being produced, already present or is
> it auto created?
>
> -Jaikiran
>
>
> On 05/11/17 3:43 PM, Dale wrote:
> > I am using the 0.9.0.0 log4j appender for Kafka because I have a lot of
> apps dependent on log4j 1.2.x that cannot be upgraded to use newer versions
> of log4j.   It appears that the appender has become part of log4j code in
> later versions of both tools.
> >
> > When I start my test app, the first message takes an exact and
> consistent 60 seconds plus a couple milliseconds to go out.  The second
> message takes right around 200 milliseconds, and all the messages after
> that take a couple of milliseconds.  The timing from message 1 to 2 could
> be tolerated but the 60 seconds will never work since the production use
> case app would typically run for 20 to 30 seconds.
> >
> > For testing, I brought the appender code into my project and added some
> additional console messages so I could see what is going on.  Here’s a
> snippet of the console output:
> >
> > ********START LOG SNIPPET***********
> > G:\kafkademoworkspace\testlog4jgenerator>java -Dlog4j.debug
> > -Dlog4j.configuration=file:///g:\kafkademoworkspace\testlog4jgenerator
> > \log4j.properties -cp
> > .\;G:\kafkademoworkspace\testlog4jgenerator\target\testlog4jgenerator.
> > jar;g:\kafkademoworkspace\testlog4jgenerator\target\libs\log4j-1.2.17.
> > jar;g:\kafkademoworkspace\testlog4jgenerator\target\libs\*;g:\kafkadem
> > oworkspace\testlog4jgenerator\target\libs\kafka-clients-0.9.0.0.jar
> > com.mydomainname.messaging.testlog4jgenerator.LogGenerator
> > log4j: Using URL [file:/g:/kafkademoworkspace/testlog4jgenerator/log4j.properties]
> for automatic log4j configuration.
> > log4j: Reading configuration from URL
> > file:/g:/kafkademoworkspace/testlog4jgenerator/log4j.properties
> > log4j: Parsing for [root] with value=[DEBUG,file,KAFKA].
> > log4j: Level token is [DEBUG].
> > log4j: Category root set to DEBUG
> > log4j: Parsing appender named "file".
> > log4j: Parsing layout options for "file".
> > log4j: Setting property [conversionPattern] to [%d{yyyy-MM-dd
> HH:mm:ss,SSS} %-5p %c{1}:%L - %m%n].
> > log4j: End of parsing for "file".
> > log4j: Setting property [file] to [/apps/logs/logtest.log].
> > log4j: Setting property [maxBackupIndex] to [10].
> > log4j: Setting property [maxFileSize] to [10MB].
> > log4j: setFile called: /apps/logs/logtest.log, true
> > log4j: setFile ended
> > log4j: Parsed "file" options.
> > log4j: Parsing appender named "KAFKA".
> > log4j: Parsing layout options for "KAFKA".
> > log4j: Setting property [conversionPattern] to [%d{yyyy-MM-dd
> HH:mm:ss,SSS} %-5p %c{1}:%L - %m%n].
> > log4j: End of parsing for "KAFKA".
> > log4j: Setting property [compressionType] to [none].
> > log4j: Setting property [topic] to [test].
> > log4j: Setting property [brokerList] to [localhost:9092].
> > log4j: Setting property [syncSend] to [false].
> > DPLOG: 2017-11-05T09:56:16.072Z - in Producer - creating new
> > KafkaProducer
> > log4j: Kafka producer connected to localhost:9092
> > log4j: Logging for topic: test
> > log4j: Parsed "KAFKA" options.
> > log4j: Finished configuring.
> > ****************************************************************
> > DPLOG: 2017-11-05T09:56:16.338Z - append START
> > DPLOG: 2017-11-05T09:56:16.339Z - after subAppend.  Message is:
> 2017-11-05 03:56:16,333 DEBUG Sender:123 - Starting Kafka producer I/O
> thread.
> >
> > log4j: [Sun Nov 05 03:56:16 CST 2017]2017-11-05 03:56:16,333 DEBUG
> Sender:123 - Starting Kafka producer I/O thread.
> >
> > DPLOG: 2017-11-05T09:56:16.342Z - getting ready to send to producer.
> > DPLOG: 2017-11-05T09:57:16.347Z - after send to producer.
> > DPLOG: 2017-11-05T09:57:16.348Z - append END
> > ****************************************************************
> > DPLOG: 2017-11-05T09:57:16.352Z - append START
> > DPLOG: 2017-11-05T09:57:16.353Z - after subAppend.  Message is:
> > 2017-11-05 03:56:16,338 INFO  root:36 - Logging message: x=0
> >
> > log4j: [Sun Nov 05 03:56:16 CST 2017]2017-11-05 03:56:16,338 INFO
> > root:36 - Logging message: x=0
> >
> > DPLOG: 2017-11-05T09:57:16.361Z - getting ready to send to producer.
> > DPLOG: 2017-11-05T09:57:16.526Z - after send to producer.
> > DPLOG: 2017-11-05T09:57:16.526Z - append END
> > ****************************************************************
> > DPLOG: 2017-11-05T09:57:16.527Z - append START
> > DPLOG: 2017-11-05T09:57:16.528Z - after subAppend.  Message is:
> > 2017-11-05 03:57:16,527 INFO  root:36 - Logging message: x=1
> >
> > log4j: [Sun Nov 05 03:57:16 CST 2017]2017-11-05 03:57:16,527 INFO
> > root:36 - Logging message: x=1
> >
> > DPLOG: 2017-11-05T09:57:16.529Z - getting ready to send to producer.
> > DPLOG: 2017-11-05T09:57:16.530Z - after send to producer.
> > DPLOG: 2017-11-05T09:57:16.530Z - append END
> > ****************************************************************
> > DPLOG: 2017-11-05T09:57:16.531Z - append START
> > DPLOG: 2017-11-05T09:57:16.531Z - after subAppend.  Message is:
> > 2017-11-05 03:57:16,531 INFO  root:36 - Logging message: x=2
> >
> > log4j: [Sun Nov 05 03:57:16 CST 2017]2017-11-05 03:57:16,531 INFO
> > root:36 - Logging message: x=2
> >
> > DPLOG: 2017-11-05T09:57:16.532Z - getting ready to send to producer.
> > DPLOG: 2017-11-05T09:57:16.533Z - after send to producer.
> > DPLOG: 2017-11-05T09:57:16.535Z - append END
> > ****************************************************************
> > DPLOG: 2017-11-05T09:57:16.536Z - append START
> > DPLOG: 2017-11-05T09:57:16.536Z - after subAppend.  Message is:
> > 2017-11-05 03:57:16,536 INFO  root:36 - Logging message: x=3
> >
> > log4j: [Sun Nov 05 03:57:16 CST 2017]2017-11-05 03:57:16,536 INFO
> > root:36 - Logging message: x=3
> >
> > DPLOG: 2017-11-05T09:57:16.538Z - getting ready to send to producer.
> > DPLOG: 2017-11-05T09:57:16.538Z - after send to producer.
> > DPLOG: 2017-11-05T09:57:16.538Z - append END
> > ****************************************************************
> > DPLOG: 2017-11-05T09:57:16.539Z - append START
> > DPLOG: 2017-11-05T09:57:16.540Z - after subAppend.  Message is:
> > 2017-11-05 03:57:16,539 INFO  root:36 - Logging message: x=4
> >
> > log4j: [Sun Nov 05 03:57:16 CST 2017]2017-11-05 03:57:16,539 INFO
> > root:36 - Logging message: x=4
> >
> > DPLOG: 2017-11-05T09:57:16.541Z - getting ready to send to producer.
> > DPLOG: 2017-11-05T09:57:16.542Z - after send to producer.
> > DPLOG: 2017-11-05T09:57:16.542Z - append END
> > ****************************************************************
> > ********END LOG SNIPPET***********
> >
> >
> >
> > The code, modified for logging, for sending to the producer is:
> > ********START CODE SNIPPET***********
> > System.out.println("DPLOG: " + Instant.now().toString() + " - getting
> > ready to send to producer."); Future<RecordMetadata> response =
> > producer.send(new ProducerRecord<byte[], byte[]>(topic,
> > message.getBytes()));
> > System.out.println("DPLOG: " + Instant.now().toString() + " - after
> > send to producer."); ********END CODE SNIPPET***********
> >
> > I have not been able to find any timeout of 60 seconds (60000 ms)
> anywhere but it is clear that something is happening that is exactly 60
> seconds at the start of the run.  Whatever the timeout is, it is not
> breaking because the messages are sent to kafka.  I’ve searched the
> KafkaProducer code and kafka server configuration looking for any 60 second
> timeout reference and cannot find one.  In what few examples I could find
> online of log data using the producer, I see the Starting kafka producer
> I/O thread with the next log entries in the same second so the 60 second
> delay I am seeing does not always appear to be there but it’s always there
> in my case.  Any ideas what could be causing it and/or how I can get rid of
> it?
> >
> > Thanks,
> >
> > Dale
> >
>
>