You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@flume.apache.org by yogendra reddy <yo...@gmail.com> on 2015/12/01 11:13:41 UTC

Re: Flume log4j Appender issue

update

I ran the flume agent first and then made changes to hadoop log4j
properties file and after the restart this started working fine. but now
Hive service is not coming up because of avro-ipc jar that I had to add to
hadoop-hdfs lib to get flume log4j appender working

I would like to know if anybody here has used flume to copy Hadoop
daemon/service logs?

Thanks,
Yogendra

On Wed, Nov 25, 2015 at 2:03 PM, yogendra reddy <yo...@gmail.com>
wrote:

> Hi All,
>
> I'm trying to configure flume to write hadoop service logs to a common
> sink.
>
> Here's what I have added to hdfs log4j.properties
>
> # Define the root logger to the system property "hadoop.root.logger".
> log4j.rootLogger=${hadoop.root.logger}, flume
>
> #Flume Appender
> log4j.appender.flume = org.apache.flume.clients.log4jappender.Log4jAppender
> log4j.appender.flume.Hostname = localhost
> log4j.appender.flume.Port = 41414
>
> and when I run sample pi job I get this error
>
> $ hadoop jar hadoop-mapreduce-examples.jar pi 10 10
> log4j:ERROR Could not find value for key log4j.appender.flume.layout
> 15/11/25 07:23:26 WARN api.NettyAvroRpcClient: Using default maxIOWorkers
> log4j:ERROR RPC client creation failed! NettyAvroRpcClient { host:
> localhost, port: 41414 }: RPC connection error
> Exception in thread "main" java.lang.ExceptionInInitializerError
>         at org.apache.hadoop.util.RunJar.run(RunJar.java:200)
>         at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
> Caused by: org.apache.commons.logging.LogConfigurationException:
> User-specified log class 'org.apache.commons.logging.impl.Log4JLogger'
> cannot be found or is not useable.
>         at
> org.apache.commons.logging.impl.LogFactoryImpl.discoverLogImplementation(LogFactoryImpl.java:804)
>         at
> org.apache.commons.logging.impl.LogFactoryImpl.newInstance(LogFactoryImpl.java:541)
>         at
> org.apache.commons.logging.impl.LogFactoryImpl.getInstance(LogFactoryImpl.java:292)
>         at
> org.apache.commons.logging.impl.LogFactoryImpl.getInstance(LogFactoryImpl.java:269)
>         at
> org.apache.commons.logging.LogFactory.getLog(LogFactory.java:657)
>         at
> org.apache.hadoop.util.ShutdownHookManager.<clinit>(ShutdownHookManager.java:44)
>         ... 2 more
>
> I have added these jars to hadoop-hdfs lib -
>
> avro-ipc-1.7.3.jar
> flume-avro-source-1.5.2.2.2.7.1-33.jar
> flume-ng-log4jappender-1.5.2.2.2.7.1-33.jar
> flume-hdfs-sink-1.5.2.2.2.7.1-33.jar
> flume-ng-sdk-1.5.2.2.2.7.1-33.jar
>
> and I do have the commons-logging( commons-logging-1.1.3.jar) and
> log4j(1.2.17) jars present in the hdfs lib. Any pointers to debug this
> issue?
>
> Thanks,
> Yogendra
>
>
>

Re: Flume log4j Appender issue

Posted by Gonzalo Herreros <gh...@gmail.com>.
Ok, I see now.
If you want to collect logs what I would do is a directory spooling source
on the namenode instead of messing with hdfs, sounds too risky.



On 1 December 2015 at 11:04, yogendra reddy <yo...@gmail.com> wrote:

> My usecase is to copy hadoop service logs and that is why I'm adding flume
> log4j appender to hadoop log4j properties.
>
> # Define the root logger to the system property "hadoop.root.logger".
> log4j.rootLogger=${hadoop.root.logger}, *flume*
>
> #Flume Appender
> log4j.appender.flume = org.apache.flume.clients.log4jappender.Log4jAppender
> log4j.appender.flume.Hostname = localhost
> log4j.appender.flume.Port = 41414
>
>  now, for this to work I need to add the flume log4j appender related
> libraries to hadoop classpath right? If not this will throw
> NoSuchClass found expection org.apache.flume.clients.
> log4jappender.Log4jAppender.
>
>
> On Tue, Dec 1, 2015 at 4:24 PM, Gonzalo Herreros <gh...@gmail.com>
> wrote:
>
>> That doesn't sound right:
>> -Flume should use it's own log4j.properties in the conf directory
>> -Never update the hdfs libs to add stuff you need for Flume, each product
>> has it's own library directories
>>
>> Regards,
>> Gonzalo
>>
>>
>> On 1 December 2015 at 10:48, yogendra reddy <yo...@gmail.com>
>> wrote:
>>
>>> I din't follow. I'm adding flume libraries to hadoop classpath i.e
>>> hadoop-hdfs lib folder and this is causing the issue. I need these jars to
>>> be in hdfs lib as I have added log4j appender to hdfs log4j properties.
>>>
>>> On Tue, Dec 1, 2015 at 4:09 PM, Gonzalo Herreros <gh...@gmail.com>
>>> wrote:
>>>
>>>> Adding a library to Flume shouldn't affect hive or any other tools.
>>>> You can add the jar to the lib or plugin.d directories.
>>>>
>>>> Regards,
>>>> Gonzalo
>>>>
>>>> On 1 December 2015 at 10:13, yogendra reddy <yo...@gmail.com>
>>>> wrote:
>>>>
>>>>> update
>>>>>
>>>>> I ran the flume agent first and then made changes to hadoop log4j
>>>>> properties file and after the restart this started working fine. but now
>>>>> Hive service is not coming up because of avro-ipc jar that I had to add to
>>>>> hadoop-hdfs lib to get flume log4j appender working
>>>>>
>>>>> I would like to know if anybody here has used flume to copy Hadoop
>>>>> daemon/service logs?
>>>>>
>>>>> Thanks,
>>>>> Yogendra
>>>>>
>>>>> On Wed, Nov 25, 2015 at 2:03 PM, yogendra reddy <yogendra.60@gmail.com
>>>>> > wrote:
>>>>>
>>>>>> Hi All,
>>>>>>
>>>>>> I'm trying to configure flume to write hadoop service logs to a
>>>>>> common sink.
>>>>>>
>>>>>> Here's what I have added to hdfs log4j.properties
>>>>>>
>>>>>> # Define the root logger to the system property "hadoop.root.logger".
>>>>>> log4j.rootLogger=${hadoop.root.logger}, flume
>>>>>>
>>>>>> #Flume Appender
>>>>>> log4j.appender.flume =
>>>>>> org.apache.flume.clients.log4jappender.Log4jAppender
>>>>>> log4j.appender.flume.Hostname = localhost
>>>>>> log4j.appender.flume.Port = 41414
>>>>>>
>>>>>> and when I run sample pi job I get this error
>>>>>>
>>>>>> $ hadoop jar hadoop-mapreduce-examples.jar pi 10 10
>>>>>> log4j:ERROR Could not find value for key log4j.appender.flume.layout
>>>>>> 15/11/25 07:23:26 WARN api.NettyAvroRpcClient: Using default
>>>>>> maxIOWorkers
>>>>>> log4j:ERROR RPC client creation failed! NettyAvroRpcClient { host:
>>>>>> localhost, port: 41414 }: RPC connection error
>>>>>> Exception in thread "main" java.lang.ExceptionInInitializerError
>>>>>>         at org.apache.hadoop.util.RunJar.run(RunJar.java:200)
>>>>>>         at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
>>>>>> Caused by: org.apache.commons.logging.LogConfigurationException:
>>>>>> User-specified log class 'org.apache.commons.logging.impl.Log4JLogger'
>>>>>> cannot be found or is not useable.
>>>>>>         at
>>>>>> org.apache.commons.logging.impl.LogFactoryImpl.discoverLogImplementation(LogFactoryImpl.java:804)
>>>>>>         at
>>>>>> org.apache.commons.logging.impl.LogFactoryImpl.newInstance(LogFactoryImpl.java:541)
>>>>>>         at
>>>>>> org.apache.commons.logging.impl.LogFactoryImpl.getInstance(LogFactoryImpl.java:292)
>>>>>>         at
>>>>>> org.apache.commons.logging.impl.LogFactoryImpl.getInstance(LogFactoryImpl.java:269)
>>>>>>         at
>>>>>> org.apache.commons.logging.LogFactory.getLog(LogFactory.java:657)
>>>>>>         at
>>>>>> org.apache.hadoop.util.ShutdownHookManager.<clinit>(ShutdownHookManager.java:44)
>>>>>>         ... 2 more
>>>>>>
>>>>>> I have added these jars to hadoop-hdfs lib -
>>>>>>
>>>>>> avro-ipc-1.7.3.jar
>>>>>> flume-avro-source-1.5.2.2.2.7.1-33.jar
>>>>>> flume-ng-log4jappender-1.5.2.2.2.7.1-33.jar
>>>>>> flume-hdfs-sink-1.5.2.2.2.7.1-33.jar
>>>>>> flume-ng-sdk-1.5.2.2.2.7.1-33.jar
>>>>>>
>>>>>> and I do have the commons-logging( commons-logging-1.1.3.jar) and
>>>>>> log4j(1.2.17) jars present in the hdfs lib. Any pointers to debug this
>>>>>> issue?
>>>>>>
>>>>>> Thanks,
>>>>>> Yogendra
>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Flume log4j Appender issue

Posted by yogendra reddy <yo...@gmail.com>.
My usecase is to copy hadoop service logs and that is why I'm adding flume
log4j appender to hadoop log4j properties.

# Define the root logger to the system property "hadoop.root.logger".
log4j.rootLogger=${hadoop.root.logger}, *flume*

#Flume Appender
log4j.appender.flume = org.apache.flume.clients.log4jappender.Log4jAppender
log4j.appender.flume.Hostname = localhost
log4j.appender.flume.Port = 41414

 now, for this to work I need to add the flume log4j appender related
libraries to hadoop classpath right? If not this will throw
NoSuchClass found expection org.apache.flume.clients.
log4jappender.Log4jAppender.


On Tue, Dec 1, 2015 at 4:24 PM, Gonzalo Herreros <gh...@gmail.com>
wrote:

> That doesn't sound right:
> -Flume should use it's own log4j.properties in the conf directory
> -Never update the hdfs libs to add stuff you need for Flume, each product
> has it's own library directories
>
> Regards,
> Gonzalo
>
>
> On 1 December 2015 at 10:48, yogendra reddy <yo...@gmail.com> wrote:
>
>> I din't follow. I'm adding flume libraries to hadoop classpath i.e
>> hadoop-hdfs lib folder and this is causing the issue. I need these jars to
>> be in hdfs lib as I have added log4j appender to hdfs log4j properties.
>>
>> On Tue, Dec 1, 2015 at 4:09 PM, Gonzalo Herreros <gh...@gmail.com>
>> wrote:
>>
>>> Adding a library to Flume shouldn't affect hive or any other tools.
>>> You can add the jar to the lib or plugin.d directories.
>>>
>>> Regards,
>>> Gonzalo
>>>
>>> On 1 December 2015 at 10:13, yogendra reddy <yo...@gmail.com>
>>> wrote:
>>>
>>>> update
>>>>
>>>> I ran the flume agent first and then made changes to hadoop log4j
>>>> properties file and after the restart this started working fine. but now
>>>> Hive service is not coming up because of avro-ipc jar that I had to add to
>>>> hadoop-hdfs lib to get flume log4j appender working
>>>>
>>>> I would like to know if anybody here has used flume to copy Hadoop
>>>> daemon/service logs?
>>>>
>>>> Thanks,
>>>> Yogendra
>>>>
>>>> On Wed, Nov 25, 2015 at 2:03 PM, yogendra reddy <yo...@gmail.com>
>>>> wrote:
>>>>
>>>>> Hi All,
>>>>>
>>>>> I'm trying to configure flume to write hadoop service logs to a common
>>>>> sink.
>>>>>
>>>>> Here's what I have added to hdfs log4j.properties
>>>>>
>>>>> # Define the root logger to the system property "hadoop.root.logger".
>>>>> log4j.rootLogger=${hadoop.root.logger}, flume
>>>>>
>>>>> #Flume Appender
>>>>> log4j.appender.flume =
>>>>> org.apache.flume.clients.log4jappender.Log4jAppender
>>>>> log4j.appender.flume.Hostname = localhost
>>>>> log4j.appender.flume.Port = 41414
>>>>>
>>>>> and when I run sample pi job I get this error
>>>>>
>>>>> $ hadoop jar hadoop-mapreduce-examples.jar pi 10 10
>>>>> log4j:ERROR Could not find value for key log4j.appender.flume.layout
>>>>> 15/11/25 07:23:26 WARN api.NettyAvroRpcClient: Using default
>>>>> maxIOWorkers
>>>>> log4j:ERROR RPC client creation failed! NettyAvroRpcClient { host:
>>>>> localhost, port: 41414 }: RPC connection error
>>>>> Exception in thread "main" java.lang.ExceptionInInitializerError
>>>>>         at org.apache.hadoop.util.RunJar.run(RunJar.java:200)
>>>>>         at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
>>>>> Caused by: org.apache.commons.logging.LogConfigurationException:
>>>>> User-specified log class 'org.apache.commons.logging.impl.Log4JLogger'
>>>>> cannot be found or is not useable.
>>>>>         at
>>>>> org.apache.commons.logging.impl.LogFactoryImpl.discoverLogImplementation(LogFactoryImpl.java:804)
>>>>>         at
>>>>> org.apache.commons.logging.impl.LogFactoryImpl.newInstance(LogFactoryImpl.java:541)
>>>>>         at
>>>>> org.apache.commons.logging.impl.LogFactoryImpl.getInstance(LogFactoryImpl.java:292)
>>>>>         at
>>>>> org.apache.commons.logging.impl.LogFactoryImpl.getInstance(LogFactoryImpl.java:269)
>>>>>         at
>>>>> org.apache.commons.logging.LogFactory.getLog(LogFactory.java:657)
>>>>>         at
>>>>> org.apache.hadoop.util.ShutdownHookManager.<clinit>(ShutdownHookManager.java:44)
>>>>>         ... 2 more
>>>>>
>>>>> I have added these jars to hadoop-hdfs lib -
>>>>>
>>>>> avro-ipc-1.7.3.jar
>>>>> flume-avro-source-1.5.2.2.2.7.1-33.jar
>>>>> flume-ng-log4jappender-1.5.2.2.2.7.1-33.jar
>>>>> flume-hdfs-sink-1.5.2.2.2.7.1-33.jar
>>>>> flume-ng-sdk-1.5.2.2.2.7.1-33.jar
>>>>>
>>>>> and I do have the commons-logging( commons-logging-1.1.3.jar) and
>>>>> log4j(1.2.17) jars present in the hdfs lib. Any pointers to debug this
>>>>> issue?
>>>>>
>>>>> Thanks,
>>>>> Yogendra
>>>>>
>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Flume log4j Appender issue

Posted by Gonzalo Herreros <gh...@gmail.com>.
That doesn't sound right:
-Flume should use it's own log4j.properties in the conf directory
-Never update the hdfs libs to add stuff you need for Flume, each product
has it's own library directories

Regards,
Gonzalo


On 1 December 2015 at 10:48, yogendra reddy <yo...@gmail.com> wrote:

> I din't follow. I'm adding flume libraries to hadoop classpath i.e
> hadoop-hdfs lib folder and this is causing the issue. I need these jars to
> be in hdfs lib as I have added log4j appender to hdfs log4j properties.
>
> On Tue, Dec 1, 2015 at 4:09 PM, Gonzalo Herreros <gh...@gmail.com>
> wrote:
>
>> Adding a library to Flume shouldn't affect hive or any other tools.
>> You can add the jar to the lib or plugin.d directories.
>>
>> Regards,
>> Gonzalo
>>
>> On 1 December 2015 at 10:13, yogendra reddy <yo...@gmail.com>
>> wrote:
>>
>>> update
>>>
>>> I ran the flume agent first and then made changes to hadoop log4j
>>> properties file and after the restart this started working fine. but now
>>> Hive service is not coming up because of avro-ipc jar that I had to add to
>>> hadoop-hdfs lib to get flume log4j appender working
>>>
>>> I would like to know if anybody here has used flume to copy Hadoop
>>> daemon/service logs?
>>>
>>> Thanks,
>>> Yogendra
>>>
>>> On Wed, Nov 25, 2015 at 2:03 PM, yogendra reddy <yo...@gmail.com>
>>> wrote:
>>>
>>>> Hi All,
>>>>
>>>> I'm trying to configure flume to write hadoop service logs to a common
>>>> sink.
>>>>
>>>> Here's what I have added to hdfs log4j.properties
>>>>
>>>> # Define the root logger to the system property "hadoop.root.logger".
>>>> log4j.rootLogger=${hadoop.root.logger}, flume
>>>>
>>>> #Flume Appender
>>>> log4j.appender.flume =
>>>> org.apache.flume.clients.log4jappender.Log4jAppender
>>>> log4j.appender.flume.Hostname = localhost
>>>> log4j.appender.flume.Port = 41414
>>>>
>>>> and when I run sample pi job I get this error
>>>>
>>>> $ hadoop jar hadoop-mapreduce-examples.jar pi 10 10
>>>> log4j:ERROR Could not find value for key log4j.appender.flume.layout
>>>> 15/11/25 07:23:26 WARN api.NettyAvroRpcClient: Using default
>>>> maxIOWorkers
>>>> log4j:ERROR RPC client creation failed! NettyAvroRpcClient { host:
>>>> localhost, port: 41414 }: RPC connection error
>>>> Exception in thread "main" java.lang.ExceptionInInitializerError
>>>>         at org.apache.hadoop.util.RunJar.run(RunJar.java:200)
>>>>         at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
>>>> Caused by: org.apache.commons.logging.LogConfigurationException:
>>>> User-specified log class 'org.apache.commons.logging.impl.Log4JLogger'
>>>> cannot be found or is not useable.
>>>>         at
>>>> org.apache.commons.logging.impl.LogFactoryImpl.discoverLogImplementation(LogFactoryImpl.java:804)
>>>>         at
>>>> org.apache.commons.logging.impl.LogFactoryImpl.newInstance(LogFactoryImpl.java:541)
>>>>         at
>>>> org.apache.commons.logging.impl.LogFactoryImpl.getInstance(LogFactoryImpl.java:292)
>>>>         at
>>>> org.apache.commons.logging.impl.LogFactoryImpl.getInstance(LogFactoryImpl.java:269)
>>>>         at
>>>> org.apache.commons.logging.LogFactory.getLog(LogFactory.java:657)
>>>>         at
>>>> org.apache.hadoop.util.ShutdownHookManager.<clinit>(ShutdownHookManager.java:44)
>>>>         ... 2 more
>>>>
>>>> I have added these jars to hadoop-hdfs lib -
>>>>
>>>> avro-ipc-1.7.3.jar
>>>> flume-avro-source-1.5.2.2.2.7.1-33.jar
>>>> flume-ng-log4jappender-1.5.2.2.2.7.1-33.jar
>>>> flume-hdfs-sink-1.5.2.2.2.7.1-33.jar
>>>> flume-ng-sdk-1.5.2.2.2.7.1-33.jar
>>>>
>>>> and I do have the commons-logging( commons-logging-1.1.3.jar) and
>>>> log4j(1.2.17) jars present in the hdfs lib. Any pointers to debug this
>>>> issue?
>>>>
>>>> Thanks,
>>>> Yogendra
>>>>
>>>>
>>>>
>>>
>>
>

Re: Flume log4j Appender issue

Posted by yogendra reddy <yo...@gmail.com>.
I din't follow. I'm adding flume libraries to hadoop classpath i.e
hadoop-hdfs lib folder and this is causing the issue. I need these jars to
be in hdfs lib as I have added log4j appender to hdfs log4j properties.

On Tue, Dec 1, 2015 at 4:09 PM, Gonzalo Herreros <gh...@gmail.com>
wrote:

> Adding a library to Flume shouldn't affect hive or any other tools.
> You can add the jar to the lib or plugin.d directories.
>
> Regards,
> Gonzalo
>
> On 1 December 2015 at 10:13, yogendra reddy <yo...@gmail.com> wrote:
>
>> update
>>
>> I ran the flume agent first and then made changes to hadoop log4j
>> properties file and after the restart this started working fine. but now
>> Hive service is not coming up because of avro-ipc jar that I had to add to
>> hadoop-hdfs lib to get flume log4j appender working
>>
>> I would like to know if anybody here has used flume to copy Hadoop
>> daemon/service logs?
>>
>> Thanks,
>> Yogendra
>>
>> On Wed, Nov 25, 2015 at 2:03 PM, yogendra reddy <yo...@gmail.com>
>> wrote:
>>
>>> Hi All,
>>>
>>> I'm trying to configure flume to write hadoop service logs to a common
>>> sink.
>>>
>>> Here's what I have added to hdfs log4j.properties
>>>
>>> # Define the root logger to the system property "hadoop.root.logger".
>>> log4j.rootLogger=${hadoop.root.logger}, flume
>>>
>>> #Flume Appender
>>> log4j.appender.flume =
>>> org.apache.flume.clients.log4jappender.Log4jAppender
>>> log4j.appender.flume.Hostname = localhost
>>> log4j.appender.flume.Port = 41414
>>>
>>> and when I run sample pi job I get this error
>>>
>>> $ hadoop jar hadoop-mapreduce-examples.jar pi 10 10
>>> log4j:ERROR Could not find value for key log4j.appender.flume.layout
>>> 15/11/25 07:23:26 WARN api.NettyAvroRpcClient: Using default maxIOWorkers
>>> log4j:ERROR RPC client creation failed! NettyAvroRpcClient { host:
>>> localhost, port: 41414 }: RPC connection error
>>> Exception in thread "main" java.lang.ExceptionInInitializerError
>>>         at org.apache.hadoop.util.RunJar.run(RunJar.java:200)
>>>         at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
>>> Caused by: org.apache.commons.logging.LogConfigurationException:
>>> User-specified log class 'org.apache.commons.logging.impl.Log4JLogger'
>>> cannot be found or is not useable.
>>>         at
>>> org.apache.commons.logging.impl.LogFactoryImpl.discoverLogImplementation(LogFactoryImpl.java:804)
>>>         at
>>> org.apache.commons.logging.impl.LogFactoryImpl.newInstance(LogFactoryImpl.java:541)
>>>         at
>>> org.apache.commons.logging.impl.LogFactoryImpl.getInstance(LogFactoryImpl.java:292)
>>>         at
>>> org.apache.commons.logging.impl.LogFactoryImpl.getInstance(LogFactoryImpl.java:269)
>>>         at
>>> org.apache.commons.logging.LogFactory.getLog(LogFactory.java:657)
>>>         at
>>> org.apache.hadoop.util.ShutdownHookManager.<clinit>(ShutdownHookManager.java:44)
>>>         ... 2 more
>>>
>>> I have added these jars to hadoop-hdfs lib -
>>>
>>> avro-ipc-1.7.3.jar
>>> flume-avro-source-1.5.2.2.2.7.1-33.jar
>>> flume-ng-log4jappender-1.5.2.2.2.7.1-33.jar
>>> flume-hdfs-sink-1.5.2.2.2.7.1-33.jar
>>> flume-ng-sdk-1.5.2.2.2.7.1-33.jar
>>>
>>> and I do have the commons-logging( commons-logging-1.1.3.jar) and
>>> log4j(1.2.17) jars present in the hdfs lib. Any pointers to debug this
>>> issue?
>>>
>>> Thanks,
>>> Yogendra
>>>
>>>
>>>
>>
>

Re: Flume log4j Appender issue

Posted by Gonzalo Herreros <gh...@gmail.com>.
Adding a library to Flume shouldn't affect hive or any other tools.
You can add the jar to the lib or plugin.d directories.

Regards,
Gonzalo

On 1 December 2015 at 10:13, yogendra reddy <yo...@gmail.com> wrote:

> update
>
> I ran the flume agent first and then made changes to hadoop log4j
> properties file and after the restart this started working fine. but now
> Hive service is not coming up because of avro-ipc jar that I had to add to
> hadoop-hdfs lib to get flume log4j appender working
>
> I would like to know if anybody here has used flume to copy Hadoop
> daemon/service logs?
>
> Thanks,
> Yogendra
>
> On Wed, Nov 25, 2015 at 2:03 PM, yogendra reddy <yo...@gmail.com>
> wrote:
>
>> Hi All,
>>
>> I'm trying to configure flume to write hadoop service logs to a common
>> sink.
>>
>> Here's what I have added to hdfs log4j.properties
>>
>> # Define the root logger to the system property "hadoop.root.logger".
>> log4j.rootLogger=${hadoop.root.logger}, flume
>>
>> #Flume Appender
>> log4j.appender.flume =
>> org.apache.flume.clients.log4jappender.Log4jAppender
>> log4j.appender.flume.Hostname = localhost
>> log4j.appender.flume.Port = 41414
>>
>> and when I run sample pi job I get this error
>>
>> $ hadoop jar hadoop-mapreduce-examples.jar pi 10 10
>> log4j:ERROR Could not find value for key log4j.appender.flume.layout
>> 15/11/25 07:23:26 WARN api.NettyAvroRpcClient: Using default maxIOWorkers
>> log4j:ERROR RPC client creation failed! NettyAvroRpcClient { host:
>> localhost, port: 41414 }: RPC connection error
>> Exception in thread "main" java.lang.ExceptionInInitializerError
>>         at org.apache.hadoop.util.RunJar.run(RunJar.java:200)
>>         at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
>> Caused by: org.apache.commons.logging.LogConfigurationException:
>> User-specified log class 'org.apache.commons.logging.impl.Log4JLogger'
>> cannot be found or is not useable.
>>         at
>> org.apache.commons.logging.impl.LogFactoryImpl.discoverLogImplementation(LogFactoryImpl.java:804)
>>         at
>> org.apache.commons.logging.impl.LogFactoryImpl.newInstance(LogFactoryImpl.java:541)
>>         at
>> org.apache.commons.logging.impl.LogFactoryImpl.getInstance(LogFactoryImpl.java:292)
>>         at
>> org.apache.commons.logging.impl.LogFactoryImpl.getInstance(LogFactoryImpl.java:269)
>>         at
>> org.apache.commons.logging.LogFactory.getLog(LogFactory.java:657)
>>         at
>> org.apache.hadoop.util.ShutdownHookManager.<clinit>(ShutdownHookManager.java:44)
>>         ... 2 more
>>
>> I have added these jars to hadoop-hdfs lib -
>>
>> avro-ipc-1.7.3.jar
>> flume-avro-source-1.5.2.2.2.7.1-33.jar
>> flume-ng-log4jappender-1.5.2.2.2.7.1-33.jar
>> flume-hdfs-sink-1.5.2.2.2.7.1-33.jar
>> flume-ng-sdk-1.5.2.2.2.7.1-33.jar
>>
>> and I do have the commons-logging( commons-logging-1.1.3.jar) and
>> log4j(1.2.17) jars present in the hdfs lib. Any pointers to debug this
>> issue?
>>
>> Thanks,
>> Yogendra
>>
>>
>>
>