You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@hbase.apache.org by Jean-Marc Spaggiari <je...@spaggiari.org> on 2013/12/14 14:42:53 UTC

HBase 0.96 LogLevel?

Hi there,

I'm trying this tool:
bin/hbase org.apache.hadoop.hbase.util.CompressionTest file:///tmp/test.txt
snappy

And I want to set the log level to debug to see why it fails. But it seems
that it's not taking the log4j.conf into consideraion. I tried to remove
it, same result. I tried to set to debug, same result.

Any idea how to change the loglevel and why it's not taking our default
config file into consideration?

JM

Re: HBase 0.96 LogLevel?

Posted by Jean-Marc Spaggiari <je...@spaggiari.org>.
Finally I copied the hadoop 2.2.0 log4j.properties in place of the HBase
one and got what I was looking for ;)

Thanks for your recommendations.


2013/12/14 Ted Yu <yu...@gmail.com>

> TRACE,RFA was used in the command you posted - it overrode DEBUG,console in
> your properties file.
>
> I should have mentioned that I got the snippet from
> /grid/0/var/log/hbase/hbase.log
> where /grid/0/var/log/hbase is the log dir.
>
> You can try specifying TRACE,console
>
> Cheers
>
>
> On Sat, Dec 14, 2013 at 7:47 AM, Jean-Marc Spaggiari <
> jean-marc@spaggiari.org> wrote:
>
> > I have updated the log4.properties.
> >
> > Here is the output with log4j debug enabled:
> > hbase@hbasetest1:~$ bin/hbase  -Dlog4j.debug
> > org.apache.hadoop.hbase.util.CompressionTest file:///tmp/test.txt snappy
> > log4j: Trying to find [log4j.xml] using context classloader
> > sun.misc.Launcher$AppClassLoader@713c817.
> > log4j: Trying to find [log4j.xml] using
> > sun.misc.Launcher$AppClassLoader@713c817 class loader.
> > log4j: Trying to find [log4j.xml] using ClassLoader.getSystemResource().
> > log4j: Trying to find [log4j.properties] using context classloader
> > sun.misc.Launcher$AppClassLoader@713c817.
> > log4j: Using URL [file:/home/hbase/conf/log4j.properties] for automatic
> > log4j configuration.
> > log4j: Reading configuration from URL
> > file:/home/hbase/conf/log4j.properties
> > log4j: Hierarchy threshold set to [ALL].
> > log4j: Parsing for [root] with value=[INFO,console].
> > log4j: Level token is [INFO].
> > log4j: Category root set to INFO
> > log4j: Parsing appender named "console".
> > log4j: Parsing layout options for "console".
> > log4j: Setting property [conversionPattern] to [%d{ISO8601} %-5p [%t]
> > %c{2}: %m%n].
> > log4j: End of parsing for "console".
> > log4j: Setting property [target] to [System.err].
> > log4j: Parsed "console" options.
> > log4j: Parsing for [SecurityLogger] with value=[INFO,NullAppender].
> > log4j: Level token is [INFO].
> > log4j: Category SecurityLogger set to INFO
> > log4j: Parsing appender named "NullAppender".
> > log4j: Parsed "NullAppender" options.
> > log4j: Handling log4j.additivity.SecurityLogger=[false]
> > log4j: Setting additivity for "SecurityLogger" to false
> > log4j: Finished configuring.
> > 2013-12-14 10:36:45,985 INFO  [main] Configuration.deprecation:
> > hadoop.native.lib is deprecated. Instead, use io.native.lib.available
> > Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library
> > /home/hbase/lib/native/Linux-amd64-64/libhadoop.so which might have
> > disabled stack guard. The VM will try to fix the stack guard now.
> > It's highly recommended that you fix the library with 'execstack -c
> > <libfile>', or link it with '-z noexecstack'.
> > 2013-12-14 10:36:46,577 WARN  [main] util.NativeCodeLoader: Unable to
> load
> > native-hadoop library for your platform... using builtin-java classes
> where
> > applicable
> > 2013-12-14 10:36:46,881 INFO  [main] util.ChecksumType: Checksum using
> > org.apache.hadoop.util.PureJavaCrc32
> > 2013-12-14 10:36:46,883 INFO  [main] util.ChecksumType: Checksum can use
> > org.apache.hadoop.util.PureJavaCrc32C
> > Exception in thread "main" java.lang.UnsatisfiedLinkError:
> > org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
> >     at org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native
> > Method)
> >     at
> >
> >
> org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:62)
> >     at
> >
> >
> org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:131)
> >     at
> > org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:147)
> >     at
> > org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:162)
> >     at
> >
> >
> org.apache.hadoop.hbase.io.compress.Compression$Algorithm.getCompressor(Compression.java:312)
> >     at
> >
> >
> org.apache.hadoop.hbase.io.encoding.HFileBlockDefaultEncodingContext.<init>(HFileBlockDefaultEncodingContext.java:79)
> >     at
> >
> >
> org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.<init>(HFileBlock.java:719)
> >     at
> >
> >
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.finishInit(HFileWriterV2.java:131)
> >     at
> >
> >
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.<init>(HFileWriterV2.java:122)
> >     at
> >
> >
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2$WriterFactoryV2.createWriter(HFileWriterV2.java:105)
> >     at
> >
> org.apache.hadoop.hbase.io.hfile.HFile$WriterFactory.create(HFile.java:426)
> >     at
> >
> >
> org.apache.hadoop.hbase.util.CompressionTest.doSmokeTest(CompressionTest.java:115)
> >     at
> >
> org.apache.hadoop.hbase.util.CompressionTest.main(CompressionTest.java:145)
> >
> >
> > As you can see, it's reading from the default log4j.property file.
> >
> > Inside of which I have this:
> > # Define some default values that can be overridden by system properties
> > hbase.root.logger=DEBUG,console
> > hbase.security.logger=DEBUG,console
> > hbase.log.dir=.
> > hbase.log.file=hbase.log
> > root.logger=DEBUG.console
> > log4j.rootLogger=DEBUG,console
> >
> >
> > However as you can see, no debug info displayed.
> >
> > Using your parameter seems to be working:
> > hbase@hbasetest1:~$ bin/hbase  -Dlog4j.debug
> -Dhbase.root.logger=TRACE,RFA
> > org.apache.hadoop.hbase.util.CompressionTest file:///tmp/test.txt snappy
> > /home/hbase/bin/../lib/native/Linux-amd64-64
> > log4j: Trying to find [log4j.xml] using context classloader
> > sun.misc.Launcher$AppClassLoader@713c817.
> > log4j: Trying to find [log4j.xml] using
> > sun.misc.Launcher$AppClassLoader@713c817 class loader.
> > log4j: Trying to find [log4j.xml] using ClassLoader.getSystemResource().
> > log4j: Trying to find [log4j.properties] using context classloader
> > sun.misc.Launcher$AppClassLoader@713c817.
> > log4j: Using URL [file:/home/hbase/conf/log4j.properties] for automatic
> > log4j configuration.
> > log4j: Reading configuration from URL
> > file:/home/hbase/conf/log4j.properties
> > log4j: Hierarchy threshold set to [ALL].
> > log4j: Parsing for [root] with value=[TRACE,RFA].
> > log4j: Level token is [TRACE].
> > log4j: Category root set to TRACE
> > log4j: Parsing appender named "RFA".
> > log4j: Parsing layout options for "RFA".
> > log4j: Setting property [conversionPattern] to [%d{ISO8601} %-5p [%t]
> > %c{2}: %m%n].
> > log4j: End of parsing for "RFA".
> > log4j: Setting property [maxBackupIndex] to [20].
> > log4j: Setting property [file] to [/home/hbase/bin/../logs/hbase.log].
> > log4j: Setting property [maxFileSize] to [256MB].
> > log4j: setFile called: /home/hbase/bin/../logs/hbase.log, true
> > log4j: setFile ended
> > log4j: Parsed "RFA" options.
> > log4j: Parsing for [SecurityLogger] with value=[INFO,NullAppender].
> > log4j: Level token is [INFO].
> > log4j: Category SecurityLogger set to INFO
> > log4j: Parsing appender named "NullAppender".
> > log4j: Parsed "NullAppender" options.
> > log4j: Handling log4j.additivity.SecurityLogger=[false]
> > log4j: Setting additivity for "SecurityLogger" to false
> > log4j: Finished configuring.
> > Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library
> > /home/hbase/lib/native/Linux-amd64-64/libhadoop.so which might have
> > disabled stack guard. The VM will try to fix the stack guard now.
> > It's highly recommended that you fix the library with 'execstack -c
> > <libfile>', or link it with '-z noexecstack'.
> > Exception in thread "main" java.lang.UnsatisfiedLinkError:
> > org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
> >     at org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native
> > Method)
> >     at
> >
> >
> org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:62)
> >     at
> >
> >
> org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:131)
> >     at
> > org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:147)
> >     at
> > org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:162)
> >     at
> >
> >
> org.apache.hadoop.hbase.io.compress.Compression$Algorithm.getCompressor(Compression.java:312)
> >     at
> >
> >
> org.apache.hadoop.hbase.io.encoding.HFileBlockDefaultEncodingContext.<init>(HFileBlockDefaultEncodingContext.java:79)
> >     at
> >
> >
> org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.<init>(HFileBlock.java:719)
> >     at
> >
> >
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.finishInit(HFileWriterV2.java:131)
> >     at
> >
> >
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.<init>(HFileWriterV2.java:122)
> >     at
> >
> >
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2$WriterFactoryV2.createWriter(HFileWriterV2.java:105)
> >     at
> >
> org.apache.hadoop.hbase.io.hfile.HFile$WriterFactory.create(HFile.java:426)
> >     at
> >
> >
> org.apache.hadoop.hbase.util.CompressionTest.doSmokeTest(CompressionTest.java:115)
> >     at
> >
> org.apache.hadoop.hbase.util.CompressionTest.main(CompressionTest.java:145)
> >
> >
> > But still no debug info displayed. My goal is to display the debug logs
> of
> > NativeCodeLoader for its static piece, but so far, no luck...
> >
> >
> > 2013/12/14 Ted Yu <yu...@gmail.com>
> >
> > > How did you set log level to DEBUG ?
> > >
> > > I tried the following command and it worked:
> > >
> > > hbase -Dhbase.root.logger=DEBUG,RFA
> > > org.apache.hadoop.hbase.util.CompressionTest file:///tmp/test.txt
> snappy
> > >
> > > Snippet of log file:
> > >
> > > 2013-12-14 15:23:37,853 DEBUG [main] hdfs.BlockReaderLocal: The
> > > short-circuit local reads feature is enabled.
> > > 2013-12-14 15:23:37,900 INFO  [main] compress.CodecPool: Got brand-new
> > > decompressor [.snappy]
> > > 2013-12-14 15:23:37,901 DEBUG [main] compress.CodecPool: Got recycled
> > > decompressor
> > >
> > > Cheers
> > >
> > >
> > > On Sat, Dec 14, 2013 at 5:42 AM, Jean-Marc Spaggiari <
> > > jean-marc@spaggiari.org> wrote:
> > >
> > > > Hi there,
> > > >
> > > > I'm trying this tool:
> > > > bin/hbase org.apache.hadoop.hbase.util.CompressionTest
> > > file:///tmp/test.txt
> > > > snappy
> > > >
> > > > And I want to set the log level to debug to see why it fails. But it
> > > seems
> > > > that it's not taking the log4j.conf into consideraion. I tried to
> > remove
> > > > it, same result. I tried to set to debug, same result.
> > > >
> > > > Any idea how to change the loglevel and why it's not taking our
> default
> > > > config file into consideration?
> > > >
> > > > JM
> > > >
> > >
> >
>

Re: HBase 0.96 LogLevel?

Posted by Ted Yu <yu...@gmail.com>.
TRACE,RFA was used in the command you posted - it overrode DEBUG,console in
your properties file.

I should have mentioned that I got the snippet from
/grid/0/var/log/hbase/hbase.log
where /grid/0/var/log/hbase is the log dir.

You can try specifying TRACE,console

Cheers


On Sat, Dec 14, 2013 at 7:47 AM, Jean-Marc Spaggiari <
jean-marc@spaggiari.org> wrote:

> I have updated the log4.properties.
>
> Here is the output with log4j debug enabled:
> hbase@hbasetest1:~$ bin/hbase  -Dlog4j.debug
> org.apache.hadoop.hbase.util.CompressionTest file:///tmp/test.txt snappy
> log4j: Trying to find [log4j.xml] using context classloader
> sun.misc.Launcher$AppClassLoader@713c817.
> log4j: Trying to find [log4j.xml] using
> sun.misc.Launcher$AppClassLoader@713c817 class loader.
> log4j: Trying to find [log4j.xml] using ClassLoader.getSystemResource().
> log4j: Trying to find [log4j.properties] using context classloader
> sun.misc.Launcher$AppClassLoader@713c817.
> log4j: Using URL [file:/home/hbase/conf/log4j.properties] for automatic
> log4j configuration.
> log4j: Reading configuration from URL
> file:/home/hbase/conf/log4j.properties
> log4j: Hierarchy threshold set to [ALL].
> log4j: Parsing for [root] with value=[INFO,console].
> log4j: Level token is [INFO].
> log4j: Category root set to INFO
> log4j: Parsing appender named "console".
> log4j: Parsing layout options for "console".
> log4j: Setting property [conversionPattern] to [%d{ISO8601} %-5p [%t]
> %c{2}: %m%n].
> log4j: End of parsing for "console".
> log4j: Setting property [target] to [System.err].
> log4j: Parsed "console" options.
> log4j: Parsing for [SecurityLogger] with value=[INFO,NullAppender].
> log4j: Level token is [INFO].
> log4j: Category SecurityLogger set to INFO
> log4j: Parsing appender named "NullAppender".
> log4j: Parsed "NullAppender" options.
> log4j: Handling log4j.additivity.SecurityLogger=[false]
> log4j: Setting additivity for "SecurityLogger" to false
> log4j: Finished configuring.
> 2013-12-14 10:36:45,985 INFO  [main] Configuration.deprecation:
> hadoop.native.lib is deprecated. Instead, use io.native.lib.available
> Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library
> /home/hbase/lib/native/Linux-amd64-64/libhadoop.so which might have
> disabled stack guard. The VM will try to fix the stack guard now.
> It's highly recommended that you fix the library with 'execstack -c
> <libfile>', or link it with '-z noexecstack'.
> 2013-12-14 10:36:46,577 WARN  [main] util.NativeCodeLoader: Unable to load
> native-hadoop library for your platform... using builtin-java classes where
> applicable
> 2013-12-14 10:36:46,881 INFO  [main] util.ChecksumType: Checksum using
> org.apache.hadoop.util.PureJavaCrc32
> 2013-12-14 10:36:46,883 INFO  [main] util.ChecksumType: Checksum can use
> org.apache.hadoop.util.PureJavaCrc32C
> Exception in thread "main" java.lang.UnsatisfiedLinkError:
> org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
>     at org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native
> Method)
>     at
>
> org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:62)
>     at
>
> org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:131)
>     at
> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:147)
>     at
> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:162)
>     at
>
> org.apache.hadoop.hbase.io.compress.Compression$Algorithm.getCompressor(Compression.java:312)
>     at
>
> org.apache.hadoop.hbase.io.encoding.HFileBlockDefaultEncodingContext.<init>(HFileBlockDefaultEncodingContext.java:79)
>     at
>
> org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.<init>(HFileBlock.java:719)
>     at
>
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.finishInit(HFileWriterV2.java:131)
>     at
>
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.<init>(HFileWriterV2.java:122)
>     at
>
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2$WriterFactoryV2.createWriter(HFileWriterV2.java:105)
>     at
> org.apache.hadoop.hbase.io.hfile.HFile$WriterFactory.create(HFile.java:426)
>     at
>
> org.apache.hadoop.hbase.util.CompressionTest.doSmokeTest(CompressionTest.java:115)
>     at
> org.apache.hadoop.hbase.util.CompressionTest.main(CompressionTest.java:145)
>
>
> As you can see, it's reading from the default log4j.property file.
>
> Inside of which I have this:
> # Define some default values that can be overridden by system properties
> hbase.root.logger=DEBUG,console
> hbase.security.logger=DEBUG,console
> hbase.log.dir=.
> hbase.log.file=hbase.log
> root.logger=DEBUG.console
> log4j.rootLogger=DEBUG,console
>
>
> However as you can see, no debug info displayed.
>
> Using your parameter seems to be working:
> hbase@hbasetest1:~$ bin/hbase  -Dlog4j.debug -Dhbase.root.logger=TRACE,RFA
> org.apache.hadoop.hbase.util.CompressionTest file:///tmp/test.txt snappy
> /home/hbase/bin/../lib/native/Linux-amd64-64
> log4j: Trying to find [log4j.xml] using context classloader
> sun.misc.Launcher$AppClassLoader@713c817.
> log4j: Trying to find [log4j.xml] using
> sun.misc.Launcher$AppClassLoader@713c817 class loader.
> log4j: Trying to find [log4j.xml] using ClassLoader.getSystemResource().
> log4j: Trying to find [log4j.properties] using context classloader
> sun.misc.Launcher$AppClassLoader@713c817.
> log4j: Using URL [file:/home/hbase/conf/log4j.properties] for automatic
> log4j configuration.
> log4j: Reading configuration from URL
> file:/home/hbase/conf/log4j.properties
> log4j: Hierarchy threshold set to [ALL].
> log4j: Parsing for [root] with value=[TRACE,RFA].
> log4j: Level token is [TRACE].
> log4j: Category root set to TRACE
> log4j: Parsing appender named "RFA".
> log4j: Parsing layout options for "RFA".
> log4j: Setting property [conversionPattern] to [%d{ISO8601} %-5p [%t]
> %c{2}: %m%n].
> log4j: End of parsing for "RFA".
> log4j: Setting property [maxBackupIndex] to [20].
> log4j: Setting property [file] to [/home/hbase/bin/../logs/hbase.log].
> log4j: Setting property [maxFileSize] to [256MB].
> log4j: setFile called: /home/hbase/bin/../logs/hbase.log, true
> log4j: setFile ended
> log4j: Parsed "RFA" options.
> log4j: Parsing for [SecurityLogger] with value=[INFO,NullAppender].
> log4j: Level token is [INFO].
> log4j: Category SecurityLogger set to INFO
> log4j: Parsing appender named "NullAppender".
> log4j: Parsed "NullAppender" options.
> log4j: Handling log4j.additivity.SecurityLogger=[false]
> log4j: Setting additivity for "SecurityLogger" to false
> log4j: Finished configuring.
> Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library
> /home/hbase/lib/native/Linux-amd64-64/libhadoop.so which might have
> disabled stack guard. The VM will try to fix the stack guard now.
> It's highly recommended that you fix the library with 'execstack -c
> <libfile>', or link it with '-z noexecstack'.
> Exception in thread "main" java.lang.UnsatisfiedLinkError:
> org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
>     at org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native
> Method)
>     at
>
> org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:62)
>     at
>
> org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:131)
>     at
> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:147)
>     at
> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:162)
>     at
>
> org.apache.hadoop.hbase.io.compress.Compression$Algorithm.getCompressor(Compression.java:312)
>     at
>
> org.apache.hadoop.hbase.io.encoding.HFileBlockDefaultEncodingContext.<init>(HFileBlockDefaultEncodingContext.java:79)
>     at
>
> org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.<init>(HFileBlock.java:719)
>     at
>
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.finishInit(HFileWriterV2.java:131)
>     at
>
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.<init>(HFileWriterV2.java:122)
>     at
>
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2$WriterFactoryV2.createWriter(HFileWriterV2.java:105)
>     at
> org.apache.hadoop.hbase.io.hfile.HFile$WriterFactory.create(HFile.java:426)
>     at
>
> org.apache.hadoop.hbase.util.CompressionTest.doSmokeTest(CompressionTest.java:115)
>     at
> org.apache.hadoop.hbase.util.CompressionTest.main(CompressionTest.java:145)
>
>
> But still no debug info displayed. My goal is to display the debug logs of
> NativeCodeLoader for its static piece, but so far, no luck...
>
>
> 2013/12/14 Ted Yu <yu...@gmail.com>
>
> > How did you set log level to DEBUG ?
> >
> > I tried the following command and it worked:
> >
> > hbase -Dhbase.root.logger=DEBUG,RFA
> > org.apache.hadoop.hbase.util.CompressionTest file:///tmp/test.txt snappy
> >
> > Snippet of log file:
> >
> > 2013-12-14 15:23:37,853 DEBUG [main] hdfs.BlockReaderLocal: The
> > short-circuit local reads feature is enabled.
> > 2013-12-14 15:23:37,900 INFO  [main] compress.CodecPool: Got brand-new
> > decompressor [.snappy]
> > 2013-12-14 15:23:37,901 DEBUG [main] compress.CodecPool: Got recycled
> > decompressor
> >
> > Cheers
> >
> >
> > On Sat, Dec 14, 2013 at 5:42 AM, Jean-Marc Spaggiari <
> > jean-marc@spaggiari.org> wrote:
> >
> > > Hi there,
> > >
> > > I'm trying this tool:
> > > bin/hbase org.apache.hadoop.hbase.util.CompressionTest
> > file:///tmp/test.txt
> > > snappy
> > >
> > > And I want to set the log level to debug to see why it fails. But it
> > seems
> > > that it's not taking the log4j.conf into consideraion. I tried to
> remove
> > > it, same result. I tried to set to debug, same result.
> > >
> > > Any idea how to change the loglevel and why it's not taking our default
> > > config file into consideration?
> > >
> > > JM
> > >
> >
>

Re: HBase 0.96 LogLevel?

Posted by Jean-Marc Spaggiari <je...@spaggiari.org>.
I have updated the log4.properties.

Here is the output with log4j debug enabled:
hbase@hbasetest1:~$ bin/hbase  -Dlog4j.debug
org.apache.hadoop.hbase.util.CompressionTest file:///tmp/test.txt snappy
log4j: Trying to find [log4j.xml] using context classloader
sun.misc.Launcher$AppClassLoader@713c817.
log4j: Trying to find [log4j.xml] using
sun.misc.Launcher$AppClassLoader@713c817 class loader.
log4j: Trying to find [log4j.xml] using ClassLoader.getSystemResource().
log4j: Trying to find [log4j.properties] using context classloader
sun.misc.Launcher$AppClassLoader@713c817.
log4j: Using URL [file:/home/hbase/conf/log4j.properties] for automatic
log4j configuration.
log4j: Reading configuration from URL file:/home/hbase/conf/log4j.properties
log4j: Hierarchy threshold set to [ALL].
log4j: Parsing for [root] with value=[INFO,console].
log4j: Level token is [INFO].
log4j: Category root set to INFO
log4j: Parsing appender named "console".
log4j: Parsing layout options for "console".
log4j: Setting property [conversionPattern] to [%d{ISO8601} %-5p [%t]
%c{2}: %m%n].
log4j: End of parsing for "console".
log4j: Setting property [target] to [System.err].
log4j: Parsed "console" options.
log4j: Parsing for [SecurityLogger] with value=[INFO,NullAppender].
log4j: Level token is [INFO].
log4j: Category SecurityLogger set to INFO
log4j: Parsing appender named "NullAppender".
log4j: Parsed "NullAppender" options.
log4j: Handling log4j.additivity.SecurityLogger=[false]
log4j: Setting additivity for "SecurityLogger" to false
log4j: Finished configuring.
2013-12-14 10:36:45,985 INFO  [main] Configuration.deprecation:
hadoop.native.lib is deprecated. Instead, use io.native.lib.available
Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library
/home/hbase/lib/native/Linux-amd64-64/libhadoop.so which might have
disabled stack guard. The VM will try to fix the stack guard now.
It's highly recommended that you fix the library with 'execstack -c
<libfile>', or link it with '-z noexecstack'.
2013-12-14 10:36:46,577 WARN  [main] util.NativeCodeLoader: Unable to load
native-hadoop library for your platform... using builtin-java classes where
applicable
2013-12-14 10:36:46,881 INFO  [main] util.ChecksumType: Checksum using
org.apache.hadoop.util.PureJavaCrc32
2013-12-14 10:36:46,883 INFO  [main] util.ChecksumType: Checksum can use
org.apache.hadoop.util.PureJavaCrc32C
Exception in thread "main" java.lang.UnsatisfiedLinkError:
org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
    at org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native
Method)
    at
org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:62)
    at
org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:131)
    at
org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:147)
    at
org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:162)
    at
org.apache.hadoop.hbase.io.compress.Compression$Algorithm.getCompressor(Compression.java:312)
    at
org.apache.hadoop.hbase.io.encoding.HFileBlockDefaultEncodingContext.<init>(HFileBlockDefaultEncodingContext.java:79)
    at
org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.<init>(HFileBlock.java:719)
    at
org.apache.hadoop.hbase.io.hfile.HFileWriterV2.finishInit(HFileWriterV2.java:131)
    at
org.apache.hadoop.hbase.io.hfile.HFileWriterV2.<init>(HFileWriterV2.java:122)
    at
org.apache.hadoop.hbase.io.hfile.HFileWriterV2$WriterFactoryV2.createWriter(HFileWriterV2.java:105)
    at
org.apache.hadoop.hbase.io.hfile.HFile$WriterFactory.create(HFile.java:426)
    at
org.apache.hadoop.hbase.util.CompressionTest.doSmokeTest(CompressionTest.java:115)
    at
org.apache.hadoop.hbase.util.CompressionTest.main(CompressionTest.java:145)


As you can see, it's reading from the default log4j.property file.

Inside of which I have this:
# Define some default values that can be overridden by system properties
hbase.root.logger=DEBUG,console
hbase.security.logger=DEBUG,console
hbase.log.dir=.
hbase.log.file=hbase.log
root.logger=DEBUG.console
log4j.rootLogger=DEBUG,console


However as you can see, no debug info displayed.

Using your parameter seems to be working:
hbase@hbasetest1:~$ bin/hbase  -Dlog4j.debug -Dhbase.root.logger=TRACE,RFA
org.apache.hadoop.hbase.util.CompressionTest file:///tmp/test.txt snappy
/home/hbase/bin/../lib/native/Linux-amd64-64
log4j: Trying to find [log4j.xml] using context classloader
sun.misc.Launcher$AppClassLoader@713c817.
log4j: Trying to find [log4j.xml] using
sun.misc.Launcher$AppClassLoader@713c817 class loader.
log4j: Trying to find [log4j.xml] using ClassLoader.getSystemResource().
log4j: Trying to find [log4j.properties] using context classloader
sun.misc.Launcher$AppClassLoader@713c817.
log4j: Using URL [file:/home/hbase/conf/log4j.properties] for automatic
log4j configuration.
log4j: Reading configuration from URL file:/home/hbase/conf/log4j.properties
log4j: Hierarchy threshold set to [ALL].
log4j: Parsing for [root] with value=[TRACE,RFA].
log4j: Level token is [TRACE].
log4j: Category root set to TRACE
log4j: Parsing appender named "RFA".
log4j: Parsing layout options for "RFA".
log4j: Setting property [conversionPattern] to [%d{ISO8601} %-5p [%t]
%c{2}: %m%n].
log4j: End of parsing for "RFA".
log4j: Setting property [maxBackupIndex] to [20].
log4j: Setting property [file] to [/home/hbase/bin/../logs/hbase.log].
log4j: Setting property [maxFileSize] to [256MB].
log4j: setFile called: /home/hbase/bin/../logs/hbase.log, true
log4j: setFile ended
log4j: Parsed "RFA" options.
log4j: Parsing for [SecurityLogger] with value=[INFO,NullAppender].
log4j: Level token is [INFO].
log4j: Category SecurityLogger set to INFO
log4j: Parsing appender named "NullAppender".
log4j: Parsed "NullAppender" options.
log4j: Handling log4j.additivity.SecurityLogger=[false]
log4j: Setting additivity for "SecurityLogger" to false
log4j: Finished configuring.
Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library
/home/hbase/lib/native/Linux-amd64-64/libhadoop.so which might have
disabled stack guard. The VM will try to fix the stack guard now.
It's highly recommended that you fix the library with 'execstack -c
<libfile>', or link it with '-z noexecstack'.
Exception in thread "main" java.lang.UnsatisfiedLinkError:
org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
    at org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native
Method)
    at
org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:62)
    at
org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:131)
    at
org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:147)
    at
org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:162)
    at
org.apache.hadoop.hbase.io.compress.Compression$Algorithm.getCompressor(Compression.java:312)
    at
org.apache.hadoop.hbase.io.encoding.HFileBlockDefaultEncodingContext.<init>(HFileBlockDefaultEncodingContext.java:79)
    at
org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.<init>(HFileBlock.java:719)
    at
org.apache.hadoop.hbase.io.hfile.HFileWriterV2.finishInit(HFileWriterV2.java:131)
    at
org.apache.hadoop.hbase.io.hfile.HFileWriterV2.<init>(HFileWriterV2.java:122)
    at
org.apache.hadoop.hbase.io.hfile.HFileWriterV2$WriterFactoryV2.createWriter(HFileWriterV2.java:105)
    at
org.apache.hadoop.hbase.io.hfile.HFile$WriterFactory.create(HFile.java:426)
    at
org.apache.hadoop.hbase.util.CompressionTest.doSmokeTest(CompressionTest.java:115)
    at
org.apache.hadoop.hbase.util.CompressionTest.main(CompressionTest.java:145)


But still no debug info displayed. My goal is to display the debug logs of
NativeCodeLoader for its static piece, but so far, no luck...


2013/12/14 Ted Yu <yu...@gmail.com>

> How did you set log level to DEBUG ?
>
> I tried the following command and it worked:
>
> hbase -Dhbase.root.logger=DEBUG,RFA
> org.apache.hadoop.hbase.util.CompressionTest file:///tmp/test.txt snappy
>
> Snippet of log file:
>
> 2013-12-14 15:23:37,853 DEBUG [main] hdfs.BlockReaderLocal: The
> short-circuit local reads feature is enabled.
> 2013-12-14 15:23:37,900 INFO  [main] compress.CodecPool: Got brand-new
> decompressor [.snappy]
> 2013-12-14 15:23:37,901 DEBUG [main] compress.CodecPool: Got recycled
> decompressor
>
> Cheers
>
>
> On Sat, Dec 14, 2013 at 5:42 AM, Jean-Marc Spaggiari <
> jean-marc@spaggiari.org> wrote:
>
> > Hi there,
> >
> > I'm trying this tool:
> > bin/hbase org.apache.hadoop.hbase.util.CompressionTest
> file:///tmp/test.txt
> > snappy
> >
> > And I want to set the log level to debug to see why it fails. But it
> seems
> > that it's not taking the log4j.conf into consideraion. I tried to remove
> > it, same result. I tried to set to debug, same result.
> >
> > Any idea how to change the loglevel and why it's not taking our default
> > config file into consideration?
> >
> > JM
> >
>

Re: HBase 0.96 LogLevel?

Posted by Ted Yu <yu...@gmail.com>.
How did you set log level to DEBUG ?

I tried the following command and it worked:

hbase -Dhbase.root.logger=DEBUG,RFA
org.apache.hadoop.hbase.util.CompressionTest file:///tmp/test.txt snappy

Snippet of log file:

2013-12-14 15:23:37,853 DEBUG [main] hdfs.BlockReaderLocal: The
short-circuit local reads feature is enabled.
2013-12-14 15:23:37,900 INFO  [main] compress.CodecPool: Got brand-new
decompressor [.snappy]
2013-12-14 15:23:37,901 DEBUG [main] compress.CodecPool: Got recycled
decompressor

Cheers


On Sat, Dec 14, 2013 at 5:42 AM, Jean-Marc Spaggiari <
jean-marc@spaggiari.org> wrote:

> Hi there,
>
> I'm trying this tool:
> bin/hbase org.apache.hadoop.hbase.util.CompressionTest file:///tmp/test.txt
> snappy
>
> And I want to set the log level to debug to see why it fails. But it seems
> that it's not taking the log4j.conf into consideraion. I tried to remove
> it, same result. I tried to set to debug, same result.
>
> Any idea how to change the loglevel and why it's not taking our default
> config file into consideration?
>
> JM
>