You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hbase.apache.org by Kristoffer Sjögren <st...@gmail.com> on 2013/12/16 11:01:27 UTC
Guava 15
Hi
At the moment HFileWriterV2.close breaks at startup when using Guava 15.
This is not a client problem - it happens because we start a master node to
do integration tests.
A bit precarious and wonder if there are any plans to support Guava 15, or
if there are clever way around this?
Cheers,
-Kristoffer
org.apache.hadoop.hbase.DroppedSnapshotException: region: -ROOT-,,0
at org.apache.hadoop.hbase.regionserver.HRegion.internalFlushcache(HRegion.java:1646)
at org.apache.hadoop.hbase.regionserver.HRegion.internalFlushcache(HRegion.java:1514)
at org.apache.hadoop.hbase.regionserver.HRegion.doClose(HRegion.java:1032)
at org.apache.hadoop.hbase.regionserver.HRegion.close(HRegion.java:980)
at org.apache.hadoop.hbase.regionserver.HRegion.close(HRegion.java:951)
at org.apache.hadoop.hbase.master.MasterFileSystem.bootstrap(MasterFileSystem.java:523)
at org.apache.hadoop.hbase.master.MasterFileSystem.checkRootDir(MasterFileSystem.java:463)
at org.apache.hadoop.hbase.master.MasterFileSystem.createInitialFileSystemLayout(MasterFileSystem.java:148)
at org.apache.hadoop.hbase.master.MasterFileSystem.<init>(MasterFileSystem.java:133)
at org.apache.hadoop.hbase.master.HMaster.finishInitialization(HMaster.java:549)
at org.apache.hadoop.hbase.master.HMaster.run(HMaster.java:408)
at org.apache.hadoop.hbase.master.HMasterCommandLine$LocalHMaster.run(HMasterCommandLine.java:226)
at java.lang.Thread.run(Thread.java:722)
Caused by: java.lang.NoClassDefFoundError: com/google/common/io/NullOutputStream
at org.apache.hadoop.hbase.io.hfile.HFileWriterV2.close(HFileWriterV2.java:375)
at org.apache.hadoop.hbase.regionserver.StoreFile$Writer.close(StoreFile.java:1299)
at org.apache.hadoop.hbase.regionserver.Store.internalFlushCache(Store.java:897)
at org.apache.hadoop.hbase.regionserver.Store.flushCache(Store.java:778)
at org.apache.hadoop.hbase.regionserver.Store$StoreFlusherImpl.flushCache(Store.java:2290)
at org.apache.hadoop.hbase.regionserver.HRegion.internalFlushcache(HRegion.java:1621)
... 12 more
Caused by: java.lang.ClassNotFoundException:
com.google.common.io.NullOutputStream
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
... 18 more
Re: Guava 15
Posted by Pradeep Gollakota <pr...@gmail.com>.
This is kinda tangential, but for very very common dependencies such as
guava, jackson, etc. would it make sense to use a shaded jar so as not to
affect user dependencies?
On Mon, Dec 16, 2013 at 7:47 PM, Ted Yu <yu...@gmail.com> wrote:
> Please try out patch v2 from HBASE-10174
>
> Thanks
>
> On Dec 16, 2013, at 11:42 AM, Kristoffer Sjögren <st...@gmail.com> wrote:
>
> > Oh thank you very much Ted! :-)
> >
> > Ill give it a try tomorrow.
> >
> > Cheers!
> >
> >
> > On Mon, Dec 16, 2013 at 6:05 PM, Ted Yu <yu...@gmail.com> wrote:
> >
> >> I created HBASE-10174 and attached a patch there.
> >>
> >> Running 0.94 test suite now.
> >>
> >>
> >> On Mon, Dec 16, 2013 at 7:05 AM, Nicolas Liochon <nk...@gmail.com>
> >> wrote:
> >>
> >>> That means more or less backporting the patch to the 0.94, no?
> >>> It should work imho.
> >>>
> >>>
> >>>
> >>>
> >>> On Mon, Dec 16, 2013 at 3:16 PM, Kristoffer Sjögren <stoffe@gmail.com
> >>>> wrote:
> >>>
> >>>> Thanks! But we cant really upgrade to HBase 0.96 right now, but we
> need
> >>> to
> >>>> go to Guava 15 :-(
> >>>>
> >>>> I was thinking of overriding the classes fixed in the patch in our
> test
> >>>> environment.
> >>>>
> >>>> Could this work maybe?
> >>>>
> >>>>
> >>>> On Mon, Dec 16, 2013 at 11:01 AM, Kristoffer Sjögren <
> stoffe@gmail.com
> >>>>> wrote:
> >>>>
> >>>>> Hi
> >>>>>
> >>>>> At the moment HFileWriterV2.close breaks at startup when using Guava
> >>> 15.
> >>>>> This is not a client problem - it happens because we start a master
> >>> node
> >>>> to
> >>>>> do integration tests.
> >>>>>
> >>>>> A bit precarious and wonder if there are any plans to support Guava
> >> 15,
> >>>> or
> >>>>> if there are clever way around this?
> >>>>>
> >>>>> Cheers,
> >>>>> -Kristoffer
> >>>>>
> >>>>> org.apache.hadoop.hbase.DroppedSnapshotException: region: -ROOT-,,0
> >>>>> at
> >>
> org.apache.hadoop.hbase.regionserver.HRegion.internalFlushcache(HRegion.java:1646)
> >>>>> at
> >>
> org.apache.hadoop.hbase.regionserver.HRegion.internalFlushcache(HRegion.java:1514)
> >>>>> at
> >>>>
> org.apache.hadoop.hbase.regionserver.HRegion.doClose(HRegion.java:1032)
> >>>>> at
> >>>> org.apache.hadoop.hbase.regionserver.HRegion.close(HRegion.java:980)
> >>>>> at
> >>>> org.apache.hadoop.hbase.regionserver.HRegion.close(HRegion.java:951)
> >>>>> at
> >>
> org.apache.hadoop.hbase.master.MasterFileSystem.bootstrap(MasterFileSystem.java:523)
> >>>>> at
> >>
> org.apache.hadoop.hbase.master.MasterFileSystem.checkRootDir(MasterFileSystem.java:463)
> >>>>> at
> >>
> org.apache.hadoop.hbase.master.MasterFileSystem.createInitialFileSystemLayout(MasterFileSystem.java:148)
> >>>>> at
> >>
> org.apache.hadoop.hbase.master.MasterFileSystem.<init>(MasterFileSystem.java:133)
> >>>>> at
> >>
> org.apache.hadoop.hbase.master.HMaster.finishInitialization(HMaster.java:549)
> >>>>> at org.apache.hadoop.hbase.master.HMaster.run(HMaster.java:408)
> >>>>> at
> >>
> org.apache.hadoop.hbase.master.HMasterCommandLine$LocalHMaster.run(HMasterCommandLine.java:226)
> >>>>> at java.lang.Thread.run(Thread.java:722)
> >>>>> Caused by: java.lang.NoClassDefFoundError:
> >>>> com/google/common/io/NullOutputStream
> >>>>> at
> >>
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.close(HFileWriterV2.java:375)
> >>>>> at
> >>
> org.apache.hadoop.hbase.regionserver.StoreFile$Writer.close(StoreFile.java:1299)
> >>>>> at
> >>
> org.apache.hadoop.hbase.regionserver.Store.internalFlushCache(Store.java:897)
> >>>>> at
> >>>> org.apache.hadoop.hbase.regionserver.Store.flushCache(Store.java:778)
> >>>>> at
> >>
> org.apache.hadoop.hbase.regionserver.Store$StoreFlusherImpl.flushCache(Store.java:2290)
> >>>>> at
> >>
> org.apache.hadoop.hbase.regionserver.HRegion.internalFlushcache(HRegion.java:1621)
> >>>>> ... 12 more
> >>>>> Caused by: java.lang.ClassNotFoundException:
> >>>> com.google.common.io.NullOutputStream
> >>>>> at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
> >>>>> at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
> >>>>> at java.security.AccessController.doPrivileged(Native Method)
> >>>>> at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
> >>>>> at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
> >>>>> at
> >> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
> >>>>> at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
> >>>>> ... 18 more
> >>
>
Re: Guava 15
Posted by Ted Yu <yu...@gmail.com>.
Please try out patch v2 from HBASE-10174
Thanks
On Dec 16, 2013, at 11:42 AM, Kristoffer Sjögren <st...@gmail.com> wrote:
> Oh thank you very much Ted! :-)
>
> Ill give it a try tomorrow.
>
> Cheers!
>
>
> On Mon, Dec 16, 2013 at 6:05 PM, Ted Yu <yu...@gmail.com> wrote:
>
>> I created HBASE-10174 and attached a patch there.
>>
>> Running 0.94 test suite now.
>>
>>
>> On Mon, Dec 16, 2013 at 7:05 AM, Nicolas Liochon <nk...@gmail.com>
>> wrote:
>>
>>> That means more or less backporting the patch to the 0.94, no?
>>> It should work imho.
>>>
>>>
>>>
>>>
>>> On Mon, Dec 16, 2013 at 3:16 PM, Kristoffer Sjögren <stoffe@gmail.com
>>>> wrote:
>>>
>>>> Thanks! But we cant really upgrade to HBase 0.96 right now, but we need
>>> to
>>>> go to Guava 15 :-(
>>>>
>>>> I was thinking of overriding the classes fixed in the patch in our test
>>>> environment.
>>>>
>>>> Could this work maybe?
>>>>
>>>>
>>>> On Mon, Dec 16, 2013 at 11:01 AM, Kristoffer Sjögren <stoffe@gmail.com
>>>>> wrote:
>>>>
>>>>> Hi
>>>>>
>>>>> At the moment HFileWriterV2.close breaks at startup when using Guava
>>> 15.
>>>>> This is not a client problem - it happens because we start a master
>>> node
>>>> to
>>>>> do integration tests.
>>>>>
>>>>> A bit precarious and wonder if there are any plans to support Guava
>> 15,
>>>> or
>>>>> if there are clever way around this?
>>>>>
>>>>> Cheers,
>>>>> -Kristoffer
>>>>>
>>>>> org.apache.hadoop.hbase.DroppedSnapshotException: region: -ROOT-,,0
>>>>> at
>> org.apache.hadoop.hbase.regionserver.HRegion.internalFlushcache(HRegion.java:1646)
>>>>> at
>> org.apache.hadoop.hbase.regionserver.HRegion.internalFlushcache(HRegion.java:1514)
>>>>> at
>>>> org.apache.hadoop.hbase.regionserver.HRegion.doClose(HRegion.java:1032)
>>>>> at
>>>> org.apache.hadoop.hbase.regionserver.HRegion.close(HRegion.java:980)
>>>>> at
>>>> org.apache.hadoop.hbase.regionserver.HRegion.close(HRegion.java:951)
>>>>> at
>> org.apache.hadoop.hbase.master.MasterFileSystem.bootstrap(MasterFileSystem.java:523)
>>>>> at
>> org.apache.hadoop.hbase.master.MasterFileSystem.checkRootDir(MasterFileSystem.java:463)
>>>>> at
>> org.apache.hadoop.hbase.master.MasterFileSystem.createInitialFileSystemLayout(MasterFileSystem.java:148)
>>>>> at
>> org.apache.hadoop.hbase.master.MasterFileSystem.<init>(MasterFileSystem.java:133)
>>>>> at
>> org.apache.hadoop.hbase.master.HMaster.finishInitialization(HMaster.java:549)
>>>>> at org.apache.hadoop.hbase.master.HMaster.run(HMaster.java:408)
>>>>> at
>> org.apache.hadoop.hbase.master.HMasterCommandLine$LocalHMaster.run(HMasterCommandLine.java:226)
>>>>> at java.lang.Thread.run(Thread.java:722)
>>>>> Caused by: java.lang.NoClassDefFoundError:
>>>> com/google/common/io/NullOutputStream
>>>>> at
>> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.close(HFileWriterV2.java:375)
>>>>> at
>> org.apache.hadoop.hbase.regionserver.StoreFile$Writer.close(StoreFile.java:1299)
>>>>> at
>> org.apache.hadoop.hbase.regionserver.Store.internalFlushCache(Store.java:897)
>>>>> at
>>>> org.apache.hadoop.hbase.regionserver.Store.flushCache(Store.java:778)
>>>>> at
>> org.apache.hadoop.hbase.regionserver.Store$StoreFlusherImpl.flushCache(Store.java:2290)
>>>>> at
>> org.apache.hadoop.hbase.regionserver.HRegion.internalFlushcache(HRegion.java:1621)
>>>>> ... 12 more
>>>>> Caused by: java.lang.ClassNotFoundException:
>>>> com.google.common.io.NullOutputStream
>>>>> at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>>>>> at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>>>>> at java.security.AccessController.doPrivileged(Native Method)
>>>>> at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>>>>> at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
>>>>> at
>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>>>>> at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
>>>>> ... 18 more
>>
Re: Guava 15
Posted by Kristoffer Sjögren <st...@gmail.com>.
Oh thank you very much Ted! :-)
Ill give it a try tomorrow.
Cheers!
On Mon, Dec 16, 2013 at 6:05 PM, Ted Yu <yu...@gmail.com> wrote:
> I created HBASE-10174 and attached a patch there.
>
> Running 0.94 test suite now.
>
>
> On Mon, Dec 16, 2013 at 7:05 AM, Nicolas Liochon <nk...@gmail.com>
> wrote:
>
> > That means more or less backporting the patch to the 0.94, no?
> > It should work imho.
> >
> >
> >
> >
> > On Mon, Dec 16, 2013 at 3:16 PM, Kristoffer Sjögren <stoffe@gmail.com
> > >wrote:
> >
> > > Thanks! But we cant really upgrade to HBase 0.96 right now, but we need
> > to
> > > go to Guava 15 :-(
> > >
> > > I was thinking of overriding the classes fixed in the patch in our test
> > > environment.
> > >
> > > Could this work maybe?
> > >
> > >
> > > On Mon, Dec 16, 2013 at 11:01 AM, Kristoffer Sjögren <stoffe@gmail.com
> > > >wrote:
> > >
> > > > Hi
> > > >
> > > > At the moment HFileWriterV2.close breaks at startup when using Guava
> > 15.
> > > > This is not a client problem - it happens because we start a master
> > node
> > > to
> > > > do integration tests.
> > > >
> > > > A bit precarious and wonder if there are any plans to support Guava
> 15,
> > > or
> > > > if there are clever way around this?
> > > >
> > > > Cheers,
> > > > -Kristoffer
> > > >
> > > > org.apache.hadoop.hbase.DroppedSnapshotException: region: -ROOT-,,0
> > > > at
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegion.internalFlushcache(HRegion.java:1646)
> > > > at
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegion.internalFlushcache(HRegion.java:1514)
> > > > at
> > > org.apache.hadoop.hbase.regionserver.HRegion.doClose(HRegion.java:1032)
> > > > at
> > > org.apache.hadoop.hbase.regionserver.HRegion.close(HRegion.java:980)
> > > > at
> > > org.apache.hadoop.hbase.regionserver.HRegion.close(HRegion.java:951)
> > > > at
> > >
> >
> org.apache.hadoop.hbase.master.MasterFileSystem.bootstrap(MasterFileSystem.java:523)
> > > > at
> > >
> >
> org.apache.hadoop.hbase.master.MasterFileSystem.checkRootDir(MasterFileSystem.java:463)
> > > > at
> > >
> >
> org.apache.hadoop.hbase.master.MasterFileSystem.createInitialFileSystemLayout(MasterFileSystem.java:148)
> > > > at
> > >
> >
> org.apache.hadoop.hbase.master.MasterFileSystem.<init>(MasterFileSystem.java:133)
> > > > at
> > >
> >
> org.apache.hadoop.hbase.master.HMaster.finishInitialization(HMaster.java:549)
> > > > at org.apache.hadoop.hbase.master.HMaster.run(HMaster.java:408)
> > > > at
> > >
> >
> org.apache.hadoop.hbase.master.HMasterCommandLine$LocalHMaster.run(HMasterCommandLine.java:226)
> > > > at java.lang.Thread.run(Thread.java:722)
> > > > Caused by: java.lang.NoClassDefFoundError:
> > > com/google/common/io/NullOutputStream
> > > > at
> > >
> >
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.close(HFileWriterV2.java:375)
> > > > at
> > >
> >
> org.apache.hadoop.hbase.regionserver.StoreFile$Writer.close(StoreFile.java:1299)
> > > > at
> > >
> >
> org.apache.hadoop.hbase.regionserver.Store.internalFlushCache(Store.java:897)
> > > > at
> > > org.apache.hadoop.hbase.regionserver.Store.flushCache(Store.java:778)
> > > > at
> > >
> >
> org.apache.hadoop.hbase.regionserver.Store$StoreFlusherImpl.flushCache(Store.java:2290)
> > > > at
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegion.internalFlushcache(HRegion.java:1621)
> > > > ... 12 more
> > > > Caused by: java.lang.ClassNotFoundException:
> > > com.google.common.io.NullOutputStream
> > > > at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
> > > > at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
> > > > at java.security.AccessController.doPrivileged(Native Method)
> > > > at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
> > > > at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
> > > > at
> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
> > > > at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
> > > > ... 18 more
> > > >
> > > >
> > >
> >
>
Re: Guava 15
Posted by Ted Yu <yu...@gmail.com>.
I created HBASE-10174 and attached a patch there.
Running 0.94 test suite now.
On Mon, Dec 16, 2013 at 7:05 AM, Nicolas Liochon <nk...@gmail.com> wrote:
> That means more or less backporting the patch to the 0.94, no?
> It should work imho.
>
>
>
>
> On Mon, Dec 16, 2013 at 3:16 PM, Kristoffer Sjögren <stoffe@gmail.com
> >wrote:
>
> > Thanks! But we cant really upgrade to HBase 0.96 right now, but we need
> to
> > go to Guava 15 :-(
> >
> > I was thinking of overriding the classes fixed in the patch in our test
> > environment.
> >
> > Could this work maybe?
> >
> >
> > On Mon, Dec 16, 2013 at 11:01 AM, Kristoffer Sjögren <stoffe@gmail.com
> > >wrote:
> >
> > > Hi
> > >
> > > At the moment HFileWriterV2.close breaks at startup when using Guava
> 15.
> > > This is not a client problem - it happens because we start a master
> node
> > to
> > > do integration tests.
> > >
> > > A bit precarious and wonder if there are any plans to support Guava 15,
> > or
> > > if there are clever way around this?
> > >
> > > Cheers,
> > > -Kristoffer
> > >
> > > org.apache.hadoop.hbase.DroppedSnapshotException: region: -ROOT-,,0
> > > at
> >
> org.apache.hadoop.hbase.regionserver.HRegion.internalFlushcache(HRegion.java:1646)
> > > at
> >
> org.apache.hadoop.hbase.regionserver.HRegion.internalFlushcache(HRegion.java:1514)
> > > at
> > org.apache.hadoop.hbase.regionserver.HRegion.doClose(HRegion.java:1032)
> > > at
> > org.apache.hadoop.hbase.regionserver.HRegion.close(HRegion.java:980)
> > > at
> > org.apache.hadoop.hbase.regionserver.HRegion.close(HRegion.java:951)
> > > at
> >
> org.apache.hadoop.hbase.master.MasterFileSystem.bootstrap(MasterFileSystem.java:523)
> > > at
> >
> org.apache.hadoop.hbase.master.MasterFileSystem.checkRootDir(MasterFileSystem.java:463)
> > > at
> >
> org.apache.hadoop.hbase.master.MasterFileSystem.createInitialFileSystemLayout(MasterFileSystem.java:148)
> > > at
> >
> org.apache.hadoop.hbase.master.MasterFileSystem.<init>(MasterFileSystem.java:133)
> > > at
> >
> org.apache.hadoop.hbase.master.HMaster.finishInitialization(HMaster.java:549)
> > > at org.apache.hadoop.hbase.master.HMaster.run(HMaster.java:408)
> > > at
> >
> org.apache.hadoop.hbase.master.HMasterCommandLine$LocalHMaster.run(HMasterCommandLine.java:226)
> > > at java.lang.Thread.run(Thread.java:722)
> > > Caused by: java.lang.NoClassDefFoundError:
> > com/google/common/io/NullOutputStream
> > > at
> >
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.close(HFileWriterV2.java:375)
> > > at
> >
> org.apache.hadoop.hbase.regionserver.StoreFile$Writer.close(StoreFile.java:1299)
> > > at
> >
> org.apache.hadoop.hbase.regionserver.Store.internalFlushCache(Store.java:897)
> > > at
> > org.apache.hadoop.hbase.regionserver.Store.flushCache(Store.java:778)
> > > at
> >
> org.apache.hadoop.hbase.regionserver.Store$StoreFlusherImpl.flushCache(Store.java:2290)
> > > at
> >
> org.apache.hadoop.hbase.regionserver.HRegion.internalFlushcache(HRegion.java:1621)
> > > ... 12 more
> > > Caused by: java.lang.ClassNotFoundException:
> > com.google.common.io.NullOutputStream
> > > at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
> > > at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
> > > at java.security.AccessController.doPrivileged(Native Method)
> > > at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
> > > at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
> > > at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
> > > at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
> > > ... 18 more
> > >
> > >
> >
>
Re: Guava 15
Posted by Nicolas Liochon <nk...@gmail.com>.
That means more or less backporting the patch to the 0.94, no?
It should work imho.
On Mon, Dec 16, 2013 at 3:16 PM, Kristoffer Sjögren <st...@gmail.com>wrote:
> Thanks! But we cant really upgrade to HBase 0.96 right now, but we need to
> go to Guava 15 :-(
>
> I was thinking of overriding the classes fixed in the patch in our test
> environment.
>
> Could this work maybe?
>
>
> On Mon, Dec 16, 2013 at 11:01 AM, Kristoffer Sjögren <stoffe@gmail.com
> >wrote:
>
> > Hi
> >
> > At the moment HFileWriterV2.close breaks at startup when using Guava 15.
> > This is not a client problem - it happens because we start a master node
> to
> > do integration tests.
> >
> > A bit precarious and wonder if there are any plans to support Guava 15,
> or
> > if there are clever way around this?
> >
> > Cheers,
> > -Kristoffer
> >
> > org.apache.hadoop.hbase.DroppedSnapshotException: region: -ROOT-,,0
> > at
> org.apache.hadoop.hbase.regionserver.HRegion.internalFlushcache(HRegion.java:1646)
> > at
> org.apache.hadoop.hbase.regionserver.HRegion.internalFlushcache(HRegion.java:1514)
> > at
> org.apache.hadoop.hbase.regionserver.HRegion.doClose(HRegion.java:1032)
> > at
> org.apache.hadoop.hbase.regionserver.HRegion.close(HRegion.java:980)
> > at
> org.apache.hadoop.hbase.regionserver.HRegion.close(HRegion.java:951)
> > at
> org.apache.hadoop.hbase.master.MasterFileSystem.bootstrap(MasterFileSystem.java:523)
> > at
> org.apache.hadoop.hbase.master.MasterFileSystem.checkRootDir(MasterFileSystem.java:463)
> > at
> org.apache.hadoop.hbase.master.MasterFileSystem.createInitialFileSystemLayout(MasterFileSystem.java:148)
> > at
> org.apache.hadoop.hbase.master.MasterFileSystem.<init>(MasterFileSystem.java:133)
> > at
> org.apache.hadoop.hbase.master.HMaster.finishInitialization(HMaster.java:549)
> > at org.apache.hadoop.hbase.master.HMaster.run(HMaster.java:408)
> > at
> org.apache.hadoop.hbase.master.HMasterCommandLine$LocalHMaster.run(HMasterCommandLine.java:226)
> > at java.lang.Thread.run(Thread.java:722)
> > Caused by: java.lang.NoClassDefFoundError:
> com/google/common/io/NullOutputStream
> > at
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.close(HFileWriterV2.java:375)
> > at
> org.apache.hadoop.hbase.regionserver.StoreFile$Writer.close(StoreFile.java:1299)
> > at
> org.apache.hadoop.hbase.regionserver.Store.internalFlushCache(Store.java:897)
> > at
> org.apache.hadoop.hbase.regionserver.Store.flushCache(Store.java:778)
> > at
> org.apache.hadoop.hbase.regionserver.Store$StoreFlusherImpl.flushCache(Store.java:2290)
> > at
> org.apache.hadoop.hbase.regionserver.HRegion.internalFlushcache(HRegion.java:1621)
> > ... 12 more
> > Caused by: java.lang.ClassNotFoundException:
> com.google.common.io.NullOutputStream
> > at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
> > at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
> > at java.security.AccessController.doPrivileged(Native Method)
> > at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
> > at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
> > at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
> > at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
> > ... 18 more
> >
> >
>
Re: Guava 15
Posted by Kristoffer Sjögren <st...@gmail.com>.
Thanks! But we cant really upgrade to HBase 0.96 right now, but we need to
go to Guava 15 :-(
I was thinking of overriding the classes fixed in the patch in our test
environment.
Could this work maybe?
On Mon, Dec 16, 2013 at 11:01 AM, Kristoffer Sjögren <st...@gmail.com>wrote:
> Hi
>
> At the moment HFileWriterV2.close breaks at startup when using Guava 15.
> This is not a client problem - it happens because we start a master node to
> do integration tests.
>
> A bit precarious and wonder if there are any plans to support Guava 15, or
> if there are clever way around this?
>
> Cheers,
> -Kristoffer
>
> org.apache.hadoop.hbase.DroppedSnapshotException: region: -ROOT-,,0
> at org.apache.hadoop.hbase.regionserver.HRegion.internalFlushcache(HRegion.java:1646)
> at org.apache.hadoop.hbase.regionserver.HRegion.internalFlushcache(HRegion.java:1514)
> at org.apache.hadoop.hbase.regionserver.HRegion.doClose(HRegion.java:1032)
> at org.apache.hadoop.hbase.regionserver.HRegion.close(HRegion.java:980)
> at org.apache.hadoop.hbase.regionserver.HRegion.close(HRegion.java:951)
> at org.apache.hadoop.hbase.master.MasterFileSystem.bootstrap(MasterFileSystem.java:523)
> at org.apache.hadoop.hbase.master.MasterFileSystem.checkRootDir(MasterFileSystem.java:463)
> at org.apache.hadoop.hbase.master.MasterFileSystem.createInitialFileSystemLayout(MasterFileSystem.java:148)
> at org.apache.hadoop.hbase.master.MasterFileSystem.<init>(MasterFileSystem.java:133)
> at org.apache.hadoop.hbase.master.HMaster.finishInitialization(HMaster.java:549)
> at org.apache.hadoop.hbase.master.HMaster.run(HMaster.java:408)
> at org.apache.hadoop.hbase.master.HMasterCommandLine$LocalHMaster.run(HMasterCommandLine.java:226)
> at java.lang.Thread.run(Thread.java:722)
> Caused by: java.lang.NoClassDefFoundError: com/google/common/io/NullOutputStream
> at org.apache.hadoop.hbase.io.hfile.HFileWriterV2.close(HFileWriterV2.java:375)
> at org.apache.hadoop.hbase.regionserver.StoreFile$Writer.close(StoreFile.java:1299)
> at org.apache.hadoop.hbase.regionserver.Store.internalFlushCache(Store.java:897)
> at org.apache.hadoop.hbase.regionserver.Store.flushCache(Store.java:778)
> at org.apache.hadoop.hbase.regionserver.Store$StoreFlusherImpl.flushCache(Store.java:2290)
> at org.apache.hadoop.hbase.regionserver.HRegion.internalFlushcache(HRegion.java:1621)
> ... 12 more
> Caused by: java.lang.ClassNotFoundException: com.google.common.io.NullOutputStream
> at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
> at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
> at java.security.AccessController.doPrivileged(Native Method)
> at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
> ... 18 more
>
>
Re: Guava 15
Posted by Nicolas Liochon <nk...@gmail.com>.
Hi,
It's fixed in HBase 0.96 (by HBASE-9667).
Cheers,
Nicolas
On Mon, Dec 16, 2013 at 11:01 AM, Kristoffer Sjögren <st...@gmail.com>wrote:
> Hi
>
> At the moment HFileWriterV2.close breaks at startup when using Guava 15.
> This is not a client problem - it happens because we start a master node to
> do integration tests.
>
> A bit precarious and wonder if there are any plans to support Guava 15, or
> if there are clever way around this?
>
> Cheers,
> -Kristoffer
>
> org.apache.hadoop.hbase.DroppedSnapshotException: region: -ROOT-,,0
> at
> org.apache.hadoop.hbase.regionserver.HRegion.internalFlushcache(HRegion.java:1646)
> at
> org.apache.hadoop.hbase.regionserver.HRegion.internalFlushcache(HRegion.java:1514)
> at
> org.apache.hadoop.hbase.regionserver.HRegion.doClose(HRegion.java:1032)
> at
> org.apache.hadoop.hbase.regionserver.HRegion.close(HRegion.java:980)
> at
> org.apache.hadoop.hbase.regionserver.HRegion.close(HRegion.java:951)
> at
> org.apache.hadoop.hbase.master.MasterFileSystem.bootstrap(MasterFileSystem.java:523)
> at
> org.apache.hadoop.hbase.master.MasterFileSystem.checkRootDir(MasterFileSystem.java:463)
> at
> org.apache.hadoop.hbase.master.MasterFileSystem.createInitialFileSystemLayout(MasterFileSystem.java:148)
> at
> org.apache.hadoop.hbase.master.MasterFileSystem.<init>(MasterFileSystem.java:133)
> at
> org.apache.hadoop.hbase.master.HMaster.finishInitialization(HMaster.java:549)
> at org.apache.hadoop.hbase.master.HMaster.run(HMaster.java:408)
> at
> org.apache.hadoop.hbase.master.HMasterCommandLine$LocalHMaster.run(HMasterCommandLine.java:226)
> at java.lang.Thread.run(Thread.java:722)
> Caused by: java.lang.NoClassDefFoundError:
> com/google/common/io/NullOutputStream
> at
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.close(HFileWriterV2.java:375)
> at
> org.apache.hadoop.hbase.regionserver.StoreFile$Writer.close(StoreFile.java:1299)
> at
> org.apache.hadoop.hbase.regionserver.Store.internalFlushCache(Store.java:897)
> at
> org.apache.hadoop.hbase.regionserver.Store.flushCache(Store.java:778)
> at
> org.apache.hadoop.hbase.regionserver.Store$StoreFlusherImpl.flushCache(Store.java:2290)
> at
> org.apache.hadoop.hbase.regionserver.HRegion.internalFlushcache(HRegion.java:1621)
> ... 12 more
> Caused by: java.lang.ClassNotFoundException:
> com.google.common.io.NullOutputStream
> at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
> at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
> at java.security.AccessController.doPrivileged(Native Method)
> at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
> ... 18 more
>