You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@phoenix.apache.org by "larsh@apache.org" <la...@apache.org> on 2021/03/02 01:46:40 UTC
Running tests locally
My apologies if this is dumb question - I've been working on other projects for a while, now coming back to some Phoenix work.
I'm trying to simply run mvn test -Dtest=LocalIndexIT locally.
And the test always fails with:
[ERROR] org.apache.phoenix.end2end.index.LocalIndexIT Time elapsed: 39.412 s <<< ERROR!
java.lang.RuntimeException: java.io.IOException: Shutting down
at org.apache.phoenix.query.BaseTest.initMiniCluster(BaseTest.java:549)
at org.apache.phoenix.query.BaseTest.setUpTestCluster(BaseTest.java:449)
at org.apache.phoenix.query.BaseTest.checkClusterInitialized(BaseTest.java:435)
at org.apache.phoenix.query.BaseTest.setUpTestDriver(BaseTest.java:517)
at org.apache.phoenix.end2end.index.BaseLocalIndexIT.doSetup(BaseLocalIndexIT.java:65)
And in the logs I see this:
2021-03-01 17:37:56,572 ERROR [master/think:0:becomeActiveMaster] org.slf4j.helpers.MarkerIgnoringBase(159): Failed to become active master
java.lang.IncompatibleClassChangeError: Found interface org.apache.hadoop.hdfs.protocol.HdfsFileStatus, but class was expected
at org.apache.hadoop.hbase.io.asyncfs.FanOutOneBlockAsyncDFSOutputHelper.createOutput(FanOutOneBlockAsyncDFSOutputHelper.java:536)
at org.apache.hadoop.hbase.io.asyncfs.FanOutOneBlockAsyncDFSOutputHelper.access$400(FanOutOneBlockAsyncDFSOutputHelper.java:112)
at org.apache.hadoop.hbase.io.asyncfs.FanOutOneBlockAsyncDFSOutputHelper$8.doCall(FanOutOneBlockAsyncDFSOutputHelper.java:616)
at org.apache.hadoop.hbase.io.asyncfs.FanOutOneBlockAsyncDFSOutputHelper$8.doCall(FanOutOneBlockAsyncDFSOutputHelper.java:611)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hbase.io.asyncfs.FanOutOneBlockAsyncDFSOutputHelper.createOutput(FanOutOneBlockAsyncDFSOutputHelper.java:624)
at org.apache.hadoop.hbase.io.asyncfs.AsyncFSOutputHelper.createOutput(AsyncFSOutputHelper.java:53)
at org.apache.hadoop.hbase.regionserver.wal.AsyncProtobufLogWriter.initOutput(AsyncProtobufLogWriter.java:180)
at org.apache.hadoop.hbase.regionserver.wal.AbstractProtobufLogWriter.init(AbstractProtobufLogWriter.java:166)
at org.apache.hadoop.hbase.wal.AsyncFSWALProvider.createAsyncWriter(AsyncFSWALProvider.java:113)
at org.apache.hadoop.hbase.regionserver.wal.AsyncFSWAL.createWriterInstance(AsyncFSWAL.java:662)
at org.apache.hadoop.hbase.regionserver.wal.AsyncFSWAL.createWriterInstance(AsyncFSWAL.java:130)
at org.apache.hadoop.hbase.regionserver.wal.AbstractFSWAL.rollWriter(AbstractFSWAL.java:848)
at org.apache.hadoop.hbase.regionserver.wal.AbstractFSWAL.rollWriter(AbstractFSWAL.java:551)
at org.apache.hadoop.hbase.regionserver.wal.AbstractFSWAL.init(AbstractFSWAL.java:492)
at org.apache.hadoop.hbase.wal.AbstractFSWALProvider.getWAL(AbstractFSWALProvider.java:161)
at org.apache.hadoop.hbase.wal.AbstractFSWALProvider.getWAL(AbstractFSWALProvider.java:63)
at org.apache.hadoop.hbase.wal.WALFactory.getWAL(WALFactory.java:296)
at org.apache.hadoop.hbase.master.region.MasterRegion.createWAL(MasterRegion.java:187)
at org.apache.hadoop.hbase.master.region.MasterRegion.bootstrap(MasterRegion.java:207)
at org.apache.hadoop.hbase.master.region.MasterRegion.create(MasterRegion.java:307)
at org.apache.hadoop.hbase.master.region.MasterRegionFactory.create(MasterRegionFactory.java:104)
at org.apache.hadoop.hbase.master.HMaster.finishActiveMasterInitialization(HMaster.java:827)
at org.apache.hadoop.hbase.master.HMaster.startActiveMasterManager(HMaster.java:2082)
at org.apache.hadoop.hbase.master.HMaster.lambda$run$0(HMaster.java:506)
at java.base/java.lang.Thread.run(Thread.java:834)
So it looks like it pulling the wrong version of Hadoop for running tests.
Am I the only one seeing this?
As I said my apologies if I am missing something. Has been a while.
Cheers.
-- Lars
Re: Running tests locally
Posted by "larsh@apache.org" <la...@apache.org>.
I see. Thanks. I thought at least the tests would work.
On Monday, March 1, 2021, 10:00:16 PM PST, Viraj Jasani <vj...@apache.org> wrote:
Lars, IncompatibleClassChangeError is expected unless
HBase 2.x is built with Hadoop.profile 3.0 specifically.
https://github.com/apache/phoenix/blob/master/BUILDING.md
On 2021/03/02 02:03:03, "larsh@apache.org" <la...@apache.org> wrote:
> AHA...
> Looks like hadoop-ci is failing with the same exception, since Feb 18th, looks like PHOENIX-6359 is the culprit.
>
> So it's not me.
>
>
> On Monday, March 1, 2021, 5:46:51 PM PST, larsh@apache.org <la...@apache.org> wrote:
>
>
>
>
>
> My apologies if this is dumb question - I've been working on other projects for a while, now coming back to some Phoenix work.
>
> I'm trying to simply run mvn test -Dtest=LocalIndexIT locally.
>
> And the test always fails with:
> [ERROR] org.apache.phoenix.end2end.index.LocalIndexIT Time elapsed: 39.412 s <<< ERROR!
> java.lang.RuntimeException: java.io.IOException: Shutting down
> at org.apache.phoenix.query.BaseTest.initMiniCluster(BaseTest.java:549)
> at org.apache.phoenix.query.BaseTest.setUpTestCluster(BaseTest.java:449)
> at org.apache.phoenix.query.BaseTest.checkClusterInitialized(BaseTest.java:435)
> at org.apache.phoenix.query.BaseTest.setUpTestDriver(BaseTest.java:517)
> at org.apache.phoenix.end2end.index.BaseLocalIndexIT.doSetup(BaseLocalIndexIT.java:65)
>
> And in the logs I see this:
> 2021-03-01 17:37:56,572 ERROR [master/think:0:becomeActiveMaster] org.slf4j.helpers.MarkerIgnoringBase(159): Failed to become active master
> java.lang.IncompatibleClassChangeError: Found interface org.apache.hadoop.hdfs.protocol.HdfsFileStatus, but class was expected
> at org.apache.hadoop.hbase.io.asyncfs.FanOutOneBlockAsyncDFSOutputHelper.createOutput(FanOutOneBlockAsyncDFSOutputHelper.java:536)
> at org.apache.hadoop.hbase.io.asyncfs.FanOutOneBlockAsyncDFSOutputHelper.access$400(FanOutOneBlockAsyncDFSOutputHelper.java:112)
> at org.apache.hadoop.hbase.io.asyncfs.FanOutOneBlockAsyncDFSOutputHelper$8.doCall(FanOutOneBlockAsyncDFSOutputHelper.java:616)
> at org.apache.hadoop.hbase.io.asyncfs.FanOutOneBlockAsyncDFSOutputHelper$8.doCall(FanOutOneBlockAsyncDFSOutputHelper.java:611)
> at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
> at org.apache.hadoop.hbase.io.asyncfs.FanOutOneBlockAsyncDFSOutputHelper.createOutput(FanOutOneBlockAsyncDFSOutputHelper.java:624)
> at org.apache.hadoop.hbase.io.asyncfs.AsyncFSOutputHelper.createOutput(AsyncFSOutputHelper.java:53)
> at org.apache.hadoop.hbase.regionserver.wal.AsyncProtobufLogWriter.initOutput(AsyncProtobufLogWriter.java:180)
> at org.apache.hadoop.hbase.regionserver.wal.AbstractProtobufLogWriter.init(AbstractProtobufLogWriter.java:166)
> at org.apache.hadoop.hbase.wal.AsyncFSWALProvider.createAsyncWriter(AsyncFSWALProvider.java:113)
> at org.apache.hadoop.hbase.regionserver.wal.AsyncFSWAL.createWriterInstance(AsyncFSWAL.java:662)
> at org.apache.hadoop.hbase.regionserver.wal.AsyncFSWAL.createWriterInstance(AsyncFSWAL.java:130)
> at org.apache.hadoop.hbase.regionserver.wal.AbstractFSWAL.rollWriter(AbstractFSWAL.java:848)
> at org.apache.hadoop.hbase.regionserver.wal.AbstractFSWAL.rollWriter(AbstractFSWAL.java:551)
> at org.apache.hadoop.hbase.regionserver.wal.AbstractFSWAL.init(AbstractFSWAL.java:492)
> at org.apache.hadoop.hbase.wal.AbstractFSWALProvider.getWAL(AbstractFSWALProvider.java:161)
> at org.apache.hadoop.hbase.wal.AbstractFSWALProvider.getWAL(AbstractFSWALProvider.java:63)
> at org.apache.hadoop.hbase.wal.WALFactory.getWAL(WALFactory.java:296)
> at org.apache.hadoop.hbase.master.region.MasterRegion.createWAL(MasterRegion.java:187)
> at org.apache.hadoop.hbase.master.region.MasterRegion.bootstrap(MasterRegion.java:207)
> at org.apache.hadoop.hbase.master.region.MasterRegion.create(MasterRegion.java:307)
> at org.apache.hadoop.hbase.master.region.MasterRegionFactory.create(MasterRegionFactory.java:104)
> at org.apache.hadoop.hbase.master.HMaster.finishActiveMasterInitialization(HMaster.java:827)
> at org.apache.hadoop.hbase.master.HMaster.startActiveMasterManager(HMaster.java:2082)
> at org.apache.hadoop.hbase.master.HMaster.lambda$run$0(HMaster.java:506)
> at java.base/java.lang.Thread.run(Thread.java:834)
>
> So it looks like it pulling the wrong version of Hadoop for running tests.
> Am I the only one seeing this?
> As I said my apologies if I am missing something. Has been a while.
>
> Cheers.
>
> -- Lars
>
Re: Running tests locally
Posted by Viraj Jasani <vj...@apache.org>.
Lars, IncompatibleClassChangeError is expected unless
HBase 2.x is built with Hadoop.profile 3.0 specifically.
https://github.com/apache/phoenix/blob/master/BUILDING.md
On 2021/03/02 02:03:03, "larsh@apache.org" <la...@apache.org> wrote:
> AHA...
> Looks like hadoop-ci is failing with the same exception, since Feb 18th, looks like PHOENIX-6359 is the culprit.
>
> So it's not me.
>
>
> On Monday, March 1, 2021, 5:46:51 PM PST, larsh@apache.org <la...@apache.org> wrote:
>
>
>
>
>
> My apologies if this is dumb question - I've been working on other projects for a while, now coming back to some Phoenix work.
>
> I'm trying to simply run mvn test -Dtest=LocalIndexIT locally.
>
> And the test always fails with:
> [ERROR] org.apache.phoenix.end2end.index.LocalIndexIT Time elapsed: 39.412 s <<< ERROR!
> java.lang.RuntimeException: java.io.IOException: Shutting down
> at org.apache.phoenix.query.BaseTest.initMiniCluster(BaseTest.java:549)
> at org.apache.phoenix.query.BaseTest.setUpTestCluster(BaseTest.java:449)
> at org.apache.phoenix.query.BaseTest.checkClusterInitialized(BaseTest.java:435)
> at org.apache.phoenix.query.BaseTest.setUpTestDriver(BaseTest.java:517)
> at org.apache.phoenix.end2end.index.BaseLocalIndexIT.doSetup(BaseLocalIndexIT.java:65)
>
> And in the logs I see this:
> 2021-03-01 17:37:56,572 ERROR [master/think:0:becomeActiveMaster] org.slf4j.helpers.MarkerIgnoringBase(159): Failed to become active master
> java.lang.IncompatibleClassChangeError: Found interface org.apache.hadoop.hdfs.protocol.HdfsFileStatus, but class was expected
> at org.apache.hadoop.hbase.io.asyncfs.FanOutOneBlockAsyncDFSOutputHelper.createOutput(FanOutOneBlockAsyncDFSOutputHelper.java:536)
> at org.apache.hadoop.hbase.io.asyncfs.FanOutOneBlockAsyncDFSOutputHelper.access$400(FanOutOneBlockAsyncDFSOutputHelper.java:112)
> at org.apache.hadoop.hbase.io.asyncfs.FanOutOneBlockAsyncDFSOutputHelper$8.doCall(FanOutOneBlockAsyncDFSOutputHelper.java:616)
> at org.apache.hadoop.hbase.io.asyncfs.FanOutOneBlockAsyncDFSOutputHelper$8.doCall(FanOutOneBlockAsyncDFSOutputHelper.java:611)
> at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
> at org.apache.hadoop.hbase.io.asyncfs.FanOutOneBlockAsyncDFSOutputHelper.createOutput(FanOutOneBlockAsyncDFSOutputHelper.java:624)
> at org.apache.hadoop.hbase.io.asyncfs.AsyncFSOutputHelper.createOutput(AsyncFSOutputHelper.java:53)
> at org.apache.hadoop.hbase.regionserver.wal.AsyncProtobufLogWriter.initOutput(AsyncProtobufLogWriter.java:180)
> at org.apache.hadoop.hbase.regionserver.wal.AbstractProtobufLogWriter.init(AbstractProtobufLogWriter.java:166)
> at org.apache.hadoop.hbase.wal.AsyncFSWALProvider.createAsyncWriter(AsyncFSWALProvider.java:113)
> at org.apache.hadoop.hbase.regionserver.wal.AsyncFSWAL.createWriterInstance(AsyncFSWAL.java:662)
> at org.apache.hadoop.hbase.regionserver.wal.AsyncFSWAL.createWriterInstance(AsyncFSWAL.java:130)
> at org.apache.hadoop.hbase.regionserver.wal.AbstractFSWAL.rollWriter(AbstractFSWAL.java:848)
> at org.apache.hadoop.hbase.regionserver.wal.AbstractFSWAL.rollWriter(AbstractFSWAL.java:551)
> at org.apache.hadoop.hbase.regionserver.wal.AbstractFSWAL.init(AbstractFSWAL.java:492)
> at org.apache.hadoop.hbase.wal.AbstractFSWALProvider.getWAL(AbstractFSWALProvider.java:161)
> at org.apache.hadoop.hbase.wal.AbstractFSWALProvider.getWAL(AbstractFSWALProvider.java:63)
> at org.apache.hadoop.hbase.wal.WALFactory.getWAL(WALFactory.java:296)
> at org.apache.hadoop.hbase.master.region.MasterRegion.createWAL(MasterRegion.java:187)
> at org.apache.hadoop.hbase.master.region.MasterRegion.bootstrap(MasterRegion.java:207)
> at org.apache.hadoop.hbase.master.region.MasterRegion.create(MasterRegion.java:307)
> at org.apache.hadoop.hbase.master.region.MasterRegionFactory.create(MasterRegionFactory.java:104)
> at org.apache.hadoop.hbase.master.HMaster.finishActiveMasterInitialization(HMaster.java:827)
> at org.apache.hadoop.hbase.master.HMaster.startActiveMasterManager(HMaster.java:2082)
> at org.apache.hadoop.hbase.master.HMaster.lambda$run$0(HMaster.java:506)
> at java.base/java.lang.Thread.run(Thread.java:834)
>
> So it looks like it pulling the wrong version of Hadoop for running tests.
> Am I the only one seeing this?
> As I said my apologies if I am missing something. Has been a while.
>
> Cheers.
>
> -- Lars
>
Re: Running tests locally
Posted by "larsh@apache.org" <la...@apache.org>.
AHA...
Looks like hadoop-ci is failing with the same exception, since Feb 18th, looks like PHOENIX-6359 is the culprit.
So it's not me.
On Monday, March 1, 2021, 5:46:51 PM PST, larsh@apache.org <la...@apache.org> wrote:
My apologies if this is dumb question - I've been working on other projects for a while, now coming back to some Phoenix work.
I'm trying to simply run mvn test -Dtest=LocalIndexIT locally.
And the test always fails with:
[ERROR] org.apache.phoenix.end2end.index.LocalIndexIT Time elapsed: 39.412 s <<< ERROR!
java.lang.RuntimeException: java.io.IOException: Shutting down
at org.apache.phoenix.query.BaseTest.initMiniCluster(BaseTest.java:549)
at org.apache.phoenix.query.BaseTest.setUpTestCluster(BaseTest.java:449)
at org.apache.phoenix.query.BaseTest.checkClusterInitialized(BaseTest.java:435)
at org.apache.phoenix.query.BaseTest.setUpTestDriver(BaseTest.java:517)
at org.apache.phoenix.end2end.index.BaseLocalIndexIT.doSetup(BaseLocalIndexIT.java:65)
And in the logs I see this:
2021-03-01 17:37:56,572 ERROR [master/think:0:becomeActiveMaster] org.slf4j.helpers.MarkerIgnoringBase(159): Failed to become active master
java.lang.IncompatibleClassChangeError: Found interface org.apache.hadoop.hdfs.protocol.HdfsFileStatus, but class was expected
at org.apache.hadoop.hbase.io.asyncfs.FanOutOneBlockAsyncDFSOutputHelper.createOutput(FanOutOneBlockAsyncDFSOutputHelper.java:536)
at org.apache.hadoop.hbase.io.asyncfs.FanOutOneBlockAsyncDFSOutputHelper.access$400(FanOutOneBlockAsyncDFSOutputHelper.java:112)
at org.apache.hadoop.hbase.io.asyncfs.FanOutOneBlockAsyncDFSOutputHelper$8.doCall(FanOutOneBlockAsyncDFSOutputHelper.java:616)
at org.apache.hadoop.hbase.io.asyncfs.FanOutOneBlockAsyncDFSOutputHelper$8.doCall(FanOutOneBlockAsyncDFSOutputHelper.java:611)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hbase.io.asyncfs.FanOutOneBlockAsyncDFSOutputHelper.createOutput(FanOutOneBlockAsyncDFSOutputHelper.java:624)
at org.apache.hadoop.hbase.io.asyncfs.AsyncFSOutputHelper.createOutput(AsyncFSOutputHelper.java:53)
at org.apache.hadoop.hbase.regionserver.wal.AsyncProtobufLogWriter.initOutput(AsyncProtobufLogWriter.java:180)
at org.apache.hadoop.hbase.regionserver.wal.AbstractProtobufLogWriter.init(AbstractProtobufLogWriter.java:166)
at org.apache.hadoop.hbase.wal.AsyncFSWALProvider.createAsyncWriter(AsyncFSWALProvider.java:113)
at org.apache.hadoop.hbase.regionserver.wal.AsyncFSWAL.createWriterInstance(AsyncFSWAL.java:662)
at org.apache.hadoop.hbase.regionserver.wal.AsyncFSWAL.createWriterInstance(AsyncFSWAL.java:130)
at org.apache.hadoop.hbase.regionserver.wal.AbstractFSWAL.rollWriter(AbstractFSWAL.java:848)
at org.apache.hadoop.hbase.regionserver.wal.AbstractFSWAL.rollWriter(AbstractFSWAL.java:551)
at org.apache.hadoop.hbase.regionserver.wal.AbstractFSWAL.init(AbstractFSWAL.java:492)
at org.apache.hadoop.hbase.wal.AbstractFSWALProvider.getWAL(AbstractFSWALProvider.java:161)
at org.apache.hadoop.hbase.wal.AbstractFSWALProvider.getWAL(AbstractFSWALProvider.java:63)
at org.apache.hadoop.hbase.wal.WALFactory.getWAL(WALFactory.java:296)
at org.apache.hadoop.hbase.master.region.MasterRegion.createWAL(MasterRegion.java:187)
at org.apache.hadoop.hbase.master.region.MasterRegion.bootstrap(MasterRegion.java:207)
at org.apache.hadoop.hbase.master.region.MasterRegion.create(MasterRegion.java:307)
at org.apache.hadoop.hbase.master.region.MasterRegionFactory.create(MasterRegionFactory.java:104)
at org.apache.hadoop.hbase.master.HMaster.finishActiveMasterInitialization(HMaster.java:827)
at org.apache.hadoop.hbase.master.HMaster.startActiveMasterManager(HMaster.java:2082)
at org.apache.hadoop.hbase.master.HMaster.lambda$run$0(HMaster.java:506)
at java.base/java.lang.Thread.run(Thread.java:834)
So it looks like it pulling the wrong version of Hadoop for running tests.
Am I the only one seeing this?
As I said my apologies if I am missing something. Has been a while.
Cheers.
-- Lars