You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@phoenix.apache.org by Mark Jens <ma...@gmail.com> on 2021/11/17 09:33:02 UTC

Test failure on Linux ARM64

Hello,

I am trying to build Phoenix on Ubuntu 20.04.3 ARM64.

Phoenix Core module fails with:

[ERROR]
org.apache.phoenix.hbase.index.write.recovery.TestPerRegionIndexWriteCache.testMultipleAddsForSingleRegion
 Time elapsed: 0.025 s  <<< ERROR!
java.lang.IncompatibleClassChangeError: Found interface
org.apache.hadoop.hdfs.protocol.HdfsFileStatus, but class was expected
at
org.apache.hadoop.hbase.io.asyncfs.FanOutOneBlockAsyncDFSOutputHelper.createOutput(FanOutOneBlockAsyncDFSOutputHelper.java:536)
at
org.apache.hadoop.hbase.io.asyncfs.FanOutOneBlockAsyncDFSOutputHelper.access$400(FanOutOneBlockAsyncDFSOutputHelper.java:112)
at
org.apache.hadoop.hbase.io.asyncfs.FanOutOneBlockAsyncDFSOutputHelper$8.doCall(FanOutOneBlockAsyncDFSOutputHelper.java:616)
at
org.apache.hadoop.hbase.io.asyncfs.FanOutOneBlockAsyncDFSOutputHelper$8.doCall(FanOutOneBlockAsyncDFSOutputHelper.java:611)
at
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at
org.apache.hadoop.hbase.io.asyncfs.FanOutOneBlockAsyncDFSOutputHelper.createOutput(FanOutOneBlockAsyncDFSOutputHelper.java:624)
at
org.apache.hadoop.hbase.io.asyncfs.AsyncFSOutputHelper.createOutput(AsyncFSOutputHelper.java:53)
at
org.apache.hadoop.hbase.regionserver.wal.AsyncProtobufLogWriter.initOutput(AsyncProtobufLogWriter.java:180)
at
org.apache.hadoop.hbase.regionserver.wal.AbstractProtobufLogWriter.init(AbstractProtobufLogWriter.java:166)
at
org.apache.hadoop.hbase.wal.AsyncFSWALProvider.createAsyncWriter(AsyncFSWALProvider.java:113)
at
org.apache.hadoop.hbase.regionserver.wal.AsyncFSWAL.createWriterInstance(AsyncFSWAL.java:669)
at
org.apache.hadoop.hbase.regionserver.wal.AsyncFSWAL.createWriterInstance(AsyncFSWAL.java:130)
at
org.apache.hadoop.hbase.regionserver.wal.AbstractFSWAL.rollWriter(AbstractFSWAL.java:841)
at
org.apache.hadoop.hbase.regionserver.wal.AbstractFSWAL.rollWriter(AbstractFSWAL.java:548)
at
org.apache.hadoop.hbase.regionserver.wal.AbstractFSWAL.init(AbstractFSWAL.java:489)
at
org.apache.hadoop.hbase.wal.AbstractFSWALProvider.getWAL(AbstractFSWALProvider.java:160)
at
org.apache.hadoop.hbase.wal.AbstractFSWALProvider.getWAL(AbstractFSWALProvider.java:62)
at org.apache.hadoop.hbase.wal.WALFactory.getWAL(WALFactory.java:296)
at
org.apache.phoenix.hbase.index.write.recovery.TestPerRegionIndexWriteCache.setUp(TestPerRegionIndexWriteCache.java:109)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
....
[INFO]
[INFO] Results:
[INFO]
[ERROR] Errors:
[ERROR]   TestPerRegionIndexWriteCache.setUp:109 » IncompatibleClassChange
Found interfa...
[ERROR]   TestPerRegionIndexWriteCache.setUp:109 » IncompatibleClassChange
Found interfa...
[ERROR]   TestPerRegionIndexWriteCache.setUp:109 » IncompatibleClassChange
Found interfa...
[INFO]
[ERROR] Tests run: 1909, Failures: 0, Errors: 3, Skipped: 6
[INFO]
[INFO]
------------------------------------------------------------------------
[INFO] Reactor Summary for Apache Phoenix 5.2.0-SNAPSHOT:
[INFO]
[INFO] Apache Phoenix ..................................... SUCCESS [
 2.034 s]
[INFO] Phoenix Hbase 2.4.1 compatibility .................. SUCCESS [
 5.358 s]
[INFO] Phoenix Hbase 2.4.0 compatibility .................. SUCCESS [
 3.946 s]
[INFO] Phoenix Hbase 2.3.0 compatibility .................. SUCCESS [
 4.437 s]
[INFO] Phoenix Hbase 2.2.5 compatibility .................. SUCCESS [
 4.004 s]
[INFO] Phoenix Hbase 2.1.6 compatibility .................. SUCCESS [
 3.966 s]
[INFO] Phoenix Core ....................................... FAILURE [01:25
min]
[INFO] Phoenix - Pherf .................................... SKIPPED
...

Any idea why this breaks ?

It does not look ARM64 specific to me. I will try on x64 too.

Thanks!

Mark

Re: Test failure on Linux ARM64

Posted by Istvan Toth <st...@cloudera.com.INVALID>.
In the meantime, you can try to download the HBase 2.x sources and apply
that patch to them before rebuilding.

On Thu, Nov 18, 2021 at 9:38 AM Istvan Toth <st...@cloudera.com> wrote:

> Phoenix probably won't work with HBase 3.x, but looking at the linked
> commit, it should be fairly straightforward to apply that to HBase 2.4.
>
> I'm not sure why that hasn't been backported to 2.x, perhaps HBase doesn't
> have test infra set up for ARM.
> You may want to discuss backporting that change to the active Hbase 2.x
> branches with the HBase project.
>
>
> On Wed, Nov 17, 2021 at 12:35 PM Mark Jens <ma...@gmail.com> wrote:
>
>> Hi Istvan,
>>
>> It seems HBase support for ARM64 is available only in 3.x:
>>
>> https://github.com/apache/hbase/commit/5480493f5f7b01b496f54215334543f2a82c6ba7
>> Would Phoenix work with HBase 3.x ?
>>
>> On Wed, 17 Nov 2021 at 13:28, Mark Jens <ma...@gmail.com> wrote:
>>
>> > Thanks for the hint!
>> >
>> > Unfortunately HBase-2.4.8 build fails with:
>> >
>> > INFO] BUILD FAILURE
>> > [INFO]
>> > ------------------------------------------------------------------------
>> > [INFO] Total time:  01:42 min
>> > [INFO] Finished at: 2021-11-17T11:23:19Z
>> > [INFO]
>> > ------------------------------------------------------------------------
>> > [ERROR] Failed to execute goal
>> > org.xolstice.maven.plugins:protobuf-maven-plugin:0.6.1:compile
>> > (compile-protoc) on project hbase-protocol: Unable to resolve artifact:
>> > Missing:
>> > [ERROR] ----------
>> > [ERROR] 1) com.google.protobuf:protoc:exe:linux-aarch_64:2.5.0
>> > [ERROR]
>> > [ERROR]   Try downloading the file manually from the project website.
>> > [ERROR]
>> > [ERROR]   Then, install it using the command:
>> > [ERROR]       mvn install:install-file -DgroupId=com.google.protobuf
>> > -DartifactId=protoc -Dversion=2.5.0 -Dclassifier=linux-aarch_64
>> > -Dpackaging=exe -Dfile=/path/to/file
>> > [ERROR]
>> > [ERROR]   Alternatively, if you host your own repository you can deploy
>> > the file there:
>> > [ERROR]       mvn deploy:deploy-file -DgroupId=com.google.protobuf
>> > -DartifactId=protoc -Dversion=2.5.0 -Dclassifier=linux-aarch_64
>> > -Dpackaging=exe -Dfile=/path/to/file -Durl=[url] -DrepositoryId=[id]
>> > [ERROR]
>> > [ERROR]   Path to dependency:
>> > [ERROR]   1) org.apache.hbase:hbase-protocol:jar:2.4.8
>> > [ERROR]   2) com.google.protobuf:protoc:exe:linux-aarch_64:2.5.0
>> >
>> >
>> >
>> >
>> > On Wed, 17 Nov 2021 at 12:27, Istvan Toth <st...@cloudera.com.invalid>
>> > wrote:
>> >
>> >> You need to recompile HBase.
>> >> See BULIDING.md
>> >>
>> >> On Wed, Nov 17, 2021 at 10:33 AM Mark Jens <ma...@gmail.com>
>> wrote:
>> >>
>> >> > Hello,
>> >> >
>> >> > I am trying to build Phoenix on Ubuntu 20.04.3 ARM64.
>> >> >
>> >> > Phoenix Core module fails with:
>> >> >
>> >> > [ERROR]
>> >> >
>> >> >
>> >>
>> org.apache.phoenix.hbase.index.write.recovery.TestPerRegionIndexWriteCache.testMultipleAddsForSingleRegion
>> >> >  Time elapsed: 0.025 s  <<< ERROR!
>> >> > java.lang.IncompatibleClassChangeError: Found interface
>> >> > org.apache.hadoop.hdfs.protocol.HdfsFileStatus, but class was
>> expected
>> >> > at
>> >> > org.apache.hadoop.hbase.io
>> >> >
>> >>
>> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper.createOutput(FanOutOneBlockAsyncDFSOutputHelper.java:536)
>> >> > at
>> >> > org.apache.hadoop.hbase.io
>> >> >
>> >>
>> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper.access$400(FanOutOneBlockAsyncDFSOutputHelper.java:112)
>> >> > at
>> >> > org.apache.hadoop.hbase.io
>> >> >
>> >>
>> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper$8.doCall(FanOutOneBlockAsyncDFSOutputHelper.java:616)
>> >> > at
>> >> > org.apache.hadoop.hbase.io
>> >> >
>> >>
>> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper$8.doCall(FanOutOneBlockAsyncDFSOutputHelper.java:611)
>> >> > at
>> >> >
>> >> >
>> >>
>> org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
>> >> > at
>> >> > org.apache.hadoop.hbase.io
>> >> >
>> >>
>> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper.createOutput(FanOutOneBlockAsyncDFSOutputHelper.java:624)
>> >> > at
>> >> > org.apache.hadoop.hbase.io
>> >> >
>> .asyncfs.AsyncFSOutputHelper.createOutput(AsyncFSOutputHelper.java:53)
>> >> > at
>> >> >
>> >> >
>> >>
>> org.apache.hadoop.hbase.regionserver.wal.AsyncProtobufLogWriter.initOutput(AsyncProtobufLogWriter.java:180)
>> >> > at
>> >> >
>> >> >
>> >>
>> org.apache.hadoop.hbase.regionserver.wal.AbstractProtobufLogWriter.init(AbstractProtobufLogWriter.java:166)
>> >> > at
>> >> >
>> >> >
>> >>
>> org.apache.hadoop.hbase.wal.AsyncFSWALProvider.createAsyncWriter(AsyncFSWALProvider.java:113)
>> >> > at
>> >> >
>> >> >
>> >>
>> org.apache.hadoop.hbase.regionserver.wal.AsyncFSWAL.createWriterInstance(AsyncFSWAL.java:669)
>> >> > at
>> >> >
>> >> >
>> >>
>> org.apache.hadoop.hbase.regionserver.wal.AsyncFSWAL.createWriterInstance(AsyncFSWAL.java:130)
>> >> > at
>> >> >
>> >> >
>> >>
>> org.apache.hadoop.hbase.regionserver.wal.AbstractFSWAL.rollWriter(AbstractFSWAL.java:841)
>> >> > at
>> >> >
>> >> >
>> >>
>> org.apache.hadoop.hbase.regionserver.wal.AbstractFSWAL.rollWriter(AbstractFSWAL.java:548)
>> >> > at
>> >> >
>> >> >
>> >>
>> org.apache.hadoop.hbase.regionserver.wal.AbstractFSWAL.init(AbstractFSWAL.java:489)
>> >> > at
>> >> >
>> >> >
>> >>
>> org.apache.hadoop.hbase.wal.AbstractFSWALProvider.getWAL(AbstractFSWALProvider.java:160)
>> >> > at
>> >> >
>> >> >
>> >>
>> org.apache.hadoop.hbase.wal.AbstractFSWALProvider.getWAL(AbstractFSWALProvider.java:62)
>> >> > at org.apache.hadoop.hbase.wal.WALFactory.getWAL(WALFactory.java:296)
>> >> > at
>> >> >
>> >> >
>> >>
>> org.apache.phoenix.hbase.index.write.recovery.TestPerRegionIndexWriteCache.setUp(TestPerRegionIndexWriteCache.java:109)
>> >> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> >> > ....
>> >> > [INFO]
>> >> > [INFO] Results:
>> >> > [INFO]
>> >> > [ERROR] Errors:
>> >> > [ERROR]   TestPerRegionIndexWriteCache.setUp:109 »
>> >> IncompatibleClassChange
>> >> > Found interfa...
>> >> > [ERROR]   TestPerRegionIndexWriteCache.setUp:109 »
>> >> IncompatibleClassChange
>> >> > Found interfa...
>> >> > [ERROR]   TestPerRegionIndexWriteCache.setUp:109 »
>> >> IncompatibleClassChange
>> >> > Found interfa...
>> >> > [INFO]
>> >> > [ERROR] Tests run: 1909, Failures: 0, Errors: 3, Skipped: 6
>> >> > [INFO]
>> >> > [INFO]
>> >> >
>> ------------------------------------------------------------------------
>> >> > [INFO] Reactor Summary for Apache Phoenix 5.2.0-SNAPSHOT:
>> >> > [INFO]
>> >> > [INFO] Apache Phoenix ..................................... SUCCESS [
>> >> >  2.034 s]
>> >> > [INFO] Phoenix Hbase 2.4.1 compatibility .................. SUCCESS [
>> >> >  5.358 s]
>> >> > [INFO] Phoenix Hbase 2.4.0 compatibility .................. SUCCESS [
>> >> >  3.946 s]
>> >> > [INFO] Phoenix Hbase 2.3.0 compatibility .................. SUCCESS [
>> >> >  4.437 s]
>> >> > [INFO] Phoenix Hbase 2.2.5 compatibility .................. SUCCESS [
>> >> >  4.004 s]
>> >> > [INFO] Phoenix Hbase 2.1.6 compatibility .................. SUCCESS [
>> >> >  3.966 s]
>> >> > [INFO] Phoenix Core ....................................... FAILURE
>> >> [01:25
>> >> > min]
>> >> > [INFO] Phoenix - Pherf .................................... SKIPPED
>> >> > ...
>> >> >
>> >> > Any idea why this breaks ?
>> >> >
>> >> > It does not look ARM64 specific to me. I will try on x64 too.
>> >> >
>> >> > Thanks!
>> >> >
>> >> > Mark
>> >> >
>> >>
>> >>
>> >> --
>> >> *István Tóth* | Staff Software Engineer
>> >> stoty@cloudera.com <https://www.cloudera.com>
>> >> [image: Cloudera] <https://www.cloudera.com/>
>> >> [image: Cloudera on Twitter] <https://twitter.com/cloudera> [image:
>> >> Cloudera on Facebook] <https://www.facebook.com/cloudera> [image:
>> >> Cloudera
>> >> on LinkedIn] <https://www.linkedin.com/company/cloudera>
>> >> <https://www.cloudera.com/>
>> >> ------------------------------
>> >>
>> >
>>
>
>
> --
> *István Tóth* | Staff Software Engineer
> stoty@cloudera.com <https://www.cloudera.com>
> [image: Cloudera] <https://www.cloudera.com/>
> [image: Cloudera on Twitter] <https://twitter.com/cloudera> [image:
> Cloudera on Facebook] <https://www.facebook.com/cloudera> [image:
> Cloudera on LinkedIn] <https://www.linkedin.com/company/cloudera>
> <https://www.cloudera.com/>
> ------------------------------
>


-- 
*István Tóth* | Staff Software Engineer
stoty@cloudera.com <https://www.cloudera.com>
[image: Cloudera] <https://www.cloudera.com/>
[image: Cloudera on Twitter] <https://twitter.com/cloudera> [image:
Cloudera on Facebook] <https://www.facebook.com/cloudera> [image: Cloudera
on LinkedIn] <https://www.linkedin.com/company/cloudera>
<https://www.cloudera.com/>
------------------------------

Re: Test failure on Linux ARM64

Posted by Istvan Toth <st...@cloudera.com.INVALID>.
Hi!
About 2.4.0: That method is definitely there in 2.4.0, It was moved in
2.4.1, that's why we must have a separate 2.4.0 compatibility module.
Try to remove all hbase 2.4.0 artifacts from your local maven cache, so
that fresh ones are downloaded.
Or just ignore that, as disabling that module has no effect on your 2.4.9
build.

The failed test is a known flakey test.
The JVM crash is either a JVM bug or more likely some kind of resource
exhaustion issue, like getting killed by the Linux OOM killer.
Try increasing ram/swap or reducing the numForkIT parameter.
I've had JVM crashes while building on an intel Mac with the latest
8u312-ish Java 8 releases from homebrew, and had to revert to 8u272, but
those were during compiling protobuf stuff.

Istvan

On Tue, Nov 23, 2021 at 12:55 PM Mark Jens <ma...@gmail.com> wrote:

> Hi Istvan,
>
> I have some progress here!
>
> I was able to overcome the above problem by adding -Dhbase24.compat.version
> and  -pl "!phoenix-hbase-compat-2.4.0"
>
> mvn clean install -Dhbase.profile=2.4 -Dhbase.version=2.4.9-SNAPSHOT
> -Dhbase24.compat.version=2.4.9-SNAPSHOT -DskipTests -pl
> "!phoenix-hbase-compat-2.4.0"
>
> Without explicitly setting -Dhbase24.compat.version it was using 2.4.1 and
> I think this was the reason why it was still using HdfsFileStatus
> fromHadoop 2.x.
>
> Without -pl "!phoenix-hbase-compat-2.4.0" I've got this compilation error:
>
> ERROR] Failed to execute goal
> org.apache.maven.plugins:maven-compiler-plugin:3.8.1:compile
> (default-compile) on project phoenix-hbase-compat-2.4.0: Compilation
> failure: Compilation failure:
> [ERROR]
>
> /home/ubuntu/git/apache/phoenix/phoenix-hbase-compat-2.4.0/src/main/java/org/apache/phoenix/compat/hbase/CompatUtil.java:[72,41]
> cannot find symbol
> [ERROR]   symbol:   method
> getBytesPerChecksum(org.apache.hadoop.conf.Configuration)
> [ERROR]   location: class org.apache.hadoop.hbase.regionserver.HStore
> [ERROR]
>
> /home/ubuntu/git/apache/phoenix/phoenix-hbase-compat-2.4.0/src/main/java/org/apache/phoenix/compat/hbase/CompatUtil.java:[71,37]
> cannot find symbol
> [ERROR]   symbol:   method
> getChecksumType(org.apache.hadoop.conf.Configuration)
> [ERROR]   location: class org.apache.hadoop.hbase.regionserver.HStore
> [ERROR]
>
> /home/ubuntu/git/apache/phoenix/phoenix-hbase-compat-2.4.0/src/main/java/org/apache/phoenix/compat/hbase/CompatUtil.java:[136,22]
> cannot find symbol
> [ERROR]   symbol:   method
> getChecksumType(org.apache.hadoop.conf.Configuration)
> [ERROR]   location: class org.apache.hadoop.hbase.regionserver.HStore
> [ERROR]
>
> /home/ubuntu/git/apache/phoenix/phoenix-hbase-compat-2.4.0/src/main/java/org/apache/phoenix/compat/hbase/CompatUtil.java:[140,22]
> cannot find symbol
> [ERROR]   symbol:   method
> getBytesPerChecksum(org.apache.hadoop.conf.Configuration)
> [ERROR]   location: class org.apache.hadoop.hbase.regionserver.HStore
> [ERROR] -> [Help 1]
>
> Then I ran
> mvn verify -Dhbase.profile=2.4 -Dhbase.version=2.4.9-SNAPSHOT
> -Dhbase24.compat.version=2.4.9-SNAPSHOT -pl "!phoenix-hbase-compat-2.4.0"
>
> After 2 hours I've got one test failure:
>
> [ERROR] Errors:
> [ERROR]
>
> PermissionNSEnabledIT>BasePermissionsIT.testSystemTablePermissions:1027->BasePermissionsIT.verifyAllowed:894->BasePermissionsIT.verifyAllowed:901
> » UndeclaredThrowable
> [INFO]
> [ERROR] Tests run: 1550, Failures: 0, Errors: 1, Skipped: 60
>
> and one IT test crashed the JVM:
>
> [ERROR] org.apache.maven.surefire.booter.SurefireBooterForkException:
> ExecutionException The forked VM terminated without properly saying
> goodbye. VM crash or System.exit called?
> [ERROR] Command was /bin/sh -c cd
> /home/ubuntu/git/apache/phoenix/phoenix-core &&
> /usr/lib/jvm/java-8-openjdk-arm64/jre/bin/java
>
> -javaagent:/home/ubuntu/.m2/repository/org/jacoco/org.jacoco.agent/0.8.7/org.jacoco.agent-0.8.7-runtime.jar=destfile=/home/ubuntu/git/apache/phoenix/phoenix-core/target/jacoco.exec
> -Xmx3000m -Djava.security.egd=file:/dev/./urandom
>
> '-Djava.library.path=${hadoop.library.path}:/usr/java/packages/lib/aarch64:/usr/lib/aarch64-linux-gnu/jni:/lib/aarch64-linux-gnu:/usr/lib/aarch64-linux-gnu:/usr/lib/jni:/lib:/usr/lib'
> -XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=./target/ -XX:NewRatio=4
> -XX:SurvivorRatio=8 -XX:+UseCompressedOops -XX:+UseConcMarkSweepGC
> -XX:+DisableExplicitGC -XX:+UseCMSInitiatingOccupancyOnly
> -XX:+CMSClassUnloadingEnabled -XX:+CMSScavengeBeforeRemark
> -XX:CMSInitiatingOccupancyFraction=68 -jar
>
> /home/ubuntu/git/apache/phoenix/phoenix-core/target/surefire/surefirebooter3776996098139745421.jar
> /home/ubuntu/git/apache/phoenix/phoenix-core/target/surefire
> 2021-11-23T09-45-47_283-jvmRun7 surefire8003544437266408760tmp
> surefire_135681795925087748715tmp
> [ERROR] Error occurred in starting fork, check output in log
> [ERROR] Process Exit Code: 137
> [ERROR] ExecutionException The forked VM terminated without properly saying
> goodbye. VM crash or System.exit called?
> [ERROR] Command was /bin/sh -c cd
> /home/ubuntu/git/apache/phoenix/phoenix-core &&
> /usr/lib/jvm/java-8-openjdk-arm64/jre/bin/java
>
> -javaagent:/home/ubuntu/.m2/repository/org/jacoco/org.jacoco.agent/0.8.7/org.jacoco.agent-0.8.7-runtime.jar=destfile=/home/ubuntu/git/apache/phoenix/phoenix-core/target/jacoco.exec
> -Xmx3000m -Djava.security.egd=file:/dev/./urandom
>
> '-Djava.library.path=${hadoop.library.path}:/usr/java/packages/lib/aarch64:/usr/lib/aarch64-linux-gnu/jni:/lib/aarch64-linux-gnu:/usr/lib/aarch64-linux-gnu:/usr/lib/jni:/lib:/usr/lib'
> -XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=./target/ -XX:NewRatio=4
> -XX:SurvivorRatio=8 -XX:+UseCompressedOops -XX:+UseConcMarkSweepGC
> -XX:+DisableExplicitGC -XX:+UseCMSInitiatingOccupancyOnly
> -XX:+CMSClassUnloadingEnabled -XX:+CMSScavengeBeforeRemark
> -XX:CMSInitiatingOccupancyFraction=68 -jar
>
> /home/ubuntu/git/apache/phoenix/phoenix-core/target/surefire/surefirebooter8513242350930097363.jar
> /home/ubuntu/git/apache/phoenix/phoenix-core/target/surefire
> 2021-11-23T09-45-47_283-jvmRun5 surefire3478876209177704794tmp
> surefire_83451609179705425541tmp
> [ERROR] Error occurred in starting fork, check output in log
> [ERROR] Process Exit Code: 137
> [ERROR] Crashed tests:
> [ERROR] org.apache.phoenix.end2end.index.LocalImmutableNonTxIndexIT
> [ERROR] at
>
> org.apache.maven.plugin.surefire.booterclient.ForkStarter.awaitResultsDone(ForkStarter.java:532)
> [ERROR] at
>
> org.apache.maven.plugin.surefire.booterclient.ForkStarter.runSuitesForkOnceMultiple(ForkStarter.java:405)
> [ERROR] at
>
> org.apache.maven.plugin.surefire.booterclient.ForkStarter.run(ForkStarter.java:321)
> [ERROR] at
>
> org.apache.maven.plugin.surefire.booterclient.ForkStarter.run(ForkStarter.java:266)
> [ERROR] at
>
> org.apache.maven.plugin.surefire.AbstractSurefireMojo.executeProvider(AbstractSurefireMojo.java:1314)
> [ERROR] at
>
> org.apache.maven.plugin.surefire.AbstractSurefireMojo.executeAfterPreconditionsChecked(AbstractSurefireMojo.java:1159)
> [ERROR] at
>
> org.apache.maven.plugin.surefire.AbstractSurefireMojo.execute(AbstractSurefireMojo.java:932)
> [ERROR] at
>
> org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:137)
> [ERROR] at
>
> org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:210)
> [ERROR] at
>
> org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:156)
> [ERROR] at
>
> org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:148)
> [ERROR] at
>
> org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:117)
> [ERROR] at
>
> org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:81)
> [ERROR] at
>
> org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build(SingleThreadedBuilder.java:56)
> [ERROR] at
>
> org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:128)
> [ERROR] at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:305)
> [ERROR] at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:192)
> [ERROR] at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:105)
> [ERROR] at org.apache.maven.cli.MavenCli.execute(MavenCli.java:972)
> [ERROR] at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:293)
> [ERROR] at org.apache.maven.cli.MavenCli.main(MavenCli.java:196)
> [ERROR] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> [ERROR] at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> [ERROR] at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> [ERROR] at java.lang.reflect.Method.invoke(Method.java:498)
> [ERROR] at
>
> org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:282)
> [ERROR] at
> org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:225)
> [ERROR] at
>
> org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:406)
> [ERROR] at
> org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:347)
> [ERROR] Caused by:
> org.apache.maven.surefire.booter.SurefireBooterForkException: The forked VM
> terminated without properly saying goodbye. VM crash or System.exit called?
> [ERROR] Command was /bin/sh -c cd
> /home/ubuntu/git/apache/phoenix/phoenix-core &&
> /usr/lib/jvm/java-8-openjdk-arm64/jre/bin/java
>
> -javaagent:/home/ubuntu/.m2/repository/org/jacoco/org.jacoco.agent/0.8.7/org.jacoco.agent-0.8.7-runtime.jar=destfile=/home/ubuntu/git/apache/phoenix/phoenix-core/target/jacoco.exec
> -Xmx3000m -Djava.security.egd=file:/dev/./urandom
>
> '-Djava.library.path=${hadoop.library.path}:/usr/java/packages/lib/aarch64:/usr/lib/aarch64-linux-gnu/jni:/lib/aarch64-linux-gnu:/usr/lib/aarch64-linux-gnu:/usr/lib/jni:/lib:/usr/lib'
> -XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=./target/ -XX:NewRatio=4
> -XX:SurvivorRatio=8 -XX:+UseCompressedOops -XX:+UseConcMarkSweepGC
> -XX:+DisableExplicitGC -XX:+UseCMSInitiatingOccupancyOnly
> -XX:+CMSClassUnloadingEnabled -XX:+CMSScavengeBeforeRemark
> -XX:CMSInitiatingOccupancyFraction=68 -jar
>
> /home/ubuntu/git/apache/phoenix/phoenix-core/target/surefire/surefirebooter8513242350930097363.jar
> /home/ubuntu/git/apache/phoenix/phoenix-core/target/surefire
> 2021-11-23T09-45-47_283-jvmRun5 surefire3478876209177704794tmp
> surefire_83451609179705425541tmp
> [ERROR] Error occurred in starting fork, check output in log
>
> HBase project has merged my PR to branch-2 and branch-2.4. That's why I use
> -2.4.9-SNAPSHOT now.
>
>
> On Mon, 22 Nov 2021 at 07:59, Istvan Toth <st...@apache.org> wrote:
>
> > The commands look good.
> >
> > However, for some reason the Phoenix build command doesn't pick up the
> > rebuilt artifacts.
> > You need to run your maven processes with debug enabled (-X), and check
> > your local maven repo to see what goes wrong.
> >
> > On Fri, Nov 19, 2021 at 12:22 PM Mark Jens <ma...@gmail.com>
> wrote:
> >
> > > Hi Istvan,
> > >
> > > I've patched HBase 2.4.8 with the same changes as in
> > > https://github.com/apache/hbase/pull/3860.
> > > hbase-2.4.8$ mvn install -Dhadoop.profile=3.0 -DskipTests  passed
> > > successfully!
> > >
> > > Following BUILDING.md I installed Phoenix with: mvn clean install
> > > -Dhbase.profile=2.4 -Dhbase.version=2.4.8 -DskipTests
> > > And finally tried to test it with: mvn verify -Dhbase.profile=2.4
> > > -Dhbase.version=2.4.8 but again it failed with
> > >
> > > java.lang.IncompatibleClassChangeError: Found interface
> > > org.apache.hadoop.hdfs.protocol.HdfsFileStatus, but class was expected
> > > at
> > > org.apache.hadoop.hbase.io
> > >
> >
> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper.createOutput(FanOutOneBlockAsyncDFSOutputHelper.java:536)
> > > at
> > > org.apache.hadoop.hbase.io
> > >
> >
> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper.access$400(FanOutOneBlockAsyncDFSOutputHelper.java:112)
> > > at
> > > org.apache.hadoop.hbase.io
> > >
> >
> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper$8.doCall(FanOutOneBlockAsyncDFSOutputHelper.java:616)
> > > at
> > > org.apache.hadoop.hbase.io
> > >
> >
> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper$8.doCall(FanOutOneBlockAsyncDFSOutputHelper.java:611)
> > > at
> > >
> > >
> >
> org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
> > > at
> > > org.apache.hadoop.hbase.io
> > >
> >
> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper.createOutput(FanOutOneBlockAsyncDFSOutputHelper.java:624)
> > > at
> > > org.apache.hadoop.hbase.io
> > > .asyncfs.AsyncFSOutputHelper.createOutput(AsyncFSOutputHelper.java:53)
> > > at
> > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.wal.AsyncProtobufLogWriter.initOutput(AsyncProtobufLogWriter.java:180)
> > > at
> > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.wal.AbstractProtobufLogWriter.init(AbstractProtobufLogWriter.java:166)
> > > at
> > >
> > >
> >
> org.apache.hadoop.hbase.wal.AsyncFSWALProvider.createAsyncWriter(AsyncFSWALProvider.java:113)
> > > at
> > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.wal.AsyncFSWAL.createWriterInstance(AsyncFSWAL.java:669)
> > > at
> > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.wal.AsyncFSWAL.createWriterInstance(AsyncFSWAL.java:130)
> > > at
> > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.wal.AbstractFSWAL.rollWriter(AbstractFSWAL.java:841)
> > > at
> > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.wal.AbstractFSWAL.rollWriter(AbstractFSWAL.java:548)
> > > at
> > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.wal.AbstractFSWAL.init(AbstractFSWAL.java:489)
> > > at
> > >
> > >
> >
> org.apache.hadoop.hbase.wal.AbstractFSWALProvider.getWAL(AbstractFSWALProvider.java:160)
> > > at
> > >
> > >
> >
> org.apache.hadoop.hbase.wal.AbstractFSWALProvider.getWAL(AbstractFSWALProvider.java:62)
> > > at org.apache.hadoop.hbase.wal.WALFactory.getWAL(WALFactory.java:296)
> > > at
> > >
> > >
> >
> org.apache.phoenix.hbase.index.write.recovery.TestPerRegionIndexWriteCache.setUp(TestPerRegionIndexWriteCache.java:109)
> > > ...
> > >
> > > What I am doing wrong ?
> > >
> > > Thanks!
> > >
> > > Mark
> > >
> > >
> > >
> > > On Thu, 18 Nov 2021 at 11:16, Mark Jens <ma...@gmail.com> wrote:
> > >
> > > > Hi,
> > > >
> > > > On Thu, 18 Nov 2021 at 10:38, Istvan Toth <stoty@cloudera.com.invalid
> >
> > > > wrote:
> > > >
> > > >> Phoenix probably won't work with HBase 3.x, but looking at the
> linked
> > > >> commit, it should be fairly straightforward to apply that to HBase
> > 2.4.
> > > >>
> > > >> I'm not sure why that hasn't been backported to 2.x, perhaps HBase
> > > doesn't
> > > >> have test infra set up for ARM.
> > > >> You may want to discuss backporting that change to the active Hbase
> > 2.x
> > > >> branches with the HBase project.
> > > >>
> > > >
> > > > I've asked for a backport at
> > > >
> > >
> >
> https://issues.apache.org/jira/browse/HBASE-23612?focusedCommentId=17445761&page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-17445761
> > > >
> > > > Apologies for creating
> > > https://issues.apache.org/jira/browse/PHOENIX-6595.
> > > > Initially I thought that Phoenix can apply the patch since it already
> > > > requires manual build of HBase.
> > > > But you are right - it would be much better if the improvement is
> > > > backported to HBase 2.x !
> > > >
> > > >
> > > >>
> > > >>
> > > >> On Wed, Nov 17, 2021 at 12:35 PM Mark Jens <ma...@gmail.com>
> > > wrote:
> > > >>
> > > >> > Hi Istvan,
> > > >> >
> > > >> > It seems HBase support for ARM64 is available only in 3.x:
> > > >> >
> > > >> >
> > > >>
> > >
> >
> https://github.com/apache/hbase/commit/5480493f5f7b01b496f54215334543f2a82c6ba7
> > > >> > Would Phoenix work with HBase 3.x ?
> > > >> >
> > > >> > On Wed, 17 Nov 2021 at 13:28, Mark Jens <ma...@gmail.com>
> > > wrote:
> > > >> >
> > > >> > > Thanks for the hint!
> > > >> > >
> > > >> > > Unfortunately HBase-2.4.8 build fails with:
> > > >> > >
> > > >> > > INFO] BUILD FAILURE
> > > >> > > [INFO]
> > > >> > >
> > > >>
> > ------------------------------------------------------------------------
> > > >> > > [INFO] Total time:  01:42 min
> > > >> > > [INFO] Finished at: 2021-11-17T11:23:19Z
> > > >> > > [INFO]
> > > >> > >
> > > >>
> > ------------------------------------------------------------------------
> > > >> > > [ERROR] Failed to execute goal
> > > >> > > org.xolstice.maven.plugins:protobuf-maven-plugin:0.6.1:compile
> > > >> > > (compile-protoc) on project hbase-protocol: Unable to resolve
> > > >> artifact:
> > > >> > > Missing:
> > > >> > > [ERROR] ----------
> > > >> > > [ERROR] 1) com.google.protobuf:protoc:exe:linux-aarch_64:2.5.0
> > > >> > > [ERROR]
> > > >> > > [ERROR]   Try downloading the file manually from the project
> > > website.
> > > >> > > [ERROR]
> > > >> > > [ERROR]   Then, install it using the command:
> > > >> > > [ERROR]       mvn install:install-file
> > -DgroupId=com.google.protobuf
> > > >> > > -DartifactId=protoc -Dversion=2.5.0 -Dclassifier=linux-aarch_64
> > > >> > > -Dpackaging=exe -Dfile=/path/to/file
> > > >> > > [ERROR]
> > > >> > > [ERROR]   Alternatively, if you host your own repository you can
> > > >> deploy
> > > >> > > the file there:
> > > >> > > [ERROR]       mvn deploy:deploy-file
> -DgroupId=com.google.protobuf
> > > >> > > -DartifactId=protoc -Dversion=2.5.0 -Dclassifier=linux-aarch_64
> > > >> > > -Dpackaging=exe -Dfile=/path/to/file -Durl=[url]
> > -DrepositoryId=[id]
> > > >> > > [ERROR]
> > > >> > > [ERROR]   Path to dependency:
> > > >> > > [ERROR]   1) org.apache.hbase:hbase-protocol:jar:2.4.8
> > > >> > > [ERROR]   2) com.google.protobuf:protoc:exe:linux-aarch_64:2.5.0
> > > >> > >
> > > >> > >
> > > >> > >
> > > >> > >
> > > >> > > On Wed, 17 Nov 2021 at 12:27, Istvan Toth
> > > <stoty@cloudera.com.invalid
> > > >> >
> > > >> > > wrote:
> > > >> > >
> > > >> > >> You need to recompile HBase.
> > > >> > >> See BULIDING.md
> > > >> > >>
> > > >> > >> On Wed, Nov 17, 2021 at 10:33 AM Mark Jens <
> > mark.r.jens@gmail.com>
> > > >> > wrote:
> > > >> > >>
> > > >> > >> > Hello,
> > > >> > >> >
> > > >> > >> > I am trying to build Phoenix on Ubuntu 20.04.3 ARM64.
> > > >> > >> >
> > > >> > >> > Phoenix Core module fails with:
> > > >> > >> >
> > > >> > >> > [ERROR]
> > > >> > >> >
> > > >> > >> >
> > > >> > >>
> > > >> >
> > > >>
> > >
> >
> org.apache.phoenix.hbase.index.write.recovery.TestPerRegionIndexWriteCache.testMultipleAddsForSingleRegion
> > > >> > >> >  Time elapsed: 0.025 s  <<< ERROR!
> > > >> > >> > java.lang.IncompatibleClassChangeError: Found interface
> > > >> > >> > org.apache.hadoop.hdfs.protocol.HdfsFileStatus, but class was
> > > >> expected
> > > >> > >> > at
> > > >> > >> > org.apache.hadoop.hbase.io
> > > >> > >> >
> > > >> > >>
> > > >> >
> > > >>
> > >
> >
> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper.createOutput(FanOutOneBlockAsyncDFSOutputHelper.java:536)
> > > >> > >> > at
> > > >> > >> > org.apache.hadoop.hbase.io
> > > >> > >> >
> > > >> > >>
> > > >> >
> > > >>
> > >
> >
> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper.access$400(FanOutOneBlockAsyncDFSOutputHelper.java:112)
> > > >> > >> > at
> > > >> > >> > org.apache.hadoop.hbase.io
> > > >> > >> >
> > > >> > >>
> > > >> >
> > > >>
> > >
> >
> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper$8.doCall(FanOutOneBlockAsyncDFSOutputHelper.java:616)
> > > >> > >> > at
> > > >> > >> > org.apache.hadoop.hbase.io
> > > >> > >> >
> > > >> > >>
> > > >> >
> > > >>
> > >
> >
> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper$8.doCall(FanOutOneBlockAsyncDFSOutputHelper.java:611)
> > > >> > >> > at
> > > >> > >> >
> > > >> > >> >
> > > >> > >>
> > > >> >
> > > >>
> > >
> >
> org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
> > > >> > >> > at
> > > >> > >> > org.apache.hadoop.hbase.io
> > > >> > >> >
> > > >> > >>
> > > >> >
> > > >>
> > >
> >
> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper.createOutput(FanOutOneBlockAsyncDFSOutputHelper.java:624)
> > > >> > >> > at
> > > >> > >> > org.apache.hadoop.hbase.io
> > > >> > >> >
> > > >>
> .asyncfs.AsyncFSOutputHelper.createOutput(AsyncFSOutputHelper.java:53)
> > > >> > >> > at
> > > >> > >> >
> > > >> > >> >
> > > >> > >>
> > > >> >
> > > >>
> > >
> >
> org.apache.hadoop.hbase.regionserver.wal.AsyncProtobufLogWriter.initOutput(AsyncProtobufLogWriter.java:180)
> > > >> > >> > at
> > > >> > >> >
> > > >> > >> >
> > > >> > >>
> > > >> >
> > > >>
> > >
> >
> org.apache.hadoop.hbase.regionserver.wal.AbstractProtobufLogWriter.init(AbstractProtobufLogWriter.java:166)
> > > >> > >> > at
> > > >> > >> >
> > > >> > >> >
> > > >> > >>
> > > >> >
> > > >>
> > >
> >
> org.apache.hadoop.hbase.wal.AsyncFSWALProvider.createAsyncWriter(AsyncFSWALProvider.java:113)
> > > >> > >> > at
> > > >> > >> >
> > > >> > >> >
> > > >> > >>
> > > >> >
> > > >>
> > >
> >
> org.apache.hadoop.hbase.regionserver.wal.AsyncFSWAL.createWriterInstance(AsyncFSWAL.java:669)
> > > >> > >> > at
> > > >> > >> >
> > > >> > >> >
> > > >> > >>
> > > >> >
> > > >>
> > >
> >
> org.apache.hadoop.hbase.regionserver.wal.AsyncFSWAL.createWriterInstance(AsyncFSWAL.java:130)
> > > >> > >> > at
> > > >> > >> >
> > > >> > >> >
> > > >> > >>
> > > >> >
> > > >>
> > >
> >
> org.apache.hadoop.hbase.regionserver.wal.AbstractFSWAL.rollWriter(AbstractFSWAL.java:841)
> > > >> > >> > at
> > > >> > >> >
> > > >> > >> >
> > > >> > >>
> > > >> >
> > > >>
> > >
> >
> org.apache.hadoop.hbase.regionserver.wal.AbstractFSWAL.rollWriter(AbstractFSWAL.java:548)
> > > >> > >> > at
> > > >> > >> >
> > > >> > >> >
> > > >> > >>
> > > >> >
> > > >>
> > >
> >
> org.apache.hadoop.hbase.regionserver.wal.AbstractFSWAL.init(AbstractFSWAL.java:489)
> > > >> > >> > at
> > > >> > >> >
> > > >> > >> >
> > > >> > >>
> > > >> >
> > > >>
> > >
> >
> org.apache.hadoop.hbase.wal.AbstractFSWALProvider.getWAL(AbstractFSWALProvider.java:160)
> > > >> > >> > at
> > > >> > >> >
> > > >> > >> >
> > > >> > >>
> > > >> >
> > > >>
> > >
> >
> org.apache.hadoop.hbase.wal.AbstractFSWALProvider.getWAL(AbstractFSWALProvider.java:62)
> > > >> > >> > at
> > > >> org.apache.hadoop.hbase.wal.WALFactory.getWAL(WALFactory.java:296)
> > > >> > >> > at
> > > >> > >> >
> > > >> > >> >
> > > >> > >>
> > > >> >
> > > >>
> > >
> >
> org.apache.phoenix.hbase.index.write.recovery.TestPerRegionIndexWriteCache.setUp(TestPerRegionIndexWriteCache.java:109)
> > > >> > >> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> > > >> > >> > ....
> > > >> > >> > [INFO]
> > > >> > >> > [INFO] Results:
> > > >> > >> > [INFO]
> > > >> > >> > [ERROR] Errors:
> > > >> > >> > [ERROR]   TestPerRegionIndexWriteCache.setUp:109 »
> > > >> > >> IncompatibleClassChange
> > > >> > >> > Found interfa...
> > > >> > >> > [ERROR]   TestPerRegionIndexWriteCache.setUp:109 »
> > > >> > >> IncompatibleClassChange
> > > >> > >> > Found interfa...
> > > >> > >> > [ERROR]   TestPerRegionIndexWriteCache.setUp:109 »
> > > >> > >> IncompatibleClassChange
> > > >> > >> > Found interfa...
> > > >> > >> > [INFO]
> > > >> > >> > [ERROR] Tests run: 1909, Failures: 0, Errors: 3, Skipped: 6
> > > >> > >> > [INFO]
> > > >> > >> > [INFO]
> > > >> > >> >
> > > >> >
> > >
> ------------------------------------------------------------------------
> > > >> > >> > [INFO] Reactor Summary for Apache Phoenix 5.2.0-SNAPSHOT:
> > > >> > >> > [INFO]
> > > >> > >> > [INFO] Apache Phoenix .....................................
> > > >> SUCCESS [
> > > >> > >> >  2.034 s]
> > > >> > >> > [INFO] Phoenix Hbase 2.4.1 compatibility ..................
> > > >> SUCCESS [
> > > >> > >> >  5.358 s]
> > > >> > >> > [INFO] Phoenix Hbase 2.4.0 compatibility ..................
> > > >> SUCCESS [
> > > >> > >> >  3.946 s]
> > > >> > >> > [INFO] Phoenix Hbase 2.3.0 compatibility ..................
> > > >> SUCCESS [
> > > >> > >> >  4.437 s]
> > > >> > >> > [INFO] Phoenix Hbase 2.2.5 compatibility ..................
> > > >> SUCCESS [
> > > >> > >> >  4.004 s]
> > > >> > >> > [INFO] Phoenix Hbase 2.1.6 compatibility ..................
> > > >> SUCCESS [
> > > >> > >> >  3.966 s]
> > > >> > >> > [INFO] Phoenix Core .......................................
> > > FAILURE
> > > >> > >> [01:25
> > > >> > >> > min]
> > > >> > >> > [INFO] Phoenix - Pherf ....................................
> > > SKIPPED
> > > >> > >> > ...
> > > >> > >> >
> > > >> > >> > Any idea why this breaks ?
> > > >> > >> >
> > > >> > >> > It does not look ARM64 specific to me. I will try on x64 too.
> > > >> > >> >
> > > >> > >> > Thanks!
> > > >> > >> >
> > > >> > >> > Mark
> > > >> > >> >
> > > >> > >>
> > > >> > >>
> > > >> > >> --
> > > >> > >> *István Tóth* | Staff Software Engineer
> > > >> > >> stoty@cloudera.com <https://www.cloudera.com>
> > > >> > >> [image: Cloudera] <https://www.cloudera.com/>
> > > >> > >> [image: Cloudera on Twitter] <https://twitter.com/cloudera>
> > > [image:
> > > >> > >> Cloudera on Facebook] <https://www.facebook.com/cloudera>
> > [image:
> > > >> > >> Cloudera
> > > >> > >> on LinkedIn] <https://www.linkedin.com/company/cloudera>
> > > >> > >> <https://www.cloudera.com/>
> > > >> > >> ------------------------------
> > > >> > >>
> > > >> > >
> > > >> >
> > > >>
> > > >>
> > > >> --
> > > >> *István Tóth* | Staff Software Engineer
> > > >> stoty@cloudera.com <https://www.cloudera.com>
> > > >> [image: Cloudera] <https://www.cloudera.com/>
> > > >> [image: Cloudera on Twitter] <https://twitter.com/cloudera> [image:
> > > >> Cloudera on Facebook] <https://www.facebook.com/cloudera> [image:
> > > >> Cloudera
> > > >> on LinkedIn] <https://www.linkedin.com/company/cloudera>
> > > >> <https://www.cloudera.com/>
> > > >> ------------------------------
> > > >>
> > > >
> > >
> >
>


-- 
*István Tóth* | Staff Software Engineer
stoty@cloudera.com <https://www.cloudera.com>
[image: Cloudera] <https://www.cloudera.com/>
[image: Cloudera on Twitter] <https://twitter.com/cloudera> [image:
Cloudera on Facebook] <https://www.facebook.com/cloudera> [image: Cloudera
on LinkedIn] <https://www.linkedin.com/company/cloudera>
<https://www.cloudera.com/>
------------------------------

Re: Test failure on Linux ARM64

Posted by Mark Jens <ma...@gmail.com>.
Hi Istvan,

I have some progress here!

I was able to overcome the above problem by adding -Dhbase24.compat.version
and  -pl "!phoenix-hbase-compat-2.4.0"

mvn clean install -Dhbase.profile=2.4 -Dhbase.version=2.4.9-SNAPSHOT
-Dhbase24.compat.version=2.4.9-SNAPSHOT -DskipTests -pl
"!phoenix-hbase-compat-2.4.0"

Without explicitly setting -Dhbase24.compat.version it was using 2.4.1 and
I think this was the reason why it was still using HdfsFileStatus
fromHadoop 2.x.

Without -pl "!phoenix-hbase-compat-2.4.0" I've got this compilation error:

ERROR] Failed to execute goal
org.apache.maven.plugins:maven-compiler-plugin:3.8.1:compile
(default-compile) on project phoenix-hbase-compat-2.4.0: Compilation
failure: Compilation failure:
[ERROR]
/home/ubuntu/git/apache/phoenix/phoenix-hbase-compat-2.4.0/src/main/java/org/apache/phoenix/compat/hbase/CompatUtil.java:[72,41]
cannot find symbol
[ERROR]   symbol:   method
getBytesPerChecksum(org.apache.hadoop.conf.Configuration)
[ERROR]   location: class org.apache.hadoop.hbase.regionserver.HStore
[ERROR]
/home/ubuntu/git/apache/phoenix/phoenix-hbase-compat-2.4.0/src/main/java/org/apache/phoenix/compat/hbase/CompatUtil.java:[71,37]
cannot find symbol
[ERROR]   symbol:   method
getChecksumType(org.apache.hadoop.conf.Configuration)
[ERROR]   location: class org.apache.hadoop.hbase.regionserver.HStore
[ERROR]
/home/ubuntu/git/apache/phoenix/phoenix-hbase-compat-2.4.0/src/main/java/org/apache/phoenix/compat/hbase/CompatUtil.java:[136,22]
cannot find symbol
[ERROR]   symbol:   method
getChecksumType(org.apache.hadoop.conf.Configuration)
[ERROR]   location: class org.apache.hadoop.hbase.regionserver.HStore
[ERROR]
/home/ubuntu/git/apache/phoenix/phoenix-hbase-compat-2.4.0/src/main/java/org/apache/phoenix/compat/hbase/CompatUtil.java:[140,22]
cannot find symbol
[ERROR]   symbol:   method
getBytesPerChecksum(org.apache.hadoop.conf.Configuration)
[ERROR]   location: class org.apache.hadoop.hbase.regionserver.HStore
[ERROR] -> [Help 1]

Then I ran
mvn verify -Dhbase.profile=2.4 -Dhbase.version=2.4.9-SNAPSHOT
-Dhbase24.compat.version=2.4.9-SNAPSHOT -pl "!phoenix-hbase-compat-2.4.0"

After 2 hours I've got one test failure:

[ERROR] Errors:
[ERROR]
PermissionNSEnabledIT>BasePermissionsIT.testSystemTablePermissions:1027->BasePermissionsIT.verifyAllowed:894->BasePermissionsIT.verifyAllowed:901
» UndeclaredThrowable
[INFO]
[ERROR] Tests run: 1550, Failures: 0, Errors: 1, Skipped: 60

and one IT test crashed the JVM:

[ERROR] org.apache.maven.surefire.booter.SurefireBooterForkException:
ExecutionException The forked VM terminated without properly saying
goodbye. VM crash or System.exit called?
[ERROR] Command was /bin/sh -c cd
/home/ubuntu/git/apache/phoenix/phoenix-core &&
/usr/lib/jvm/java-8-openjdk-arm64/jre/bin/java
-javaagent:/home/ubuntu/.m2/repository/org/jacoco/org.jacoco.agent/0.8.7/org.jacoco.agent-0.8.7-runtime.jar=destfile=/home/ubuntu/git/apache/phoenix/phoenix-core/target/jacoco.exec
-Xmx3000m -Djava.security.egd=file:/dev/./urandom
'-Djava.library.path=${hadoop.library.path}:/usr/java/packages/lib/aarch64:/usr/lib/aarch64-linux-gnu/jni:/lib/aarch64-linux-gnu:/usr/lib/aarch64-linux-gnu:/usr/lib/jni:/lib:/usr/lib'
-XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=./target/ -XX:NewRatio=4
-XX:SurvivorRatio=8 -XX:+UseCompressedOops -XX:+UseConcMarkSweepGC
-XX:+DisableExplicitGC -XX:+UseCMSInitiatingOccupancyOnly
-XX:+CMSClassUnloadingEnabled -XX:+CMSScavengeBeforeRemark
-XX:CMSInitiatingOccupancyFraction=68 -jar
/home/ubuntu/git/apache/phoenix/phoenix-core/target/surefire/surefirebooter3776996098139745421.jar
/home/ubuntu/git/apache/phoenix/phoenix-core/target/surefire
2021-11-23T09-45-47_283-jvmRun7 surefire8003544437266408760tmp
surefire_135681795925087748715tmp
[ERROR] Error occurred in starting fork, check output in log
[ERROR] Process Exit Code: 137
[ERROR] ExecutionException The forked VM terminated without properly saying
goodbye. VM crash or System.exit called?
[ERROR] Command was /bin/sh -c cd
/home/ubuntu/git/apache/phoenix/phoenix-core &&
/usr/lib/jvm/java-8-openjdk-arm64/jre/bin/java
-javaagent:/home/ubuntu/.m2/repository/org/jacoco/org.jacoco.agent/0.8.7/org.jacoco.agent-0.8.7-runtime.jar=destfile=/home/ubuntu/git/apache/phoenix/phoenix-core/target/jacoco.exec
-Xmx3000m -Djava.security.egd=file:/dev/./urandom
'-Djava.library.path=${hadoop.library.path}:/usr/java/packages/lib/aarch64:/usr/lib/aarch64-linux-gnu/jni:/lib/aarch64-linux-gnu:/usr/lib/aarch64-linux-gnu:/usr/lib/jni:/lib:/usr/lib'
-XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=./target/ -XX:NewRatio=4
-XX:SurvivorRatio=8 -XX:+UseCompressedOops -XX:+UseConcMarkSweepGC
-XX:+DisableExplicitGC -XX:+UseCMSInitiatingOccupancyOnly
-XX:+CMSClassUnloadingEnabled -XX:+CMSScavengeBeforeRemark
-XX:CMSInitiatingOccupancyFraction=68 -jar
/home/ubuntu/git/apache/phoenix/phoenix-core/target/surefire/surefirebooter8513242350930097363.jar
/home/ubuntu/git/apache/phoenix/phoenix-core/target/surefire
2021-11-23T09-45-47_283-jvmRun5 surefire3478876209177704794tmp
surefire_83451609179705425541tmp
[ERROR] Error occurred in starting fork, check output in log
[ERROR] Process Exit Code: 137
[ERROR] Crashed tests:
[ERROR] org.apache.phoenix.end2end.index.LocalImmutableNonTxIndexIT
[ERROR] at
org.apache.maven.plugin.surefire.booterclient.ForkStarter.awaitResultsDone(ForkStarter.java:532)
[ERROR] at
org.apache.maven.plugin.surefire.booterclient.ForkStarter.runSuitesForkOnceMultiple(ForkStarter.java:405)
[ERROR] at
org.apache.maven.plugin.surefire.booterclient.ForkStarter.run(ForkStarter.java:321)
[ERROR] at
org.apache.maven.plugin.surefire.booterclient.ForkStarter.run(ForkStarter.java:266)
[ERROR] at
org.apache.maven.plugin.surefire.AbstractSurefireMojo.executeProvider(AbstractSurefireMojo.java:1314)
[ERROR] at
org.apache.maven.plugin.surefire.AbstractSurefireMojo.executeAfterPreconditionsChecked(AbstractSurefireMojo.java:1159)
[ERROR] at
org.apache.maven.plugin.surefire.AbstractSurefireMojo.execute(AbstractSurefireMojo.java:932)
[ERROR] at
org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:137)
[ERROR] at
org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:210)
[ERROR] at
org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:156)
[ERROR] at
org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:148)
[ERROR] at
org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:117)
[ERROR] at
org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:81)
[ERROR] at
org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build(SingleThreadedBuilder.java:56)
[ERROR] at
org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:128)
[ERROR] at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:305)
[ERROR] at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:192)
[ERROR] at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:105)
[ERROR] at org.apache.maven.cli.MavenCli.execute(MavenCli.java:972)
[ERROR] at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:293)
[ERROR] at org.apache.maven.cli.MavenCli.main(MavenCli.java:196)
[ERROR] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[ERROR] at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
[ERROR] at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
[ERROR] at java.lang.reflect.Method.invoke(Method.java:498)
[ERROR] at
org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:282)
[ERROR] at
org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:225)
[ERROR] at
org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:406)
[ERROR] at
org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:347)
[ERROR] Caused by:
org.apache.maven.surefire.booter.SurefireBooterForkException: The forked VM
terminated without properly saying goodbye. VM crash or System.exit called?
[ERROR] Command was /bin/sh -c cd
/home/ubuntu/git/apache/phoenix/phoenix-core &&
/usr/lib/jvm/java-8-openjdk-arm64/jre/bin/java
-javaagent:/home/ubuntu/.m2/repository/org/jacoco/org.jacoco.agent/0.8.7/org.jacoco.agent-0.8.7-runtime.jar=destfile=/home/ubuntu/git/apache/phoenix/phoenix-core/target/jacoco.exec
-Xmx3000m -Djava.security.egd=file:/dev/./urandom
'-Djava.library.path=${hadoop.library.path}:/usr/java/packages/lib/aarch64:/usr/lib/aarch64-linux-gnu/jni:/lib/aarch64-linux-gnu:/usr/lib/aarch64-linux-gnu:/usr/lib/jni:/lib:/usr/lib'
-XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=./target/ -XX:NewRatio=4
-XX:SurvivorRatio=8 -XX:+UseCompressedOops -XX:+UseConcMarkSweepGC
-XX:+DisableExplicitGC -XX:+UseCMSInitiatingOccupancyOnly
-XX:+CMSClassUnloadingEnabled -XX:+CMSScavengeBeforeRemark
-XX:CMSInitiatingOccupancyFraction=68 -jar
/home/ubuntu/git/apache/phoenix/phoenix-core/target/surefire/surefirebooter8513242350930097363.jar
/home/ubuntu/git/apache/phoenix/phoenix-core/target/surefire
2021-11-23T09-45-47_283-jvmRun5 surefire3478876209177704794tmp
surefire_83451609179705425541tmp
[ERROR] Error occurred in starting fork, check output in log

HBase project has merged my PR to branch-2 and branch-2.4. That's why I use
-2.4.9-SNAPSHOT now.


On Mon, 22 Nov 2021 at 07:59, Istvan Toth <st...@apache.org> wrote:

> The commands look good.
>
> However, for some reason the Phoenix build command doesn't pick up the
> rebuilt artifacts.
> You need to run your maven processes with debug enabled (-X), and check
> your local maven repo to see what goes wrong.
>
> On Fri, Nov 19, 2021 at 12:22 PM Mark Jens <ma...@gmail.com> wrote:
>
> > Hi Istvan,
> >
> > I've patched HBase 2.4.8 with the same changes as in
> > https://github.com/apache/hbase/pull/3860.
> > hbase-2.4.8$ mvn install -Dhadoop.profile=3.0 -DskipTests  passed
> > successfully!
> >
> > Following BUILDING.md I installed Phoenix with: mvn clean install
> > -Dhbase.profile=2.4 -Dhbase.version=2.4.8 -DskipTests
> > And finally tried to test it with: mvn verify -Dhbase.profile=2.4
> > -Dhbase.version=2.4.8 but again it failed with
> >
> > java.lang.IncompatibleClassChangeError: Found interface
> > org.apache.hadoop.hdfs.protocol.HdfsFileStatus, but class was expected
> > at
> > org.apache.hadoop.hbase.io
> >
> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper.createOutput(FanOutOneBlockAsyncDFSOutputHelper.java:536)
> > at
> > org.apache.hadoop.hbase.io
> >
> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper.access$400(FanOutOneBlockAsyncDFSOutputHelper.java:112)
> > at
> > org.apache.hadoop.hbase.io
> >
> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper$8.doCall(FanOutOneBlockAsyncDFSOutputHelper.java:616)
> > at
> > org.apache.hadoop.hbase.io
> >
> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper$8.doCall(FanOutOneBlockAsyncDFSOutputHelper.java:611)
> > at
> >
> >
> org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
> > at
> > org.apache.hadoop.hbase.io
> >
> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper.createOutput(FanOutOneBlockAsyncDFSOutputHelper.java:624)
> > at
> > org.apache.hadoop.hbase.io
> > .asyncfs.AsyncFSOutputHelper.createOutput(AsyncFSOutputHelper.java:53)
> > at
> >
> >
> org.apache.hadoop.hbase.regionserver.wal.AsyncProtobufLogWriter.initOutput(AsyncProtobufLogWriter.java:180)
> > at
> >
> >
> org.apache.hadoop.hbase.regionserver.wal.AbstractProtobufLogWriter.init(AbstractProtobufLogWriter.java:166)
> > at
> >
> >
> org.apache.hadoop.hbase.wal.AsyncFSWALProvider.createAsyncWriter(AsyncFSWALProvider.java:113)
> > at
> >
> >
> org.apache.hadoop.hbase.regionserver.wal.AsyncFSWAL.createWriterInstance(AsyncFSWAL.java:669)
> > at
> >
> >
> org.apache.hadoop.hbase.regionserver.wal.AsyncFSWAL.createWriterInstance(AsyncFSWAL.java:130)
> > at
> >
> >
> org.apache.hadoop.hbase.regionserver.wal.AbstractFSWAL.rollWriter(AbstractFSWAL.java:841)
> > at
> >
> >
> org.apache.hadoop.hbase.regionserver.wal.AbstractFSWAL.rollWriter(AbstractFSWAL.java:548)
> > at
> >
> >
> org.apache.hadoop.hbase.regionserver.wal.AbstractFSWAL.init(AbstractFSWAL.java:489)
> > at
> >
> >
> org.apache.hadoop.hbase.wal.AbstractFSWALProvider.getWAL(AbstractFSWALProvider.java:160)
> > at
> >
> >
> org.apache.hadoop.hbase.wal.AbstractFSWALProvider.getWAL(AbstractFSWALProvider.java:62)
> > at org.apache.hadoop.hbase.wal.WALFactory.getWAL(WALFactory.java:296)
> > at
> >
> >
> org.apache.phoenix.hbase.index.write.recovery.TestPerRegionIndexWriteCache.setUp(TestPerRegionIndexWriteCache.java:109)
> > ...
> >
> > What I am doing wrong ?
> >
> > Thanks!
> >
> > Mark
> >
> >
> >
> > On Thu, 18 Nov 2021 at 11:16, Mark Jens <ma...@gmail.com> wrote:
> >
> > > Hi,
> > >
> > > On Thu, 18 Nov 2021 at 10:38, Istvan Toth <st...@cloudera.com.invalid>
> > > wrote:
> > >
> > >> Phoenix probably won't work with HBase 3.x, but looking at the linked
> > >> commit, it should be fairly straightforward to apply that to HBase
> 2.4.
> > >>
> > >> I'm not sure why that hasn't been backported to 2.x, perhaps HBase
> > doesn't
> > >> have test infra set up for ARM.
> > >> You may want to discuss backporting that change to the active Hbase
> 2.x
> > >> branches with the HBase project.
> > >>
> > >
> > > I've asked for a backport at
> > >
> >
> https://issues.apache.org/jira/browse/HBASE-23612?focusedCommentId=17445761&page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-17445761
> > >
> > > Apologies for creating
> > https://issues.apache.org/jira/browse/PHOENIX-6595.
> > > Initially I thought that Phoenix can apply the patch since it already
> > > requires manual build of HBase.
> > > But you are right - it would be much better if the improvement is
> > > backported to HBase 2.x !
> > >
> > >
> > >>
> > >>
> > >> On Wed, Nov 17, 2021 at 12:35 PM Mark Jens <ma...@gmail.com>
> > wrote:
> > >>
> > >> > Hi Istvan,
> > >> >
> > >> > It seems HBase support for ARM64 is available only in 3.x:
> > >> >
> > >> >
> > >>
> >
> https://github.com/apache/hbase/commit/5480493f5f7b01b496f54215334543f2a82c6ba7
> > >> > Would Phoenix work with HBase 3.x ?
> > >> >
> > >> > On Wed, 17 Nov 2021 at 13:28, Mark Jens <ma...@gmail.com>
> > wrote:
> > >> >
> > >> > > Thanks for the hint!
> > >> > >
> > >> > > Unfortunately HBase-2.4.8 build fails with:
> > >> > >
> > >> > > INFO] BUILD FAILURE
> > >> > > [INFO]
> > >> > >
> > >>
> ------------------------------------------------------------------------
> > >> > > [INFO] Total time:  01:42 min
> > >> > > [INFO] Finished at: 2021-11-17T11:23:19Z
> > >> > > [INFO]
> > >> > >
> > >>
> ------------------------------------------------------------------------
> > >> > > [ERROR] Failed to execute goal
> > >> > > org.xolstice.maven.plugins:protobuf-maven-plugin:0.6.1:compile
> > >> > > (compile-protoc) on project hbase-protocol: Unable to resolve
> > >> artifact:
> > >> > > Missing:
> > >> > > [ERROR] ----------
> > >> > > [ERROR] 1) com.google.protobuf:protoc:exe:linux-aarch_64:2.5.0
> > >> > > [ERROR]
> > >> > > [ERROR]   Try downloading the file manually from the project
> > website.
> > >> > > [ERROR]
> > >> > > [ERROR]   Then, install it using the command:
> > >> > > [ERROR]       mvn install:install-file
> -DgroupId=com.google.protobuf
> > >> > > -DartifactId=protoc -Dversion=2.5.0 -Dclassifier=linux-aarch_64
> > >> > > -Dpackaging=exe -Dfile=/path/to/file
> > >> > > [ERROR]
> > >> > > [ERROR]   Alternatively, if you host your own repository you can
> > >> deploy
> > >> > > the file there:
> > >> > > [ERROR]       mvn deploy:deploy-file -DgroupId=com.google.protobuf
> > >> > > -DartifactId=protoc -Dversion=2.5.0 -Dclassifier=linux-aarch_64
> > >> > > -Dpackaging=exe -Dfile=/path/to/file -Durl=[url]
> -DrepositoryId=[id]
> > >> > > [ERROR]
> > >> > > [ERROR]   Path to dependency:
> > >> > > [ERROR]   1) org.apache.hbase:hbase-protocol:jar:2.4.8
> > >> > > [ERROR]   2) com.google.protobuf:protoc:exe:linux-aarch_64:2.5.0
> > >> > >
> > >> > >
> > >> > >
> > >> > >
> > >> > > On Wed, 17 Nov 2021 at 12:27, Istvan Toth
> > <stoty@cloudera.com.invalid
> > >> >
> > >> > > wrote:
> > >> > >
> > >> > >> You need to recompile HBase.
> > >> > >> See BULIDING.md
> > >> > >>
> > >> > >> On Wed, Nov 17, 2021 at 10:33 AM Mark Jens <
> mark.r.jens@gmail.com>
> > >> > wrote:
> > >> > >>
> > >> > >> > Hello,
> > >> > >> >
> > >> > >> > I am trying to build Phoenix on Ubuntu 20.04.3 ARM64.
> > >> > >> >
> > >> > >> > Phoenix Core module fails with:
> > >> > >> >
> > >> > >> > [ERROR]
> > >> > >> >
> > >> > >> >
> > >> > >>
> > >> >
> > >>
> >
> org.apache.phoenix.hbase.index.write.recovery.TestPerRegionIndexWriteCache.testMultipleAddsForSingleRegion
> > >> > >> >  Time elapsed: 0.025 s  <<< ERROR!
> > >> > >> > java.lang.IncompatibleClassChangeError: Found interface
> > >> > >> > org.apache.hadoop.hdfs.protocol.HdfsFileStatus, but class was
> > >> expected
> > >> > >> > at
> > >> > >> > org.apache.hadoop.hbase.io
> > >> > >> >
> > >> > >>
> > >> >
> > >>
> >
> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper.createOutput(FanOutOneBlockAsyncDFSOutputHelper.java:536)
> > >> > >> > at
> > >> > >> > org.apache.hadoop.hbase.io
> > >> > >> >
> > >> > >>
> > >> >
> > >>
> >
> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper.access$400(FanOutOneBlockAsyncDFSOutputHelper.java:112)
> > >> > >> > at
> > >> > >> > org.apache.hadoop.hbase.io
> > >> > >> >
> > >> > >>
> > >> >
> > >>
> >
> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper$8.doCall(FanOutOneBlockAsyncDFSOutputHelper.java:616)
> > >> > >> > at
> > >> > >> > org.apache.hadoop.hbase.io
> > >> > >> >
> > >> > >>
> > >> >
> > >>
> >
> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper$8.doCall(FanOutOneBlockAsyncDFSOutputHelper.java:611)
> > >> > >> > at
> > >> > >> >
> > >> > >> >
> > >> > >>
> > >> >
> > >>
> >
> org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
> > >> > >> > at
> > >> > >> > org.apache.hadoop.hbase.io
> > >> > >> >
> > >> > >>
> > >> >
> > >>
> >
> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper.createOutput(FanOutOneBlockAsyncDFSOutputHelper.java:624)
> > >> > >> > at
> > >> > >> > org.apache.hadoop.hbase.io
> > >> > >> >
> > >> .asyncfs.AsyncFSOutputHelper.createOutput(AsyncFSOutputHelper.java:53)
> > >> > >> > at
> > >> > >> >
> > >> > >> >
> > >> > >>
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.regionserver.wal.AsyncProtobufLogWriter.initOutput(AsyncProtobufLogWriter.java:180)
> > >> > >> > at
> > >> > >> >
> > >> > >> >
> > >> > >>
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.regionserver.wal.AbstractProtobufLogWriter.init(AbstractProtobufLogWriter.java:166)
> > >> > >> > at
> > >> > >> >
> > >> > >> >
> > >> > >>
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.wal.AsyncFSWALProvider.createAsyncWriter(AsyncFSWALProvider.java:113)
> > >> > >> > at
> > >> > >> >
> > >> > >> >
> > >> > >>
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.regionserver.wal.AsyncFSWAL.createWriterInstance(AsyncFSWAL.java:669)
> > >> > >> > at
> > >> > >> >
> > >> > >> >
> > >> > >>
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.regionserver.wal.AsyncFSWAL.createWriterInstance(AsyncFSWAL.java:130)
> > >> > >> > at
> > >> > >> >
> > >> > >> >
> > >> > >>
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.regionserver.wal.AbstractFSWAL.rollWriter(AbstractFSWAL.java:841)
> > >> > >> > at
> > >> > >> >
> > >> > >> >
> > >> > >>
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.regionserver.wal.AbstractFSWAL.rollWriter(AbstractFSWAL.java:548)
> > >> > >> > at
> > >> > >> >
> > >> > >> >
> > >> > >>
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.regionserver.wal.AbstractFSWAL.init(AbstractFSWAL.java:489)
> > >> > >> > at
> > >> > >> >
> > >> > >> >
> > >> > >>
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.wal.AbstractFSWALProvider.getWAL(AbstractFSWALProvider.java:160)
> > >> > >> > at
> > >> > >> >
> > >> > >> >
> > >> > >>
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.wal.AbstractFSWALProvider.getWAL(AbstractFSWALProvider.java:62)
> > >> > >> > at
> > >> org.apache.hadoop.hbase.wal.WALFactory.getWAL(WALFactory.java:296)
> > >> > >> > at
> > >> > >> >
> > >> > >> >
> > >> > >>
> > >> >
> > >>
> >
> org.apache.phoenix.hbase.index.write.recovery.TestPerRegionIndexWriteCache.setUp(TestPerRegionIndexWriteCache.java:109)
> > >> > >> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > >> > >> > ....
> > >> > >> > [INFO]
> > >> > >> > [INFO] Results:
> > >> > >> > [INFO]
> > >> > >> > [ERROR] Errors:
> > >> > >> > [ERROR]   TestPerRegionIndexWriteCache.setUp:109 »
> > >> > >> IncompatibleClassChange
> > >> > >> > Found interfa...
> > >> > >> > [ERROR]   TestPerRegionIndexWriteCache.setUp:109 »
> > >> > >> IncompatibleClassChange
> > >> > >> > Found interfa...
> > >> > >> > [ERROR]   TestPerRegionIndexWriteCache.setUp:109 »
> > >> > >> IncompatibleClassChange
> > >> > >> > Found interfa...
> > >> > >> > [INFO]
> > >> > >> > [ERROR] Tests run: 1909, Failures: 0, Errors: 3, Skipped: 6
> > >> > >> > [INFO]
> > >> > >> > [INFO]
> > >> > >> >
> > >> >
> > ------------------------------------------------------------------------
> > >> > >> > [INFO] Reactor Summary for Apache Phoenix 5.2.0-SNAPSHOT:
> > >> > >> > [INFO]
> > >> > >> > [INFO] Apache Phoenix .....................................
> > >> SUCCESS [
> > >> > >> >  2.034 s]
> > >> > >> > [INFO] Phoenix Hbase 2.4.1 compatibility ..................
> > >> SUCCESS [
> > >> > >> >  5.358 s]
> > >> > >> > [INFO] Phoenix Hbase 2.4.0 compatibility ..................
> > >> SUCCESS [
> > >> > >> >  3.946 s]
> > >> > >> > [INFO] Phoenix Hbase 2.3.0 compatibility ..................
> > >> SUCCESS [
> > >> > >> >  4.437 s]
> > >> > >> > [INFO] Phoenix Hbase 2.2.5 compatibility ..................
> > >> SUCCESS [
> > >> > >> >  4.004 s]
> > >> > >> > [INFO] Phoenix Hbase 2.1.6 compatibility ..................
> > >> SUCCESS [
> > >> > >> >  3.966 s]
> > >> > >> > [INFO] Phoenix Core .......................................
> > FAILURE
> > >> > >> [01:25
> > >> > >> > min]
> > >> > >> > [INFO] Phoenix - Pherf ....................................
> > SKIPPED
> > >> > >> > ...
> > >> > >> >
> > >> > >> > Any idea why this breaks ?
> > >> > >> >
> > >> > >> > It does not look ARM64 specific to me. I will try on x64 too.
> > >> > >> >
> > >> > >> > Thanks!
> > >> > >> >
> > >> > >> > Mark
> > >> > >> >
> > >> > >>
> > >> > >>
> > >> > >> --
> > >> > >> *István Tóth* | Staff Software Engineer
> > >> > >> stoty@cloudera.com <https://www.cloudera.com>
> > >> > >> [image: Cloudera] <https://www.cloudera.com/>
> > >> > >> [image: Cloudera on Twitter] <https://twitter.com/cloudera>
> > [image:
> > >> > >> Cloudera on Facebook] <https://www.facebook.com/cloudera>
> [image:
> > >> > >> Cloudera
> > >> > >> on LinkedIn] <https://www.linkedin.com/company/cloudera>
> > >> > >> <https://www.cloudera.com/>
> > >> > >> ------------------------------
> > >> > >>
> > >> > >
> > >> >
> > >>
> > >>
> > >> --
> > >> *István Tóth* | Staff Software Engineer
> > >> stoty@cloudera.com <https://www.cloudera.com>
> > >> [image: Cloudera] <https://www.cloudera.com/>
> > >> [image: Cloudera on Twitter] <https://twitter.com/cloudera> [image:
> > >> Cloudera on Facebook] <https://www.facebook.com/cloudera> [image:
> > >> Cloudera
> > >> on LinkedIn] <https://www.linkedin.com/company/cloudera>
> > >> <https://www.cloudera.com/>
> > >> ------------------------------
> > >>
> > >
> >
>

Re: Test failure on Linux ARM64

Posted by Istvan Toth <st...@apache.org>.
The commands look good.

However, for some reason the Phoenix build command doesn't pick up the
rebuilt artifacts.
You need to run your maven processes with debug enabled (-X), and check
your local maven repo to see what goes wrong.

On Fri, Nov 19, 2021 at 12:22 PM Mark Jens <ma...@gmail.com> wrote:

> Hi Istvan,
>
> I've patched HBase 2.4.8 with the same changes as in
> https://github.com/apache/hbase/pull/3860.
> hbase-2.4.8$ mvn install -Dhadoop.profile=3.0 -DskipTests  passed
> successfully!
>
> Following BUILDING.md I installed Phoenix with: mvn clean install
> -Dhbase.profile=2.4 -Dhbase.version=2.4.8 -DskipTests
> And finally tried to test it with: mvn verify -Dhbase.profile=2.4
> -Dhbase.version=2.4.8 but again it failed with
>
> java.lang.IncompatibleClassChangeError: Found interface
> org.apache.hadoop.hdfs.protocol.HdfsFileStatus, but class was expected
> at
> org.apache.hadoop.hbase.io
> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper.createOutput(FanOutOneBlockAsyncDFSOutputHelper.java:536)
> at
> org.apache.hadoop.hbase.io
> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper.access$400(FanOutOneBlockAsyncDFSOutputHelper.java:112)
> at
> org.apache.hadoop.hbase.io
> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper$8.doCall(FanOutOneBlockAsyncDFSOutputHelper.java:616)
> at
> org.apache.hadoop.hbase.io
> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper$8.doCall(FanOutOneBlockAsyncDFSOutputHelper.java:611)
> at
>
> org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
> at
> org.apache.hadoop.hbase.io
> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper.createOutput(FanOutOneBlockAsyncDFSOutputHelper.java:624)
> at
> org.apache.hadoop.hbase.io
> .asyncfs.AsyncFSOutputHelper.createOutput(AsyncFSOutputHelper.java:53)
> at
>
> org.apache.hadoop.hbase.regionserver.wal.AsyncProtobufLogWriter.initOutput(AsyncProtobufLogWriter.java:180)
> at
>
> org.apache.hadoop.hbase.regionserver.wal.AbstractProtobufLogWriter.init(AbstractProtobufLogWriter.java:166)
> at
>
> org.apache.hadoop.hbase.wal.AsyncFSWALProvider.createAsyncWriter(AsyncFSWALProvider.java:113)
> at
>
> org.apache.hadoop.hbase.regionserver.wal.AsyncFSWAL.createWriterInstance(AsyncFSWAL.java:669)
> at
>
> org.apache.hadoop.hbase.regionserver.wal.AsyncFSWAL.createWriterInstance(AsyncFSWAL.java:130)
> at
>
> org.apache.hadoop.hbase.regionserver.wal.AbstractFSWAL.rollWriter(AbstractFSWAL.java:841)
> at
>
> org.apache.hadoop.hbase.regionserver.wal.AbstractFSWAL.rollWriter(AbstractFSWAL.java:548)
> at
>
> org.apache.hadoop.hbase.regionserver.wal.AbstractFSWAL.init(AbstractFSWAL.java:489)
> at
>
> org.apache.hadoop.hbase.wal.AbstractFSWALProvider.getWAL(AbstractFSWALProvider.java:160)
> at
>
> org.apache.hadoop.hbase.wal.AbstractFSWALProvider.getWAL(AbstractFSWALProvider.java:62)
> at org.apache.hadoop.hbase.wal.WALFactory.getWAL(WALFactory.java:296)
> at
>
> org.apache.phoenix.hbase.index.write.recovery.TestPerRegionIndexWriteCache.setUp(TestPerRegionIndexWriteCache.java:109)
> ...
>
> What I am doing wrong ?
>
> Thanks!
>
> Mark
>
>
>
> On Thu, 18 Nov 2021 at 11:16, Mark Jens <ma...@gmail.com> wrote:
>
> > Hi,
> >
> > On Thu, 18 Nov 2021 at 10:38, Istvan Toth <st...@cloudera.com.invalid>
> > wrote:
> >
> >> Phoenix probably won't work with HBase 3.x, but looking at the linked
> >> commit, it should be fairly straightforward to apply that to HBase 2.4.
> >>
> >> I'm not sure why that hasn't been backported to 2.x, perhaps HBase
> doesn't
> >> have test infra set up for ARM.
> >> You may want to discuss backporting that change to the active Hbase 2.x
> >> branches with the HBase project.
> >>
> >
> > I've asked for a backport at
> >
> https://issues.apache.org/jira/browse/HBASE-23612?focusedCommentId=17445761&page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-17445761
> >
> > Apologies for creating
> https://issues.apache.org/jira/browse/PHOENIX-6595.
> > Initially I thought that Phoenix can apply the patch since it already
> > requires manual build of HBase.
> > But you are right - it would be much better if the improvement is
> > backported to HBase 2.x !
> >
> >
> >>
> >>
> >> On Wed, Nov 17, 2021 at 12:35 PM Mark Jens <ma...@gmail.com>
> wrote:
> >>
> >> > Hi Istvan,
> >> >
> >> > It seems HBase support for ARM64 is available only in 3.x:
> >> >
> >> >
> >>
> https://github.com/apache/hbase/commit/5480493f5f7b01b496f54215334543f2a82c6ba7
> >> > Would Phoenix work with HBase 3.x ?
> >> >
> >> > On Wed, 17 Nov 2021 at 13:28, Mark Jens <ma...@gmail.com>
> wrote:
> >> >
> >> > > Thanks for the hint!
> >> > >
> >> > > Unfortunately HBase-2.4.8 build fails with:
> >> > >
> >> > > INFO] BUILD FAILURE
> >> > > [INFO]
> >> > >
> >> ------------------------------------------------------------------------
> >> > > [INFO] Total time:  01:42 min
> >> > > [INFO] Finished at: 2021-11-17T11:23:19Z
> >> > > [INFO]
> >> > >
> >> ------------------------------------------------------------------------
> >> > > [ERROR] Failed to execute goal
> >> > > org.xolstice.maven.plugins:protobuf-maven-plugin:0.6.1:compile
> >> > > (compile-protoc) on project hbase-protocol: Unable to resolve
> >> artifact:
> >> > > Missing:
> >> > > [ERROR] ----------
> >> > > [ERROR] 1) com.google.protobuf:protoc:exe:linux-aarch_64:2.5.0
> >> > > [ERROR]
> >> > > [ERROR]   Try downloading the file manually from the project
> website.
> >> > > [ERROR]
> >> > > [ERROR]   Then, install it using the command:
> >> > > [ERROR]       mvn install:install-file -DgroupId=com.google.protobuf
> >> > > -DartifactId=protoc -Dversion=2.5.0 -Dclassifier=linux-aarch_64
> >> > > -Dpackaging=exe -Dfile=/path/to/file
> >> > > [ERROR]
> >> > > [ERROR]   Alternatively, if you host your own repository you can
> >> deploy
> >> > > the file there:
> >> > > [ERROR]       mvn deploy:deploy-file -DgroupId=com.google.protobuf
> >> > > -DartifactId=protoc -Dversion=2.5.0 -Dclassifier=linux-aarch_64
> >> > > -Dpackaging=exe -Dfile=/path/to/file -Durl=[url] -DrepositoryId=[id]
> >> > > [ERROR]
> >> > > [ERROR]   Path to dependency:
> >> > > [ERROR]   1) org.apache.hbase:hbase-protocol:jar:2.4.8
> >> > > [ERROR]   2) com.google.protobuf:protoc:exe:linux-aarch_64:2.5.0
> >> > >
> >> > >
> >> > >
> >> > >
> >> > > On Wed, 17 Nov 2021 at 12:27, Istvan Toth
> <stoty@cloudera.com.invalid
> >> >
> >> > > wrote:
> >> > >
> >> > >> You need to recompile HBase.
> >> > >> See BULIDING.md
> >> > >>
> >> > >> On Wed, Nov 17, 2021 at 10:33 AM Mark Jens <ma...@gmail.com>
> >> > wrote:
> >> > >>
> >> > >> > Hello,
> >> > >> >
> >> > >> > I am trying to build Phoenix on Ubuntu 20.04.3 ARM64.
> >> > >> >
> >> > >> > Phoenix Core module fails with:
> >> > >> >
> >> > >> > [ERROR]
> >> > >> >
> >> > >> >
> >> > >>
> >> >
> >>
> org.apache.phoenix.hbase.index.write.recovery.TestPerRegionIndexWriteCache.testMultipleAddsForSingleRegion
> >> > >> >  Time elapsed: 0.025 s  <<< ERROR!
> >> > >> > java.lang.IncompatibleClassChangeError: Found interface
> >> > >> > org.apache.hadoop.hdfs.protocol.HdfsFileStatus, but class was
> >> expected
> >> > >> > at
> >> > >> > org.apache.hadoop.hbase.io
> >> > >> >
> >> > >>
> >> >
> >>
> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper.createOutput(FanOutOneBlockAsyncDFSOutputHelper.java:536)
> >> > >> > at
> >> > >> > org.apache.hadoop.hbase.io
> >> > >> >
> >> > >>
> >> >
> >>
> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper.access$400(FanOutOneBlockAsyncDFSOutputHelper.java:112)
> >> > >> > at
> >> > >> > org.apache.hadoop.hbase.io
> >> > >> >
> >> > >>
> >> >
> >>
> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper$8.doCall(FanOutOneBlockAsyncDFSOutputHelper.java:616)
> >> > >> > at
> >> > >> > org.apache.hadoop.hbase.io
> >> > >> >
> >> > >>
> >> >
> >>
> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper$8.doCall(FanOutOneBlockAsyncDFSOutputHelper.java:611)
> >> > >> > at
> >> > >> >
> >> > >> >
> >> > >>
> >> >
> >>
> org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
> >> > >> > at
> >> > >> > org.apache.hadoop.hbase.io
> >> > >> >
> >> > >>
> >> >
> >>
> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper.createOutput(FanOutOneBlockAsyncDFSOutputHelper.java:624)
> >> > >> > at
> >> > >> > org.apache.hadoop.hbase.io
> >> > >> >
> >> .asyncfs.AsyncFSOutputHelper.createOutput(AsyncFSOutputHelper.java:53)
> >> > >> > at
> >> > >> >
> >> > >> >
> >> > >>
> >> >
> >>
> org.apache.hadoop.hbase.regionserver.wal.AsyncProtobufLogWriter.initOutput(AsyncProtobufLogWriter.java:180)
> >> > >> > at
> >> > >> >
> >> > >> >
> >> > >>
> >> >
> >>
> org.apache.hadoop.hbase.regionserver.wal.AbstractProtobufLogWriter.init(AbstractProtobufLogWriter.java:166)
> >> > >> > at
> >> > >> >
> >> > >> >
> >> > >>
> >> >
> >>
> org.apache.hadoop.hbase.wal.AsyncFSWALProvider.createAsyncWriter(AsyncFSWALProvider.java:113)
> >> > >> > at
> >> > >> >
> >> > >> >
> >> > >>
> >> >
> >>
> org.apache.hadoop.hbase.regionserver.wal.AsyncFSWAL.createWriterInstance(AsyncFSWAL.java:669)
> >> > >> > at
> >> > >> >
> >> > >> >
> >> > >>
> >> >
> >>
> org.apache.hadoop.hbase.regionserver.wal.AsyncFSWAL.createWriterInstance(AsyncFSWAL.java:130)
> >> > >> > at
> >> > >> >
> >> > >> >
> >> > >>
> >> >
> >>
> org.apache.hadoop.hbase.regionserver.wal.AbstractFSWAL.rollWriter(AbstractFSWAL.java:841)
> >> > >> > at
> >> > >> >
> >> > >> >
> >> > >>
> >> >
> >>
> org.apache.hadoop.hbase.regionserver.wal.AbstractFSWAL.rollWriter(AbstractFSWAL.java:548)
> >> > >> > at
> >> > >> >
> >> > >> >
> >> > >>
> >> >
> >>
> org.apache.hadoop.hbase.regionserver.wal.AbstractFSWAL.init(AbstractFSWAL.java:489)
> >> > >> > at
> >> > >> >
> >> > >> >
> >> > >>
> >> >
> >>
> org.apache.hadoop.hbase.wal.AbstractFSWALProvider.getWAL(AbstractFSWALProvider.java:160)
> >> > >> > at
> >> > >> >
> >> > >> >
> >> > >>
> >> >
> >>
> org.apache.hadoop.hbase.wal.AbstractFSWALProvider.getWAL(AbstractFSWALProvider.java:62)
> >> > >> > at
> >> org.apache.hadoop.hbase.wal.WALFactory.getWAL(WALFactory.java:296)
> >> > >> > at
> >> > >> >
> >> > >> >
> >> > >>
> >> >
> >>
> org.apache.phoenix.hbase.index.write.recovery.TestPerRegionIndexWriteCache.setUp(TestPerRegionIndexWriteCache.java:109)
> >> > >> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >> > >> > ....
> >> > >> > [INFO]
> >> > >> > [INFO] Results:
> >> > >> > [INFO]
> >> > >> > [ERROR] Errors:
> >> > >> > [ERROR]   TestPerRegionIndexWriteCache.setUp:109 »
> >> > >> IncompatibleClassChange
> >> > >> > Found interfa...
> >> > >> > [ERROR]   TestPerRegionIndexWriteCache.setUp:109 »
> >> > >> IncompatibleClassChange
> >> > >> > Found interfa...
> >> > >> > [ERROR]   TestPerRegionIndexWriteCache.setUp:109 »
> >> > >> IncompatibleClassChange
> >> > >> > Found interfa...
> >> > >> > [INFO]
> >> > >> > [ERROR] Tests run: 1909, Failures: 0, Errors: 3, Skipped: 6
> >> > >> > [INFO]
> >> > >> > [INFO]
> >> > >> >
> >> >
> ------------------------------------------------------------------------
> >> > >> > [INFO] Reactor Summary for Apache Phoenix 5.2.0-SNAPSHOT:
> >> > >> > [INFO]
> >> > >> > [INFO] Apache Phoenix .....................................
> >> SUCCESS [
> >> > >> >  2.034 s]
> >> > >> > [INFO] Phoenix Hbase 2.4.1 compatibility ..................
> >> SUCCESS [
> >> > >> >  5.358 s]
> >> > >> > [INFO] Phoenix Hbase 2.4.0 compatibility ..................
> >> SUCCESS [
> >> > >> >  3.946 s]
> >> > >> > [INFO] Phoenix Hbase 2.3.0 compatibility ..................
> >> SUCCESS [
> >> > >> >  4.437 s]
> >> > >> > [INFO] Phoenix Hbase 2.2.5 compatibility ..................
> >> SUCCESS [
> >> > >> >  4.004 s]
> >> > >> > [INFO] Phoenix Hbase 2.1.6 compatibility ..................
> >> SUCCESS [
> >> > >> >  3.966 s]
> >> > >> > [INFO] Phoenix Core .......................................
> FAILURE
> >> > >> [01:25
> >> > >> > min]
> >> > >> > [INFO] Phoenix - Pherf ....................................
> SKIPPED
> >> > >> > ...
> >> > >> >
> >> > >> > Any idea why this breaks ?
> >> > >> >
> >> > >> > It does not look ARM64 specific to me. I will try on x64 too.
> >> > >> >
> >> > >> > Thanks!
> >> > >> >
> >> > >> > Mark
> >> > >> >
> >> > >>
> >> > >>
> >> > >> --
> >> > >> *István Tóth* | Staff Software Engineer
> >> > >> stoty@cloudera.com <https://www.cloudera.com>
> >> > >> [image: Cloudera] <https://www.cloudera.com/>
> >> > >> [image: Cloudera on Twitter] <https://twitter.com/cloudera>
> [image:
> >> > >> Cloudera on Facebook] <https://www.facebook.com/cloudera> [image:
> >> > >> Cloudera
> >> > >> on LinkedIn] <https://www.linkedin.com/company/cloudera>
> >> > >> <https://www.cloudera.com/>
> >> > >> ------------------------------
> >> > >>
> >> > >
> >> >
> >>
> >>
> >> --
> >> *István Tóth* | Staff Software Engineer
> >> stoty@cloudera.com <https://www.cloudera.com>
> >> [image: Cloudera] <https://www.cloudera.com/>
> >> [image: Cloudera on Twitter] <https://twitter.com/cloudera> [image:
> >> Cloudera on Facebook] <https://www.facebook.com/cloudera> [image:
> >> Cloudera
> >> on LinkedIn] <https://www.linkedin.com/company/cloudera>
> >> <https://www.cloudera.com/>
> >> ------------------------------
> >>
> >
>

Re: Test failure on Linux ARM64

Posted by Mark Jens <ma...@gmail.com>.
Hi Istvan,

I've patched HBase 2.4.8 with the same changes as in
https://github.com/apache/hbase/pull/3860.
hbase-2.4.8$ mvn install -Dhadoop.profile=3.0 -DskipTests  passed
successfully!

Following BUILDING.md I installed Phoenix with: mvn clean install
-Dhbase.profile=2.4 -Dhbase.version=2.4.8 -DskipTests
And finally tried to test it with: mvn verify -Dhbase.profile=2.4
-Dhbase.version=2.4.8 but again it failed with

java.lang.IncompatibleClassChangeError: Found interface
org.apache.hadoop.hdfs.protocol.HdfsFileStatus, but class was expected
at
org.apache.hadoop.hbase.io.asyncfs.FanOutOneBlockAsyncDFSOutputHelper.createOutput(FanOutOneBlockAsyncDFSOutputHelper.java:536)
at
org.apache.hadoop.hbase.io.asyncfs.FanOutOneBlockAsyncDFSOutputHelper.access$400(FanOutOneBlockAsyncDFSOutputHelper.java:112)
at
org.apache.hadoop.hbase.io.asyncfs.FanOutOneBlockAsyncDFSOutputHelper$8.doCall(FanOutOneBlockAsyncDFSOutputHelper.java:616)
at
org.apache.hadoop.hbase.io.asyncfs.FanOutOneBlockAsyncDFSOutputHelper$8.doCall(FanOutOneBlockAsyncDFSOutputHelper.java:611)
at
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at
org.apache.hadoop.hbase.io.asyncfs.FanOutOneBlockAsyncDFSOutputHelper.createOutput(FanOutOneBlockAsyncDFSOutputHelper.java:624)
at
org.apache.hadoop.hbase.io.asyncfs.AsyncFSOutputHelper.createOutput(AsyncFSOutputHelper.java:53)
at
org.apache.hadoop.hbase.regionserver.wal.AsyncProtobufLogWriter.initOutput(AsyncProtobufLogWriter.java:180)
at
org.apache.hadoop.hbase.regionserver.wal.AbstractProtobufLogWriter.init(AbstractProtobufLogWriter.java:166)
at
org.apache.hadoop.hbase.wal.AsyncFSWALProvider.createAsyncWriter(AsyncFSWALProvider.java:113)
at
org.apache.hadoop.hbase.regionserver.wal.AsyncFSWAL.createWriterInstance(AsyncFSWAL.java:669)
at
org.apache.hadoop.hbase.regionserver.wal.AsyncFSWAL.createWriterInstance(AsyncFSWAL.java:130)
at
org.apache.hadoop.hbase.regionserver.wal.AbstractFSWAL.rollWriter(AbstractFSWAL.java:841)
at
org.apache.hadoop.hbase.regionserver.wal.AbstractFSWAL.rollWriter(AbstractFSWAL.java:548)
at
org.apache.hadoop.hbase.regionserver.wal.AbstractFSWAL.init(AbstractFSWAL.java:489)
at
org.apache.hadoop.hbase.wal.AbstractFSWALProvider.getWAL(AbstractFSWALProvider.java:160)
at
org.apache.hadoop.hbase.wal.AbstractFSWALProvider.getWAL(AbstractFSWALProvider.java:62)
at org.apache.hadoop.hbase.wal.WALFactory.getWAL(WALFactory.java:296)
at
org.apache.phoenix.hbase.index.write.recovery.TestPerRegionIndexWriteCache.setUp(TestPerRegionIndexWriteCache.java:109)
...

What I am doing wrong ?

Thanks!

Mark



On Thu, 18 Nov 2021 at 11:16, Mark Jens <ma...@gmail.com> wrote:

> Hi,
>
> On Thu, 18 Nov 2021 at 10:38, Istvan Toth <st...@cloudera.com.invalid>
> wrote:
>
>> Phoenix probably won't work with HBase 3.x, but looking at the linked
>> commit, it should be fairly straightforward to apply that to HBase 2.4.
>>
>> I'm not sure why that hasn't been backported to 2.x, perhaps HBase doesn't
>> have test infra set up for ARM.
>> You may want to discuss backporting that change to the active Hbase 2.x
>> branches with the HBase project.
>>
>
> I've asked for a backport at
> https://issues.apache.org/jira/browse/HBASE-23612?focusedCommentId=17445761&page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-17445761
>
> Apologies for creating https://issues.apache.org/jira/browse/PHOENIX-6595.
> Initially I thought that Phoenix can apply the patch since it already
> requires manual build of HBase.
> But you are right - it would be much better if the improvement is
> backported to HBase 2.x !
>
>
>>
>>
>> On Wed, Nov 17, 2021 at 12:35 PM Mark Jens <ma...@gmail.com> wrote:
>>
>> > Hi Istvan,
>> >
>> > It seems HBase support for ARM64 is available only in 3.x:
>> >
>> >
>> https://github.com/apache/hbase/commit/5480493f5f7b01b496f54215334543f2a82c6ba7
>> > Would Phoenix work with HBase 3.x ?
>> >
>> > On Wed, 17 Nov 2021 at 13:28, Mark Jens <ma...@gmail.com> wrote:
>> >
>> > > Thanks for the hint!
>> > >
>> > > Unfortunately HBase-2.4.8 build fails with:
>> > >
>> > > INFO] BUILD FAILURE
>> > > [INFO]
>> > >
>> ------------------------------------------------------------------------
>> > > [INFO] Total time:  01:42 min
>> > > [INFO] Finished at: 2021-11-17T11:23:19Z
>> > > [INFO]
>> > >
>> ------------------------------------------------------------------------
>> > > [ERROR] Failed to execute goal
>> > > org.xolstice.maven.plugins:protobuf-maven-plugin:0.6.1:compile
>> > > (compile-protoc) on project hbase-protocol: Unable to resolve
>> artifact:
>> > > Missing:
>> > > [ERROR] ----------
>> > > [ERROR] 1) com.google.protobuf:protoc:exe:linux-aarch_64:2.5.0
>> > > [ERROR]
>> > > [ERROR]   Try downloading the file manually from the project website.
>> > > [ERROR]
>> > > [ERROR]   Then, install it using the command:
>> > > [ERROR]       mvn install:install-file -DgroupId=com.google.protobuf
>> > > -DartifactId=protoc -Dversion=2.5.0 -Dclassifier=linux-aarch_64
>> > > -Dpackaging=exe -Dfile=/path/to/file
>> > > [ERROR]
>> > > [ERROR]   Alternatively, if you host your own repository you can
>> deploy
>> > > the file there:
>> > > [ERROR]       mvn deploy:deploy-file -DgroupId=com.google.protobuf
>> > > -DartifactId=protoc -Dversion=2.5.0 -Dclassifier=linux-aarch_64
>> > > -Dpackaging=exe -Dfile=/path/to/file -Durl=[url] -DrepositoryId=[id]
>> > > [ERROR]
>> > > [ERROR]   Path to dependency:
>> > > [ERROR]   1) org.apache.hbase:hbase-protocol:jar:2.4.8
>> > > [ERROR]   2) com.google.protobuf:protoc:exe:linux-aarch_64:2.5.0
>> > >
>> > >
>> > >
>> > >
>> > > On Wed, 17 Nov 2021 at 12:27, Istvan Toth <stoty@cloudera.com.invalid
>> >
>> > > wrote:
>> > >
>> > >> You need to recompile HBase.
>> > >> See BULIDING.md
>> > >>
>> > >> On Wed, Nov 17, 2021 at 10:33 AM Mark Jens <ma...@gmail.com>
>> > wrote:
>> > >>
>> > >> > Hello,
>> > >> >
>> > >> > I am trying to build Phoenix on Ubuntu 20.04.3 ARM64.
>> > >> >
>> > >> > Phoenix Core module fails with:
>> > >> >
>> > >> > [ERROR]
>> > >> >
>> > >> >
>> > >>
>> >
>> org.apache.phoenix.hbase.index.write.recovery.TestPerRegionIndexWriteCache.testMultipleAddsForSingleRegion
>> > >> >  Time elapsed: 0.025 s  <<< ERROR!
>> > >> > java.lang.IncompatibleClassChangeError: Found interface
>> > >> > org.apache.hadoop.hdfs.protocol.HdfsFileStatus, but class was
>> expected
>> > >> > at
>> > >> > org.apache.hadoop.hbase.io
>> > >> >
>> > >>
>> >
>> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper.createOutput(FanOutOneBlockAsyncDFSOutputHelper.java:536)
>> > >> > at
>> > >> > org.apache.hadoop.hbase.io
>> > >> >
>> > >>
>> >
>> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper.access$400(FanOutOneBlockAsyncDFSOutputHelper.java:112)
>> > >> > at
>> > >> > org.apache.hadoop.hbase.io
>> > >> >
>> > >>
>> >
>> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper$8.doCall(FanOutOneBlockAsyncDFSOutputHelper.java:616)
>> > >> > at
>> > >> > org.apache.hadoop.hbase.io
>> > >> >
>> > >>
>> >
>> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper$8.doCall(FanOutOneBlockAsyncDFSOutputHelper.java:611)
>> > >> > at
>> > >> >
>> > >> >
>> > >>
>> >
>> org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
>> > >> > at
>> > >> > org.apache.hadoop.hbase.io
>> > >> >
>> > >>
>> >
>> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper.createOutput(FanOutOneBlockAsyncDFSOutputHelper.java:624)
>> > >> > at
>> > >> > org.apache.hadoop.hbase.io
>> > >> >
>> .asyncfs.AsyncFSOutputHelper.createOutput(AsyncFSOutputHelper.java:53)
>> > >> > at
>> > >> >
>> > >> >
>> > >>
>> >
>> org.apache.hadoop.hbase.regionserver.wal.AsyncProtobufLogWriter.initOutput(AsyncProtobufLogWriter.java:180)
>> > >> > at
>> > >> >
>> > >> >
>> > >>
>> >
>> org.apache.hadoop.hbase.regionserver.wal.AbstractProtobufLogWriter.init(AbstractProtobufLogWriter.java:166)
>> > >> > at
>> > >> >
>> > >> >
>> > >>
>> >
>> org.apache.hadoop.hbase.wal.AsyncFSWALProvider.createAsyncWriter(AsyncFSWALProvider.java:113)
>> > >> > at
>> > >> >
>> > >> >
>> > >>
>> >
>> org.apache.hadoop.hbase.regionserver.wal.AsyncFSWAL.createWriterInstance(AsyncFSWAL.java:669)
>> > >> > at
>> > >> >
>> > >> >
>> > >>
>> >
>> org.apache.hadoop.hbase.regionserver.wal.AsyncFSWAL.createWriterInstance(AsyncFSWAL.java:130)
>> > >> > at
>> > >> >
>> > >> >
>> > >>
>> >
>> org.apache.hadoop.hbase.regionserver.wal.AbstractFSWAL.rollWriter(AbstractFSWAL.java:841)
>> > >> > at
>> > >> >
>> > >> >
>> > >>
>> >
>> org.apache.hadoop.hbase.regionserver.wal.AbstractFSWAL.rollWriter(AbstractFSWAL.java:548)
>> > >> > at
>> > >> >
>> > >> >
>> > >>
>> >
>> org.apache.hadoop.hbase.regionserver.wal.AbstractFSWAL.init(AbstractFSWAL.java:489)
>> > >> > at
>> > >> >
>> > >> >
>> > >>
>> >
>> org.apache.hadoop.hbase.wal.AbstractFSWALProvider.getWAL(AbstractFSWALProvider.java:160)
>> > >> > at
>> > >> >
>> > >> >
>> > >>
>> >
>> org.apache.hadoop.hbase.wal.AbstractFSWALProvider.getWAL(AbstractFSWALProvider.java:62)
>> > >> > at
>> org.apache.hadoop.hbase.wal.WALFactory.getWAL(WALFactory.java:296)
>> > >> > at
>> > >> >
>> > >> >
>> > >>
>> >
>> org.apache.phoenix.hbase.index.write.recovery.TestPerRegionIndexWriteCache.setUp(TestPerRegionIndexWriteCache.java:109)
>> > >> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> > >> > ....
>> > >> > [INFO]
>> > >> > [INFO] Results:
>> > >> > [INFO]
>> > >> > [ERROR] Errors:
>> > >> > [ERROR]   TestPerRegionIndexWriteCache.setUp:109 »
>> > >> IncompatibleClassChange
>> > >> > Found interfa...
>> > >> > [ERROR]   TestPerRegionIndexWriteCache.setUp:109 »
>> > >> IncompatibleClassChange
>> > >> > Found interfa...
>> > >> > [ERROR]   TestPerRegionIndexWriteCache.setUp:109 »
>> > >> IncompatibleClassChange
>> > >> > Found interfa...
>> > >> > [INFO]
>> > >> > [ERROR] Tests run: 1909, Failures: 0, Errors: 3, Skipped: 6
>> > >> > [INFO]
>> > >> > [INFO]
>> > >> >
>> > ------------------------------------------------------------------------
>> > >> > [INFO] Reactor Summary for Apache Phoenix 5.2.0-SNAPSHOT:
>> > >> > [INFO]
>> > >> > [INFO] Apache Phoenix .....................................
>> SUCCESS [
>> > >> >  2.034 s]
>> > >> > [INFO] Phoenix Hbase 2.4.1 compatibility ..................
>> SUCCESS [
>> > >> >  5.358 s]
>> > >> > [INFO] Phoenix Hbase 2.4.0 compatibility ..................
>> SUCCESS [
>> > >> >  3.946 s]
>> > >> > [INFO] Phoenix Hbase 2.3.0 compatibility ..................
>> SUCCESS [
>> > >> >  4.437 s]
>> > >> > [INFO] Phoenix Hbase 2.2.5 compatibility ..................
>> SUCCESS [
>> > >> >  4.004 s]
>> > >> > [INFO] Phoenix Hbase 2.1.6 compatibility ..................
>> SUCCESS [
>> > >> >  3.966 s]
>> > >> > [INFO] Phoenix Core ....................................... FAILURE
>> > >> [01:25
>> > >> > min]
>> > >> > [INFO] Phoenix - Pherf .................................... SKIPPED
>> > >> > ...
>> > >> >
>> > >> > Any idea why this breaks ?
>> > >> >
>> > >> > It does not look ARM64 specific to me. I will try on x64 too.
>> > >> >
>> > >> > Thanks!
>> > >> >
>> > >> > Mark
>> > >> >
>> > >>
>> > >>
>> > >> --
>> > >> *István Tóth* | Staff Software Engineer
>> > >> stoty@cloudera.com <https://www.cloudera.com>
>> > >> [image: Cloudera] <https://www.cloudera.com/>
>> > >> [image: Cloudera on Twitter] <https://twitter.com/cloudera> [image:
>> > >> Cloudera on Facebook] <https://www.facebook.com/cloudera> [image:
>> > >> Cloudera
>> > >> on LinkedIn] <https://www.linkedin.com/company/cloudera>
>> > >> <https://www.cloudera.com/>
>> > >> ------------------------------
>> > >>
>> > >
>> >
>>
>>
>> --
>> *István Tóth* | Staff Software Engineer
>> stoty@cloudera.com <https://www.cloudera.com>
>> [image: Cloudera] <https://www.cloudera.com/>
>> [image: Cloudera on Twitter] <https://twitter.com/cloudera> [image:
>> Cloudera on Facebook] <https://www.facebook.com/cloudera> [image:
>> Cloudera
>> on LinkedIn] <https://www.linkedin.com/company/cloudera>
>> <https://www.cloudera.com/>
>> ------------------------------
>>
>

Re: Test failure on Linux ARM64

Posted by Mark Jens <ma...@gmail.com>.
Hi,

On Thu, 18 Nov 2021 at 10:38, Istvan Toth <st...@cloudera.com.invalid>
wrote:

> Phoenix probably won't work with HBase 3.x, but looking at the linked
> commit, it should be fairly straightforward to apply that to HBase 2.4.
>
> I'm not sure why that hasn't been backported to 2.x, perhaps HBase doesn't
> have test infra set up for ARM.
> You may want to discuss backporting that change to the active Hbase 2.x
> branches with the HBase project.
>

I've asked for a backport at
https://issues.apache.org/jira/browse/HBASE-23612?focusedCommentId=17445761&page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-17445761

Apologies for creating https://issues.apache.org/jira/browse/PHOENIX-6595.
Initially I thought that Phoenix can apply the patch since it already
requires manual build of HBase.
But you are right - it would be much better if the improvement is
backported to HBase 2.x !


>
>
> On Wed, Nov 17, 2021 at 12:35 PM Mark Jens <ma...@gmail.com> wrote:
>
> > Hi Istvan,
> >
> > It seems HBase support for ARM64 is available only in 3.x:
> >
> >
> https://github.com/apache/hbase/commit/5480493f5f7b01b496f54215334543f2a82c6ba7
> > Would Phoenix work with HBase 3.x ?
> >
> > On Wed, 17 Nov 2021 at 13:28, Mark Jens <ma...@gmail.com> wrote:
> >
> > > Thanks for the hint!
> > >
> > > Unfortunately HBase-2.4.8 build fails with:
> > >
> > > INFO] BUILD FAILURE
> > > [INFO]
> > >
> ------------------------------------------------------------------------
> > > [INFO] Total time:  01:42 min
> > > [INFO] Finished at: 2021-11-17T11:23:19Z
> > > [INFO]
> > >
> ------------------------------------------------------------------------
> > > [ERROR] Failed to execute goal
> > > org.xolstice.maven.plugins:protobuf-maven-plugin:0.6.1:compile
> > > (compile-protoc) on project hbase-protocol: Unable to resolve artifact:
> > > Missing:
> > > [ERROR] ----------
> > > [ERROR] 1) com.google.protobuf:protoc:exe:linux-aarch_64:2.5.0
> > > [ERROR]
> > > [ERROR]   Try downloading the file manually from the project website.
> > > [ERROR]
> > > [ERROR]   Then, install it using the command:
> > > [ERROR]       mvn install:install-file -DgroupId=com.google.protobuf
> > > -DartifactId=protoc -Dversion=2.5.0 -Dclassifier=linux-aarch_64
> > > -Dpackaging=exe -Dfile=/path/to/file
> > > [ERROR]
> > > [ERROR]   Alternatively, if you host your own repository you can deploy
> > > the file there:
> > > [ERROR]       mvn deploy:deploy-file -DgroupId=com.google.protobuf
> > > -DartifactId=protoc -Dversion=2.5.0 -Dclassifier=linux-aarch_64
> > > -Dpackaging=exe -Dfile=/path/to/file -Durl=[url] -DrepositoryId=[id]
> > > [ERROR]
> > > [ERROR]   Path to dependency:
> > > [ERROR]   1) org.apache.hbase:hbase-protocol:jar:2.4.8
> > > [ERROR]   2) com.google.protobuf:protoc:exe:linux-aarch_64:2.5.0
> > >
> > >
> > >
> > >
> > > On Wed, 17 Nov 2021 at 12:27, Istvan Toth <st...@cloudera.com.invalid>
> > > wrote:
> > >
> > >> You need to recompile HBase.
> > >> See BULIDING.md
> > >>
> > >> On Wed, Nov 17, 2021 at 10:33 AM Mark Jens <ma...@gmail.com>
> > wrote:
> > >>
> > >> > Hello,
> > >> >
> > >> > I am trying to build Phoenix on Ubuntu 20.04.3 ARM64.
> > >> >
> > >> > Phoenix Core module fails with:
> > >> >
> > >> > [ERROR]
> > >> >
> > >> >
> > >>
> >
> org.apache.phoenix.hbase.index.write.recovery.TestPerRegionIndexWriteCache.testMultipleAddsForSingleRegion
> > >> >  Time elapsed: 0.025 s  <<< ERROR!
> > >> > java.lang.IncompatibleClassChangeError: Found interface
> > >> > org.apache.hadoop.hdfs.protocol.HdfsFileStatus, but class was
> expected
> > >> > at
> > >> > org.apache.hadoop.hbase.io
> > >> >
> > >>
> >
> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper.createOutput(FanOutOneBlockAsyncDFSOutputHelper.java:536)
> > >> > at
> > >> > org.apache.hadoop.hbase.io
> > >> >
> > >>
> >
> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper.access$400(FanOutOneBlockAsyncDFSOutputHelper.java:112)
> > >> > at
> > >> > org.apache.hadoop.hbase.io
> > >> >
> > >>
> >
> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper$8.doCall(FanOutOneBlockAsyncDFSOutputHelper.java:616)
> > >> > at
> > >> > org.apache.hadoop.hbase.io
> > >> >
> > >>
> >
> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper$8.doCall(FanOutOneBlockAsyncDFSOutputHelper.java:611)
> > >> > at
> > >> >
> > >> >
> > >>
> >
> org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
> > >> > at
> > >> > org.apache.hadoop.hbase.io
> > >> >
> > >>
> >
> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper.createOutput(FanOutOneBlockAsyncDFSOutputHelper.java:624)
> > >> > at
> > >> > org.apache.hadoop.hbase.io
> > >> >
> .asyncfs.AsyncFSOutputHelper.createOutput(AsyncFSOutputHelper.java:53)
> > >> > at
> > >> >
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.regionserver.wal.AsyncProtobufLogWriter.initOutput(AsyncProtobufLogWriter.java:180)
> > >> > at
> > >> >
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.regionserver.wal.AbstractProtobufLogWriter.init(AbstractProtobufLogWriter.java:166)
> > >> > at
> > >> >
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.wal.AsyncFSWALProvider.createAsyncWriter(AsyncFSWALProvider.java:113)
> > >> > at
> > >> >
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.regionserver.wal.AsyncFSWAL.createWriterInstance(AsyncFSWAL.java:669)
> > >> > at
> > >> >
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.regionserver.wal.AsyncFSWAL.createWriterInstance(AsyncFSWAL.java:130)
> > >> > at
> > >> >
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.regionserver.wal.AbstractFSWAL.rollWriter(AbstractFSWAL.java:841)
> > >> > at
> > >> >
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.regionserver.wal.AbstractFSWAL.rollWriter(AbstractFSWAL.java:548)
> > >> > at
> > >> >
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.regionserver.wal.AbstractFSWAL.init(AbstractFSWAL.java:489)
> > >> > at
> > >> >
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.wal.AbstractFSWALProvider.getWAL(AbstractFSWALProvider.java:160)
> > >> > at
> > >> >
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.wal.AbstractFSWALProvider.getWAL(AbstractFSWALProvider.java:62)
> > >> > at
> org.apache.hadoop.hbase.wal.WALFactory.getWAL(WALFactory.java:296)
> > >> > at
> > >> >
> > >> >
> > >>
> >
> org.apache.phoenix.hbase.index.write.recovery.TestPerRegionIndexWriteCache.setUp(TestPerRegionIndexWriteCache.java:109)
> > >> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > >> > ....
> > >> > [INFO]
> > >> > [INFO] Results:
> > >> > [INFO]
> > >> > [ERROR] Errors:
> > >> > [ERROR]   TestPerRegionIndexWriteCache.setUp:109 »
> > >> IncompatibleClassChange
> > >> > Found interfa...
> > >> > [ERROR]   TestPerRegionIndexWriteCache.setUp:109 »
> > >> IncompatibleClassChange
> > >> > Found interfa...
> > >> > [ERROR]   TestPerRegionIndexWriteCache.setUp:109 »
> > >> IncompatibleClassChange
> > >> > Found interfa...
> > >> > [INFO]
> > >> > [ERROR] Tests run: 1909, Failures: 0, Errors: 3, Skipped: 6
> > >> > [INFO]
> > >> > [INFO]
> > >> >
> > ------------------------------------------------------------------------
> > >> > [INFO] Reactor Summary for Apache Phoenix 5.2.0-SNAPSHOT:
> > >> > [INFO]
> > >> > [INFO] Apache Phoenix ..................................... SUCCESS
> [
> > >> >  2.034 s]
> > >> > [INFO] Phoenix Hbase 2.4.1 compatibility .................. SUCCESS
> [
> > >> >  5.358 s]
> > >> > [INFO] Phoenix Hbase 2.4.0 compatibility .................. SUCCESS
> [
> > >> >  3.946 s]
> > >> > [INFO] Phoenix Hbase 2.3.0 compatibility .................. SUCCESS
> [
> > >> >  4.437 s]
> > >> > [INFO] Phoenix Hbase 2.2.5 compatibility .................. SUCCESS
> [
> > >> >  4.004 s]
> > >> > [INFO] Phoenix Hbase 2.1.6 compatibility .................. SUCCESS
> [
> > >> >  3.966 s]
> > >> > [INFO] Phoenix Core ....................................... FAILURE
> > >> [01:25
> > >> > min]
> > >> > [INFO] Phoenix - Pherf .................................... SKIPPED
> > >> > ...
> > >> >
> > >> > Any idea why this breaks ?
> > >> >
> > >> > It does not look ARM64 specific to me. I will try on x64 too.
> > >> >
> > >> > Thanks!
> > >> >
> > >> > Mark
> > >> >
> > >>
> > >>
> > >> --
> > >> *István Tóth* | Staff Software Engineer
> > >> stoty@cloudera.com <https://www.cloudera.com>
> > >> [image: Cloudera] <https://www.cloudera.com/>
> > >> [image: Cloudera on Twitter] <https://twitter.com/cloudera> [image:
> > >> Cloudera on Facebook] <https://www.facebook.com/cloudera> [image:
> > >> Cloudera
> > >> on LinkedIn] <https://www.linkedin.com/company/cloudera>
> > >> <https://www.cloudera.com/>
> > >> ------------------------------
> > >>
> > >
> >
>
>
> --
> *István Tóth* | Staff Software Engineer
> stoty@cloudera.com <https://www.cloudera.com>
> [image: Cloudera] <https://www.cloudera.com/>
> [image: Cloudera on Twitter] <https://twitter.com/cloudera> [image:
> Cloudera on Facebook] <https://www.facebook.com/cloudera> [image: Cloudera
> on LinkedIn] <https://www.linkedin.com/company/cloudera>
> <https://www.cloudera.com/>
> ------------------------------
>

Re: Test failure on Linux ARM64

Posted by Istvan Toth <st...@cloudera.com.INVALID>.
Phoenix probably won't work with HBase 3.x, but looking at the linked
commit, it should be fairly straightforward to apply that to HBase 2.4.

I'm not sure why that hasn't been backported to 2.x, perhaps HBase doesn't
have test infra set up for ARM.
You may want to discuss backporting that change to the active Hbase 2.x
branches with the HBase project.


On Wed, Nov 17, 2021 at 12:35 PM Mark Jens <ma...@gmail.com> wrote:

> Hi Istvan,
>
> It seems HBase support for ARM64 is available only in 3.x:
>
> https://github.com/apache/hbase/commit/5480493f5f7b01b496f54215334543f2a82c6ba7
> Would Phoenix work with HBase 3.x ?
>
> On Wed, 17 Nov 2021 at 13:28, Mark Jens <ma...@gmail.com> wrote:
>
> > Thanks for the hint!
> >
> > Unfortunately HBase-2.4.8 build fails with:
> >
> > INFO] BUILD FAILURE
> > [INFO]
> > ------------------------------------------------------------------------
> > [INFO] Total time:  01:42 min
> > [INFO] Finished at: 2021-11-17T11:23:19Z
> > [INFO]
> > ------------------------------------------------------------------------
> > [ERROR] Failed to execute goal
> > org.xolstice.maven.plugins:protobuf-maven-plugin:0.6.1:compile
> > (compile-protoc) on project hbase-protocol: Unable to resolve artifact:
> > Missing:
> > [ERROR] ----------
> > [ERROR] 1) com.google.protobuf:protoc:exe:linux-aarch_64:2.5.0
> > [ERROR]
> > [ERROR]   Try downloading the file manually from the project website.
> > [ERROR]
> > [ERROR]   Then, install it using the command:
> > [ERROR]       mvn install:install-file -DgroupId=com.google.protobuf
> > -DartifactId=protoc -Dversion=2.5.0 -Dclassifier=linux-aarch_64
> > -Dpackaging=exe -Dfile=/path/to/file
> > [ERROR]
> > [ERROR]   Alternatively, if you host your own repository you can deploy
> > the file there:
> > [ERROR]       mvn deploy:deploy-file -DgroupId=com.google.protobuf
> > -DartifactId=protoc -Dversion=2.5.0 -Dclassifier=linux-aarch_64
> > -Dpackaging=exe -Dfile=/path/to/file -Durl=[url] -DrepositoryId=[id]
> > [ERROR]
> > [ERROR]   Path to dependency:
> > [ERROR]   1) org.apache.hbase:hbase-protocol:jar:2.4.8
> > [ERROR]   2) com.google.protobuf:protoc:exe:linux-aarch_64:2.5.0
> >
> >
> >
> >
> > On Wed, 17 Nov 2021 at 12:27, Istvan Toth <st...@cloudera.com.invalid>
> > wrote:
> >
> >> You need to recompile HBase.
> >> See BULIDING.md
> >>
> >> On Wed, Nov 17, 2021 at 10:33 AM Mark Jens <ma...@gmail.com>
> wrote:
> >>
> >> > Hello,
> >> >
> >> > I am trying to build Phoenix on Ubuntu 20.04.3 ARM64.
> >> >
> >> > Phoenix Core module fails with:
> >> >
> >> > [ERROR]
> >> >
> >> >
> >>
> org.apache.phoenix.hbase.index.write.recovery.TestPerRegionIndexWriteCache.testMultipleAddsForSingleRegion
> >> >  Time elapsed: 0.025 s  <<< ERROR!
> >> > java.lang.IncompatibleClassChangeError: Found interface
> >> > org.apache.hadoop.hdfs.protocol.HdfsFileStatus, but class was expected
> >> > at
> >> > org.apache.hadoop.hbase.io
> >> >
> >>
> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper.createOutput(FanOutOneBlockAsyncDFSOutputHelper.java:536)
> >> > at
> >> > org.apache.hadoop.hbase.io
> >> >
> >>
> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper.access$400(FanOutOneBlockAsyncDFSOutputHelper.java:112)
> >> > at
> >> > org.apache.hadoop.hbase.io
> >> >
> >>
> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper$8.doCall(FanOutOneBlockAsyncDFSOutputHelper.java:616)
> >> > at
> >> > org.apache.hadoop.hbase.io
> >> >
> >>
> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper$8.doCall(FanOutOneBlockAsyncDFSOutputHelper.java:611)
> >> > at
> >> >
> >> >
> >>
> org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
> >> > at
> >> > org.apache.hadoop.hbase.io
> >> >
> >>
> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper.createOutput(FanOutOneBlockAsyncDFSOutputHelper.java:624)
> >> > at
> >> > org.apache.hadoop.hbase.io
> >> > .asyncfs.AsyncFSOutputHelper.createOutput(AsyncFSOutputHelper.java:53)
> >> > at
> >> >
> >> >
> >>
> org.apache.hadoop.hbase.regionserver.wal.AsyncProtobufLogWriter.initOutput(AsyncProtobufLogWriter.java:180)
> >> > at
> >> >
> >> >
> >>
> org.apache.hadoop.hbase.regionserver.wal.AbstractProtobufLogWriter.init(AbstractProtobufLogWriter.java:166)
> >> > at
> >> >
> >> >
> >>
> org.apache.hadoop.hbase.wal.AsyncFSWALProvider.createAsyncWriter(AsyncFSWALProvider.java:113)
> >> > at
> >> >
> >> >
> >>
> org.apache.hadoop.hbase.regionserver.wal.AsyncFSWAL.createWriterInstance(AsyncFSWAL.java:669)
> >> > at
> >> >
> >> >
> >>
> org.apache.hadoop.hbase.regionserver.wal.AsyncFSWAL.createWriterInstance(AsyncFSWAL.java:130)
> >> > at
> >> >
> >> >
> >>
> org.apache.hadoop.hbase.regionserver.wal.AbstractFSWAL.rollWriter(AbstractFSWAL.java:841)
> >> > at
> >> >
> >> >
> >>
> org.apache.hadoop.hbase.regionserver.wal.AbstractFSWAL.rollWriter(AbstractFSWAL.java:548)
> >> > at
> >> >
> >> >
> >>
> org.apache.hadoop.hbase.regionserver.wal.AbstractFSWAL.init(AbstractFSWAL.java:489)
> >> > at
> >> >
> >> >
> >>
> org.apache.hadoop.hbase.wal.AbstractFSWALProvider.getWAL(AbstractFSWALProvider.java:160)
> >> > at
> >> >
> >> >
> >>
> org.apache.hadoop.hbase.wal.AbstractFSWALProvider.getWAL(AbstractFSWALProvider.java:62)
> >> > at org.apache.hadoop.hbase.wal.WALFactory.getWAL(WALFactory.java:296)
> >> > at
> >> >
> >> >
> >>
> org.apache.phoenix.hbase.index.write.recovery.TestPerRegionIndexWriteCache.setUp(TestPerRegionIndexWriteCache.java:109)
> >> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >> > ....
> >> > [INFO]
> >> > [INFO] Results:
> >> > [INFO]
> >> > [ERROR] Errors:
> >> > [ERROR]   TestPerRegionIndexWriteCache.setUp:109 »
> >> IncompatibleClassChange
> >> > Found interfa...
> >> > [ERROR]   TestPerRegionIndexWriteCache.setUp:109 »
> >> IncompatibleClassChange
> >> > Found interfa...
> >> > [ERROR]   TestPerRegionIndexWriteCache.setUp:109 »
> >> IncompatibleClassChange
> >> > Found interfa...
> >> > [INFO]
> >> > [ERROR] Tests run: 1909, Failures: 0, Errors: 3, Skipped: 6
> >> > [INFO]
> >> > [INFO]
> >> >
> ------------------------------------------------------------------------
> >> > [INFO] Reactor Summary for Apache Phoenix 5.2.0-SNAPSHOT:
> >> > [INFO]
> >> > [INFO] Apache Phoenix ..................................... SUCCESS [
> >> >  2.034 s]
> >> > [INFO] Phoenix Hbase 2.4.1 compatibility .................. SUCCESS [
> >> >  5.358 s]
> >> > [INFO] Phoenix Hbase 2.4.0 compatibility .................. SUCCESS [
> >> >  3.946 s]
> >> > [INFO] Phoenix Hbase 2.3.0 compatibility .................. SUCCESS [
> >> >  4.437 s]
> >> > [INFO] Phoenix Hbase 2.2.5 compatibility .................. SUCCESS [
> >> >  4.004 s]
> >> > [INFO] Phoenix Hbase 2.1.6 compatibility .................. SUCCESS [
> >> >  3.966 s]
> >> > [INFO] Phoenix Core ....................................... FAILURE
> >> [01:25
> >> > min]
> >> > [INFO] Phoenix - Pherf .................................... SKIPPED
> >> > ...
> >> >
> >> > Any idea why this breaks ?
> >> >
> >> > It does not look ARM64 specific to me. I will try on x64 too.
> >> >
> >> > Thanks!
> >> >
> >> > Mark
> >> >
> >>
> >>
> >> --
> >> *István Tóth* | Staff Software Engineer
> >> stoty@cloudera.com <https://www.cloudera.com>
> >> [image: Cloudera] <https://www.cloudera.com/>
> >> [image: Cloudera on Twitter] <https://twitter.com/cloudera> [image:
> >> Cloudera on Facebook] <https://www.facebook.com/cloudera> [image:
> >> Cloudera
> >> on LinkedIn] <https://www.linkedin.com/company/cloudera>
> >> <https://www.cloudera.com/>
> >> ------------------------------
> >>
> >
>


-- 
*István Tóth* | Staff Software Engineer
stoty@cloudera.com <https://www.cloudera.com>
[image: Cloudera] <https://www.cloudera.com/>
[image: Cloudera on Twitter] <https://twitter.com/cloudera> [image:
Cloudera on Facebook] <https://www.facebook.com/cloudera> [image: Cloudera
on LinkedIn] <https://www.linkedin.com/company/cloudera>
<https://www.cloudera.com/>
------------------------------

Re: Test failure on Linux ARM64

Posted by Mark Jens <ma...@gmail.com>.
Hi Istvan,

It seems HBase support for ARM64 is available only in 3.x:
https://github.com/apache/hbase/commit/5480493f5f7b01b496f54215334543f2a82c6ba7
Would Phoenix work with HBase 3.x ?

On Wed, 17 Nov 2021 at 13:28, Mark Jens <ma...@gmail.com> wrote:

> Thanks for the hint!
>
> Unfortunately HBase-2.4.8 build fails with:
>
> INFO] BUILD FAILURE
> [INFO]
> ------------------------------------------------------------------------
> [INFO] Total time:  01:42 min
> [INFO] Finished at: 2021-11-17T11:23:19Z
> [INFO]
> ------------------------------------------------------------------------
> [ERROR] Failed to execute goal
> org.xolstice.maven.plugins:protobuf-maven-plugin:0.6.1:compile
> (compile-protoc) on project hbase-protocol: Unable to resolve artifact:
> Missing:
> [ERROR] ----------
> [ERROR] 1) com.google.protobuf:protoc:exe:linux-aarch_64:2.5.0
> [ERROR]
> [ERROR]   Try downloading the file manually from the project website.
> [ERROR]
> [ERROR]   Then, install it using the command:
> [ERROR]       mvn install:install-file -DgroupId=com.google.protobuf
> -DartifactId=protoc -Dversion=2.5.0 -Dclassifier=linux-aarch_64
> -Dpackaging=exe -Dfile=/path/to/file
> [ERROR]
> [ERROR]   Alternatively, if you host your own repository you can deploy
> the file there:
> [ERROR]       mvn deploy:deploy-file -DgroupId=com.google.protobuf
> -DartifactId=protoc -Dversion=2.5.0 -Dclassifier=linux-aarch_64
> -Dpackaging=exe -Dfile=/path/to/file -Durl=[url] -DrepositoryId=[id]
> [ERROR]
> [ERROR]   Path to dependency:
> [ERROR]   1) org.apache.hbase:hbase-protocol:jar:2.4.8
> [ERROR]   2) com.google.protobuf:protoc:exe:linux-aarch_64:2.5.0
>
>
>
>
> On Wed, 17 Nov 2021 at 12:27, Istvan Toth <st...@cloudera.com.invalid>
> wrote:
>
>> You need to recompile HBase.
>> See BULIDING.md
>>
>> On Wed, Nov 17, 2021 at 10:33 AM Mark Jens <ma...@gmail.com> wrote:
>>
>> > Hello,
>> >
>> > I am trying to build Phoenix on Ubuntu 20.04.3 ARM64.
>> >
>> > Phoenix Core module fails with:
>> >
>> > [ERROR]
>> >
>> >
>> org.apache.phoenix.hbase.index.write.recovery.TestPerRegionIndexWriteCache.testMultipleAddsForSingleRegion
>> >  Time elapsed: 0.025 s  <<< ERROR!
>> > java.lang.IncompatibleClassChangeError: Found interface
>> > org.apache.hadoop.hdfs.protocol.HdfsFileStatus, but class was expected
>> > at
>> > org.apache.hadoop.hbase.io
>> >
>> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper.createOutput(FanOutOneBlockAsyncDFSOutputHelper.java:536)
>> > at
>> > org.apache.hadoop.hbase.io
>> >
>> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper.access$400(FanOutOneBlockAsyncDFSOutputHelper.java:112)
>> > at
>> > org.apache.hadoop.hbase.io
>> >
>> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper$8.doCall(FanOutOneBlockAsyncDFSOutputHelper.java:616)
>> > at
>> > org.apache.hadoop.hbase.io
>> >
>> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper$8.doCall(FanOutOneBlockAsyncDFSOutputHelper.java:611)
>> > at
>> >
>> >
>> org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
>> > at
>> > org.apache.hadoop.hbase.io
>> >
>> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper.createOutput(FanOutOneBlockAsyncDFSOutputHelper.java:624)
>> > at
>> > org.apache.hadoop.hbase.io
>> > .asyncfs.AsyncFSOutputHelper.createOutput(AsyncFSOutputHelper.java:53)
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.regionserver.wal.AsyncProtobufLogWriter.initOutput(AsyncProtobufLogWriter.java:180)
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.regionserver.wal.AbstractProtobufLogWriter.init(AbstractProtobufLogWriter.java:166)
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.wal.AsyncFSWALProvider.createAsyncWriter(AsyncFSWALProvider.java:113)
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.regionserver.wal.AsyncFSWAL.createWriterInstance(AsyncFSWAL.java:669)
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.regionserver.wal.AsyncFSWAL.createWriterInstance(AsyncFSWAL.java:130)
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.regionserver.wal.AbstractFSWAL.rollWriter(AbstractFSWAL.java:841)
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.regionserver.wal.AbstractFSWAL.rollWriter(AbstractFSWAL.java:548)
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.regionserver.wal.AbstractFSWAL.init(AbstractFSWAL.java:489)
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.wal.AbstractFSWALProvider.getWAL(AbstractFSWALProvider.java:160)
>> > at
>> >
>> >
>> org.apache.hadoop.hbase.wal.AbstractFSWALProvider.getWAL(AbstractFSWALProvider.java:62)
>> > at org.apache.hadoop.hbase.wal.WALFactory.getWAL(WALFactory.java:296)
>> > at
>> >
>> >
>> org.apache.phoenix.hbase.index.write.recovery.TestPerRegionIndexWriteCache.setUp(TestPerRegionIndexWriteCache.java:109)
>> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> > ....
>> > [INFO]
>> > [INFO] Results:
>> > [INFO]
>> > [ERROR] Errors:
>> > [ERROR]   TestPerRegionIndexWriteCache.setUp:109 »
>> IncompatibleClassChange
>> > Found interfa...
>> > [ERROR]   TestPerRegionIndexWriteCache.setUp:109 »
>> IncompatibleClassChange
>> > Found interfa...
>> > [ERROR]   TestPerRegionIndexWriteCache.setUp:109 »
>> IncompatibleClassChange
>> > Found interfa...
>> > [INFO]
>> > [ERROR] Tests run: 1909, Failures: 0, Errors: 3, Skipped: 6
>> > [INFO]
>> > [INFO]
>> > ------------------------------------------------------------------------
>> > [INFO] Reactor Summary for Apache Phoenix 5.2.0-SNAPSHOT:
>> > [INFO]
>> > [INFO] Apache Phoenix ..................................... SUCCESS [
>> >  2.034 s]
>> > [INFO] Phoenix Hbase 2.4.1 compatibility .................. SUCCESS [
>> >  5.358 s]
>> > [INFO] Phoenix Hbase 2.4.0 compatibility .................. SUCCESS [
>> >  3.946 s]
>> > [INFO] Phoenix Hbase 2.3.0 compatibility .................. SUCCESS [
>> >  4.437 s]
>> > [INFO] Phoenix Hbase 2.2.5 compatibility .................. SUCCESS [
>> >  4.004 s]
>> > [INFO] Phoenix Hbase 2.1.6 compatibility .................. SUCCESS [
>> >  3.966 s]
>> > [INFO] Phoenix Core ....................................... FAILURE
>> [01:25
>> > min]
>> > [INFO] Phoenix - Pherf .................................... SKIPPED
>> > ...
>> >
>> > Any idea why this breaks ?
>> >
>> > It does not look ARM64 specific to me. I will try on x64 too.
>> >
>> > Thanks!
>> >
>> > Mark
>> >
>>
>>
>> --
>> *István Tóth* | Staff Software Engineer
>> stoty@cloudera.com <https://www.cloudera.com>
>> [image: Cloudera] <https://www.cloudera.com/>
>> [image: Cloudera on Twitter] <https://twitter.com/cloudera> [image:
>> Cloudera on Facebook] <https://www.facebook.com/cloudera> [image:
>> Cloudera
>> on LinkedIn] <https://www.linkedin.com/company/cloudera>
>> <https://www.cloudera.com/>
>> ------------------------------
>>
>

Re: Test failure on Linux ARM64

Posted by Mark Jens <ma...@gmail.com>.
Thanks for the hint!

Unfortunately HBase-2.4.8 build fails with:

INFO] BUILD FAILURE
[INFO]
------------------------------------------------------------------------
[INFO] Total time:  01:42 min
[INFO] Finished at: 2021-11-17T11:23:19Z
[INFO]
------------------------------------------------------------------------
[ERROR] Failed to execute goal
org.xolstice.maven.plugins:protobuf-maven-plugin:0.6.1:compile
(compile-protoc) on project hbase-protocol: Unable to resolve artifact:
Missing:
[ERROR] ----------
[ERROR] 1) com.google.protobuf:protoc:exe:linux-aarch_64:2.5.0
[ERROR]
[ERROR]   Try downloading the file manually from the project website.
[ERROR]
[ERROR]   Then, install it using the command:
[ERROR]       mvn install:install-file -DgroupId=com.google.protobuf
-DartifactId=protoc -Dversion=2.5.0 -Dclassifier=linux-aarch_64
-Dpackaging=exe -Dfile=/path/to/file
[ERROR]
[ERROR]   Alternatively, if you host your own repository you can deploy the
file there:
[ERROR]       mvn deploy:deploy-file -DgroupId=com.google.protobuf
-DartifactId=protoc -Dversion=2.5.0 -Dclassifier=linux-aarch_64
-Dpackaging=exe -Dfile=/path/to/file -Durl=[url] -DrepositoryId=[id]
[ERROR]
[ERROR]   Path to dependency:
[ERROR]   1) org.apache.hbase:hbase-protocol:jar:2.4.8
[ERROR]   2) com.google.protobuf:protoc:exe:linux-aarch_64:2.5.0




On Wed, 17 Nov 2021 at 12:27, Istvan Toth <st...@cloudera.com.invalid>
wrote:

> You need to recompile HBase.
> See BULIDING.md
>
> On Wed, Nov 17, 2021 at 10:33 AM Mark Jens <ma...@gmail.com> wrote:
>
> > Hello,
> >
> > I am trying to build Phoenix on Ubuntu 20.04.3 ARM64.
> >
> > Phoenix Core module fails with:
> >
> > [ERROR]
> >
> >
> org.apache.phoenix.hbase.index.write.recovery.TestPerRegionIndexWriteCache.testMultipleAddsForSingleRegion
> >  Time elapsed: 0.025 s  <<< ERROR!
> > java.lang.IncompatibleClassChangeError: Found interface
> > org.apache.hadoop.hdfs.protocol.HdfsFileStatus, but class was expected
> > at
> > org.apache.hadoop.hbase.io
> >
> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper.createOutput(FanOutOneBlockAsyncDFSOutputHelper.java:536)
> > at
> > org.apache.hadoop.hbase.io
> >
> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper.access$400(FanOutOneBlockAsyncDFSOutputHelper.java:112)
> > at
> > org.apache.hadoop.hbase.io
> >
> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper$8.doCall(FanOutOneBlockAsyncDFSOutputHelper.java:616)
> > at
> > org.apache.hadoop.hbase.io
> >
> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper$8.doCall(FanOutOneBlockAsyncDFSOutputHelper.java:611)
> > at
> >
> >
> org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
> > at
> > org.apache.hadoop.hbase.io
> >
> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper.createOutput(FanOutOneBlockAsyncDFSOutputHelper.java:624)
> > at
> > org.apache.hadoop.hbase.io
> > .asyncfs.AsyncFSOutputHelper.createOutput(AsyncFSOutputHelper.java:53)
> > at
> >
> >
> org.apache.hadoop.hbase.regionserver.wal.AsyncProtobufLogWriter.initOutput(AsyncProtobufLogWriter.java:180)
> > at
> >
> >
> org.apache.hadoop.hbase.regionserver.wal.AbstractProtobufLogWriter.init(AbstractProtobufLogWriter.java:166)
> > at
> >
> >
> org.apache.hadoop.hbase.wal.AsyncFSWALProvider.createAsyncWriter(AsyncFSWALProvider.java:113)
> > at
> >
> >
> org.apache.hadoop.hbase.regionserver.wal.AsyncFSWAL.createWriterInstance(AsyncFSWAL.java:669)
> > at
> >
> >
> org.apache.hadoop.hbase.regionserver.wal.AsyncFSWAL.createWriterInstance(AsyncFSWAL.java:130)
> > at
> >
> >
> org.apache.hadoop.hbase.regionserver.wal.AbstractFSWAL.rollWriter(AbstractFSWAL.java:841)
> > at
> >
> >
> org.apache.hadoop.hbase.regionserver.wal.AbstractFSWAL.rollWriter(AbstractFSWAL.java:548)
> > at
> >
> >
> org.apache.hadoop.hbase.regionserver.wal.AbstractFSWAL.init(AbstractFSWAL.java:489)
> > at
> >
> >
> org.apache.hadoop.hbase.wal.AbstractFSWALProvider.getWAL(AbstractFSWALProvider.java:160)
> > at
> >
> >
> org.apache.hadoop.hbase.wal.AbstractFSWALProvider.getWAL(AbstractFSWALProvider.java:62)
> > at org.apache.hadoop.hbase.wal.WALFactory.getWAL(WALFactory.java:296)
> > at
> >
> >
> org.apache.phoenix.hbase.index.write.recovery.TestPerRegionIndexWriteCache.setUp(TestPerRegionIndexWriteCache.java:109)
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > ....
> > [INFO]
> > [INFO] Results:
> > [INFO]
> > [ERROR] Errors:
> > [ERROR]   TestPerRegionIndexWriteCache.setUp:109 »
> IncompatibleClassChange
> > Found interfa...
> > [ERROR]   TestPerRegionIndexWriteCache.setUp:109 »
> IncompatibleClassChange
> > Found interfa...
> > [ERROR]   TestPerRegionIndexWriteCache.setUp:109 »
> IncompatibleClassChange
> > Found interfa...
> > [INFO]
> > [ERROR] Tests run: 1909, Failures: 0, Errors: 3, Skipped: 6
> > [INFO]
> > [INFO]
> > ------------------------------------------------------------------------
> > [INFO] Reactor Summary for Apache Phoenix 5.2.0-SNAPSHOT:
> > [INFO]
> > [INFO] Apache Phoenix ..................................... SUCCESS [
> >  2.034 s]
> > [INFO] Phoenix Hbase 2.4.1 compatibility .................. SUCCESS [
> >  5.358 s]
> > [INFO] Phoenix Hbase 2.4.0 compatibility .................. SUCCESS [
> >  3.946 s]
> > [INFO] Phoenix Hbase 2.3.0 compatibility .................. SUCCESS [
> >  4.437 s]
> > [INFO] Phoenix Hbase 2.2.5 compatibility .................. SUCCESS [
> >  4.004 s]
> > [INFO] Phoenix Hbase 2.1.6 compatibility .................. SUCCESS [
> >  3.966 s]
> > [INFO] Phoenix Core ....................................... FAILURE
> [01:25
> > min]
> > [INFO] Phoenix - Pherf .................................... SKIPPED
> > ...
> >
> > Any idea why this breaks ?
> >
> > It does not look ARM64 specific to me. I will try on x64 too.
> >
> > Thanks!
> >
> > Mark
> >
>
>
> --
> *István Tóth* | Staff Software Engineer
> stoty@cloudera.com <https://www.cloudera.com>
> [image: Cloudera] <https://www.cloudera.com/>
> [image: Cloudera on Twitter] <https://twitter.com/cloudera> [image:
> Cloudera on Facebook] <https://www.facebook.com/cloudera> [image: Cloudera
> on LinkedIn] <https://www.linkedin.com/company/cloudera>
> <https://www.cloudera.com/>
> ------------------------------
>

Re: Test failure on Linux ARM64

Posted by Istvan Toth <st...@cloudera.com.INVALID>.
You need to recompile HBase.
See BULIDING.md

On Wed, Nov 17, 2021 at 10:33 AM Mark Jens <ma...@gmail.com> wrote:

> Hello,
>
> I am trying to build Phoenix on Ubuntu 20.04.3 ARM64.
>
> Phoenix Core module fails with:
>
> [ERROR]
>
> org.apache.phoenix.hbase.index.write.recovery.TestPerRegionIndexWriteCache.testMultipleAddsForSingleRegion
>  Time elapsed: 0.025 s  <<< ERROR!
> java.lang.IncompatibleClassChangeError: Found interface
> org.apache.hadoop.hdfs.protocol.HdfsFileStatus, but class was expected
> at
> org.apache.hadoop.hbase.io
> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper.createOutput(FanOutOneBlockAsyncDFSOutputHelper.java:536)
> at
> org.apache.hadoop.hbase.io
> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper.access$400(FanOutOneBlockAsyncDFSOutputHelper.java:112)
> at
> org.apache.hadoop.hbase.io
> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper$8.doCall(FanOutOneBlockAsyncDFSOutputHelper.java:616)
> at
> org.apache.hadoop.hbase.io
> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper$8.doCall(FanOutOneBlockAsyncDFSOutputHelper.java:611)
> at
>
> org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
> at
> org.apache.hadoop.hbase.io
> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper.createOutput(FanOutOneBlockAsyncDFSOutputHelper.java:624)
> at
> org.apache.hadoop.hbase.io
> .asyncfs.AsyncFSOutputHelper.createOutput(AsyncFSOutputHelper.java:53)
> at
>
> org.apache.hadoop.hbase.regionserver.wal.AsyncProtobufLogWriter.initOutput(AsyncProtobufLogWriter.java:180)
> at
>
> org.apache.hadoop.hbase.regionserver.wal.AbstractProtobufLogWriter.init(AbstractProtobufLogWriter.java:166)
> at
>
> org.apache.hadoop.hbase.wal.AsyncFSWALProvider.createAsyncWriter(AsyncFSWALProvider.java:113)
> at
>
> org.apache.hadoop.hbase.regionserver.wal.AsyncFSWAL.createWriterInstance(AsyncFSWAL.java:669)
> at
>
> org.apache.hadoop.hbase.regionserver.wal.AsyncFSWAL.createWriterInstance(AsyncFSWAL.java:130)
> at
>
> org.apache.hadoop.hbase.regionserver.wal.AbstractFSWAL.rollWriter(AbstractFSWAL.java:841)
> at
>
> org.apache.hadoop.hbase.regionserver.wal.AbstractFSWAL.rollWriter(AbstractFSWAL.java:548)
> at
>
> org.apache.hadoop.hbase.regionserver.wal.AbstractFSWAL.init(AbstractFSWAL.java:489)
> at
>
> org.apache.hadoop.hbase.wal.AbstractFSWALProvider.getWAL(AbstractFSWALProvider.java:160)
> at
>
> org.apache.hadoop.hbase.wal.AbstractFSWALProvider.getWAL(AbstractFSWALProvider.java:62)
> at org.apache.hadoop.hbase.wal.WALFactory.getWAL(WALFactory.java:296)
> at
>
> org.apache.phoenix.hbase.index.write.recovery.TestPerRegionIndexWriteCache.setUp(TestPerRegionIndexWriteCache.java:109)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> ....
> [INFO]
> [INFO] Results:
> [INFO]
> [ERROR] Errors:
> [ERROR]   TestPerRegionIndexWriteCache.setUp:109 » IncompatibleClassChange
> Found interfa...
> [ERROR]   TestPerRegionIndexWriteCache.setUp:109 » IncompatibleClassChange
> Found interfa...
> [ERROR]   TestPerRegionIndexWriteCache.setUp:109 » IncompatibleClassChange
> Found interfa...
> [INFO]
> [ERROR] Tests run: 1909, Failures: 0, Errors: 3, Skipped: 6
> [INFO]
> [INFO]
> ------------------------------------------------------------------------
> [INFO] Reactor Summary for Apache Phoenix 5.2.0-SNAPSHOT:
> [INFO]
> [INFO] Apache Phoenix ..................................... SUCCESS [
>  2.034 s]
> [INFO] Phoenix Hbase 2.4.1 compatibility .................. SUCCESS [
>  5.358 s]
> [INFO] Phoenix Hbase 2.4.0 compatibility .................. SUCCESS [
>  3.946 s]
> [INFO] Phoenix Hbase 2.3.0 compatibility .................. SUCCESS [
>  4.437 s]
> [INFO] Phoenix Hbase 2.2.5 compatibility .................. SUCCESS [
>  4.004 s]
> [INFO] Phoenix Hbase 2.1.6 compatibility .................. SUCCESS [
>  3.966 s]
> [INFO] Phoenix Core ....................................... FAILURE [01:25
> min]
> [INFO] Phoenix - Pherf .................................... SKIPPED
> ...
>
> Any idea why this breaks ?
>
> It does not look ARM64 specific to me. I will try on x64 too.
>
> Thanks!
>
> Mark
>


-- 
*István Tóth* | Staff Software Engineer
stoty@cloudera.com <https://www.cloudera.com>
[image: Cloudera] <https://www.cloudera.com/>
[image: Cloudera on Twitter] <https://twitter.com/cloudera> [image:
Cloudera on Facebook] <https://www.facebook.com/cloudera> [image: Cloudera
on LinkedIn] <https://www.linkedin.com/company/cloudera>
<https://www.cloudera.com/>
------------------------------