You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@bigtop.apache.org by Rui Gao <rg...@yahoo-corp.jp> on 2016/05/16 11:19:38 UTC
Proposal of adding HDFS Erasure Coding test cases, and resource
unpack problem with HDFS smoke test.
Hi all
Nice to meet you all. I am Gao, an engineer working in Japan. And I have got two issues about Bigtop which I want to share.
The first, I want to proposal that we add HDFS Erasure Coding(HDFS-7285 , HDFS-8031) test cases in to Bigtop HDFS smoke tests. This could be very helpful when users want to test HDFS EC(Erasure Coding) related functions before they use Erasure Coding in production clusters. I wonder if there are engineers also interested in developing Erasure Coding test cases, so we could gather a team to work together? I have added a basic demo test case “TestECPolicy.groovy”, and send it in attached patch.
The second, when I am trying to reuse HDFS test cases by simply copying them and setting EC policy to TESTDIR, like I copied “TestPut.groovy” and make some minor changes to make a new EC test case “TestPutEC.groovy”. This test case failed dur to unpack resource errors. Then I try to run the original “TestPut.groovy”, some how it failed with the same reason too. Could you give some tips to resolve these problem? Thank you very much.
Looking forward to your reply.
Gao.
Following is the detailed failure log of the second issue.
TestPut.groovy:
After build with “pom.xml” and “clean package -Pnative,dist -Dtar -DskipTests” command, “install-hadoop” and “bigtop-tests:smoke-tests:hdfs:test -Psmoke.tests –info” grandle tasks was ran, then I got following error log in line 53( JarContent.unpackJarContainer(TestPut.class, ".", null);).
Gradle Test Executor 2 started executing tests.
Gradle Test Executor 2 finished executing tests.
org.apache.bigtop.itest.hadoop.hdfs.TestPut > classMethod FAILED
java.io.IOException: Class TestPut doesn't belong to any jar file in the classpath
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.codehaus.groovy.reflection.CachedConstructor.invoke(CachedConstructor.java:77)
at org.codehaus.groovy.runtime.callsite.ConstructorSite$ConstructorSiteNoUnwrapNoCoerce.callConstructor(ConstructorSite.java:102)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallConstructor(CallSiteArray.java:54)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callConstructor(AbstractCallSite.java:182)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callConstructor(AbstractCallSite.java:190)
at org.apache.bigtop.itest.JarContent.unpackJarContainer(JarContent.groovy:181)
at org.apache.bigtop.itest.JarContent$unpackJarContainer.call(Unknown Source)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:42)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:108)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:124)
at org.apache.bigtop.itest.hadoop.hdfs.TestPut.setUp(TestPut.groovy:53)
1 test completed, 1 failed
TestPutEC.groovy:
After build with “pom.xml” and “clean package -Pnative,dist -Dtar -DskipTests” command, “install-hadoop” and “bigtop-tests:smoke-tests:hdfs-ec:test -Psmoke.tests –info” grandle tasks was ran, then I got following error log in line 53( JarContent.unpackJarContainer(TestPutEC.class, ".", null);). While “TestECPolicy.groovy” was passed without problem.
Gradle Test Executor 2 started executing tests.
Gradle Test Executor 2 finished executing tests.
org.apache.bigtop.itest.hadoop.hdfs.TestPutEC > classMethod FAILED
java.io.IOException: Class TestPutEC doesn't belong to any jar file in the classpath
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.codehaus.groovy.reflection.CachedConstructor.invoke(CachedConstructor.java:77)
at org.codehaus.groovy.runtime.callsite.ConstructorSite$ConstructorSiteNoUnwrapNoCoerce.callConstructor(ConstructorSite.java:102)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallConstructor(CallSiteArray.java:54)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callConstructor(AbstractCallSite.java:182)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callConstructor(AbstractCallSite.java:190)
at org.apache.bigtop.itest.JarContent.unpackJarContainer(JarContent.groovy:181)
at org.apache.bigtop.itest.JarContent$unpackJarContainer.call(Unknown Source)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:42)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:108)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:124)
at org.apache.bigtop.itest.hadoop.hdfs.TestPutEC.setUp(TestPutEC.groovy:53)
2 tests completed, 1 failed
Re: Proposal of adding HDFS Erasure Coding test cases, and resource
unpack problem with HDFS smoke test.
Posted by Konstantin Boudnik <co...@apache.org>.
Hi Gao.
Welcome to the community and thanks for looking into the ways to contribute to
the project!
I'm sure it would be great to have some erasure tests in our set, so looking
forward to have them. As for your other question:
TestPut (and all other tests introduced in BIGTOP-2009) are actually
duplicating existing CLI tests and will be removed. The reason they won't work
in the smoke-test environment is this line
JarContent.unpackJarContainer(TestPut.class, "." , null);
As you might know we have two ways of running integration tests:
- with maven, where test artifacts (jar file(s)) need to be 'maven installed'
first. Some of the tests were using the assumption of a jar file to be
present in the classpath in order to extract test classes and other needed
resources.
- gradle smoke tests, which execute tests right from the source code (which
is very convenient). However, the jar file isn't present in this case and
the line above would produce the assert you see. I'd suggest to remove this
line from your tests completely. If you need any resources for your test to
work just put them in a resource directory (see hdfs smoke test build
file for an example).
Also, TestPut very heavily relies on the shell-outs which might not be the
best way of writing smoke tests.
Looking forward for your patch (and JIRA)! Thanks
Cos
On Mon, May 16, 2016 at 11:19AM, Rui Gao wrote:
>
> Hi all
>
> Nice to meet you all. I am Gao, an engineer working in Japan. And I have got
> two issues about Bigtop which I want to share.
>
> The first, I want to proposal that we add HDFS Erasure Coding(HDFS-7285 ,
> HDFS-8031) test cases in to Bigtop HDFS smoke tests. This could be very
> helpful when users want to test HDFS EC(Erasure Coding) related functions
> before they use Erasure Coding in production clusters. I wonder if there are
> engineers also interested in developing Erasure Coding test cases, so we
> could gather a team to work together? I have added a basic demo test case
> “TestECPolicy.groovy”, and send it in attached patch.
>
> The second, when I am trying to reuse HDFS test cases by simply copying them
> and setting EC policy to TESTDIR, like I copied “TestPut.groovy” and make
> some minor changes to make a new EC test case “TestPutEC.groovy”. This test
> case failed dur to unpack resource errors. Then I try to run the original
> “TestPut.groovy”, some how it failed with the same reason too. Could you
> give some tips to resolve these problem? Thank you very much.
>
> Looking forward to your reply.
> Gao.
>
>
> Following is the detailed failure log of the second issue.
>
> TestPut.groovy:
> After build with “pom.xml” and “clean package -Pnative,dist -Dtar -DskipTests” command, “install-hadoop” and “bigtop-tests:smoke-tests:hdfs:test -Psmoke.tests –info” grandle tasks was ran, then I got following error log in line 53( JarContent.unpackJarContainer(TestPut.class, ".", null);).
>
>
> Gradle Test Executor 2 started executing tests.
> Gradle Test Executor 2 finished executing tests.
>
> org.apache.bigtop.itest.hadoop.hdfs.TestPut > classMethod FAILED
> java.io.IOException: Class TestPut doesn't belong to any jar file in the classpath
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> at org.codehaus.groovy.reflection.CachedConstructor.invoke(CachedConstructor.java:77)
> at org.codehaus.groovy.runtime.callsite.ConstructorSite$ConstructorSiteNoUnwrapNoCoerce.callConstructor(ConstructorSite.java:102)
> at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallConstructor(CallSiteArray.java:54)
> at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callConstructor(AbstractCallSite.java:182)
> at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callConstructor(AbstractCallSite.java:190)
> at org.apache.bigtop.itest.JarContent.unpackJarContainer(JarContent.groovy:181)
> at org.apache.bigtop.itest.JarContent$unpackJarContainer.call(Unknown Source)
> at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:42)
> at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:108)
> at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:124)
> at org.apache.bigtop.itest.hadoop.hdfs.TestPut.setUp(TestPut.groovy:53)
>
> 1 test completed, 1 failed
>
>
>
>
>
>
> TestPutEC.groovy:
> After build with “pom.xml” and “clean package -Pnative,dist -Dtar -DskipTests” command, “install-hadoop” and “bigtop-tests:smoke-tests:hdfs-ec:test -Psmoke.tests –info” grandle tasks was ran, then I got following error log in line 53( JarContent.unpackJarContainer(TestPutEC.class, ".", null);). While “TestECPolicy.groovy” was passed without problem.
>
>
>
> Gradle Test Executor 2 started executing tests.
> Gradle Test Executor 2 finished executing tests.
>
> org.apache.bigtop.itest.hadoop.hdfs.TestPutEC > classMethod FAILED
> java.io.IOException: Class TestPutEC doesn't belong to any jar file in the classpath
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> at org.codehaus.groovy.reflection.CachedConstructor.invoke(CachedConstructor.java:77)
> at org.codehaus.groovy.runtime.callsite.ConstructorSite$ConstructorSiteNoUnwrapNoCoerce.callConstructor(ConstructorSite.java:102)
> at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallConstructor(CallSiteArray.java:54)
> at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callConstructor(AbstractCallSite.java:182)
> at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callConstructor(AbstractCallSite.java:190)
> at org.apache.bigtop.itest.JarContent.unpackJarContainer(JarContent.groovy:181)
> at org.apache.bigtop.itest.JarContent$unpackJarContainer.call(Unknown Source)
> at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:42)
> at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:108)
> at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:124)
> at org.apache.bigtop.itest.hadoop.hdfs.TestPutEC.setUp(TestPutEC.groovy:53)
>
> 2 tests completed, 1 failed
>
>
Re: Proposal of adding HDFS Erasure Coding test cases, and resource
unpack problem with HDFS smoke test.
Posted by Konstantin Boudnik <co...@apache.org>.
Hi Gao.
Welcome to the community and thanks for looking into the ways to contribute to
the project!
I'm sure it would be great to have some erasure tests in our set, so looking
forward to have them. As for your other question:
TestPut (and all other tests introduced in BIGTOP-2009) are actually
duplicating existing CLI tests and will be removed. The reason they won't work
in the smoke-test environment is this line
JarContent.unpackJarContainer(TestPut.class, "." , null);
As you might know we have two ways of running integration tests:
- with maven, where test artifacts (jar file(s)) need to be 'maven installed'
first. Some of the tests were using the assumption of a jar file to be
present in the classpath in order to extract test classes and other needed
resources.
- gradle smoke tests, which execute tests right from the source code (which
is very convenient). However, the jar file isn't present in this case and
the line above would produce the assert you see. I'd suggest to remove this
line from your tests completely. If you need any resources for your test to
work just put them in a resource directory (see hdfs smoke test build
file for an example).
Also, TestPut very heavily relies on the shell-outs which might not be the
best way of writing smoke tests.
Looking forward for your patch (and JIRA)! Thanks
Cos
On Mon, May 16, 2016 at 11:19AM, Rui Gao wrote:
>
> Hi all
>
> Nice to meet you all. I am Gao, an engineer working in Japan. And I have got
> two issues about Bigtop which I want to share.
>
> The first, I want to proposal that we add HDFS Erasure Coding(HDFS-7285 ,
> HDFS-8031) test cases in to Bigtop HDFS smoke tests. This could be very
> helpful when users want to test HDFS EC(Erasure Coding) related functions
> before they use Erasure Coding in production clusters. I wonder if there are
> engineers also interested in developing Erasure Coding test cases, so we
> could gather a team to work together? I have added a basic demo test case
> “TestECPolicy.groovy”, and send it in attached patch.
>
> The second, when I am trying to reuse HDFS test cases by simply copying them
> and setting EC policy to TESTDIR, like I copied “TestPut.groovy” and make
> some minor changes to make a new EC test case “TestPutEC.groovy”. This test
> case failed dur to unpack resource errors. Then I try to run the original
> “TestPut.groovy”, some how it failed with the same reason too. Could you
> give some tips to resolve these problem? Thank you very much.
>
> Looking forward to your reply.
> Gao.
>
>
> Following is the detailed failure log of the second issue.
>
> TestPut.groovy:
> After build with “pom.xml” and “clean package -Pnative,dist -Dtar -DskipTests” command, “install-hadoop” and “bigtop-tests:smoke-tests:hdfs:test -Psmoke.tests –info” grandle tasks was ran, then I got following error log in line 53( JarContent.unpackJarContainer(TestPut.class, ".", null);).
>
>
> Gradle Test Executor 2 started executing tests.
> Gradle Test Executor 2 finished executing tests.
>
> org.apache.bigtop.itest.hadoop.hdfs.TestPut > classMethod FAILED
> java.io.IOException: Class TestPut doesn't belong to any jar file in the classpath
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> at org.codehaus.groovy.reflection.CachedConstructor.invoke(CachedConstructor.java:77)
> at org.codehaus.groovy.runtime.callsite.ConstructorSite$ConstructorSiteNoUnwrapNoCoerce.callConstructor(ConstructorSite.java:102)
> at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallConstructor(CallSiteArray.java:54)
> at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callConstructor(AbstractCallSite.java:182)
> at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callConstructor(AbstractCallSite.java:190)
> at org.apache.bigtop.itest.JarContent.unpackJarContainer(JarContent.groovy:181)
> at org.apache.bigtop.itest.JarContent$unpackJarContainer.call(Unknown Source)
> at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:42)
> at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:108)
> at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:124)
> at org.apache.bigtop.itest.hadoop.hdfs.TestPut.setUp(TestPut.groovy:53)
>
> 1 test completed, 1 failed
>
>
>
>
>
>
> TestPutEC.groovy:
> After build with “pom.xml” and “clean package -Pnative,dist -Dtar -DskipTests” command, “install-hadoop” and “bigtop-tests:smoke-tests:hdfs-ec:test -Psmoke.tests –info” grandle tasks was ran, then I got following error log in line 53( JarContent.unpackJarContainer(TestPutEC.class, ".", null);). While “TestECPolicy.groovy” was passed without problem.
>
>
>
> Gradle Test Executor 2 started executing tests.
> Gradle Test Executor 2 finished executing tests.
>
> org.apache.bigtop.itest.hadoop.hdfs.TestPutEC > classMethod FAILED
> java.io.IOException: Class TestPutEC doesn't belong to any jar file in the classpath
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> at org.codehaus.groovy.reflection.CachedConstructor.invoke(CachedConstructor.java:77)
> at org.codehaus.groovy.runtime.callsite.ConstructorSite$ConstructorSiteNoUnwrapNoCoerce.callConstructor(ConstructorSite.java:102)
> at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallConstructor(CallSiteArray.java:54)
> at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callConstructor(AbstractCallSite.java:182)
> at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callConstructor(AbstractCallSite.java:190)
> at org.apache.bigtop.itest.JarContent.unpackJarContainer(JarContent.groovy:181)
> at org.apache.bigtop.itest.JarContent$unpackJarContainer.call(Unknown Source)
> at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:42)
> at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:108)
> at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:124)
> at org.apache.bigtop.itest.hadoop.hdfs.TestPutEC.setUp(TestPutEC.groovy:53)
>
> 2 tests completed, 1 failed
>
>