You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@lucene.apache.org by Policeman Jenkins Server <je...@thetaphi.de> on 2015/08/29 07:12:04 UTC

[JENKINS] Lucene-Solr-5.x-Solaris (multiarch/jdk1.7.0) - Build # 7 - Still Failing!

Build: http://jenkins.thetaphi.de/job/Lucene-Solr-5.x-Solaris/7/
Java: multiarch/jdk1.7.0 -d64 -XX:+UseCompressedOops -XX:+UseG1GC

1 tests failed.
FAILED:  org.apache.lucene.codecs.TestCodecLoadingDeadlock.testDeadlock

Error Message:
Cannot run program "/usr/jdk/instances/jdk1.7.0/jre/bin/java": error=12, Not enough space

Stack Trace:
java.io.IOException: Cannot run program "/usr/jdk/instances/jdk1.7.0/jre/bin/java": error=12, Not enough space
	at __randomizedtesting.SeedInfo.seed([A506913E9D3F33AD:A86D702A9B659E7B]:0)
	at java.lang.ProcessBuilder.start(ProcessBuilder.java:1047)
	at org.apache.lucene.codecs.TestCodecLoadingDeadlock.testDeadlock(TestCodecLoadingDeadlock.java:67)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1627)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:836)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:798)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:458)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:845)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$3.evaluate(RandomizedRunner.java:747)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:781)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:792)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:798)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$2.evaluate(ThreadLeakControl.java:401)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.runSuite(RandomizedRunner.java:651)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.access$200(RandomizedRunner.java:138)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$1.run(RandomizedRunner.java:568)
Caused by: java.io.IOException: error=12, Not enough space
	at java.lang.UNIXProcess.forkAndExec(Native Method)
	at java.lang.UNIXProcess.<init>(UNIXProcess.java:137)
	at java.lang.ProcessImpl.start(ProcessImpl.java:130)
	at java.lang.ProcessBuilder.start(ProcessBuilder.java:1028)
	... 22 more




Build Log:
[...truncated 797 lines...]
   [junit4] Suite: org.apache.lucene.codecs.TestCodecLoadingDeadlock
   [junit4] ERROR   3.07s J0 | TestCodecLoadingDeadlock.testDeadlock <<<
   [junit4]    > Throwable #1: java.io.IOException: Cannot run program "/usr/jdk/instances/jdk1.7.0/jre/bin/java": error=12, Not enough space
   [junit4]    > 	at __randomizedtesting.SeedInfo.seed([A506913E9D3F33AD:A86D702A9B659E7B]:0)
   [junit4]    > 	at java.lang.ProcessBuilder.start(ProcessBuilder.java:1047)
   [junit4]    > 	at org.apache.lucene.codecs.TestCodecLoadingDeadlock.testDeadlock(TestCodecLoadingDeadlock.java:67)
   [junit4]    > Caused by: java.io.IOException: error=12, Not enough space
   [junit4]    > 	at java.lang.UNIXProcess.forkAndExec(Native Method)
   [junit4]    > 	at java.lang.UNIXProcess.<init>(UNIXProcess.java:137)
   [junit4]    > 	at java.lang.ProcessImpl.start(ProcessImpl.java:130)
   [junit4]    > 	at java.lang.ProcessBuilder.start(ProcessBuilder.java:1028)
   [junit4]    > 	... 22 more
   [junit4] Completed [141/411] on J0 in 3.07s, 1 test, 1 error <<< FAILURES!

[...truncated 888 lines...]
BUILD FAILED
/export/home/jenkins/workspace/Lucene-Solr-5.x-Solaris/build.xml:785: The following error occurred while executing this line:
/export/home/jenkins/workspace/Lucene-Solr-5.x-Solaris/build.xml:729: The following error occurred while executing this line:
/export/home/jenkins/workspace/Lucene-Solr-5.x-Solaris/build.xml:59: The following error occurred while executing this line:
/export/home/jenkins/workspace/Lucene-Solr-5.x-Solaris/lucene/build.xml:50: The following error occurred while executing this line:
/export/home/jenkins/workspace/Lucene-Solr-5.x-Solaris/lucene/common-build.xml:1452: The following error occurred while executing this line:
/export/home/jenkins/workspace/Lucene-Solr-5.x-Solaris/lucene/common-build.xml:1006: There were test failures: 411 suites, 3324 tests, 1 error, 48 ignored (44 assumptions)

Total time: 6 minutes 22 seconds
Build step 'Invoke Ant' marked build as failure
Archiving artifacts
[WARNINGS] Skipping publisher since build result is FAILURE
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



RE: [JENKINS] Lucene-Solr-5.x-Solaris (multiarch/jdk1.7.0) - Build # 9 - Still Failing!

Posted by Uwe Schindler <uw...@thetaphi.de>.
Nice article about the problem (now solved in Java 8):
http://kirkwylie.blogspot.de/2008/09/solaris-10-terrible-choice-for-java.html

-----
Uwe Schindler
H.-H.-Meier-Allee 63, D-28213 Bremen
http://www.thetaphi.de
eMail: uwe@thetaphi.de


> -----Original Message-----
> From: Uwe Schindler [mailto:uwe@thetaphi.de]
> Sent: Sunday, August 30, 2015 1:18 AM
> To: dev@lucene.apache.org
> Subject: RE: [JENKINS] Lucene-Solr-5.x-Solaris (multiarch/jdk1.7.0) - Build # 9
> - Still Failing!
> 
> Hi,
> 
> This is a problem of Java 7 on Solaris. It still uses fork to spawn processes in
> Java 7; Java 8 has fixed this. Unfortunately the whole fork stuff is badly
> implemented in Solaris and really allocates the same size of memory again
> and does not use it - due to the large heaps this needs a lot of memory.
> Workaround is to allocate enough Swap (which is never used):
> 
> https://developer.opencloud.com/forum/posts/list/620.page
> 
> For now I raised swap space (which is really simple to do with ZFS... Way
> cool):
> 
> root@solaris-vm:~# zfs set volsize=6g rpool/swap
> 
> Super cool.
> 
> In Java 8, it uses the new posix_spawn launch mechanism... (on Linux vfork).
> This was (by the way) the buggy code that was fixed in U40, which caused
> the Turkish Locale to fail :-)
> 
> Uwe
> 
> -----
> Uwe Schindler
> H.-H.-Meier-Allee 63, D-28213 Bremen
> http://www.thetaphi.de
> eMail: uwe@thetaphi.de
> 
> 
> > -----Original Message-----
> > From: Uwe Schindler [mailto:uwe@thetaphi.de]
> > Sent: Saturday, August 29, 2015 11:52 PM
> > To: dev@lucene.apache.org
> > Subject: RE: [JENKINS] Lucene-Solr-5.x-Solaris (multiarch/jdk1.7.0) - Build #
> 9
> > - Still Failing!
> >
> > I am still digging... On Solaris there seems to be a general forking problem
> on
> > 32 bit processes.
> >
> > Uwe
> >
> > -----
> > Uwe Schindler
> > H.-H.-Meier-Allee 63, D-28213 Bremen
> > http://www.thetaphi.de
> > eMail: uwe@thetaphi.de
> >
> > > -----Original Message-----
> > > From: Policeman Jenkins Server [mailto:jenkins@thetaphi.de]
> > > Sent: Saturday, August 29, 2015 11:02 PM
> > > To: shalin@apache.org; mikemccand@apache.org;
> dev@lucene.apache.org
> > > Subject: [JENKINS] Lucene-Solr-5.x-Solaris (multiarch/jdk1.7.0) - Build # 9
> -
> > > Still Failing!
> > >
> > > Build: http://jenkins.thetaphi.de/job/Lucene-Solr-5.x-Solaris/9/
> > > Java: multiarch/jdk1.7.0 -d32 -server -XX:+UseConcMarkSweepGC
> > >
> > > 4 tests failed.
> > > FAILED:
> > >
> junit.framework.TestSuite.org.apache.solr.cloud.hdfs.HdfsNNFailoverTest
> > >
> > > Error Message:
> > > Error while running command to get file permissions :
> java.io.IOException:
> > > Cannot run program "/bin/ls": error=12, Not enough space  at
> > > java.lang.ProcessBuilder.start(ProcessBuilder.java:1047)  at
> > > org.apache.hadoop.util.Shell.runCommand(Shell.java:485)  at
> > > org.apache.hadoop.util.Shell.run(Shell.java:455)  at
> > >
> >
> org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:715
> > > )  at org.apache.hadoop.util.Shell.execCommand(Shell.java:808)  at
> > > org.apache.hadoop.util.Shell.execCommand(Shell.java:791)  at
> > > org.apache.hadoop.fs.FileUtil.execCommand(FileUtil.java:1097)  at
> > >
> >
> org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.lo
> > > adPermissionInfo(RawLocalFileSystem.java:582)  at
> > >
> >
> org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.g
> > > etPermission(RawLocalFileSystem.java:557)  at
> > >
> >
> org.apache.hadoop.util.DiskChecker.mkdirsWithExistsAndPermissionCheck(
> > > DiskChecker.java:139)  at
> > > org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:156)  at
> > >
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode$DataNodeDiskChecker.
> > > checkDir(DataNode.java:2239)  at
> > >
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.checkStorageLocations(
> > > DataNode.java:2281)  at
> > >
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNo
> > > de.java:2263)  at
> > >
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(D
> > > ataNode.java:2155)  at
> > >
> >
> org.apache.hadoop.hdfs.MiniDFSCluster.startDataNodes(MiniDFSCluster.jav
> > > a:1443)  at
> > >
> >
> org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.j
> > > ava:828)  at
> > > org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:738)
> > at
> > > org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:608)
> > at
> > > org.apache.solr.cloud.hdfs.HdfsTestUtil.setupClass(HdfsTestUtil.java:98)
> > at
> > >
> >
> org.apache.solr.cloud.hdfs.HdfsNNFailoverTest.setupClass(HdfsNNFailoverT
> > > est.java:44)  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> > > Method)  at
> > >
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.j
> > > ava:57)  at
> > >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
> > > sorImpl.java:43)  at java.lang.reflect.Method.invoke(Method.java:606)  at
> > >
> >
> com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(Randomize
> > > dRunner.java:1627)  at
> > >
> >
> com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(Rando
> > > mizedRunner.java:776)  at
> > >
> >
> com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(Rando
> > > mizedRunner.java:792)  at
> > >
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > > ementAdapter.java:36)  at
> > >
> >
> com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.
> > > evaluate(SystemPropertiesRestoreRule.java:57)  at
> > >
> >
> org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeA
> > > fterRule.java:46)  at
> > >
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > > ementAdapter.java:36)  at
> > >
> >
> org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreCl
> > > assName.java:42)  at
> > >
> >
> com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMet
> > > hodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
> > at
> > >
> >
> com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMet
> > > hodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
> > at
> > >
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > > ementAdapter.java:36)  at
> > >
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > > ementAdapter.java:36)  at
> > >
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > > ementAdapter.java:36)  at
> > >
> >
> org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAss
> > > ertionsRequired.java:54)  at
> > >
> >
> org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure
> > > .java:48)  at
> > >
> >
> org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRule
> > > IgnoreAfterMaxFailures.java:65)  at
> > >
> >
> org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnore
> > > TestSuites.java:55)  at
> > >
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > > ementAdapter.java:36)  at
> > >
> >
> com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.
> > > run(ThreadLeakControl.java:365)  at
> java.lang.Thread.run(Thread.java:745)
> > > Caused by: java.io.IOException: error=12, Not enough space  at
> > > java.lang.UNIXProcess.forkAndExec(Native Method)  at
> > > java.lang.UNIXProcess.<init>(UNIXProcess.java:137)  at
> > > java.lang.ProcessImpl.start(ProcessImpl.java:130)  at
> > > java.lang.ProcessBuilder.start(ProcessBuilder.java:1028)  ... 44 more
> > >
> > > Stack Trace:
> > > java.lang.RuntimeException: Error while running command to get file
> > > permissions : java.io.IOException: Cannot run program "/bin/ls":
> error=12,
> > > Not enough space
> > > 	at java.lang.ProcessBuilder.start(ProcessBuilder.java:1047)
> > > 	at org.apache.hadoop.util.Shell.runCommand(Shell.java:485)
> > > 	at org.apache.hadoop.util.Shell.run(Shell.java:455)
> > > 	at
> > >
> >
> org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:715
> > > )
> > > 	at org.apache.hadoop.util.Shell.execCommand(Shell.java:808)
> > > 	at org.apache.hadoop.util.Shell.execCommand(Shell.java:791)
> > > 	at org.apache.hadoop.fs.FileUtil.execCommand(FileUtil.java:1097)
> > > 	at
> > >
> >
> org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.lo
> > > adPermissionInfo(RawLocalFileSystem.java:582)
> > > 	at
> > >
> >
> org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.g
> > > etPermission(RawLocalFileSystem.java:557)
> > > 	at
> > >
> >
> org.apache.hadoop.util.DiskChecker.mkdirsWithExistsAndPermissionCheck(
> > > DiskChecker.java:139)
> > > 	at
> > > org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:156)
> > > 	at
> > >
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode$DataNodeDiskChecker.
> > > checkDir(DataNode.java:2239)
> > > 	at
> > >
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.checkStorageLocations(
> > > DataNode.java:2281)
> > > 	at
> > >
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNo
> > > de.java:2263)
> > > 	at
> > >
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(D
> > > ataNode.java:2155)
> > > 	at
> > >
> >
> org.apache.hadoop.hdfs.MiniDFSCluster.startDataNodes(MiniDFSCluster.jav
> > > a:1443)
> > > 	at
> > >
> >
> org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.j
> > > ava:828)
> > > 	at
> > > org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:738)
> > > 	at
> > > org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:608)
> > > 	at
> > > org.apache.solr.cloud.hdfs.HdfsTestUtil.setupClass(HdfsTestUtil.java:98)
> > > 	at
> > >
> >
> org.apache.solr.cloud.hdfs.HdfsNNFailoverTest.setupClass(HdfsNNFailoverT
> > > est.java:44)
> > > 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > > 	at
> > >
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.j
> > > ava:57)
> > > 	at
> > >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
> > > sorImpl.java:43)
> > > 	at java.lang.reflect.Method.invoke(Method.java:606)
> > > 	at
> > >
> >
> com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(Randomize
> > > dRunner.java:1627)
> > > 	at
> > >
> >
> com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(Rando
> > > mizedRunner.java:776)
> > > 	at
> > >
> >
> com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(Rando
> > > mizedRunner.java:792)
> > > 	at
> > >
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > > ementAdapter.java:36)
> > > 	at
> > >
> >
> com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.
> > > evaluate(SystemPropertiesRestoreRule.java:57)
> > > 	at
> > >
> >
> org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeA
> > > fterRule.java:46)
> > > 	at
> > >
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > > ementAdapter.java:36)
> > > 	at
> > >
> >
> org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreCl
> > > assName.java:42)
> > > 	at
> > >
> >
> com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMet
> > > hodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
> > > 	at
> > >
> >
> com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMet
> > > hodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
> > > 	at
> > >
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > > ementAdapter.java:36)
> > > 	at
> > >
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > > ementAdapter.java:36)
> > > 	at
> > >
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > > ementAdapter.java:36)
> > > 	at
> > >
> >
> org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAss
> > > ertionsRequired.java:54)
> > > 	at
> > >
> >
> org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure
> > > .java:48)
> > > 	at
> > >
> >
> org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRule
> > > IgnoreAfterMaxFailures.java:65)
> > > 	at
> > >
> >
> org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnore
> > > TestSuites.java:55)
> > > 	at
> > >
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > > ementAdapter.java:36)
> > > 	at
> > >
> >
> com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.
> > > run(ThreadLeakControl.java:365)
> > > 	at java.lang.Thread.run(Thread.java:745)
> > > Caused by: java.io.IOException: error=12, Not enough space
> > > 	at java.lang.UNIXProcess.forkAndExec(Native Method)
> > > 	at java.lang.UNIXProcess.<init>(UNIXProcess.java:137)
> > > 	at java.lang.ProcessImpl.start(ProcessImpl.java:130)
> > > 	at java.lang.ProcessBuilder.start(ProcessBuilder.java:1028)
> > > 	... 44 more
> > >
> > > 	at __randomizedtesting.SeedInfo.seed([5D8F351977870E3F]:0)
> > > 	at
> > >
> >
> org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.lo
> > > adPermissionInfo(RawLocalFileSystem.java:620)
> > > 	at
> > >
> >
> org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.g
> > > etPermission(RawLocalFileSystem.java:557)
> > > 	at
> > >
> >
> org.apache.hadoop.util.DiskChecker.mkdirsWithExistsAndPermissionCheck(
> > > DiskChecker.java:139)
> > > 	at
> > > org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:156)
> > > 	at
> > >
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode$DataNodeDiskChecker.
> > > checkDir(DataNode.java:2239)
> > > 	at
> > >
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.checkStorageLocations(
> > > DataNode.java:2281)
> > > 	at
> > >
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNo
> > > de.java:2263)
> > > 	at
> > >
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(D
> > > ataNode.java:2155)
> > > 	at
> > >
> >
> org.apache.hadoop.hdfs.MiniDFSCluster.startDataNodes(MiniDFSCluster.jav
> > > a:1443)
> > > 	at
> > >
> >
> org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.j
> > > ava:828)
> > > 	at
> > > org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:738)
> > > 	at
> > > org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:608)
> > > 	at
> > > org.apache.solr.cloud.hdfs.HdfsTestUtil.setupClass(HdfsTestUtil.java:98)
> > > 	at
> > >
> >
> org.apache.solr.cloud.hdfs.HdfsNNFailoverTest.setupClass(HdfsNNFailoverT
> > > est.java:44)
> > > 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > > 	at
> > >
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.j
> > > ava:57)
> > > 	at
> > >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
> > > sorImpl.java:43)
> > > 	at java.lang.reflect.Method.invoke(Method.java:606)
> > > 	at
> > >
> >
> com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(Randomize
> > > dRunner.java:1627)
> > > 	at
> > >
> >
> com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(Rando
> > > mizedRunner.java:776)
> > > 	at
> > >
> >
> com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(Rando
> > > mizedRunner.java:792)
> > > 	at
> > >
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > > ementAdapter.java:36)
> > > 	at
> > >
> >
> com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.
> > > evaluate(SystemPropertiesRestoreRule.java:57)
> > > 	at
> > >
> >
> org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeA
> > > fterRule.java:46)
> > > 	at
> > >
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > > ementAdapter.java:36)
> > > 	at
> > >
> >
> org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreCl
> > > assName.java:42)
> > > 	at
> > >
> >
> com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMet
> > > hodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
> > > 	at
> > >
> >
> com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMet
> > > hodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
> > > 	at
> > >
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > > ementAdapter.java:36)
> > > 	at
> > >
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > > ementAdapter.java:36)
> > > 	at
> > >
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > > ementAdapter.java:36)
> > > 	at
> > >
> >
> org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAss
> > > ertionsRequired.java:54)
> > > 	at
> > >
> >
> org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure
> > > .java:48)
> > > 	at
> > >
> >
> org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRule
> > > IgnoreAfterMaxFailures.java:65)
> > > 	at
> > >
> >
> org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnore
> > > TestSuites.java:55)
> > > 	at
> > >
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > > ementAdapter.java:36)
> > > 	at
> > >
> >
> com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.
> > > run(ThreadLeakControl.java:365)
> > > 	at java.lang.Thread.run(Thread.java:745)
> > >
> > >
> > > FAILED:
> > > junit.framework.TestSuite.org.apache.solr.store.hdfs.HdfsDirectoryTest
> > >
> > > Error Message:
> > > access denied ("java.io.FilePermission"
> > > "/export/home/jenkins/workspace/Lucene-Solr-5.x-
> Solaris/solr/build/solr-
> > > core/test/J1" "write")
> > >
> > > Stack Trace:
> > > java.security.AccessControlException: access denied
> > > ("java.io.FilePermission" "/export/home/jenkins/workspace/Lucene-
> Solr-
> > > 5.x-Solaris/solr/build/solr-core/test/J1" "write")
> > > 	at __randomizedtesting.SeedInfo.seed([5D8F351977870E3F]:0)
> > > 	at
> > >
> >
> java.security.AccessControlContext.checkPermission(AccessControlContext.j
> > > ava:395)
> > > 	at
> > >
> java.security.AccessController.checkPermission(AccessController.java:559)
> > > 	at
> > > java.lang.SecurityManager.checkPermission(SecurityManager.java:549)
> > > 	at java.lang.SecurityManager.checkWrite(SecurityManager.java:979)
> > > 	at java.io.File.canWrite(File.java:785)
> > > 	at org.apache.hadoop.fs.FileUtil.canWrite(FileUtil.java:1002)
> > > 	at
> > >
> >
> org.apache.hadoop.hdfs.MiniDFSCluster.createPermissionsDiagnosisString(
> > > MiniDFSCluster.java:856)
> > > 	at
> > >
> >
> org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.j
> > > ava:812)
> > > 	at
> > > org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:738)
> > > 	at
> > > org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:608)
> > > 	at
> > > org.apache.solr.cloud.hdfs.HdfsTestUtil.setupClass(HdfsTestUtil.java:98)
> > > 	at
> > > org.apache.solr.cloud.hdfs.HdfsTestUtil.setupClass(HdfsTestUtil.java:60)
> > > 	at
> > >
> >
> org.apache.solr.store.hdfs.HdfsDirectoryTest.beforeClass(HdfsDirectoryTest.
> > > java:62)
> > > 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > > 	at
> > >
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.j
> > > ava:57)
> > > 	at
> > >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
> > > sorImpl.java:43)
> > > 	at java.lang.reflect.Method.invoke(Method.java:606)
> > > 	at
> > >
> >
> com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(Randomize
> > > dRunner.java:1627)
> > > 	at
> > >
> >
> com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(Rando
> > > mizedRunner.java:776)
> > > 	at
> > >
> >
> com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(Rando
> > > mizedRunner.java:792)
> > > 	at
> > >
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > > ementAdapter.java:36)
> > > 	at
> > >
> >
> com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.
> > > evaluate(SystemPropertiesRestoreRule.java:57)
> > > 	at
> > >
> >
> org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeA
> > > fterRule.java:46)
> > > 	at
> > >
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > > ementAdapter.java:36)
> > > 	at
> > >
> >
> org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreCl
> > > assName.java:42)
> > > 	at
> > >
> >
> com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMet
> > > hodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
> > > 	at
> > >
> >
> com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMet
> > > hodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
> > > 	at
> > >
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > > ementAdapter.java:36)
> > > 	at
> > >
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > > ementAdapter.java:36)
> > > 	at
> > >
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > > ementAdapter.java:36)
> > > 	at
> > >
> >
> org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAss
> > > ertionsRequired.java:54)
> > > 	at
> > >
> >
> org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure
> > > .java:48)
> > > 	at
> > >
> >
> org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRule
> > > IgnoreAfterMaxFailures.java:65)
> > > 	at
> > >
> >
> org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnore
> > > TestSuites.java:55)
> > > 	at
> > >
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > > ementAdapter.java:36)
> > > 	at
> > >
> >
> com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.
> > > run(ThreadLeakControl.java:365)
> > > 	at java.lang.Thread.run(Thread.java:745)
> > >
> > >
> > > FAILED:
> > > junit.framework.TestSuite.org.apache.solr.store.hdfs.HdfsDirectoryTest
> > >
> > > Error Message:
> > > 1 thread leaked from SUITE scope at
> > > org.apache.solr.store.hdfs.HdfsDirectoryTest:     1) Thread[id=20389,
> > > name=IPC Server idle connection scanner for port 41610, state=WAITING,
> > > group=TGRP-HdfsDirectoryTest]         at java.lang.Object.wait(Native
> > > Method)         at java.lang.Object.wait(Object.java:503)         at
> > > java.util.TimerThread.mainLoop(Timer.java:526)         at
> > > java.util.TimerThread.run(Timer.java:505)
> > >
> > > Stack Trace:
> > > com.carrotsearch.randomizedtesting.ThreadLeakError: 1 thread leaked
> > from
> > > SUITE scope at org.apache.solr.store.hdfs.HdfsDirectoryTest:
> > >    1) Thread[id=20389, name=IPC Server idle connection scanner for port
> > > 41610, state=WAITING, group=TGRP-HdfsDirectoryTest]
> > >         at java.lang.Object.wait(Native Method)
> > >         at java.lang.Object.wait(Object.java:503)
> > >         at java.util.TimerThread.mainLoop(Timer.java:526)
> > >         at java.util.TimerThread.run(Timer.java:505)
> > > 	at __randomizedtesting.SeedInfo.seed([5D8F351977870E3F]:0)
> > >
> > >
> > > FAILED:
> > > junit.framework.TestSuite.org.apache.solr.store.hdfs.HdfsDirectoryTest
> > >
> > > Error Message:
> > > There are still zombie threads that couldn't be terminated:    1)
> > > Thread[id=20389, name=IPC Server idle connection scanner for port
> 41610,
> > > state=WAITING, group=TGRP-HdfsDirectoryTest]         at
> > > java.lang.Object.wait(Native Method)         at
> > > java.lang.Object.wait(Object.java:503)         at
> > > java.util.TimerThread.mainLoop(Timer.java:526)         at
> > > java.util.TimerThread.run(Timer.java:505)
> > >
> > > Stack Trace:
> > > com.carrotsearch.randomizedtesting.ThreadLeakError: There are still
> > zombie
> > > threads that couldn't be terminated:
> > >    1) Thread[id=20389, name=IPC Server idle connection scanner for port
> > > 41610, state=WAITING, group=TGRP-HdfsDirectoryTest]
> > >         at java.lang.Object.wait(Native Method)
> > >         at java.lang.Object.wait(Object.java:503)
> > >         at java.util.TimerThread.mainLoop(Timer.java:526)
> > >         at java.util.TimerThread.run(Timer.java:505)
> > > 	at __randomizedtesting.SeedInfo.seed([5D8F351977870E3F]:0)
> > >
> > >
> > >
> > >
> > > Build Log:
> > > [...truncated 10577 lines...]
> > >    [junit4] Suite: org.apache.solr.cloud.hdfs.HdfsNNFailoverTest
> > >    [junit4]   2> Creating dataDir:
> /export/home/jenkins/workspace/Lucene-
> > > Solr-5.x-Solaris/solr/build/solr-
> > >
> >
> core/test/J0/temp/solr.cloud.hdfs.HdfsNNFailoverTest_5D8F351977870E3F-
> > > 001/init-core-data-001
> > >    [junit4]   2> 2599844 INFO  (SUITE-HdfsNNFailoverTest-
> > > seed#[5D8F351977870E3F]-worker) [    ]
> > o.a.s.BaseDistributedSearchTestCase
> > > Setting hostContext system property: /
> > >    [junit4]   2> 2616331 WARN  (SUITE-HdfsNNFailoverTest-
> > > seed#[5D8F351977870E3F]-worker) [    ] o.a.h.u.NativeCodeLoader
> Unable
> > to
> > > load native-hadoop library for your platform... using builtin-java classes
> > > where applicable
> > >    [junit4]   1> Formatting using clusterid: testClusterID
> > >    [junit4]   2> 2617524 WARN  (SUITE-HdfsNNFailoverTest-
> > > seed#[5D8F351977870E3F]-worker) [    ] o.a.h.m.i.MetricsConfig Cannot
> > > locate configuration: tried hadoop-metrics2-
> > namenode.properties,hadoop-
> > > metrics2.properties
> > >    [junit4]   2> 2617755 INFO  (SUITE-HdfsNNFailoverTest-
> > > seed#[5D8F351977870E3F]-worker) [    ] o.m.log Logging to
> > > org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via
> > > org.mortbay.log.Slf4jLog
> > >    [junit4]   2> 2617771 WARN  (SUITE-HdfsNNFailoverTest-
> > > seed#[5D8F351977870E3F]-worker) [    ] o.a.h.h.HttpRequestLog Jetty
> > > request log can only be enabled using Log4j
> > >    [junit4]   2> 2617878 INFO  (SUITE-HdfsNNFailoverTest-
> > > seed#[5D8F351977870E3F]-worker) [    ] o.m.log jetty-6.1.26
> > >    [junit4]   2> 2617942 INFO  (SUITE-HdfsNNFailoverTest-
> > > seed#[5D8F351977870E3F]-worker) [    ] o.m.log Extract
> > > jar:file:/export/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-
> > > hdfs/tests/hadoop-hdfs-2.6.0-tests.jar!/webapps/hdfs to
> > > ./temp/Jetty_solaris.vm_35231_hdfs____thayv4/webapp
> > >    [junit4]   2> 2618129 INFO  (SUITE-HdfsNNFailoverTest-
> > > seed#[5D8F351977870E3F]-worker) [    ] o.m.log NO JSP Support for /, did
> > not
> > > find org.apache.jasper.servlet.JspServlet
> > >    [junit4]   2> 2619464 INFO  (SUITE-HdfsNNFailoverTest-
> > > seed#[5D8F351977870E3F]-worker) [    ] o.m.log Started
> > > HttpServer2$SelectChannelConnectorWithSafeStartup@solaris-vm:35231
> > >    [junit4]   2> 2637264 WARN  (SUITE-HdfsNNFailoverTest-
> > > seed#[5D8F351977870E3F]-worker) [    ] o.a.h.h.s.d.DataNode Invalid
> > > dfs.datanode.data.dir /export/home/jenkins/workspace/Lucene-Solr-
> 5.x-
> > > Solaris/solr/build/solr-
> > >
> >
> core/test/J0/temp/solr.cloud.hdfs.HdfsNNFailoverTest_5D8F351977870E3F-
> > > 001/tempDir-001/hdfsBaseDir/data/data2 :
> > >    [junit4]   2> java.io.IOException: Cannot run program "chmod":
> error=12,
> > > Not enough space
> > >    [junit4]   2> 	at
> java.lang.ProcessBuilder.start(ProcessBuilder.java:1047)
> > >    [junit4]   2> 	at
> org.apache.hadoop.util.Shell.runCommand(Shell.java:485)
> > >    [junit4]   2> 	at org.apache.hadoop.util.Shell.run(Shell.java:455)
> > >    [junit4]   2> 	at
> > >
> >
> org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:715
> > > )
> > >    [junit4]   2> 	at
> > > org.apache.hadoop.util.Shell.execCommand(Shell.java:808)
> > >    [junit4]   2> 	at
> > > org.apache.hadoop.util.Shell.execCommand(Shell.java:791)
> > >    [junit4]   2> 	at
> > >
> >
> org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSyste
> > > m.java:656)
> > >    [junit4]   2> 	at
> > >
> >
> org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.java:4
> > > 90)
> > >    [junit4]   2> 	at
> > >
> >
> org.apache.hadoop.util.DiskChecker.mkdirsWithExistsAndPermissionCheck(
> > > DiskChecker.java:140)
> > >    [junit4]   2> 	at
> > > org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:156)
> > >    [junit4]   2> 	at
> > >
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode$DataNodeDiskChecker.
> > > checkDir(DataNode.java:2239)
> > >    [junit4]   2> 	at
> > >
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.checkStorageLocations(
> > > DataNode.java:2281)
> > >    [junit4]   2> 	at
> > >
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNo
> > > de.java:2263)
> > >    [junit4]   2> 	at
> > >
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(D
> > > ataNode.java:2155)
> > >    [junit4]   2> 	at
> > >
> >
> org.apache.hadoop.hdfs.MiniDFSCluster.startDataNodes(MiniDFSCluster.jav
> > > a:1443)
> > >    [junit4]   2> 	at
> > >
> >
> org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.j
> > > ava:828)
> > >    [junit4]   2> 	at
> > > org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:738)
> > >    [junit4]   2> 	at
> > > org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:608)
> > >    [junit4]   2> 	at
> > > org.apache.solr.cloud.hdfs.HdfsTestUtil.setupClass(HdfsTestUtil.java:98)
> > >    [junit4]   2> 	at
> > >
> >
> org.apache.solr.cloud.hdfs.HdfsNNFailoverTest.setupClass(HdfsNNFailoverT
> > > est.java:44)
> > >    [junit4]   2> 	at
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> > > Method)
> > >    [junit4]   2> 	at
> > >
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.j
> > > ava:57)
> > >    [junit4]   2> 	at
> > >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
> > > sorImpl.java:43)
> > >    [junit4]   2> 	at java.lang.reflect.Method.invoke(Method.java:606)
> > >    [junit4]   2> 	at
> > >
> >
> com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(Randomize
> > > dRunner.java:1627)
> > >    [junit4]   2> 	at
> > >
> >
> com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(Rando
> > > mizedRunner.java:776)
> > >    [junit4]   2> 	at
> > >
> >
> com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(Rando
> > > mizedRunner.java:792)
> > >    [junit4]   2> 	at
> > >
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > > ementAdapter.java:36)
> > >    [junit4]   2> 	at
> > >
> >
> com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.
> > > evaluate(SystemPropertiesRestoreRule.java:57)
> > >    [junit4]   2> 	at
> > >
> >
> org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeA
> > > fterRule.java:46)
> > >    [junit4]   2> 	at
> > >
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > > ementAdapter.java:36)
> > >    [junit4]   2> 	at
> > >
> >
> org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreCl
> > > assName.java:42)
> > >    [junit4]   2> 	at
> > >
> >
> com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMet
> > > hodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
> > >    [junit4]   2> 	at
> > >
> >
> com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMet
> > > hodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
> > >    [junit4]   2> 	at
> > >
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > > ementAdapter.java:36)
> > >    [junit4]   2> 	at
> > >
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > > ementAdapter.java:36)
> > >    [junit4]   2> 	at
> > >
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > > ementAdapter.java:36)
> > >    [junit4]   2> 	at
> > >
> >
> org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAss
> > > ertionsRequired.java:54)
> > >    [junit4]   2> 	at
> > >
> >
> org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure
> > > .java:48)
> > >    [junit4]   2> 	at
> > >
> >
> org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRule
> > > IgnoreAfterMaxFailures.java:65)
> > >    [junit4]   2> 	at
> > >
> >
> org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnore
> > > TestSuites.java:55)
> > >    [junit4]   2> 	at
> > >
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > > ementAdapter.java:36)
> > >    [junit4]   2> 	at
> > >
> >
> com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.
> > > run(ThreadLeakControl.java:365)
> > >    [junit4]   2> 	at java.lang.Thread.run(Thread.java:745)
> > >    [junit4]   2> Caused by: java.io.IOException: error=12, Not enough space
> > >    [junit4]   2> 	at java.lang.UNIXProcess.forkAndExec(Native
> Method)
> > >    [junit4]   2> 	at
> java.lang.UNIXProcess.<init>(UNIXProcess.java:137)
> > >    [junit4]   2> 	at java.lang.ProcessImpl.start(ProcessImpl.java:130)
> > >    [junit4]   2> 	at
> java.lang.ProcessBuilder.start(ProcessBuilder.java:1028)
> > >    [junit4]   2> 	... 43 more
> > >    [junit4]   2> 2637287 WARN
> > > (org.apache.hadoop.util.JvmPauseMonitor$Monitor@be51b7) [    ]
> > > o.a.h.u.JvmPauseMonitor Detected pause in JVM or host machine (eg
> GC):
> > > pause of approximately 15969ms
> > >    [junit4]   2> No GCs detected
> > >    [junit4]   2> 2637368 WARN  (SUITE-HdfsNNFailoverTest-
> > > seed#[5D8F351977870E3F]-worker) [    ] o.a.h.h.HttpRequestLog Jetty
> > > request log can only be enabled using Log4j
> > >    [junit4]   2> 2637384 INFO  (SUITE-HdfsNNFailoverTest-
> > > seed#[5D8F351977870E3F]-worker) [    ] o.m.log jetty-6.1.26
> > >    [junit4]   2> 2637422 INFO  (SUITE-HdfsNNFailoverTest-
> > > seed#[5D8F351977870E3F]-worker) [    ] o.m.log Extract
> > > jar:file:/export/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-
> > > hdfs/tests/hadoop-hdfs-2.6.0-tests.jar!/webapps/datanode to
> > > ./temp/Jetty_solaris.vm_49465_datanode____96t731/webapp
> > >    [junit4]   2> 2637655 INFO  (SUITE-HdfsNNFailoverTest-
> > > seed#[5D8F351977870E3F]-worker) [    ] o.m.log NO JSP Support for /, did
> > not
> > > find org.apache.jasper.servlet.JspServlet
> > >    [junit4]   2> 2638756 INFO  (SUITE-HdfsNNFailoverTest-
> > > seed#[5D8F351977870E3F]-worker) [    ] o.m.log Started
> > > HttpServer2$SelectChannelConnectorWithSafeStartup@solaris-vm:49465
> > >    [junit4]   2> 2645079 INFO  (SUITE-HdfsNNFailoverTest-
> > > seed#[5D8F351977870E3F]-worker) [    ] o.m.log Stopped
> > > HttpServer2$SelectChannelConnectorWithSafeStartup@solaris-vm:0
> > >    [junit4]   2> 2645234 ERROR (DataNode:
> > > [[[DISK]file:/export/home/jenkins/workspace/Lucene-Solr-5.x-
> > > Solaris/solr/build/solr-
> > >
> >
> core/test/J0/temp/solr.cloud.hdfs.HdfsNNFailoverTest_5D8F351977870E3F-
> > > 001/tempDir-001/hdfsBaseDir/data/data1/,
> > > [DISK]file:/export/home/jenkins/workspace/Lucene-Solr-5.x-
> > > Solaris/solr/build/solr-
> > >
> >
> core/test/J0/temp/solr.cloud.hdfs.HdfsNNFailoverTest_5D8F351977870E3F-
> > > 001/tempDir-001/hdfsBaseDir/data/data2/]]  heartbeating to solaris-
> > > vm/127.0.0.1:61051) [    ] o.a.h.h.s.d.DataNode Initialization failed for
> Block
> > > pool <registering> (Datanode Uuid unassigned) service to solaris-
> > > vm/127.0.0.1:61051. Exiting.
> > >    [junit4]   2> java.io.IOException: DN shut down before block pool
> > > connected
> > >    [junit4]   2> 	at
> > >
> >
> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.retrieveNamespac
> > > eInfo(BPServiceActor.java:185)
> > >    [junit4]   2> 	at
> > >
> >
> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAnd
> > > Handshake(BPServiceActor.java:215)
> > >    [junit4]   2> 	at
> > >
> >
> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceAct
> > > or.java:828)
> > >    [junit4]   2> 	at java.lang.Thread.run(Thread.java:745)
> > >    [junit4]   2> 2645236 WARN  (DataNode:
> > > [[[DISK]file:/export/home/jenkins/workspace/Lucene-Solr-5.x-
> > > Solaris/solr/build/solr-
> > >
> >
> core/test/J0/temp/solr.cloud.hdfs.HdfsNNFailoverTest_5D8F351977870E3F-
> > > 001/tempDir-001/hdfsBaseDir/data/data1/,
> > > [DISK]file:/export/home/jenkins/workspace/Lucene-Solr-5.x-
> > > Solaris/solr/build/solr-
> > >
> >
> core/test/J0/temp/solr.cloud.hdfs.HdfsNNFailoverTest_5D8F351977870E3F-
> > > 001/tempDir-001/hdfsBaseDir/data/data2/]]  heartbeating to solaris-
> > > vm/127.0.0.1:61051) [    ] o.a.h.h.s.d.DataNode Ending block pool service
> > for:
> > > Block pool <registering> (Datanode Uuid unassigned) service to solaris-
> > > vm/127.0.0.1:61051
> > >    [junit4]   2> 2645259 WARN
> > >
> >
> (org.apache.hadoop.hdfs.server.blockmanagement.DecommissionManager
> > > $Monitor@7b7964) [    ] o.a.h.h.s.b.DecommissionManager Monitor
> > > interrupted: java.lang.InterruptedException: sleep interrupted
> > >    [junit4]   2> 2645314 INFO  (SUITE-HdfsNNFailoverTest-
> > > seed#[5D8F351977870E3F]-worker) [    ] o.m.log Stopped
> > > HttpServer2$SelectChannelConnectorWithSafeStartup@solaris-vm:0
> > >    [junit4]   2> 2645418 ERROR (SUITE-HdfsNNFailoverTest-
> > > seed#[5D8F351977870E3F]-worker) [    ] o.a.h.m.l.MethodMetric Error
> > > invoking method getBlocksTotal
> > >    [junit4]   2> java.lang.reflect.InvocationTargetException
> > >    [junit4]   2> 	at
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> > > Method)
> > >    [junit4]   2> 	at
> > >
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.j
> > > ava:57)
> > >    [junit4]   2> 	at
> > >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
> > > sorImpl.java:43)
> > >    [junit4]   2> 	at java.lang.reflect.Method.invoke(Method.java:606)
> > >    [junit4]   2> 	at
> > >
> >
> org.apache.hadoop.metrics2.lib.MethodMetric$2.snapshot(MethodMetric.j
> > > ava:111)
> > >    [junit4]   2> 	at
> > >
> >
> org.apache.hadoop.metrics2.lib.MethodMetric.snapshot(MethodMetric.jav
> > > a:144)
> > >    [junit4]   2> 	at
> > >
> >
> org.apache.hadoop.metrics2.lib.MetricsRegistry.snapshot(MetricsRegistry.ja
> > > va:387)
> > >    [junit4]   2> 	at
> > >
> >
> org.apache.hadoop.metrics2.lib.MetricsSourceBuilder$1.getMetrics(MetricsS
> > > ourceBuilder.java:79)
> > >    [junit4]   2> 	at
> > >
> >
> org.apache.hadoop.metrics2.impl.MetricsSourceAdapter.getMetrics(Metrics
> > > SourceAdapter.java:195)
> > >    [junit4]   2> 	at
> > >
> >
> org.apache.hadoop.metrics2.impl.MetricsSourceAdapter.updateJmxCache(
> > > MetricsSourceAdapter.java:172)
> > >    [junit4]   2> 	at
> > >
> >
> org.apache.hadoop.metrics2.impl.MetricsSourceAdapter.getMBeanInfo(Me
> > > tricsSourceAdapter.java:151)
> > >    [junit4]   2> 	at
> > >
> >
> com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getClassName(De
> > > faultMBeanServerInterceptor.java:1804)
> > >    [junit4]   2> 	at
> > >
> >
> com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.safeGetClassNam
> > > e(DefaultMBeanServerInterceptor.java:1595)
> > >    [junit4]   2> 	at
> > >
> com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.checkMBeanPer
> > > mission(DefaultMBeanServerInterceptor.java:1813)
> > >    [junit4]   2> 	at
> > >
> >
> com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.exclusiveUnregist
> > > erMBean(DefaultMBeanServerInterceptor.java:430)
> > >    [junit4]   2> 	at
> > >
> >
> com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.unregisterMBean
> > > (DefaultMBeanServerInterceptor.java:415)
> > >    [junit4]   2> 	at
> > >
> >
> com.sun.jmx.mbeanserver.JmxMBeanServer.unregisterMBean(JmxMBeanS
> > > erver.java:546)
> > >    [junit4]   2> 	at
> > > org.apache.hadoop.metrics2.util.MBeans.unregister(MBeans.java:81)
> > >    [junit4]   2> 	at
> > >
> >
> org.apache.hadoop.metrics2.impl.MetricsSourceAdapter.stopMBeans(Metri
> > > csSourceAdapter.java:227)
> > >    [junit4]   2> 	at
> > >
> >
> org.apache.hadoop.metrics2.impl.MetricsSourceAdapter.stop(MetricsSourc
> > > eAdapter.java:212)
> > >    [junit4]   2> 	at
> > >
> >
> org.apache.hadoop.metrics2.impl.MetricsSystemImpl.stopSources(MetricsS
> > > ystemImpl.java:461)
> > >    [junit4]   2> 	at
> > >
> >
> org.apache.hadoop.metrics2.impl.MetricsSystemImpl.stop(MetricsSystemIm
> > > pl.java:212)
> > >    [junit4]   2> 	at
> > >
> >
> org.apache.hadoop.metrics2.impl.MetricsSystemImpl.shutdown(MetricsSyst
> > > emImpl.java:592)
> > >    [junit4]   2> 	at
> > >
> >
> org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.shutdownInstance(D
> > > efaultMetricsSystem.java:72)
> > >    [junit4]   2> 	at
> > >
> >
> org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.shutdown(DefaultMe
> > > tricsSystem.java:68)
> > >    [junit4]   2> 	at
> > >
> >
> org.apache.hadoop.hdfs.server.namenode.metrics.NameNodeMetrics.shut
> > > down(NameNodeMetrics.java:145)
> > >    [junit4]   2> 	at
> > >
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.stop(NameNode.jav
> > > a:822)
> > >    [junit4]   2> 	at
> > >
> >
> org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:172
> > > 0)
> > >    [junit4]   2> 	at
> > >
> >
> org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:169
> > > 9)
> > >    [junit4]   2> 	at
> > >
> >
> org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.j
> > > ava:838)
> > >    [junit4]   2> 	at
> > > org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:738)
> > >    [junit4]   2> 	at
> > > org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:608)
> > >    [junit4]   2> 	at
> > > org.apache.solr.cloud.hdfs.HdfsTestUtil.setupClass(HdfsTestUtil.java:98)
> > >    [junit4]   2> 	at
> > >
> >
> org.apache.solr.cloud.hdfs.HdfsNNFailoverTest.setupClass(HdfsNNFailoverT
> > > est.java:44)
> > >    [junit4]   2> 	at
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> > > Method)
> > >    [junit4]   2> 	at
> > >
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.j
> > > ava:57)
> > >    [junit4]   2> 	at
> > >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
> > > sorImpl.java:43)
> > >    [junit4]   2> 	at java.lang.reflect.Method.invoke(Method.java:606)
> > >    [junit4]   2> 	at
> > >
> >
> com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(Randomize
> > > dRunner.java:1627)
> > >    [junit4]   2> 	at
> > >
> >
> com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(Rando
> > > mizedRunner.java:776)
> > >    [junit4]   2> 	at
> > >
> >
> com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(Rando
> > > mizedRunner.java:792)
> > >    [junit4]   2> 	at
> > >
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > > ementAdapter.java:36)
> > >    [junit4]   2> 	at
> > >
> >
> com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.
> > > evaluate(SystemPropertiesRestoreRule.java:57)
> > >    [junit4]   2> 	at
> > >
> >
> org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeA
> > > fterRule.java:46)
> > >    [junit4]   2> 	at
> > >
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > > ementAdapter.java:36)
> > >    [junit4]   2> 	at
> > >
> >
> org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreCl
> > > assName.java:42)
> > >    [junit4]   2> 	at
> > >
> >
> com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMet
> > > hodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
> > >    [junit4]   2> 	at
> > >
> >
> com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMet
> > > hodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
> > >    [junit4]   2> 	at
> > >
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > > ementAdapter.java:36)
> > >    [junit4]   2> 	at
> > >
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > > ementAdapter.java:36)
> > >    [junit4]   2> 	at
> > >
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > > ementAdapter.java:36)
> > >    [junit4]   2> 	at
> > >
> >
> org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAss
> > > ertionsRequired.java:54)
> > >    [junit4]   2> 	at
> > >
> >
> org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure
> > > .java:48)
> > >    [junit4]   2> 	at
> > >
> >
> org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRule
> > > IgnoreAfterMaxFailures.java:65)
> > >    [junit4]   2> 	at
> > >
> >
> org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnore
> > > TestSuites.java:55)
> > >    [junit4]   2> 	at
> > >
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > > ementAdapter.java:36)
> > >    [junit4]   2> 	at
> > >
> >
> com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.
> > > run(ThreadLeakControl.java:365)
> > >    [junit4]   2> 	at java.lang.Thread.run(Thread.java:745)
> > >    [junit4]   2> Caused by: java.lang.NullPointerException
> > >    [junit4]   2> 	at
> > >
> >
> org.apache.hadoop.hdfs.server.blockmanagement.BlocksMap.size(BlocksM
> > > ap.java:198)
> > >    [junit4]   2> 	at
> > >
> >
> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.getTotalBl
> > > ocks(BlockManager.java:3291)
> > >    [junit4]   2> 	at
> > >
> >
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlocksTotal(F
> > > SNamesystem.java:6223)
> > >    [junit4]   2> 	... 58 more
> > >    [junit4]   2> 2645432 INFO  (SUITE-HdfsNNFailoverTest-
> > > seed#[5D8F351977870E3F]-worker) [    ] o.a.s.SolrTestCaseJ4
> > ###deleteCore
> > >    [junit4]   2> NOTE: test params are: codec=Asserting(Lucene53),
> > > sim=RandomSimilarityProvider(queryNorm=true,coord=yes): {},
> > > locale=mk_MK, timezone=Asia/Shanghai
> > >    [junit4]   2> NOTE: SunOS 5.11 x86/Oracle Corporation 1.7.0_85 (32-
> > > bit)/cpus=3,threads=1,free=99794816,total=518979584
> > >    [junit4]   2> NOTE: All tests run in this JVM: [SolrCloudExampleTest,
> > > TestStressVersions, TestSerializedLuceneMatchVersion, TestSolrJ,
> > > DistanceUnitsTest, MultiThreadedOCPTest, TestDistribDocBasedVersion,
> > > BJQParserTest, ZkCLITest, QueryEqualityTest, PrimitiveFieldTypeTest,
> > > DistributedQueryComponentOptimizationTest, AliasIntegrationTest,
> > > TestInitQParser, TestAuthorizationFramework, TestLazyCores,
> > > SolrIndexConfigTest, TestFunctionQuery, TestXIncludeConfig,
> > > HardAutoCommitTest, DocValuesMultiTest, TestDefaultStatsCache,
> > > SolrRequestParserTest, RecoveryZkTest, UpdateParamsTest,
> > > TestSolrDeletionPolicy1, TestDFRSimilarityFactory, TestFastWriter,
> > > PathHierarchyTokenizerFactoryTest, TestDynamicLoading,
> > > TestElisionMultitermQuery, PolyFieldTest, UnloadDistributedZkTest,
> > > TestJsonRequest, TestRuleBasedAuthorizationPlugin,
> > > TestManagedStopFilterFactory, TestRawResponseWriter,
> > IndexSchemaTest,
> > > TestEmbeddedSolrServerConstructors, InfoHandlerTest,
> > > AlternateDirectoryTest, LeaderElectionTest, JsonLoaderTest,
> > > TestCoreContainer, DirectSolrSpellCheckerTest, RequestLoggingTest,
> > > ZkNodePropsTest, TermsComponentTest, TestConfig,
> > > TestFieldTypeCollectionResource, XsltUpdateRequestHandlerTest,
> > > TestManagedSchemaFieldResource, TestSchemaResource,
> > > DataDrivenBlockJoinTest, TestExactStatsCache, TestConfigSetProperties,
> > > DeleteLastCustomShardedReplicaTest, TestAnalyzedSuggestions,
> > > DirectUpdateHandlerTest, ExternalFileFieldSortTest,
> > TestIBSimilarityFactory,
> > > TestMissingGroups, ClusterStateUpdateTest, ActionThrottleTest,
> > > QueryElevationComponentTest, DocValuesTest, QueryResultKeyTest,
> > > TestLRUCache, TestPhraseSuggestions, SimplePostToolTest,
> > > TriLevelCompositeIdRoutingTest, DistributedMLTComponentTest,
> > > CloudExitableDirectoryReaderTest, TestSolrCloudWithKerberosAlt,
> > > TestCodecSupport, TestConfigSets, PeerSyncTest,
> > > XmlUpdateRequestHandlerTest, SpatialHeatmapFacetsTest,
> > > SoftAutoCommitTest, TestSchemaNameResource,
> > > PreAnalyzedUpdateProcessorTest, TestJmxMonitoredMap,
> > > TestDistributedStatsComponentCardinality,
> > > TestManagedSynonymFilterFactory, JSONWriterTest, TestNRTOpen,
> > > ReplicationFactorTest, DOMUtilTest, SolrCoreTest,
> > > DocExpirationUpdateProcessorFactoryTest, FastVectorHighlighterTest,
> > > SuggesterFSTTest, TestExtendedDismaxParser, TestSolrConfigHandler,
> > > DocumentAnalysisRequestHandlerTest,
> > > DistributedFacetPivotSmallAdvancedTest, BlockDirectoryTest,
> > > TestQuerySenderNoQuery, TestHashPartitioner, DateFieldTest,
> > > SegmentsInfoRequestHandlerTest, TestFieldCollectionResource,
> > > RecoveryAfterSoftCommitTest, TestMergePolicyConfig,
> > TestFieldSortValues,
> > > SecurityConfHandlerTest, TestStressReorder, BufferStoreTest,
> > > TestRandomRequestDistribution, HdfsBasicDistributedZkTest,
> > > TestCloudManagedSchemaConcurrent, TestReplicaProperties,
> > > DisMaxRequestHandlerTest, TestMacros, TestStressLucene,
> > > TestReloadAndDeleteDocs, BasicAuthIntegrationTest, TestDocSet,
> > > BasicDistributedZkTest, DistributedQueryElevationComponentTest,
> > > TestGroupingSearch, TestObjectReleaseTracker,
> MoreLikeThisHandlerTest,
> > > OverseerTest, TestFaceting, TestUpdate, TestClassNameShortening,
> > > TestRestManager, SyncSliceTest, ShardRoutingTest, ZkSolrClientTest,
> > > TestZkChroot, TestRandomDVFaceting, ShardRoutingCustomTest,
> > > TestDistributedGrouping, DistributedSpellCheckComponentTest,
> > > ZkControllerTest, TestRealTimeGet, TestReload,
> > > DistributedTermsComponentTest, TestRangeQuery, SimpleFacetsTest,
> > > TestSolr4Spatial, StatsComponentTest, SolrCmdDistributorTest, TestSort,
> > > CurrencyFieldXmlFileTest, AnalysisAfterCoreReloadTest,
> > > TestFoldingMultitermQuery, SuggesterTSTTest, TestCSVLoader,
> > > SchemaVersionSpecificBehaviorTest, SolrCoreCheckLockOnStartupTest,
> > > DirectUpdateHandlerOptimizeTest,
> > > StatelessScriptUpdateProcessorFactoryTest, DistanceFunctionTest,
> > > IndexBasedSpellCheckerTest, StandardRequestHandlerTest,
> > > TestOmitPositions, DocumentBuilderTest, RequiredFieldsTest,
> > > TestArbitraryIndexDir, LoggingHandlerTest, ReturnFieldsTest,
> > > MBeansHandlerTest, UniqFieldsUpdateProcessorFactoryTest,
> > > PingRequestHandlerTest, TestComponentsName, TestLFUCache,
> > > PreAnalyzedFieldTest, TestSystemIdResolver,
> SpellingQueryConverterTest,
> > > TestUtils, TestDocumentBuilder, SliceStateTest, SystemInfoHandlerTest,
> > > UUIDFieldTest, FileUtilsTest, CircularListTest, TestRTGBase,
> > > CursorPagingTest, DistributedIntervalFacetingTest,
> > > TestDistributedMissingSort, TestSimpleTrackingShardHandler,
> > > AsyncMigrateRouteKeyTest, DeleteInactiveReplicaTest,
> > > DistribDocExpirationUpdateProcessorTest,
> > LeaderFailoverAfterPartitionTest,
> > > OverriddenZkACLAndCredentialsProvidersTest,
> > > OverseerCollectionConfigSetProcessorTest, OverseerRolesTest,
> > > OverseerTaskQueueTest, SSLMigrationTest, SaslZkACLProviderTest,
> > > SimpleCollectionCreateDeleteTest, TestAuthenticationFramework,
> > > TestCloudInspectUtil, TestCollectionAPI, TestMiniSolrCloudClusterSSL,
> > > TestRebalanceLeaders, TestRequestStatusCollectionAPI,
> > > HdfsBasicDistributedZk2Test, HdfsChaosMonkeySafeLeaderTest,
> > > HdfsCollectionsAPIDistributedZkTest, HdfsNNFailoverTest]
> > >    [junit4]   2> NOTE: reproduce with: ant test  -
> > > Dtestcase=HdfsNNFailoverTest -Dtests.seed=5D8F351977870E3F -
> > > Dtests.slow=true -Dtests.locale=mk_MK -
> Dtests.timezone=Asia/Shanghai -
> > > Dtests.asserts=true -Dtests.file.encoding=UTF-8
> > >    [junit4] ERROR   0.00s J0 | HdfsNNFailoverTest (suite) <<<
> > >    [junit4]    > Throwable #1: java.lang.RuntimeException: Error while
> > running
> > > command to get file permissions : java.io.IOException: Cannot run
> program
> > > "/bin/ls": error=12, Not enough space
> > >    [junit4]    > 	at
> java.lang.ProcessBuilder.start(ProcessBuilder.java:1047)
> > >    [junit4]    > 	at
> org.apache.hadoop.util.Shell.runCommand(Shell.java:485)
> > >    [junit4]    > 	at org.apache.hadoop.util.Shell.run(Shell.java:455)
> > >    [junit4]    > 	at
> > >
> >
> org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:715
> > > )
> > >    [junit4]    > 	at
> > > org.apache.hadoop.util.Shell.execCommand(Shell.java:808)
> > >    [junit4]    > 	at
> > > org.apache.hadoop.util.Shell.execCommand(Shell.java:791)
> > >    [junit4]    > 	at
> > > org.apache.hadoop.fs.FileUtil.execCommand(FileUtil.java:1097)
> > >    [junit4]    > 	at
> > >
> >
> org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.lo
> > > adPermissionInfo(RawLocalFileSystem.java:582)
> > >    [junit4]    > 	at
> > >
> >
> org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.g
> > > etPermission(RawLocalFileSystem.java:557)
> > >    [junit4]    > 	at
> > >
> >
> org.apache.hadoop.util.DiskChecker.mkdirsWithExistsAndPermissionCheck(
> > > DiskChecker.java:139)
> > >    [junit4]    > 	at
> > > org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:156)
> > >    [junit4]    > 	at
> > >
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode$DataNodeDiskChecker.
> > > checkDir(DataNode.java:2239)
> > >    [junit4]    > 	at
> > >
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.checkStorageLocations(
> > > DataNode.java:2281)
> > >    [junit4]    > 	at
> > >
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNo
> > > de.java:2263)
> > >    [junit4]    > 	at
> > >
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(D
> > > ataNode.java:2155)
> > >    [junit4]    > 	at
> > >
> >
> org.apache.hadoop.hdfs.MiniDFSCluster.startDataNodes(MiniDFSCluster.jav
> > > a:1443)
> > >    [junit4]    > 	at
> > >
> >
> org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.j
> > > ava:828)
> > >    [junit4]    > 	at
> > > org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:738)
> > >    [junit4]    > 	at
> > > org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:608)
> > >    [junit4]    > 	at
> > > org.apache.solr.cloud.hdfs.HdfsTestUtil.setupClass(HdfsTestUtil.java:98)
> > >    [junit4]    > 	at
> > >
> >
> org.apache.solr.cloud.hdfs.HdfsNNFailoverTest.setupClass(HdfsNNFailoverT
> > > est.java:44)
> > >    [junit4]    > 	at java.lang.Thread.run(Thread.java:745)
> > >    [junit4]    > Caused by: java.io.IOException: error=12, Not enough space
> > >    [junit4]    > 	at java.lang.UNIXProcess.forkAndExec(Native
> Method)
> > >    [junit4]    > 	at
> java.lang.UNIXProcess.<init>(UNIXProcess.java:137)
> > >    [junit4]    > 	at java.lang.ProcessImpl.start(ProcessImpl.java:130)
> > >    [junit4]    > 	at
> java.lang.ProcessBuilder.start(ProcessBuilder.java:1028)
> > >    [junit4]    > 	... 44 more
> > >    [junit4]    > 	at
> > > __randomizedtesting.SeedInfo.seed([5D8F351977870E3F]:0)
> > >    [junit4]    > 	at
> > >
> >
> org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.lo
> > > adPermissionInfo(RawLocalFileSystem.java:620)
> > >    [junit4]    > 	at
> > >
> >
> org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.g
> > > etPermission(RawLocalFileSystem.java:557)
> > >    [junit4]    > 	at
> > >
> >
> org.apache.hadoop.util.DiskChecker.mkdirsWithExistsAndPermissionCheck(
> > > DiskChecker.java:139)
> > >    [junit4]    > 	at
> > > org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:156)
> > >    [junit4]    > 	at
> > >
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode$DataNodeDiskChecker.
> > > checkDir(DataNode.java:2239)
> > >    [junit4]    > 	at
> > >
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.checkStorageLocations(
> > > DataNode.java:2281)
> > >    [junit4]    > 	at
> > >
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNo
> > > de.java:2263)
> > >    [junit4]    > 	at
> > >
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(D
> > > ataNode.java:2155)
> > >    [junit4]    > 	at
> > >
> >
> org.apache.hadoop.hdfs.MiniDFSCluster.startDataNodes(MiniDFSCluster.jav
> > > a:1443)
> > >    [junit4]    > 	at
> > >
> >
> org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.j
> > > ava:828)
> > >    [junit4]    > 	at
> > > org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:738)
> > >    [junit4]    > 	at
> > > org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:608)
> > >    [junit4]    > 	at
> > > org.apache.solr.cloud.hdfs.HdfsTestUtil.setupClass(HdfsTestUtil.java:98)
> > >    [junit4]    > 	at
> > >
> >
> org.apache.solr.cloud.hdfs.HdfsNNFailoverTest.setupClass(HdfsNNFailoverT
> > > est.java:44)
> > >    [junit4]    > 	at java.lang.Thread.run(Thread.java:745)
> > >    [junit4] Completed [426/536] on J0 in 45.66s, 0 tests, 1 error <<<
> > FAILURES!
> > >
> > > [...truncated 300 lines...]
> > >    [junit4] Suite: org.apache.solr.store.hdfs.HdfsDirectoryTest
> > >    [junit4]   2> Creating dataDir:
> /export/home/jenkins/workspace/Lucene-
> > > Solr-5.x-Solaris/solr/build/solr-
> > >
> core/test/J1/temp/solr.store.hdfs.HdfsDirectoryTest_5D8F351977870E3F-
> > > 001/init-core-data-001
> > >    [junit4]   2> 3147821 INFO  (SUITE-HdfsDirectoryTest-
> > > seed#[5D8F351977870E3F]-worker) [    ] o.a.s.SolrTestCaseJ4 Randomized
> > ssl
> > > (false) and clientAuth (false)
> > >    [junit4]   1> Formatting using clusterid: testClusterID
> > >    [junit4]   2> 3147964 WARN  (SUITE-HdfsDirectoryTest-
> > > seed#[5D8F351977870E3F]-worker) [    ] o.a.h.m.i.MetricsConfig Cannot
> > > locate configuration: tried hadoop-metrics2-
> > namenode.properties,hadoop-
> > > metrics2.properties
> > >    [junit4]   2> 3147974 WARN  (SUITE-HdfsDirectoryTest-
> > > seed#[5D8F351977870E3F]-worker) [    ] o.a.h.h.HttpRequestLog Jetty
> > > request log can only be enabled using Log4j
> > >    [junit4]   2> 3147976 INFO  (SUITE-HdfsDirectoryTest-
> > > seed#[5D8F351977870E3F]-worker) [    ] o.m.log jetty-6.1.26
> > >    [junit4]   2> 3147994 INFO  (SUITE-HdfsDirectoryTest-
> > > seed#[5D8F351977870E3F]-worker) [    ] o.m.log Extract
> > > jar:file:/export/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-
> > > hdfs/tests/hadoop-hdfs-2.6.0-tests.jar!/webapps/hdfs to
> > > ./temp/Jetty_solaris.vm_46547_hdfs____.vwfmpk/webapp
> > >    [junit4]   2> 3148170 INFO  (SUITE-HdfsDirectoryTest-
> > > seed#[5D8F351977870E3F]-worker) [    ] o.m.log NO JSP Support for /, did
> > not
> > > find org.apache.jasper.servlet.JspServlet
> > >    [junit4]   2> 3148982 INFO  (SUITE-HdfsDirectoryTest-
> > > seed#[5D8F351977870E3F]-worker) [    ] o.m.log Started
> > > HttpServer2$SelectChannelConnectorWithSafeStartup@solaris-vm:46547
> > >    [junit4]   2> 3157264 INFO  (SUITE-HdfsDirectoryTest-
> > > seed#[5D8F351977870E3F]-worker) [    ] o.m.log Stopped
> > > HttpServer2$SelectChannelConnectorWithSafeStartup@solaris-vm:0
> > >    [junit4]   2> 3157403 INFO  (SUITE-HdfsDirectoryTest-
> > > seed#[5D8F351977870E3F]-worker) [    ] o.a.s.SolrTestCaseJ4
> > ###deleteCore
> > >    [junit4]   2> Aug 29, 2015 9:01:07 PM
> > > com.carrotsearch.randomizedtesting.ThreadLeakControl
> > checkThreadLeaks
> > >    [junit4]   2> WARNING: Will linger awaiting termination of 1 leaked
> > > thread(s).
> > >    [junit4]   2> Aug 29, 2015 9:01:27 PM
> > > com.carrotsearch.randomizedtesting.ThreadLeakControl
> > checkThreadLeaks
> > >    [junit4]   2> SEVERE: 1 thread leaked from SUITE scope at
> > > org.apache.solr.store.hdfs.HdfsDirectoryTest:
> > >    [junit4]   2>    1) Thread[id=20389, name=IPC Server idle connection
> > scanner
> > > for port 41610, state=WAITING, group=TGRP-HdfsDirectoryTest]
> > >    [junit4]   2>         at java.lang.Object.wait(Native Method)
> > >    [junit4]   2>         at java.lang.Object.wait(Object.java:503)
> > >    [junit4]   2>         at java.util.TimerThread.mainLoop(Timer.java:526)
> > >    [junit4]   2>         at java.util.TimerThread.run(Timer.java:505)
> > >    [junit4]   2> Aug 29, 2015 9:01:27 PM
> > > com.carrotsearch.randomizedtesting.ThreadLeakControl
> tryToInterruptAll
> > >    [junit4]   2> INFO: Starting to interrupt leaked threads:
> > >    [junit4]   2>    1) Thread[id=20389, name=IPC Server idle connection
> > scanner
> > > for port 41610, state=WAITING, group=TGRP-HdfsDirectoryTest]
> > >    [junit4]   2> Aug 29, 2015 9:01:30 PM
> > > com.carrotsearch.randomizedtesting.ThreadLeakControl
> tryToInterruptAll
> > >    [junit4]   2> SEVERE: There are still zombie threads that couldn't be
> > > terminated:
> > >    [junit4]   2>    1) Thread[id=20389, name=IPC Server idle connection
> > scanner
> > > for port 41610, state=WAITING, group=TGRP-HdfsDirectoryTest]
> > >    [junit4]   2>         at java.lang.Object.wait(Native Method)
> > >    [junit4]   2>         at java.lang.Object.wait(Object.java:503)
> > >    [junit4]   2>         at java.util.TimerThread.mainLoop(Timer.java:526)
> > >    [junit4]   2>         at java.util.TimerThread.run(Timer.java:505)
> > >    [junit4]   2> NOTE: test params are: codec=Asserting(Lucene53): {},
> > > docValues:{}, sim=DefaultSimilarity, locale=es_BO,
> > > timezone=Antarctica/South_Pole
> > >    [junit4]   2> NOTE: SunOS 5.11 x86/Oracle Corporation 1.7.0_85 (32-
> > > bit)/cpus=3,threads=2,free=136627544,total=518979584
> > >    [junit4]   2> NOTE: All tests run in this JVM: [TestIndexingPerformance,
> > > TestCSVResponseWriter, DistributedQueryComponentCustomSortTest,
> > > DirectSolrConnectionTest, FullSolrCloudDistribCmdsTest,
> > > TestShardHandlerFactory, CacheHeaderTest, BasicZkTest, TestTrie,
> > > FieldAnalysisRequestHandlerTest, PKIAuthenticationIntegrationTest,
> > > OpenCloseCoreStressTest, TestSuggestSpellingConverter,
> StressHdfsTest,
> > > CleanupOldIndexTest, DistributedExpandComponentTest,
> > > TestHdfsUpdateLog, TestSolrXml, TestAddFieldRealTimeGet,
> > TestJsonFacets,
> > > DistributedSuggestComponentTest,
> > > OutOfBoxZkACLAndCredentialsProvidersTest,
> AnalyticsMergeStrategyTest,
> > > HLLUtilTest, ResponseHeaderTest, SearchHandlerTest,
> > > BinaryUpdateRequestHandlerTest, DistributedFacetPivotWhiteBoxTest,
> > > ConnectionManagerTest, SpellCheckComponentTest,
> > > TestScoreJoinQPNoScore, SolrTestCaseJ4Test, SolrIndexSplitterTest,
> > > TestConfigSetsAPI, TestDefaultSearchFieldResource, TestCryptoKeys,
> > > TestNonDefinedSimilarityFactory, TestCoreDiscovery, RollingRestartTest,
> > > SolrInfoMBeanTest, CustomCollectionTest, DistributedVersionInfoTest,
> > > ClusterStateTest, TestReversedWildcardFilterFactory, SolrXmlInZkTest,
> > > DistributedFacetPivotLongTailTest, URLClassifyProcessorTest,
> > > TestLMJelinekMercerSimilarityFactory, RequestHandlersTest,
> > > RemoteQueryErrorTest, LeaderElectionIntegrationTest,
> > > SharedFSAutoReplicaFailoverTest, TestBadConfig,
> > > SignatureUpdateProcessorFactoryTest,
> > TestCursorMarkWithoutUniqueKey,
> > > TestCrossCoreJoin, SparseHLLTest, DistributedQueueTest,
> > > BigEndianAscendingWordSerializerTest, TestBM25SimilarityFactory,
> > > AutoCommitTest, DateMathParserTest, BasicFunctionalityTest,
> > > SuggesterWFSTTest, TestCollapseQParserPlugin, TestManagedResource,
> > > TestSha256AuthenticationProvider, CollectionTooManyReplicasTest,
> > > BadCopyFieldTest, TestDownShardTolerantSearch,
> CloudMLTQParserTest,
> > > NotRequiredUniqueKeyTest, TestAnalyzeInfixSuggestions,
> > > ExitableDirectoryReaderTest, TestScoreJoinQPScore, DeleteShardTest,
> > > RankQueryTest, TestSchemaManager,
> > UpdateRequestProcessorFactoryTest,
> > > CursorMarkTest, DistributedDebugComponentTest, DeleteReplicaTest,
> > > RAMDirectoryFactoryTest, ConcurrentDeleteAndCreateCollectionTest,
> > > TestQueryTypes, OutputWriterTest, TestSchemaSimilarityResource,
> > > HighlighterMaxOffsetTest, ResponseLogComponentTest,
> > > TestCloudPivotFacet, DocValuesMissingTest,
> > > FieldMutatingUpdateProcessorTest, HttpPartitionTest, TestCollationField,
> > > ZkStateWriterTest, TestQuerySenderListener, AtomicUpdatesTest,
> > > TestStressRecovery, TestRandomFaceting,
> > > SharedFSAutoReplicaFailoverUtilsTest, CoreAdminHandlerTest,
> > > HighlighterConfigTest, TestCustomSort, MultiTermTest,
> > > VMParamsZkACLAndCredentialsProvidersTest,
> > > IgnoreCommitOptimizeUpdateProcessorFactoryTest,
> CollectionReloadTest,
> > > PrimUtilsTest, TestRecovery, TestWriterPerf,
> > > AddSchemaFieldsUpdateProcessorFactoryTest, TimeZoneUtilsTest,
> > > CurrencyFieldOpenExchangeTest, TestSolrCLIRunExample,
> > > TestPHPSerializedResponseWriter, ChaosMonkeySafeLeaderTest,
> > > TestIndexSearcher, EnumFieldTest, TestSolrIndexConfig,
> > > TermVectorComponentDistributedTest, TestJoin,
> TestExpandComponent,
> > > TestManagedResourceStorage, SortByFunctionTest,
> > > TestDefaultSimilarityFactory, SuggesterTest, TestValueSourceCache,
> > > SolrPluginUtilsTest, TermVectorComponentTest, TestFiltering,
> > > TestQueryUtils, FileBasedSpellCheckerTest, BasicDistributedZk2Test,
> > > CollectionsAPIDistributedZkTest, TestReplicationHandler,
> > > TestDistributedSearch, BadIndexSchemaTest, ConvertedLegacyTest,
> > > HighlighterTest, ShowFileRequestHandlerTest, SpellCheckCollatorTest,
> > > SpatialFilterTest, NoCacheHeaderTest, WordBreakSolrSpellCheckerTest,
> > > TestPseudoReturnFields, TestAtomicUpdateErrorCases,
> > > TestWordDelimiterFilterFactory, DefaultValueUpdateProcessorTest,
> > > TestRemoteStreaming, DebugComponentTest,
> TestSurroundQueryParser,
> > > LukeRequestHandlerTest, TestSolrQueryParser,
> > > IndexSchemaRuntimeFieldTest, RegexBoostProcessorTest,
> > > TestJmxIntegration, QueryParsingTest, TestPartialUpdateDeduplication,
> > > CSVRequestHandlerTest, TestBinaryResponseWriter, SOLR749Test,
> > > CopyFieldTest, BadComponentTest, TestSolrDeletionPolicy2, SampleTest,
> > > TestBinaryField, TestSearchPerf, NumericFieldsTest, MinimalSchemaTest,
> > > TestFuzzyAnalyzedSuggestions, TestSolrCoreProperties,
> > > TestPostingsSolrHighlighter, TestLuceneMatchVersion,
> > > SpellPossibilityIteratorTest, TestCharFilters, SynonymTokenizerTest,
> > > EchoParamsTest, TestSweetSpotSimilarityFactory, TestPerFieldSimilarity,
> > > TestLMDirichletSimilarityFactory, ResourceLoaderTest,
> > > TestFastOutputStream, ScriptEngineTest,
> > > OpenExchangeRatesOrgProviderTest, PluginInfoTest, TestFastLRUCache,
> > > ChaosMonkeyNothingIsSafeTest, TestHighlightDedupGrouping,
> > > TestTolerantSearch, TestJettySolrRunner, AssignTest,
> > > AsyncCallRequestStatusResponseTest, CollectionStateFormat2Test,
> > > CollectionsAPIAsyncDistributedZkTest, DistribCursorPagingTest,
> > > DistribJoinFromCollectionTest, LeaderInitiatedRecoveryOnCommitTest,
> > > MigrateRouteKeyTest, OverseerStatusTest, ShardSplitTest,
> > > TestConfigSetsAPIExclusivity, TestConfigSetsAPIZkFailure,
> > > TestLeaderElectionZkExpiry, TestMiniSolrCloudCluster,
> > > TestShortCircuitedRequests, HdfsRecoverLeaseTest,
> > > CachingDirectoryFactoryTest, HdfsDirectoryFactoryTest,
> TestConfigOverlay,
> > > TestConfigSetImmutable, TestImplicitCoreProperties,
> > > TestInfoStreamLogging, TestInitParams, TestSolrDynamicMBean,
> > > TestBlobHandler, TestConfigReload, TestReplicationHandlerBackup,
> > > TestSolrConfigHandlerConcurrent, CoreAdminCreateDiscoverTest,
> > > CoreAdminRequestStatusTest, CoreMergeIndexesAdminHandlerTest,
> > > DistributedFacetPivotLargeTest, DistributedFacetPivotSmallTest,
> > > FacetPivotSmallTest, SuggestComponentTest, JavabinLoaderTest,
> > > SmileWriterTest, TestIntervalFaceting, TestChildDocTransformer,
> > > TestCustomDocTransformer, TestSortingResponseWriter,
> > > TestBulkSchemaAPI, TestFieldResource,
> > > TestManagedSchemaDynamicFieldResource,
> TestBulkSchemaConcurrent,
> > > TestCloudSchemaless, TestReloadDeadlock, TestSearcherReuse,
> > > TestSimpleQParserPlugin, TestSmileRequest, TestSolr4Spatial2,
> > > TestStandardQParsers, TestStressUserVersions, TestTrieFacet,
> > > TestMinMaxOnMultiValuedField, TestOrdValues,
> > > TestSortByMinMaxFunction, SimpleMLTQParserTest, TestDistribIDF,
> > > TestExactSharedStatsCache, TestPKIAuthenticationPlugin,
> > > TestBlendedInfixSuggestions, TestFileDictionaryLookup,
> > > TestFreeTextSuggestions, TestHighFrequencyDictionaryFactory,
> > > BlockCacheTest, HdfsDirectoryTest]
> > >    [junit4]   2> NOTE: reproduce with: ant test  -
> Dtestcase=HdfsDirectoryTest
> > -
> > > Dtests.seed=5D8F351977870E3F -Dtests.slow=true -Dtests.locale=es_BO
> -
> > > Dtests.timezone=Antarctica/South_Pole -Dtests.asserts=true -
> > > Dtests.file.encoding=UTF-8
> > >    [junit4] ERROR   0.00s J1 | HdfsDirectoryTest (suite) <<<
> > >    [junit4]    > Throwable #1: java.security.AccessControlException: access
> > > denied ("java.io.FilePermission"
> > "/export/home/jenkins/workspace/Lucene-
> > > Solr-5.x-Solaris/solr/build/solr-core/test/J1" "write")
> > >    [junit4]    > 	at
> > > __randomizedtesting.SeedInfo.seed([5D8F351977870E3F]:0)
> > >    [junit4]    > 	at
> > >
> >
> java.security.AccessControlContext.checkPermission(AccessControlContext.j
> > > ava:395)
> > >    [junit4]    > 	at
> > >
> java.security.AccessController.checkPermission(AccessController.java:559)
> > >    [junit4]    > 	at
> > > java.lang.SecurityManager.checkPermission(SecurityManager.java:549)
> > >    [junit4]    > 	at
> > > java.lang.SecurityManager.checkWrite(SecurityManager.java:979)
> > >    [junit4]    > 	at java.io.File.canWrite(File.java:785)
> > >    [junit4]    > 	at
> org.apache.hadoop.fs.FileUtil.canWrite(FileUtil.java:1002)
> > >    [junit4]    > 	at
> > >
> >
> org.apache.hadoop.hdfs.MiniDFSCluster.createPermissionsDiagnosisString(
> > > MiniDFSCluster.java:856)
> > >    [junit4]    > 	at
> > >
> >
> org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.j
> > > ava:812)
> > >    [junit4]    > 	at
> > > org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:738)
> > >    [junit4]    > 	at
> > > org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:608)
> > >    [junit4]    > 	at
> > > org.apache.solr.cloud.hdfs.HdfsTestUtil.setupClass(HdfsTestUtil.java:98)
> > >    [junit4]    > 	at
> > > org.apache.solr.cloud.hdfs.HdfsTestUtil.setupClass(HdfsTestUtil.java:60)
> > >    [junit4]    > 	at
> > >
> >
> org.apache.solr.store.hdfs.HdfsDirectoryTest.beforeClass(HdfsDirectoryTest.
> > > java:62)
> > >    [junit4]    > 	at java.lang.Thread.run(Thread.java:745)Throwable
> #2:
> > > com.carrotsearch.randomizedtesting.ThreadLeakError: 1 thread leaked
> > from
> > > SUITE scope at org.apache.solr.store.hdfs.HdfsDirectoryTest:
> > >    [junit4]    >    1) Thread[id=20389, name=IPC Server idle connection
> > scanner
> > > for port 41610, state=WAITING, group=TGRP-HdfsDirectoryTest]
> > >    [junit4]    >         at java.lang.Object.wait(Native Method)
> > >    [junit4]    >         at java.lang.Object.wait(Object.java:503)
> > >    [junit4]    >         at java.util.TimerThread.mainLoop(Timer.java:526)
> > >    [junit4]    >         at java.util.TimerThread.run(Timer.java:505)
> > >    [junit4]    > 	at
> > > __randomizedtesting.SeedInfo.seed([5D8F351977870E3F]:0)Throwable
> #3:
> > > com.carrotsearch.randomizedtesting.ThreadLeakError: There are still
> > zombie
> > > threads that couldn't be terminated:
> > >    [junit4]    >    1) Thread[id=20389, name=IPC Server idle connection
> > scanner
> > > for port 41610, state=WAITING, group=TGRP-HdfsDirectoryTest]
> > >    [junit4]    >         at java.lang.Object.wait(Native Method)
> > >    [junit4]    >         at java.lang.Object.wait(Object.java:503)
> > >    [junit4]    >         at java.util.TimerThread.mainLoop(Timer.java:526)
> > >    [junit4]    >         at java.util.TimerThread.run(Timer.java:505)
> > >    [junit4]    > 	at
> > > __randomizedtesting.SeedInfo.seed([5D8F351977870E3F]:0)
> > >    [junit4] Completed [521/536] on J1 in 33.05s, 0 tests, 3 errors <<<
> > FAILURES!
> > >
> > > [...truncated 64 lines...]
> > > BUILD FAILED
> > > /export/home/jenkins/workspace/Lucene-Solr-5.x-Solaris/build.xml:785:
> > > The following error occurred while executing this line:
> > > /export/home/jenkins/workspace/Lucene-Solr-5.x-Solaris/build.xml:729:
> > > The following error occurred while executing this line:
> > > /export/home/jenkins/workspace/Lucene-Solr-5.x-Solaris/build.xml:59:
> > The
> > > following error occurred while executing this line:
> > > /export/home/jenkins/workspace/Lucene-Solr-5.x-
> > > Solaris/solr/build.xml:233: The following error occurred while executing
> this
> > > line:
> > > /export/home/jenkins/workspace/Lucene-Solr-5.x-
> Solaris/solr/common-
> > > build.xml:524: The following error occurred while executing this line:
> > > /export/home/jenkins/workspace/Lucene-Solr-5.x-
> > Solaris/lucene/common-
> > > build.xml:1452: The following error occurred while executing this line:
> > > /export/home/jenkins/workspace/Lucene-Solr-5.x-
> > Solaris/lucene/common-
> > > build.xml:1006: There were test failures: 536 suites, 2123 tests, 4 suite-
> level
> > > errors, 108 ignored (34 assumptions)
> > >
> > > Total time: 77 minutes 51 seconds
> > > Build step 'Invoke Ant' marked build as failure
> > > Archiving artifacts
> > > [WARNINGS] Skipping publisher since build result is FAILURE
> > > Recording test results
> > > Email was triggered for: Failure - Any
> > > Sending email for trigger: Failure - Any
> > >
> >
> >
> >
> > ---------------------------------------------------------------------
> > To unsubscribe, e-mail: dev-unsubscribe@lucene.apache.org
> > For additional commands, e-mail: dev-help@lucene.apache.org
> 
> 
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscribe@lucene.apache.org
> For additional commands, e-mail: dev-help@lucene.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@lucene.apache.org
For additional commands, e-mail: dev-help@lucene.apache.org


RE: [JENKINS] Lucene-Solr-5.x-Solaris (multiarch/jdk1.7.0) - Build # 9 - Still Failing!

Posted by Uwe Schindler <uw...@thetaphi.de>.
Hi,

This is a problem of Java 7 on Solaris. It still uses fork to spawn processes in Java 7; Java 8 has fixed this. Unfortunately the whole fork stuff is badly implemented in Solaris and really allocates the same size of memory again and does not use it - due to the large heaps this needs a lot of memory. Workaround is to allocate enough Swap (which is never used):

https://developer.opencloud.com/forum/posts/list/620.page

For now I raised swap space (which is really simple to do with ZFS... Way cool):

root@solaris-vm:~# zfs set volsize=6g rpool/swap

Super cool.

In Java 8, it uses the new posix_spawn launch mechanism... (on Linux vfork). This was (by the way) the buggy code that was fixed in U40, which caused the Turkish Locale to fail :-)

Uwe

-----
Uwe Schindler
H.-H.-Meier-Allee 63, D-28213 Bremen
http://www.thetaphi.de
eMail: uwe@thetaphi.de


> -----Original Message-----
> From: Uwe Schindler [mailto:uwe@thetaphi.de]
> Sent: Saturday, August 29, 2015 11:52 PM
> To: dev@lucene.apache.org
> Subject: RE: [JENKINS] Lucene-Solr-5.x-Solaris (multiarch/jdk1.7.0) - Build # 9
> - Still Failing!
> 
> I am still digging... On Solaris there seems to be a general forking problem on
> 32 bit processes.
> 
> Uwe
> 
> -----
> Uwe Schindler
> H.-H.-Meier-Allee 63, D-28213 Bremen
> http://www.thetaphi.de
> eMail: uwe@thetaphi.de
> 
> > -----Original Message-----
> > From: Policeman Jenkins Server [mailto:jenkins@thetaphi.de]
> > Sent: Saturday, August 29, 2015 11:02 PM
> > To: shalin@apache.org; mikemccand@apache.org; dev@lucene.apache.org
> > Subject: [JENKINS] Lucene-Solr-5.x-Solaris (multiarch/jdk1.7.0) - Build # 9 -
> > Still Failing!
> >
> > Build: http://jenkins.thetaphi.de/job/Lucene-Solr-5.x-Solaris/9/
> > Java: multiarch/jdk1.7.0 -d32 -server -XX:+UseConcMarkSweepGC
> >
> > 4 tests failed.
> > FAILED:
> > junit.framework.TestSuite.org.apache.solr.cloud.hdfs.HdfsNNFailoverTest
> >
> > Error Message:
> > Error while running command to get file permissions : java.io.IOException:
> > Cannot run program "/bin/ls": error=12, Not enough space  at
> > java.lang.ProcessBuilder.start(ProcessBuilder.java:1047)  at
> > org.apache.hadoop.util.Shell.runCommand(Shell.java:485)  at
> > org.apache.hadoop.util.Shell.run(Shell.java:455)  at
> >
> org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:715
> > )  at org.apache.hadoop.util.Shell.execCommand(Shell.java:808)  at
> > org.apache.hadoop.util.Shell.execCommand(Shell.java:791)  at
> > org.apache.hadoop.fs.FileUtil.execCommand(FileUtil.java:1097)  at
> >
> org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.lo
> > adPermissionInfo(RawLocalFileSystem.java:582)  at
> >
> org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.g
> > etPermission(RawLocalFileSystem.java:557)  at
> >
> org.apache.hadoop.util.DiskChecker.mkdirsWithExistsAndPermissionCheck(
> > DiskChecker.java:139)  at
> > org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:156)  at
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode$DataNodeDiskChecker.
> > checkDir(DataNode.java:2239)  at
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.checkStorageLocations(
> > DataNode.java:2281)  at
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNo
> > de.java:2263)  at
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(D
> > ataNode.java:2155)  at
> >
> org.apache.hadoop.hdfs.MiniDFSCluster.startDataNodes(MiniDFSCluster.jav
> > a:1443)  at
> >
> org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.j
> > ava:828)  at
> > org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:738)
> at
> > org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:608)
> at
> > org.apache.solr.cloud.hdfs.HdfsTestUtil.setupClass(HdfsTestUtil.java:98)
> at
> >
> org.apache.solr.cloud.hdfs.HdfsNNFailoverTest.setupClass(HdfsNNFailoverT
> > est.java:44)  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> > Method)  at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.j
> > ava:57)  at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
> > sorImpl.java:43)  at java.lang.reflect.Method.invoke(Method.java:606)  at
> >
> com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(Randomize
> > dRunner.java:1627)  at
> >
> com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(Rando
> > mizedRunner.java:776)  at
> >
> com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(Rando
> > mizedRunner.java:792)  at
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > ementAdapter.java:36)  at
> >
> com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.
> > evaluate(SystemPropertiesRestoreRule.java:57)  at
> >
> org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeA
> > fterRule.java:46)  at
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > ementAdapter.java:36)  at
> >
> org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreCl
> > assName.java:42)  at
> >
> com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMet
> > hodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
> at
> >
> com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMet
> > hodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
> at
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > ementAdapter.java:36)  at
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > ementAdapter.java:36)  at
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > ementAdapter.java:36)  at
> >
> org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAss
> > ertionsRequired.java:54)  at
> >
> org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure
> > .java:48)  at
> >
> org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRule
> > IgnoreAfterMaxFailures.java:65)  at
> >
> org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnore
> > TestSuites.java:55)  at
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > ementAdapter.java:36)  at
> >
> com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.
> > run(ThreadLeakControl.java:365)  at java.lang.Thread.run(Thread.java:745)
> > Caused by: java.io.IOException: error=12, Not enough space  at
> > java.lang.UNIXProcess.forkAndExec(Native Method)  at
> > java.lang.UNIXProcess.<init>(UNIXProcess.java:137)  at
> > java.lang.ProcessImpl.start(ProcessImpl.java:130)  at
> > java.lang.ProcessBuilder.start(ProcessBuilder.java:1028)  ... 44 more
> >
> > Stack Trace:
> > java.lang.RuntimeException: Error while running command to get file
> > permissions : java.io.IOException: Cannot run program "/bin/ls": error=12,
> > Not enough space
> > 	at java.lang.ProcessBuilder.start(ProcessBuilder.java:1047)
> > 	at org.apache.hadoop.util.Shell.runCommand(Shell.java:485)
> > 	at org.apache.hadoop.util.Shell.run(Shell.java:455)
> > 	at
> >
> org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:715
> > )
> > 	at org.apache.hadoop.util.Shell.execCommand(Shell.java:808)
> > 	at org.apache.hadoop.util.Shell.execCommand(Shell.java:791)
> > 	at org.apache.hadoop.fs.FileUtil.execCommand(FileUtil.java:1097)
> > 	at
> >
> org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.lo
> > adPermissionInfo(RawLocalFileSystem.java:582)
> > 	at
> >
> org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.g
> > etPermission(RawLocalFileSystem.java:557)
> > 	at
> >
> org.apache.hadoop.util.DiskChecker.mkdirsWithExistsAndPermissionCheck(
> > DiskChecker.java:139)
> > 	at
> > org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:156)
> > 	at
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode$DataNodeDiskChecker.
> > checkDir(DataNode.java:2239)
> > 	at
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.checkStorageLocations(
> > DataNode.java:2281)
> > 	at
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNo
> > de.java:2263)
> > 	at
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(D
> > ataNode.java:2155)
> > 	at
> >
> org.apache.hadoop.hdfs.MiniDFSCluster.startDataNodes(MiniDFSCluster.jav
> > a:1443)
> > 	at
> >
> org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.j
> > ava:828)
> > 	at
> > org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:738)
> > 	at
> > org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:608)
> > 	at
> > org.apache.solr.cloud.hdfs.HdfsTestUtil.setupClass(HdfsTestUtil.java:98)
> > 	at
> >
> org.apache.solr.cloud.hdfs.HdfsNNFailoverTest.setupClass(HdfsNNFailoverT
> > est.java:44)
> > 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > 	at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.j
> > ava:57)
> > 	at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
> > sorImpl.java:43)
> > 	at java.lang.reflect.Method.invoke(Method.java:606)
> > 	at
> >
> com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(Randomize
> > dRunner.java:1627)
> > 	at
> >
> com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(Rando
> > mizedRunner.java:776)
> > 	at
> >
> com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(Rando
> > mizedRunner.java:792)
> > 	at
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > ementAdapter.java:36)
> > 	at
> >
> com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.
> > evaluate(SystemPropertiesRestoreRule.java:57)
> > 	at
> >
> org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeA
> > fterRule.java:46)
> > 	at
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > ementAdapter.java:36)
> > 	at
> >
> org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreCl
> > assName.java:42)
> > 	at
> >
> com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMet
> > hodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
> > 	at
> >
> com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMet
> > hodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
> > 	at
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > ementAdapter.java:36)
> > 	at
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > ementAdapter.java:36)
> > 	at
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > ementAdapter.java:36)
> > 	at
> >
> org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAss
> > ertionsRequired.java:54)
> > 	at
> >
> org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure
> > .java:48)
> > 	at
> >
> org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRule
> > IgnoreAfterMaxFailures.java:65)
> > 	at
> >
> org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnore
> > TestSuites.java:55)
> > 	at
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > ementAdapter.java:36)
> > 	at
> >
> com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.
> > run(ThreadLeakControl.java:365)
> > 	at java.lang.Thread.run(Thread.java:745)
> > Caused by: java.io.IOException: error=12, Not enough space
> > 	at java.lang.UNIXProcess.forkAndExec(Native Method)
> > 	at java.lang.UNIXProcess.<init>(UNIXProcess.java:137)
> > 	at java.lang.ProcessImpl.start(ProcessImpl.java:130)
> > 	at java.lang.ProcessBuilder.start(ProcessBuilder.java:1028)
> > 	... 44 more
> >
> > 	at __randomizedtesting.SeedInfo.seed([5D8F351977870E3F]:0)
> > 	at
> >
> org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.lo
> > adPermissionInfo(RawLocalFileSystem.java:620)
> > 	at
> >
> org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.g
> > etPermission(RawLocalFileSystem.java:557)
> > 	at
> >
> org.apache.hadoop.util.DiskChecker.mkdirsWithExistsAndPermissionCheck(
> > DiskChecker.java:139)
> > 	at
> > org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:156)
> > 	at
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode$DataNodeDiskChecker.
> > checkDir(DataNode.java:2239)
> > 	at
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.checkStorageLocations(
> > DataNode.java:2281)
> > 	at
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNo
> > de.java:2263)
> > 	at
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(D
> > ataNode.java:2155)
> > 	at
> >
> org.apache.hadoop.hdfs.MiniDFSCluster.startDataNodes(MiniDFSCluster.jav
> > a:1443)
> > 	at
> >
> org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.j
> > ava:828)
> > 	at
> > org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:738)
> > 	at
> > org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:608)
> > 	at
> > org.apache.solr.cloud.hdfs.HdfsTestUtil.setupClass(HdfsTestUtil.java:98)
> > 	at
> >
> org.apache.solr.cloud.hdfs.HdfsNNFailoverTest.setupClass(HdfsNNFailoverT
> > est.java:44)
> > 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > 	at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.j
> > ava:57)
> > 	at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
> > sorImpl.java:43)
> > 	at java.lang.reflect.Method.invoke(Method.java:606)
> > 	at
> >
> com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(Randomize
> > dRunner.java:1627)
> > 	at
> >
> com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(Rando
> > mizedRunner.java:776)
> > 	at
> >
> com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(Rando
> > mizedRunner.java:792)
> > 	at
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > ementAdapter.java:36)
> > 	at
> >
> com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.
> > evaluate(SystemPropertiesRestoreRule.java:57)
> > 	at
> >
> org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeA
> > fterRule.java:46)
> > 	at
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > ementAdapter.java:36)
> > 	at
> >
> org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreCl
> > assName.java:42)
> > 	at
> >
> com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMet
> > hodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
> > 	at
> >
> com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMet
> > hodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
> > 	at
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > ementAdapter.java:36)
> > 	at
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > ementAdapter.java:36)
> > 	at
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > ementAdapter.java:36)
> > 	at
> >
> org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAss
> > ertionsRequired.java:54)
> > 	at
> >
> org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure
> > .java:48)
> > 	at
> >
> org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRule
> > IgnoreAfterMaxFailures.java:65)
> > 	at
> >
> org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnore
> > TestSuites.java:55)
> > 	at
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > ementAdapter.java:36)
> > 	at
> >
> com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.
> > run(ThreadLeakControl.java:365)
> > 	at java.lang.Thread.run(Thread.java:745)
> >
> >
> > FAILED:
> > junit.framework.TestSuite.org.apache.solr.store.hdfs.HdfsDirectoryTest
> >
> > Error Message:
> > access denied ("java.io.FilePermission"
> > "/export/home/jenkins/workspace/Lucene-Solr-5.x-Solaris/solr/build/solr-
> > core/test/J1" "write")
> >
> > Stack Trace:
> > java.security.AccessControlException: access denied
> > ("java.io.FilePermission" "/export/home/jenkins/workspace/Lucene-Solr-
> > 5.x-Solaris/solr/build/solr-core/test/J1" "write")
> > 	at __randomizedtesting.SeedInfo.seed([5D8F351977870E3F]:0)
> > 	at
> >
> java.security.AccessControlContext.checkPermission(AccessControlContext.j
> > ava:395)
> > 	at
> > java.security.AccessController.checkPermission(AccessController.java:559)
> > 	at
> > java.lang.SecurityManager.checkPermission(SecurityManager.java:549)
> > 	at java.lang.SecurityManager.checkWrite(SecurityManager.java:979)
> > 	at java.io.File.canWrite(File.java:785)
> > 	at org.apache.hadoop.fs.FileUtil.canWrite(FileUtil.java:1002)
> > 	at
> >
> org.apache.hadoop.hdfs.MiniDFSCluster.createPermissionsDiagnosisString(
> > MiniDFSCluster.java:856)
> > 	at
> >
> org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.j
> > ava:812)
> > 	at
> > org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:738)
> > 	at
> > org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:608)
> > 	at
> > org.apache.solr.cloud.hdfs.HdfsTestUtil.setupClass(HdfsTestUtil.java:98)
> > 	at
> > org.apache.solr.cloud.hdfs.HdfsTestUtil.setupClass(HdfsTestUtil.java:60)
> > 	at
> >
> org.apache.solr.store.hdfs.HdfsDirectoryTest.beforeClass(HdfsDirectoryTest.
> > java:62)
> > 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > 	at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.j
> > ava:57)
> > 	at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
> > sorImpl.java:43)
> > 	at java.lang.reflect.Method.invoke(Method.java:606)
> > 	at
> >
> com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(Randomize
> > dRunner.java:1627)
> > 	at
> >
> com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(Rando
> > mizedRunner.java:776)
> > 	at
> >
> com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(Rando
> > mizedRunner.java:792)
> > 	at
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > ementAdapter.java:36)
> > 	at
> >
> com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.
> > evaluate(SystemPropertiesRestoreRule.java:57)
> > 	at
> >
> org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeA
> > fterRule.java:46)
> > 	at
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > ementAdapter.java:36)
> > 	at
> >
> org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreCl
> > assName.java:42)
> > 	at
> >
> com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMet
> > hodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
> > 	at
> >
> com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMet
> > hodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
> > 	at
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > ementAdapter.java:36)
> > 	at
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > ementAdapter.java:36)
> > 	at
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > ementAdapter.java:36)
> > 	at
> >
> org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAss
> > ertionsRequired.java:54)
> > 	at
> >
> org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure
> > .java:48)
> > 	at
> >
> org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRule
> > IgnoreAfterMaxFailures.java:65)
> > 	at
> >
> org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnore
> > TestSuites.java:55)
> > 	at
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > ementAdapter.java:36)
> > 	at
> >
> com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.
> > run(ThreadLeakControl.java:365)
> > 	at java.lang.Thread.run(Thread.java:745)
> >
> >
> > FAILED:
> > junit.framework.TestSuite.org.apache.solr.store.hdfs.HdfsDirectoryTest
> >
> > Error Message:
> > 1 thread leaked from SUITE scope at
> > org.apache.solr.store.hdfs.HdfsDirectoryTest:     1) Thread[id=20389,
> > name=IPC Server idle connection scanner for port 41610, state=WAITING,
> > group=TGRP-HdfsDirectoryTest]         at java.lang.Object.wait(Native
> > Method)         at java.lang.Object.wait(Object.java:503)         at
> > java.util.TimerThread.mainLoop(Timer.java:526)         at
> > java.util.TimerThread.run(Timer.java:505)
> >
> > Stack Trace:
> > com.carrotsearch.randomizedtesting.ThreadLeakError: 1 thread leaked
> from
> > SUITE scope at org.apache.solr.store.hdfs.HdfsDirectoryTest:
> >    1) Thread[id=20389, name=IPC Server idle connection scanner for port
> > 41610, state=WAITING, group=TGRP-HdfsDirectoryTest]
> >         at java.lang.Object.wait(Native Method)
> >         at java.lang.Object.wait(Object.java:503)
> >         at java.util.TimerThread.mainLoop(Timer.java:526)
> >         at java.util.TimerThread.run(Timer.java:505)
> > 	at __randomizedtesting.SeedInfo.seed([5D8F351977870E3F]:0)
> >
> >
> > FAILED:
> > junit.framework.TestSuite.org.apache.solr.store.hdfs.HdfsDirectoryTest
> >
> > Error Message:
> > There are still zombie threads that couldn't be terminated:    1)
> > Thread[id=20389, name=IPC Server idle connection scanner for port 41610,
> > state=WAITING, group=TGRP-HdfsDirectoryTest]         at
> > java.lang.Object.wait(Native Method)         at
> > java.lang.Object.wait(Object.java:503)         at
> > java.util.TimerThread.mainLoop(Timer.java:526)         at
> > java.util.TimerThread.run(Timer.java:505)
> >
> > Stack Trace:
> > com.carrotsearch.randomizedtesting.ThreadLeakError: There are still
> zombie
> > threads that couldn't be terminated:
> >    1) Thread[id=20389, name=IPC Server idle connection scanner for port
> > 41610, state=WAITING, group=TGRP-HdfsDirectoryTest]
> >         at java.lang.Object.wait(Native Method)
> >         at java.lang.Object.wait(Object.java:503)
> >         at java.util.TimerThread.mainLoop(Timer.java:526)
> >         at java.util.TimerThread.run(Timer.java:505)
> > 	at __randomizedtesting.SeedInfo.seed([5D8F351977870E3F]:0)
> >
> >
> >
> >
> > Build Log:
> > [...truncated 10577 lines...]
> >    [junit4] Suite: org.apache.solr.cloud.hdfs.HdfsNNFailoverTest
> >    [junit4]   2> Creating dataDir: /export/home/jenkins/workspace/Lucene-
> > Solr-5.x-Solaris/solr/build/solr-
> >
> core/test/J0/temp/solr.cloud.hdfs.HdfsNNFailoverTest_5D8F351977870E3F-
> > 001/init-core-data-001
> >    [junit4]   2> 2599844 INFO  (SUITE-HdfsNNFailoverTest-
> > seed#[5D8F351977870E3F]-worker) [    ]
> o.a.s.BaseDistributedSearchTestCase
> > Setting hostContext system property: /
> >    [junit4]   2> 2616331 WARN  (SUITE-HdfsNNFailoverTest-
> > seed#[5D8F351977870E3F]-worker) [    ] o.a.h.u.NativeCodeLoader Unable
> to
> > load native-hadoop library for your platform... using builtin-java classes
> > where applicable
> >    [junit4]   1> Formatting using clusterid: testClusterID
> >    [junit4]   2> 2617524 WARN  (SUITE-HdfsNNFailoverTest-
> > seed#[5D8F351977870E3F]-worker) [    ] o.a.h.m.i.MetricsConfig Cannot
> > locate configuration: tried hadoop-metrics2-
> namenode.properties,hadoop-
> > metrics2.properties
> >    [junit4]   2> 2617755 INFO  (SUITE-HdfsNNFailoverTest-
> > seed#[5D8F351977870E3F]-worker) [    ] o.m.log Logging to
> > org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via
> > org.mortbay.log.Slf4jLog
> >    [junit4]   2> 2617771 WARN  (SUITE-HdfsNNFailoverTest-
> > seed#[5D8F351977870E3F]-worker) [    ] o.a.h.h.HttpRequestLog Jetty
> > request log can only be enabled using Log4j
> >    [junit4]   2> 2617878 INFO  (SUITE-HdfsNNFailoverTest-
> > seed#[5D8F351977870E3F]-worker) [    ] o.m.log jetty-6.1.26
> >    [junit4]   2> 2617942 INFO  (SUITE-HdfsNNFailoverTest-
> > seed#[5D8F351977870E3F]-worker) [    ] o.m.log Extract
> > jar:file:/export/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-
> > hdfs/tests/hadoop-hdfs-2.6.0-tests.jar!/webapps/hdfs to
> > ./temp/Jetty_solaris.vm_35231_hdfs____thayv4/webapp
> >    [junit4]   2> 2618129 INFO  (SUITE-HdfsNNFailoverTest-
> > seed#[5D8F351977870E3F]-worker) [    ] o.m.log NO JSP Support for /, did
> not
> > find org.apache.jasper.servlet.JspServlet
> >    [junit4]   2> 2619464 INFO  (SUITE-HdfsNNFailoverTest-
> > seed#[5D8F351977870E3F]-worker) [    ] o.m.log Started
> > HttpServer2$SelectChannelConnectorWithSafeStartup@solaris-vm:35231
> >    [junit4]   2> 2637264 WARN  (SUITE-HdfsNNFailoverTest-
> > seed#[5D8F351977870E3F]-worker) [    ] o.a.h.h.s.d.DataNode Invalid
> > dfs.datanode.data.dir /export/home/jenkins/workspace/Lucene-Solr-5.x-
> > Solaris/solr/build/solr-
> >
> core/test/J0/temp/solr.cloud.hdfs.HdfsNNFailoverTest_5D8F351977870E3F-
> > 001/tempDir-001/hdfsBaseDir/data/data2 :
> >    [junit4]   2> java.io.IOException: Cannot run program "chmod": error=12,
> > Not enough space
> >    [junit4]   2> 	at java.lang.ProcessBuilder.start(ProcessBuilder.java:1047)
> >    [junit4]   2> 	at org.apache.hadoop.util.Shell.runCommand(Shell.java:485)
> >    [junit4]   2> 	at org.apache.hadoop.util.Shell.run(Shell.java:455)
> >    [junit4]   2> 	at
> >
> org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:715
> > )
> >    [junit4]   2> 	at
> > org.apache.hadoop.util.Shell.execCommand(Shell.java:808)
> >    [junit4]   2> 	at
> > org.apache.hadoop.util.Shell.execCommand(Shell.java:791)
> >    [junit4]   2> 	at
> >
> org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSyste
> > m.java:656)
> >    [junit4]   2> 	at
> >
> org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.java:4
> > 90)
> >    [junit4]   2> 	at
> >
> org.apache.hadoop.util.DiskChecker.mkdirsWithExistsAndPermissionCheck(
> > DiskChecker.java:140)
> >    [junit4]   2> 	at
> > org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:156)
> >    [junit4]   2> 	at
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode$DataNodeDiskChecker.
> > checkDir(DataNode.java:2239)
> >    [junit4]   2> 	at
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.checkStorageLocations(
> > DataNode.java:2281)
> >    [junit4]   2> 	at
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNo
> > de.java:2263)
> >    [junit4]   2> 	at
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(D
> > ataNode.java:2155)
> >    [junit4]   2> 	at
> >
> org.apache.hadoop.hdfs.MiniDFSCluster.startDataNodes(MiniDFSCluster.jav
> > a:1443)
> >    [junit4]   2> 	at
> >
> org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.j
> > ava:828)
> >    [junit4]   2> 	at
> > org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:738)
> >    [junit4]   2> 	at
> > org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:608)
> >    [junit4]   2> 	at
> > org.apache.solr.cloud.hdfs.HdfsTestUtil.setupClass(HdfsTestUtil.java:98)
> >    [junit4]   2> 	at
> >
> org.apache.solr.cloud.hdfs.HdfsNNFailoverTest.setupClass(HdfsNNFailoverT
> > est.java:44)
> >    [junit4]   2> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> > Method)
> >    [junit4]   2> 	at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.j
> > ava:57)
> >    [junit4]   2> 	at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
> > sorImpl.java:43)
> >    [junit4]   2> 	at java.lang.reflect.Method.invoke(Method.java:606)
> >    [junit4]   2> 	at
> >
> com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(Randomize
> > dRunner.java:1627)
> >    [junit4]   2> 	at
> >
> com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(Rando
> > mizedRunner.java:776)
> >    [junit4]   2> 	at
> >
> com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(Rando
> > mizedRunner.java:792)
> >    [junit4]   2> 	at
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > ementAdapter.java:36)
> >    [junit4]   2> 	at
> >
> com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.
> > evaluate(SystemPropertiesRestoreRule.java:57)
> >    [junit4]   2> 	at
> >
> org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeA
> > fterRule.java:46)
> >    [junit4]   2> 	at
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > ementAdapter.java:36)
> >    [junit4]   2> 	at
> >
> org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreCl
> > assName.java:42)
> >    [junit4]   2> 	at
> >
> com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMet
> > hodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
> >    [junit4]   2> 	at
> >
> com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMet
> > hodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
> >    [junit4]   2> 	at
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > ementAdapter.java:36)
> >    [junit4]   2> 	at
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > ementAdapter.java:36)
> >    [junit4]   2> 	at
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > ementAdapter.java:36)
> >    [junit4]   2> 	at
> >
> org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAss
> > ertionsRequired.java:54)
> >    [junit4]   2> 	at
> >
> org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure
> > .java:48)
> >    [junit4]   2> 	at
> >
> org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRule
> > IgnoreAfterMaxFailures.java:65)
> >    [junit4]   2> 	at
> >
> org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnore
> > TestSuites.java:55)
> >    [junit4]   2> 	at
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > ementAdapter.java:36)
> >    [junit4]   2> 	at
> >
> com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.
> > run(ThreadLeakControl.java:365)
> >    [junit4]   2> 	at java.lang.Thread.run(Thread.java:745)
> >    [junit4]   2> Caused by: java.io.IOException: error=12, Not enough space
> >    [junit4]   2> 	at java.lang.UNIXProcess.forkAndExec(Native Method)
> >    [junit4]   2> 	at java.lang.UNIXProcess.<init>(UNIXProcess.java:137)
> >    [junit4]   2> 	at java.lang.ProcessImpl.start(ProcessImpl.java:130)
> >    [junit4]   2> 	at java.lang.ProcessBuilder.start(ProcessBuilder.java:1028)
> >    [junit4]   2> 	... 43 more
> >    [junit4]   2> 2637287 WARN
> > (org.apache.hadoop.util.JvmPauseMonitor$Monitor@be51b7) [    ]
> > o.a.h.u.JvmPauseMonitor Detected pause in JVM or host machine (eg GC):
> > pause of approximately 15969ms
> >    [junit4]   2> No GCs detected
> >    [junit4]   2> 2637368 WARN  (SUITE-HdfsNNFailoverTest-
> > seed#[5D8F351977870E3F]-worker) [    ] o.a.h.h.HttpRequestLog Jetty
> > request log can only be enabled using Log4j
> >    [junit4]   2> 2637384 INFO  (SUITE-HdfsNNFailoverTest-
> > seed#[5D8F351977870E3F]-worker) [    ] o.m.log jetty-6.1.26
> >    [junit4]   2> 2637422 INFO  (SUITE-HdfsNNFailoverTest-
> > seed#[5D8F351977870E3F]-worker) [    ] o.m.log Extract
> > jar:file:/export/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-
> > hdfs/tests/hadoop-hdfs-2.6.0-tests.jar!/webapps/datanode to
> > ./temp/Jetty_solaris.vm_49465_datanode____96t731/webapp
> >    [junit4]   2> 2637655 INFO  (SUITE-HdfsNNFailoverTest-
> > seed#[5D8F351977870E3F]-worker) [    ] o.m.log NO JSP Support for /, did
> not
> > find org.apache.jasper.servlet.JspServlet
> >    [junit4]   2> 2638756 INFO  (SUITE-HdfsNNFailoverTest-
> > seed#[5D8F351977870E3F]-worker) [    ] o.m.log Started
> > HttpServer2$SelectChannelConnectorWithSafeStartup@solaris-vm:49465
> >    [junit4]   2> 2645079 INFO  (SUITE-HdfsNNFailoverTest-
> > seed#[5D8F351977870E3F]-worker) [    ] o.m.log Stopped
> > HttpServer2$SelectChannelConnectorWithSafeStartup@solaris-vm:0
> >    [junit4]   2> 2645234 ERROR (DataNode:
> > [[[DISK]file:/export/home/jenkins/workspace/Lucene-Solr-5.x-
> > Solaris/solr/build/solr-
> >
> core/test/J0/temp/solr.cloud.hdfs.HdfsNNFailoverTest_5D8F351977870E3F-
> > 001/tempDir-001/hdfsBaseDir/data/data1/,
> > [DISK]file:/export/home/jenkins/workspace/Lucene-Solr-5.x-
> > Solaris/solr/build/solr-
> >
> core/test/J0/temp/solr.cloud.hdfs.HdfsNNFailoverTest_5D8F351977870E3F-
> > 001/tempDir-001/hdfsBaseDir/data/data2/]]  heartbeating to solaris-
> > vm/127.0.0.1:61051) [    ] o.a.h.h.s.d.DataNode Initialization failed for Block
> > pool <registering> (Datanode Uuid unassigned) service to solaris-
> > vm/127.0.0.1:61051. Exiting.
> >    [junit4]   2> java.io.IOException: DN shut down before block pool
> > connected
> >    [junit4]   2> 	at
> >
> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.retrieveNamespac
> > eInfo(BPServiceActor.java:185)
> >    [junit4]   2> 	at
> >
> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAnd
> > Handshake(BPServiceActor.java:215)
> >    [junit4]   2> 	at
> >
> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceAct
> > or.java:828)
> >    [junit4]   2> 	at java.lang.Thread.run(Thread.java:745)
> >    [junit4]   2> 2645236 WARN  (DataNode:
> > [[[DISK]file:/export/home/jenkins/workspace/Lucene-Solr-5.x-
> > Solaris/solr/build/solr-
> >
> core/test/J0/temp/solr.cloud.hdfs.HdfsNNFailoverTest_5D8F351977870E3F-
> > 001/tempDir-001/hdfsBaseDir/data/data1/,
> > [DISK]file:/export/home/jenkins/workspace/Lucene-Solr-5.x-
> > Solaris/solr/build/solr-
> >
> core/test/J0/temp/solr.cloud.hdfs.HdfsNNFailoverTest_5D8F351977870E3F-
> > 001/tempDir-001/hdfsBaseDir/data/data2/]]  heartbeating to solaris-
> > vm/127.0.0.1:61051) [    ] o.a.h.h.s.d.DataNode Ending block pool service
> for:
> > Block pool <registering> (Datanode Uuid unassigned) service to solaris-
> > vm/127.0.0.1:61051
> >    [junit4]   2> 2645259 WARN
> >
> (org.apache.hadoop.hdfs.server.blockmanagement.DecommissionManager
> > $Monitor@7b7964) [    ] o.a.h.h.s.b.DecommissionManager Monitor
> > interrupted: java.lang.InterruptedException: sleep interrupted
> >    [junit4]   2> 2645314 INFO  (SUITE-HdfsNNFailoverTest-
> > seed#[5D8F351977870E3F]-worker) [    ] o.m.log Stopped
> > HttpServer2$SelectChannelConnectorWithSafeStartup@solaris-vm:0
> >    [junit4]   2> 2645418 ERROR (SUITE-HdfsNNFailoverTest-
> > seed#[5D8F351977870E3F]-worker) [    ] o.a.h.m.l.MethodMetric Error
> > invoking method getBlocksTotal
> >    [junit4]   2> java.lang.reflect.InvocationTargetException
> >    [junit4]   2> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> > Method)
> >    [junit4]   2> 	at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.j
> > ava:57)
> >    [junit4]   2> 	at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
> > sorImpl.java:43)
> >    [junit4]   2> 	at java.lang.reflect.Method.invoke(Method.java:606)
> >    [junit4]   2> 	at
> >
> org.apache.hadoop.metrics2.lib.MethodMetric$2.snapshot(MethodMetric.j
> > ava:111)
> >    [junit4]   2> 	at
> >
> org.apache.hadoop.metrics2.lib.MethodMetric.snapshot(MethodMetric.jav
> > a:144)
> >    [junit4]   2> 	at
> >
> org.apache.hadoop.metrics2.lib.MetricsRegistry.snapshot(MetricsRegistry.ja
> > va:387)
> >    [junit4]   2> 	at
> >
> org.apache.hadoop.metrics2.lib.MetricsSourceBuilder$1.getMetrics(MetricsS
> > ourceBuilder.java:79)
> >    [junit4]   2> 	at
> >
> org.apache.hadoop.metrics2.impl.MetricsSourceAdapter.getMetrics(Metrics
> > SourceAdapter.java:195)
> >    [junit4]   2> 	at
> >
> org.apache.hadoop.metrics2.impl.MetricsSourceAdapter.updateJmxCache(
> > MetricsSourceAdapter.java:172)
> >    [junit4]   2> 	at
> >
> org.apache.hadoop.metrics2.impl.MetricsSourceAdapter.getMBeanInfo(Me
> > tricsSourceAdapter.java:151)
> >    [junit4]   2> 	at
> >
> com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getClassName(De
> > faultMBeanServerInterceptor.java:1804)
> >    [junit4]   2> 	at
> >
> com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.safeGetClassNam
> > e(DefaultMBeanServerInterceptor.java:1595)
> >    [junit4]   2> 	at
> > com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.checkMBeanPer
> > mission(DefaultMBeanServerInterceptor.java:1813)
> >    [junit4]   2> 	at
> >
> com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.exclusiveUnregist
> > erMBean(DefaultMBeanServerInterceptor.java:430)
> >    [junit4]   2> 	at
> >
> com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.unregisterMBean
> > (DefaultMBeanServerInterceptor.java:415)
> >    [junit4]   2> 	at
> >
> com.sun.jmx.mbeanserver.JmxMBeanServer.unregisterMBean(JmxMBeanS
> > erver.java:546)
> >    [junit4]   2> 	at
> > org.apache.hadoop.metrics2.util.MBeans.unregister(MBeans.java:81)
> >    [junit4]   2> 	at
> >
> org.apache.hadoop.metrics2.impl.MetricsSourceAdapter.stopMBeans(Metri
> > csSourceAdapter.java:227)
> >    [junit4]   2> 	at
> >
> org.apache.hadoop.metrics2.impl.MetricsSourceAdapter.stop(MetricsSourc
> > eAdapter.java:212)
> >    [junit4]   2> 	at
> >
> org.apache.hadoop.metrics2.impl.MetricsSystemImpl.stopSources(MetricsS
> > ystemImpl.java:461)
> >    [junit4]   2> 	at
> >
> org.apache.hadoop.metrics2.impl.MetricsSystemImpl.stop(MetricsSystemIm
> > pl.java:212)
> >    [junit4]   2> 	at
> >
> org.apache.hadoop.metrics2.impl.MetricsSystemImpl.shutdown(MetricsSyst
> > emImpl.java:592)
> >    [junit4]   2> 	at
> >
> org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.shutdownInstance(D
> > efaultMetricsSystem.java:72)
> >    [junit4]   2> 	at
> >
> org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.shutdown(DefaultMe
> > tricsSystem.java:68)
> >    [junit4]   2> 	at
> >
> org.apache.hadoop.hdfs.server.namenode.metrics.NameNodeMetrics.shut
> > down(NameNodeMetrics.java:145)
> >    [junit4]   2> 	at
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.stop(NameNode.jav
> > a:822)
> >    [junit4]   2> 	at
> >
> org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:172
> > 0)
> >    [junit4]   2> 	at
> >
> org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:169
> > 9)
> >    [junit4]   2> 	at
> >
> org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.j
> > ava:838)
> >    [junit4]   2> 	at
> > org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:738)
> >    [junit4]   2> 	at
> > org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:608)
> >    [junit4]   2> 	at
> > org.apache.solr.cloud.hdfs.HdfsTestUtil.setupClass(HdfsTestUtil.java:98)
> >    [junit4]   2> 	at
> >
> org.apache.solr.cloud.hdfs.HdfsNNFailoverTest.setupClass(HdfsNNFailoverT
> > est.java:44)
> >    [junit4]   2> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> > Method)
> >    [junit4]   2> 	at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.j
> > ava:57)
> >    [junit4]   2> 	at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
> > sorImpl.java:43)
> >    [junit4]   2> 	at java.lang.reflect.Method.invoke(Method.java:606)
> >    [junit4]   2> 	at
> >
> com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(Randomize
> > dRunner.java:1627)
> >    [junit4]   2> 	at
> >
> com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(Rando
> > mizedRunner.java:776)
> >    [junit4]   2> 	at
> >
> com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(Rando
> > mizedRunner.java:792)
> >    [junit4]   2> 	at
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > ementAdapter.java:36)
> >    [junit4]   2> 	at
> >
> com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.
> > evaluate(SystemPropertiesRestoreRule.java:57)
> >    [junit4]   2> 	at
> >
> org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeA
> > fterRule.java:46)
> >    [junit4]   2> 	at
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > ementAdapter.java:36)
> >    [junit4]   2> 	at
> >
> org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreCl
> > assName.java:42)
> >    [junit4]   2> 	at
> >
> com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMet
> > hodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
> >    [junit4]   2> 	at
> >
> com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMet
> > hodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
> >    [junit4]   2> 	at
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > ementAdapter.java:36)
> >    [junit4]   2> 	at
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > ementAdapter.java:36)
> >    [junit4]   2> 	at
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > ementAdapter.java:36)
> >    [junit4]   2> 	at
> >
> org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAss
> > ertionsRequired.java:54)
> >    [junit4]   2> 	at
> >
> org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure
> > .java:48)
> >    [junit4]   2> 	at
> >
> org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRule
> > IgnoreAfterMaxFailures.java:65)
> >    [junit4]   2> 	at
> >
> org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnore
> > TestSuites.java:55)
> >    [junit4]   2> 	at
> >
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> > ementAdapter.java:36)
> >    [junit4]   2> 	at
> >
> com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.
> > run(ThreadLeakControl.java:365)
> >    [junit4]   2> 	at java.lang.Thread.run(Thread.java:745)
> >    [junit4]   2> Caused by: java.lang.NullPointerException
> >    [junit4]   2> 	at
> >
> org.apache.hadoop.hdfs.server.blockmanagement.BlocksMap.size(BlocksM
> > ap.java:198)
> >    [junit4]   2> 	at
> >
> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.getTotalBl
> > ocks(BlockManager.java:3291)
> >    [junit4]   2> 	at
> >
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlocksTotal(F
> > SNamesystem.java:6223)
> >    [junit4]   2> 	... 58 more
> >    [junit4]   2> 2645432 INFO  (SUITE-HdfsNNFailoverTest-
> > seed#[5D8F351977870E3F]-worker) [    ] o.a.s.SolrTestCaseJ4
> ###deleteCore
> >    [junit4]   2> NOTE: test params are: codec=Asserting(Lucene53),
> > sim=RandomSimilarityProvider(queryNorm=true,coord=yes): {},
> > locale=mk_MK, timezone=Asia/Shanghai
> >    [junit4]   2> NOTE: SunOS 5.11 x86/Oracle Corporation 1.7.0_85 (32-
> > bit)/cpus=3,threads=1,free=99794816,total=518979584
> >    [junit4]   2> NOTE: All tests run in this JVM: [SolrCloudExampleTest,
> > TestStressVersions, TestSerializedLuceneMatchVersion, TestSolrJ,
> > DistanceUnitsTest, MultiThreadedOCPTest, TestDistribDocBasedVersion,
> > BJQParserTest, ZkCLITest, QueryEqualityTest, PrimitiveFieldTypeTest,
> > DistributedQueryComponentOptimizationTest, AliasIntegrationTest,
> > TestInitQParser, TestAuthorizationFramework, TestLazyCores,
> > SolrIndexConfigTest, TestFunctionQuery, TestXIncludeConfig,
> > HardAutoCommitTest, DocValuesMultiTest, TestDefaultStatsCache,
> > SolrRequestParserTest, RecoveryZkTest, UpdateParamsTest,
> > TestSolrDeletionPolicy1, TestDFRSimilarityFactory, TestFastWriter,
> > PathHierarchyTokenizerFactoryTest, TestDynamicLoading,
> > TestElisionMultitermQuery, PolyFieldTest, UnloadDistributedZkTest,
> > TestJsonRequest, TestRuleBasedAuthorizationPlugin,
> > TestManagedStopFilterFactory, TestRawResponseWriter,
> IndexSchemaTest,
> > TestEmbeddedSolrServerConstructors, InfoHandlerTest,
> > AlternateDirectoryTest, LeaderElectionTest, JsonLoaderTest,
> > TestCoreContainer, DirectSolrSpellCheckerTest, RequestLoggingTest,
> > ZkNodePropsTest, TermsComponentTest, TestConfig,
> > TestFieldTypeCollectionResource, XsltUpdateRequestHandlerTest,
> > TestManagedSchemaFieldResource, TestSchemaResource,
> > DataDrivenBlockJoinTest, TestExactStatsCache, TestConfigSetProperties,
> > DeleteLastCustomShardedReplicaTest, TestAnalyzedSuggestions,
> > DirectUpdateHandlerTest, ExternalFileFieldSortTest,
> TestIBSimilarityFactory,
> > TestMissingGroups, ClusterStateUpdateTest, ActionThrottleTest,
> > QueryElevationComponentTest, DocValuesTest, QueryResultKeyTest,
> > TestLRUCache, TestPhraseSuggestions, SimplePostToolTest,
> > TriLevelCompositeIdRoutingTest, DistributedMLTComponentTest,
> > CloudExitableDirectoryReaderTest, TestSolrCloudWithKerberosAlt,
> > TestCodecSupport, TestConfigSets, PeerSyncTest,
> > XmlUpdateRequestHandlerTest, SpatialHeatmapFacetsTest,
> > SoftAutoCommitTest, TestSchemaNameResource,
> > PreAnalyzedUpdateProcessorTest, TestJmxMonitoredMap,
> > TestDistributedStatsComponentCardinality,
> > TestManagedSynonymFilterFactory, JSONWriterTest, TestNRTOpen,
> > ReplicationFactorTest, DOMUtilTest, SolrCoreTest,
> > DocExpirationUpdateProcessorFactoryTest, FastVectorHighlighterTest,
> > SuggesterFSTTest, TestExtendedDismaxParser, TestSolrConfigHandler,
> > DocumentAnalysisRequestHandlerTest,
> > DistributedFacetPivotSmallAdvancedTest, BlockDirectoryTest,
> > TestQuerySenderNoQuery, TestHashPartitioner, DateFieldTest,
> > SegmentsInfoRequestHandlerTest, TestFieldCollectionResource,
> > RecoveryAfterSoftCommitTest, TestMergePolicyConfig,
> TestFieldSortValues,
> > SecurityConfHandlerTest, TestStressReorder, BufferStoreTest,
> > TestRandomRequestDistribution, HdfsBasicDistributedZkTest,
> > TestCloudManagedSchemaConcurrent, TestReplicaProperties,
> > DisMaxRequestHandlerTest, TestMacros, TestStressLucene,
> > TestReloadAndDeleteDocs, BasicAuthIntegrationTest, TestDocSet,
> > BasicDistributedZkTest, DistributedQueryElevationComponentTest,
> > TestGroupingSearch, TestObjectReleaseTracker, MoreLikeThisHandlerTest,
> > OverseerTest, TestFaceting, TestUpdate, TestClassNameShortening,
> > TestRestManager, SyncSliceTest, ShardRoutingTest, ZkSolrClientTest,
> > TestZkChroot, TestRandomDVFaceting, ShardRoutingCustomTest,
> > TestDistributedGrouping, DistributedSpellCheckComponentTest,
> > ZkControllerTest, TestRealTimeGet, TestReload,
> > DistributedTermsComponentTest, TestRangeQuery, SimpleFacetsTest,
> > TestSolr4Spatial, StatsComponentTest, SolrCmdDistributorTest, TestSort,
> > CurrencyFieldXmlFileTest, AnalysisAfterCoreReloadTest,
> > TestFoldingMultitermQuery, SuggesterTSTTest, TestCSVLoader,
> > SchemaVersionSpecificBehaviorTest, SolrCoreCheckLockOnStartupTest,
> > DirectUpdateHandlerOptimizeTest,
> > StatelessScriptUpdateProcessorFactoryTest, DistanceFunctionTest,
> > IndexBasedSpellCheckerTest, StandardRequestHandlerTest,
> > TestOmitPositions, DocumentBuilderTest, RequiredFieldsTest,
> > TestArbitraryIndexDir, LoggingHandlerTest, ReturnFieldsTest,
> > MBeansHandlerTest, UniqFieldsUpdateProcessorFactoryTest,
> > PingRequestHandlerTest, TestComponentsName, TestLFUCache,
> > PreAnalyzedFieldTest, TestSystemIdResolver, SpellingQueryConverterTest,
> > TestUtils, TestDocumentBuilder, SliceStateTest, SystemInfoHandlerTest,
> > UUIDFieldTest, FileUtilsTest, CircularListTest, TestRTGBase,
> > CursorPagingTest, DistributedIntervalFacetingTest,
> > TestDistributedMissingSort, TestSimpleTrackingShardHandler,
> > AsyncMigrateRouteKeyTest, DeleteInactiveReplicaTest,
> > DistribDocExpirationUpdateProcessorTest,
> LeaderFailoverAfterPartitionTest,
> > OverriddenZkACLAndCredentialsProvidersTest,
> > OverseerCollectionConfigSetProcessorTest, OverseerRolesTest,
> > OverseerTaskQueueTest, SSLMigrationTest, SaslZkACLProviderTest,
> > SimpleCollectionCreateDeleteTest, TestAuthenticationFramework,
> > TestCloudInspectUtil, TestCollectionAPI, TestMiniSolrCloudClusterSSL,
> > TestRebalanceLeaders, TestRequestStatusCollectionAPI,
> > HdfsBasicDistributedZk2Test, HdfsChaosMonkeySafeLeaderTest,
> > HdfsCollectionsAPIDistributedZkTest, HdfsNNFailoverTest]
> >    [junit4]   2> NOTE: reproduce with: ant test  -
> > Dtestcase=HdfsNNFailoverTest -Dtests.seed=5D8F351977870E3F -
> > Dtests.slow=true -Dtests.locale=mk_MK -Dtests.timezone=Asia/Shanghai -
> > Dtests.asserts=true -Dtests.file.encoding=UTF-8
> >    [junit4] ERROR   0.00s J0 | HdfsNNFailoverTest (suite) <<<
> >    [junit4]    > Throwable #1: java.lang.RuntimeException: Error while
> running
> > command to get file permissions : java.io.IOException: Cannot run program
> > "/bin/ls": error=12, Not enough space
> >    [junit4]    > 	at java.lang.ProcessBuilder.start(ProcessBuilder.java:1047)
> >    [junit4]    > 	at org.apache.hadoop.util.Shell.runCommand(Shell.java:485)
> >    [junit4]    > 	at org.apache.hadoop.util.Shell.run(Shell.java:455)
> >    [junit4]    > 	at
> >
> org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:715
> > )
> >    [junit4]    > 	at
> > org.apache.hadoop.util.Shell.execCommand(Shell.java:808)
> >    [junit4]    > 	at
> > org.apache.hadoop.util.Shell.execCommand(Shell.java:791)
> >    [junit4]    > 	at
> > org.apache.hadoop.fs.FileUtil.execCommand(FileUtil.java:1097)
> >    [junit4]    > 	at
> >
> org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.lo
> > adPermissionInfo(RawLocalFileSystem.java:582)
> >    [junit4]    > 	at
> >
> org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.g
> > etPermission(RawLocalFileSystem.java:557)
> >    [junit4]    > 	at
> >
> org.apache.hadoop.util.DiskChecker.mkdirsWithExistsAndPermissionCheck(
> > DiskChecker.java:139)
> >    [junit4]    > 	at
> > org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:156)
> >    [junit4]    > 	at
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode$DataNodeDiskChecker.
> > checkDir(DataNode.java:2239)
> >    [junit4]    > 	at
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.checkStorageLocations(
> > DataNode.java:2281)
> >    [junit4]    > 	at
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNo
> > de.java:2263)
> >    [junit4]    > 	at
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(D
> > ataNode.java:2155)
> >    [junit4]    > 	at
> >
> org.apache.hadoop.hdfs.MiniDFSCluster.startDataNodes(MiniDFSCluster.jav
> > a:1443)
> >    [junit4]    > 	at
> >
> org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.j
> > ava:828)
> >    [junit4]    > 	at
> > org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:738)
> >    [junit4]    > 	at
> > org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:608)
> >    [junit4]    > 	at
> > org.apache.solr.cloud.hdfs.HdfsTestUtil.setupClass(HdfsTestUtil.java:98)
> >    [junit4]    > 	at
> >
> org.apache.solr.cloud.hdfs.HdfsNNFailoverTest.setupClass(HdfsNNFailoverT
> > est.java:44)
> >    [junit4]    > 	at java.lang.Thread.run(Thread.java:745)
> >    [junit4]    > Caused by: java.io.IOException: error=12, Not enough space
> >    [junit4]    > 	at java.lang.UNIXProcess.forkAndExec(Native Method)
> >    [junit4]    > 	at java.lang.UNIXProcess.<init>(UNIXProcess.java:137)
> >    [junit4]    > 	at java.lang.ProcessImpl.start(ProcessImpl.java:130)
> >    [junit4]    > 	at java.lang.ProcessBuilder.start(ProcessBuilder.java:1028)
> >    [junit4]    > 	... 44 more
> >    [junit4]    > 	at
> > __randomizedtesting.SeedInfo.seed([5D8F351977870E3F]:0)
> >    [junit4]    > 	at
> >
> org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.lo
> > adPermissionInfo(RawLocalFileSystem.java:620)
> >    [junit4]    > 	at
> >
> org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.g
> > etPermission(RawLocalFileSystem.java:557)
> >    [junit4]    > 	at
> >
> org.apache.hadoop.util.DiskChecker.mkdirsWithExistsAndPermissionCheck(
> > DiskChecker.java:139)
> >    [junit4]    > 	at
> > org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:156)
> >    [junit4]    > 	at
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode$DataNodeDiskChecker.
> > checkDir(DataNode.java:2239)
> >    [junit4]    > 	at
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.checkStorageLocations(
> > DataNode.java:2281)
> >    [junit4]    > 	at
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNo
> > de.java:2263)
> >    [junit4]    > 	at
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(D
> > ataNode.java:2155)
> >    [junit4]    > 	at
> >
> org.apache.hadoop.hdfs.MiniDFSCluster.startDataNodes(MiniDFSCluster.jav
> > a:1443)
> >    [junit4]    > 	at
> >
> org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.j
> > ava:828)
> >    [junit4]    > 	at
> > org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:738)
> >    [junit4]    > 	at
> > org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:608)
> >    [junit4]    > 	at
> > org.apache.solr.cloud.hdfs.HdfsTestUtil.setupClass(HdfsTestUtil.java:98)
> >    [junit4]    > 	at
> >
> org.apache.solr.cloud.hdfs.HdfsNNFailoverTest.setupClass(HdfsNNFailoverT
> > est.java:44)
> >    [junit4]    > 	at java.lang.Thread.run(Thread.java:745)
> >    [junit4] Completed [426/536] on J0 in 45.66s, 0 tests, 1 error <<<
> FAILURES!
> >
> > [...truncated 300 lines...]
> >    [junit4] Suite: org.apache.solr.store.hdfs.HdfsDirectoryTest
> >    [junit4]   2> Creating dataDir: /export/home/jenkins/workspace/Lucene-
> > Solr-5.x-Solaris/solr/build/solr-
> > core/test/J1/temp/solr.store.hdfs.HdfsDirectoryTest_5D8F351977870E3F-
> > 001/init-core-data-001
> >    [junit4]   2> 3147821 INFO  (SUITE-HdfsDirectoryTest-
> > seed#[5D8F351977870E3F]-worker) [    ] o.a.s.SolrTestCaseJ4 Randomized
> ssl
> > (false) and clientAuth (false)
> >    [junit4]   1> Formatting using clusterid: testClusterID
> >    [junit4]   2> 3147964 WARN  (SUITE-HdfsDirectoryTest-
> > seed#[5D8F351977870E3F]-worker) [    ] o.a.h.m.i.MetricsConfig Cannot
> > locate configuration: tried hadoop-metrics2-
> namenode.properties,hadoop-
> > metrics2.properties
> >    [junit4]   2> 3147974 WARN  (SUITE-HdfsDirectoryTest-
> > seed#[5D8F351977870E3F]-worker) [    ] o.a.h.h.HttpRequestLog Jetty
> > request log can only be enabled using Log4j
> >    [junit4]   2> 3147976 INFO  (SUITE-HdfsDirectoryTest-
> > seed#[5D8F351977870E3F]-worker) [    ] o.m.log jetty-6.1.26
> >    [junit4]   2> 3147994 INFO  (SUITE-HdfsDirectoryTest-
> > seed#[5D8F351977870E3F]-worker) [    ] o.m.log Extract
> > jar:file:/export/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-
> > hdfs/tests/hadoop-hdfs-2.6.0-tests.jar!/webapps/hdfs to
> > ./temp/Jetty_solaris.vm_46547_hdfs____.vwfmpk/webapp
> >    [junit4]   2> 3148170 INFO  (SUITE-HdfsDirectoryTest-
> > seed#[5D8F351977870E3F]-worker) [    ] o.m.log NO JSP Support for /, did
> not
> > find org.apache.jasper.servlet.JspServlet
> >    [junit4]   2> 3148982 INFO  (SUITE-HdfsDirectoryTest-
> > seed#[5D8F351977870E3F]-worker) [    ] o.m.log Started
> > HttpServer2$SelectChannelConnectorWithSafeStartup@solaris-vm:46547
> >    [junit4]   2> 3157264 INFO  (SUITE-HdfsDirectoryTest-
> > seed#[5D8F351977870E3F]-worker) [    ] o.m.log Stopped
> > HttpServer2$SelectChannelConnectorWithSafeStartup@solaris-vm:0
> >    [junit4]   2> 3157403 INFO  (SUITE-HdfsDirectoryTest-
> > seed#[5D8F351977870E3F]-worker) [    ] o.a.s.SolrTestCaseJ4
> ###deleteCore
> >    [junit4]   2> Aug 29, 2015 9:01:07 PM
> > com.carrotsearch.randomizedtesting.ThreadLeakControl
> checkThreadLeaks
> >    [junit4]   2> WARNING: Will linger awaiting termination of 1 leaked
> > thread(s).
> >    [junit4]   2> Aug 29, 2015 9:01:27 PM
> > com.carrotsearch.randomizedtesting.ThreadLeakControl
> checkThreadLeaks
> >    [junit4]   2> SEVERE: 1 thread leaked from SUITE scope at
> > org.apache.solr.store.hdfs.HdfsDirectoryTest:
> >    [junit4]   2>    1) Thread[id=20389, name=IPC Server idle connection
> scanner
> > for port 41610, state=WAITING, group=TGRP-HdfsDirectoryTest]
> >    [junit4]   2>         at java.lang.Object.wait(Native Method)
> >    [junit4]   2>         at java.lang.Object.wait(Object.java:503)
> >    [junit4]   2>         at java.util.TimerThread.mainLoop(Timer.java:526)
> >    [junit4]   2>         at java.util.TimerThread.run(Timer.java:505)
> >    [junit4]   2> Aug 29, 2015 9:01:27 PM
> > com.carrotsearch.randomizedtesting.ThreadLeakControl tryToInterruptAll
> >    [junit4]   2> INFO: Starting to interrupt leaked threads:
> >    [junit4]   2>    1) Thread[id=20389, name=IPC Server idle connection
> scanner
> > for port 41610, state=WAITING, group=TGRP-HdfsDirectoryTest]
> >    [junit4]   2> Aug 29, 2015 9:01:30 PM
> > com.carrotsearch.randomizedtesting.ThreadLeakControl tryToInterruptAll
> >    [junit4]   2> SEVERE: There are still zombie threads that couldn't be
> > terminated:
> >    [junit4]   2>    1) Thread[id=20389, name=IPC Server idle connection
> scanner
> > for port 41610, state=WAITING, group=TGRP-HdfsDirectoryTest]
> >    [junit4]   2>         at java.lang.Object.wait(Native Method)
> >    [junit4]   2>         at java.lang.Object.wait(Object.java:503)
> >    [junit4]   2>         at java.util.TimerThread.mainLoop(Timer.java:526)
> >    [junit4]   2>         at java.util.TimerThread.run(Timer.java:505)
> >    [junit4]   2> NOTE: test params are: codec=Asserting(Lucene53): {},
> > docValues:{}, sim=DefaultSimilarity, locale=es_BO,
> > timezone=Antarctica/South_Pole
> >    [junit4]   2> NOTE: SunOS 5.11 x86/Oracle Corporation 1.7.0_85 (32-
> > bit)/cpus=3,threads=2,free=136627544,total=518979584
> >    [junit4]   2> NOTE: All tests run in this JVM: [TestIndexingPerformance,
> > TestCSVResponseWriter, DistributedQueryComponentCustomSortTest,
> > DirectSolrConnectionTest, FullSolrCloudDistribCmdsTest,
> > TestShardHandlerFactory, CacheHeaderTest, BasicZkTest, TestTrie,
> > FieldAnalysisRequestHandlerTest, PKIAuthenticationIntegrationTest,
> > OpenCloseCoreStressTest, TestSuggestSpellingConverter, StressHdfsTest,
> > CleanupOldIndexTest, DistributedExpandComponentTest,
> > TestHdfsUpdateLog, TestSolrXml, TestAddFieldRealTimeGet,
> TestJsonFacets,
> > DistributedSuggestComponentTest,
> > OutOfBoxZkACLAndCredentialsProvidersTest, AnalyticsMergeStrategyTest,
> > HLLUtilTest, ResponseHeaderTest, SearchHandlerTest,
> > BinaryUpdateRequestHandlerTest, DistributedFacetPivotWhiteBoxTest,
> > ConnectionManagerTest, SpellCheckComponentTest,
> > TestScoreJoinQPNoScore, SolrTestCaseJ4Test, SolrIndexSplitterTest,
> > TestConfigSetsAPI, TestDefaultSearchFieldResource, TestCryptoKeys,
> > TestNonDefinedSimilarityFactory, TestCoreDiscovery, RollingRestartTest,
> > SolrInfoMBeanTest, CustomCollectionTest, DistributedVersionInfoTest,
> > ClusterStateTest, TestReversedWildcardFilterFactory, SolrXmlInZkTest,
> > DistributedFacetPivotLongTailTest, URLClassifyProcessorTest,
> > TestLMJelinekMercerSimilarityFactory, RequestHandlersTest,
> > RemoteQueryErrorTest, LeaderElectionIntegrationTest,
> > SharedFSAutoReplicaFailoverTest, TestBadConfig,
> > SignatureUpdateProcessorFactoryTest,
> TestCursorMarkWithoutUniqueKey,
> > TestCrossCoreJoin, SparseHLLTest, DistributedQueueTest,
> > BigEndianAscendingWordSerializerTest, TestBM25SimilarityFactory,
> > AutoCommitTest, DateMathParserTest, BasicFunctionalityTest,
> > SuggesterWFSTTest, TestCollapseQParserPlugin, TestManagedResource,
> > TestSha256AuthenticationProvider, CollectionTooManyReplicasTest,
> > BadCopyFieldTest, TestDownShardTolerantSearch, CloudMLTQParserTest,
> > NotRequiredUniqueKeyTest, TestAnalyzeInfixSuggestions,
> > ExitableDirectoryReaderTest, TestScoreJoinQPScore, DeleteShardTest,
> > RankQueryTest, TestSchemaManager,
> UpdateRequestProcessorFactoryTest,
> > CursorMarkTest, DistributedDebugComponentTest, DeleteReplicaTest,
> > RAMDirectoryFactoryTest, ConcurrentDeleteAndCreateCollectionTest,
> > TestQueryTypes, OutputWriterTest, TestSchemaSimilarityResource,
> > HighlighterMaxOffsetTest, ResponseLogComponentTest,
> > TestCloudPivotFacet, DocValuesMissingTest,
> > FieldMutatingUpdateProcessorTest, HttpPartitionTest, TestCollationField,
> > ZkStateWriterTest, TestQuerySenderListener, AtomicUpdatesTest,
> > TestStressRecovery, TestRandomFaceting,
> > SharedFSAutoReplicaFailoverUtilsTest, CoreAdminHandlerTest,
> > HighlighterConfigTest, TestCustomSort, MultiTermTest,
> > VMParamsZkACLAndCredentialsProvidersTest,
> > IgnoreCommitOptimizeUpdateProcessorFactoryTest, CollectionReloadTest,
> > PrimUtilsTest, TestRecovery, TestWriterPerf,
> > AddSchemaFieldsUpdateProcessorFactoryTest, TimeZoneUtilsTest,
> > CurrencyFieldOpenExchangeTest, TestSolrCLIRunExample,
> > TestPHPSerializedResponseWriter, ChaosMonkeySafeLeaderTest,
> > TestIndexSearcher, EnumFieldTest, TestSolrIndexConfig,
> > TermVectorComponentDistributedTest, TestJoin, TestExpandComponent,
> > TestManagedResourceStorage, SortByFunctionTest,
> > TestDefaultSimilarityFactory, SuggesterTest, TestValueSourceCache,
> > SolrPluginUtilsTest, TermVectorComponentTest, TestFiltering,
> > TestQueryUtils, FileBasedSpellCheckerTest, BasicDistributedZk2Test,
> > CollectionsAPIDistributedZkTest, TestReplicationHandler,
> > TestDistributedSearch, BadIndexSchemaTest, ConvertedLegacyTest,
> > HighlighterTest, ShowFileRequestHandlerTest, SpellCheckCollatorTest,
> > SpatialFilterTest, NoCacheHeaderTest, WordBreakSolrSpellCheckerTest,
> > TestPseudoReturnFields, TestAtomicUpdateErrorCases,
> > TestWordDelimiterFilterFactory, DefaultValueUpdateProcessorTest,
> > TestRemoteStreaming, DebugComponentTest, TestSurroundQueryParser,
> > LukeRequestHandlerTest, TestSolrQueryParser,
> > IndexSchemaRuntimeFieldTest, RegexBoostProcessorTest,
> > TestJmxIntegration, QueryParsingTest, TestPartialUpdateDeduplication,
> > CSVRequestHandlerTest, TestBinaryResponseWriter, SOLR749Test,
> > CopyFieldTest, BadComponentTest, TestSolrDeletionPolicy2, SampleTest,
> > TestBinaryField, TestSearchPerf, NumericFieldsTest, MinimalSchemaTest,
> > TestFuzzyAnalyzedSuggestions, TestSolrCoreProperties,
> > TestPostingsSolrHighlighter, TestLuceneMatchVersion,
> > SpellPossibilityIteratorTest, TestCharFilters, SynonymTokenizerTest,
> > EchoParamsTest, TestSweetSpotSimilarityFactory, TestPerFieldSimilarity,
> > TestLMDirichletSimilarityFactory, ResourceLoaderTest,
> > TestFastOutputStream, ScriptEngineTest,
> > OpenExchangeRatesOrgProviderTest, PluginInfoTest, TestFastLRUCache,
> > ChaosMonkeyNothingIsSafeTest, TestHighlightDedupGrouping,
> > TestTolerantSearch, TestJettySolrRunner, AssignTest,
> > AsyncCallRequestStatusResponseTest, CollectionStateFormat2Test,
> > CollectionsAPIAsyncDistributedZkTest, DistribCursorPagingTest,
> > DistribJoinFromCollectionTest, LeaderInitiatedRecoveryOnCommitTest,
> > MigrateRouteKeyTest, OverseerStatusTest, ShardSplitTest,
> > TestConfigSetsAPIExclusivity, TestConfigSetsAPIZkFailure,
> > TestLeaderElectionZkExpiry, TestMiniSolrCloudCluster,
> > TestShortCircuitedRequests, HdfsRecoverLeaseTest,
> > CachingDirectoryFactoryTest, HdfsDirectoryFactoryTest, TestConfigOverlay,
> > TestConfigSetImmutable, TestImplicitCoreProperties,
> > TestInfoStreamLogging, TestInitParams, TestSolrDynamicMBean,
> > TestBlobHandler, TestConfigReload, TestReplicationHandlerBackup,
> > TestSolrConfigHandlerConcurrent, CoreAdminCreateDiscoverTest,
> > CoreAdminRequestStatusTest, CoreMergeIndexesAdminHandlerTest,
> > DistributedFacetPivotLargeTest, DistributedFacetPivotSmallTest,
> > FacetPivotSmallTest, SuggestComponentTest, JavabinLoaderTest,
> > SmileWriterTest, TestIntervalFaceting, TestChildDocTransformer,
> > TestCustomDocTransformer, TestSortingResponseWriter,
> > TestBulkSchemaAPI, TestFieldResource,
> > TestManagedSchemaDynamicFieldResource, TestBulkSchemaConcurrent,
> > TestCloudSchemaless, TestReloadDeadlock, TestSearcherReuse,
> > TestSimpleQParserPlugin, TestSmileRequest, TestSolr4Spatial2,
> > TestStandardQParsers, TestStressUserVersions, TestTrieFacet,
> > TestMinMaxOnMultiValuedField, TestOrdValues,
> > TestSortByMinMaxFunction, SimpleMLTQParserTest, TestDistribIDF,
> > TestExactSharedStatsCache, TestPKIAuthenticationPlugin,
> > TestBlendedInfixSuggestions, TestFileDictionaryLookup,
> > TestFreeTextSuggestions, TestHighFrequencyDictionaryFactory,
> > BlockCacheTest, HdfsDirectoryTest]
> >    [junit4]   2> NOTE: reproduce with: ant test  -Dtestcase=HdfsDirectoryTest
> -
> > Dtests.seed=5D8F351977870E3F -Dtests.slow=true -Dtests.locale=es_BO -
> > Dtests.timezone=Antarctica/South_Pole -Dtests.asserts=true -
> > Dtests.file.encoding=UTF-8
> >    [junit4] ERROR   0.00s J1 | HdfsDirectoryTest (suite) <<<
> >    [junit4]    > Throwable #1: java.security.AccessControlException: access
> > denied ("java.io.FilePermission"
> "/export/home/jenkins/workspace/Lucene-
> > Solr-5.x-Solaris/solr/build/solr-core/test/J1" "write")
> >    [junit4]    > 	at
> > __randomizedtesting.SeedInfo.seed([5D8F351977870E3F]:0)
> >    [junit4]    > 	at
> >
> java.security.AccessControlContext.checkPermission(AccessControlContext.j
> > ava:395)
> >    [junit4]    > 	at
> > java.security.AccessController.checkPermission(AccessController.java:559)
> >    [junit4]    > 	at
> > java.lang.SecurityManager.checkPermission(SecurityManager.java:549)
> >    [junit4]    > 	at
> > java.lang.SecurityManager.checkWrite(SecurityManager.java:979)
> >    [junit4]    > 	at java.io.File.canWrite(File.java:785)
> >    [junit4]    > 	at org.apache.hadoop.fs.FileUtil.canWrite(FileUtil.java:1002)
> >    [junit4]    > 	at
> >
> org.apache.hadoop.hdfs.MiniDFSCluster.createPermissionsDiagnosisString(
> > MiniDFSCluster.java:856)
> >    [junit4]    > 	at
> >
> org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.j
> > ava:812)
> >    [junit4]    > 	at
> > org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:738)
> >    [junit4]    > 	at
> > org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:608)
> >    [junit4]    > 	at
> > org.apache.solr.cloud.hdfs.HdfsTestUtil.setupClass(HdfsTestUtil.java:98)
> >    [junit4]    > 	at
> > org.apache.solr.cloud.hdfs.HdfsTestUtil.setupClass(HdfsTestUtil.java:60)
> >    [junit4]    > 	at
> >
> org.apache.solr.store.hdfs.HdfsDirectoryTest.beforeClass(HdfsDirectoryTest.
> > java:62)
> >    [junit4]    > 	at java.lang.Thread.run(Thread.java:745)Throwable #2:
> > com.carrotsearch.randomizedtesting.ThreadLeakError: 1 thread leaked
> from
> > SUITE scope at org.apache.solr.store.hdfs.HdfsDirectoryTest:
> >    [junit4]    >    1) Thread[id=20389, name=IPC Server idle connection
> scanner
> > for port 41610, state=WAITING, group=TGRP-HdfsDirectoryTest]
> >    [junit4]    >         at java.lang.Object.wait(Native Method)
> >    [junit4]    >         at java.lang.Object.wait(Object.java:503)
> >    [junit4]    >         at java.util.TimerThread.mainLoop(Timer.java:526)
> >    [junit4]    >         at java.util.TimerThread.run(Timer.java:505)
> >    [junit4]    > 	at
> > __randomizedtesting.SeedInfo.seed([5D8F351977870E3F]:0)Throwable #3:
> > com.carrotsearch.randomizedtesting.ThreadLeakError: There are still
> zombie
> > threads that couldn't be terminated:
> >    [junit4]    >    1) Thread[id=20389, name=IPC Server idle connection
> scanner
> > for port 41610, state=WAITING, group=TGRP-HdfsDirectoryTest]
> >    [junit4]    >         at java.lang.Object.wait(Native Method)
> >    [junit4]    >         at java.lang.Object.wait(Object.java:503)
> >    [junit4]    >         at java.util.TimerThread.mainLoop(Timer.java:526)
> >    [junit4]    >         at java.util.TimerThread.run(Timer.java:505)
> >    [junit4]    > 	at
> > __randomizedtesting.SeedInfo.seed([5D8F351977870E3F]:0)
> >    [junit4] Completed [521/536] on J1 in 33.05s, 0 tests, 3 errors <<<
> FAILURES!
> >
> > [...truncated 64 lines...]
> > BUILD FAILED
> > /export/home/jenkins/workspace/Lucene-Solr-5.x-Solaris/build.xml:785:
> > The following error occurred while executing this line:
> > /export/home/jenkins/workspace/Lucene-Solr-5.x-Solaris/build.xml:729:
> > The following error occurred while executing this line:
> > /export/home/jenkins/workspace/Lucene-Solr-5.x-Solaris/build.xml:59:
> The
> > following error occurred while executing this line:
> > /export/home/jenkins/workspace/Lucene-Solr-5.x-
> > Solaris/solr/build.xml:233: The following error occurred while executing this
> > line:
> > /export/home/jenkins/workspace/Lucene-Solr-5.x-Solaris/solr/common-
> > build.xml:524: The following error occurred while executing this line:
> > /export/home/jenkins/workspace/Lucene-Solr-5.x-
> Solaris/lucene/common-
> > build.xml:1452: The following error occurred while executing this line:
> > /export/home/jenkins/workspace/Lucene-Solr-5.x-
> Solaris/lucene/common-
> > build.xml:1006: There were test failures: 536 suites, 2123 tests, 4 suite-level
> > errors, 108 ignored (34 assumptions)
> >
> > Total time: 77 minutes 51 seconds
> > Build step 'Invoke Ant' marked build as failure
> > Archiving artifacts
> > [WARNINGS] Skipping publisher since build result is FAILURE
> > Recording test results
> > Email was triggered for: Failure - Any
> > Sending email for trigger: Failure - Any
> >
> 
> 
> 
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscribe@lucene.apache.org
> For additional commands, e-mail: dev-help@lucene.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@lucene.apache.org
For additional commands, e-mail: dev-help@lucene.apache.org


RE: [JENKINS] Lucene-Solr-5.x-Solaris (multiarch/jdk1.7.0) - Build # 9 - Still Failing!

Posted by Uwe Schindler <uw...@thetaphi.de>.
I am still digging... On Solaris there seems to be a general forking problem on 32 bit processes.

Uwe

-----
Uwe Schindler
H.-H.-Meier-Allee 63, D-28213 Bremen
http://www.thetaphi.de
eMail: uwe@thetaphi.de

> -----Original Message-----
> From: Policeman Jenkins Server [mailto:jenkins@thetaphi.de]
> Sent: Saturday, August 29, 2015 11:02 PM
> To: shalin@apache.org; mikemccand@apache.org; dev@lucene.apache.org
> Subject: [JENKINS] Lucene-Solr-5.x-Solaris (multiarch/jdk1.7.0) - Build # 9 -
> Still Failing!
> 
> Build: http://jenkins.thetaphi.de/job/Lucene-Solr-5.x-Solaris/9/
> Java: multiarch/jdk1.7.0 -d32 -server -XX:+UseConcMarkSweepGC
> 
> 4 tests failed.
> FAILED:
> junit.framework.TestSuite.org.apache.solr.cloud.hdfs.HdfsNNFailoverTest
> 
> Error Message:
> Error while running command to get file permissions : java.io.IOException:
> Cannot run program "/bin/ls": error=12, Not enough space  at
> java.lang.ProcessBuilder.start(ProcessBuilder.java:1047)  at
> org.apache.hadoop.util.Shell.runCommand(Shell.java:485)  at
> org.apache.hadoop.util.Shell.run(Shell.java:455)  at
> org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:715
> )  at org.apache.hadoop.util.Shell.execCommand(Shell.java:808)  at
> org.apache.hadoop.util.Shell.execCommand(Shell.java:791)  at
> org.apache.hadoop.fs.FileUtil.execCommand(FileUtil.java:1097)  at
> org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.lo
> adPermissionInfo(RawLocalFileSystem.java:582)  at
> org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.g
> etPermission(RawLocalFileSystem.java:557)  at
> org.apache.hadoop.util.DiskChecker.mkdirsWithExistsAndPermissionCheck(
> DiskChecker.java:139)  at
> org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:156)  at
> org.apache.hadoop.hdfs.server.datanode.DataNode$DataNodeDiskChecker.
> checkDir(DataNode.java:2239)  at
> org.apache.hadoop.hdfs.server.datanode.DataNode.checkStorageLocations(
> DataNode.java:2281)  at
> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNo
> de.java:2263)  at
> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(D
> ataNode.java:2155)  at
> org.apache.hadoop.hdfs.MiniDFSCluster.startDataNodes(MiniDFSCluster.jav
> a:1443)  at
> org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.j
> ava:828)  at
> org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:738)  at
> org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:608)  at
> org.apache.solr.cloud.hdfs.HdfsTestUtil.setupClass(HdfsTestUtil.java:98)  at
> org.apache.solr.cloud.hdfs.HdfsNNFailoverTest.setupClass(HdfsNNFailoverT
> est.java:44)  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)  at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.j
> ava:57)  at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
> sorImpl.java:43)  at java.lang.reflect.Method.invoke(Method.java:606)  at
> com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(Randomize
> dRunner.java:1627)  at
> com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(Rando
> mizedRunner.java:776)  at
> com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(Rando
> mizedRunner.java:792)  at
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> ementAdapter.java:36)  at
> com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.
> evaluate(SystemPropertiesRestoreRule.java:57)  at
> org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeA
> fterRule.java:46)  at
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> ementAdapter.java:36)  at
> org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreCl
> assName.java:42)  at
> com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMet
> hodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)  at
> com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMet
> hodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)  at
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> ementAdapter.java:36)  at
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> ementAdapter.java:36)  at
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> ementAdapter.java:36)  at
> org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAss
> ertionsRequired.java:54)  at
> org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure
> .java:48)  at
> org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRule
> IgnoreAfterMaxFailures.java:65)  at
> org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnore
> TestSuites.java:55)  at
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> ementAdapter.java:36)  at
> com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.
> run(ThreadLeakControl.java:365)  at java.lang.Thread.run(Thread.java:745)
> Caused by: java.io.IOException: error=12, Not enough space  at
> java.lang.UNIXProcess.forkAndExec(Native Method)  at
> java.lang.UNIXProcess.<init>(UNIXProcess.java:137)  at
> java.lang.ProcessImpl.start(ProcessImpl.java:130)  at
> java.lang.ProcessBuilder.start(ProcessBuilder.java:1028)  ... 44 more
> 
> Stack Trace:
> java.lang.RuntimeException: Error while running command to get file
> permissions : java.io.IOException: Cannot run program "/bin/ls": error=12,
> Not enough space
> 	at java.lang.ProcessBuilder.start(ProcessBuilder.java:1047)
> 	at org.apache.hadoop.util.Shell.runCommand(Shell.java:485)
> 	at org.apache.hadoop.util.Shell.run(Shell.java:455)
> 	at
> org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:715
> )
> 	at org.apache.hadoop.util.Shell.execCommand(Shell.java:808)
> 	at org.apache.hadoop.util.Shell.execCommand(Shell.java:791)
> 	at org.apache.hadoop.fs.FileUtil.execCommand(FileUtil.java:1097)
> 	at
> org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.lo
> adPermissionInfo(RawLocalFileSystem.java:582)
> 	at
> org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.g
> etPermission(RawLocalFileSystem.java:557)
> 	at
> org.apache.hadoop.util.DiskChecker.mkdirsWithExistsAndPermissionCheck(
> DiskChecker.java:139)
> 	at
> org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:156)
> 	at
> org.apache.hadoop.hdfs.server.datanode.DataNode$DataNodeDiskChecker.
> checkDir(DataNode.java:2239)
> 	at
> org.apache.hadoop.hdfs.server.datanode.DataNode.checkStorageLocations(
> DataNode.java:2281)
> 	at
> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNo
> de.java:2263)
> 	at
> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(D
> ataNode.java:2155)
> 	at
> org.apache.hadoop.hdfs.MiniDFSCluster.startDataNodes(MiniDFSCluster.jav
> a:1443)
> 	at
> org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.j
> ava:828)
> 	at
> org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:738)
> 	at
> org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:608)
> 	at
> org.apache.solr.cloud.hdfs.HdfsTestUtil.setupClass(HdfsTestUtil.java:98)
> 	at
> org.apache.solr.cloud.hdfs.HdfsNNFailoverTest.setupClass(HdfsNNFailoverT
> est.java:44)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.j
> ava:57)
> 	at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
> sorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:606)
> 	at
> com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(Randomize
> dRunner.java:1627)
> 	at
> com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(Rando
> mizedRunner.java:776)
> 	at
> com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(Rando
> mizedRunner.java:792)
> 	at
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> ementAdapter.java:36)
> 	at
> com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.
> evaluate(SystemPropertiesRestoreRule.java:57)
> 	at
> org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeA
> fterRule.java:46)
> 	at
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> ementAdapter.java:36)
> 	at
> org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreCl
> assName.java:42)
> 	at
> com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMet
> hodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
> 	at
> com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMet
> hodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
> 	at
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> ementAdapter.java:36)
> 	at
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> ementAdapter.java:36)
> 	at
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> ementAdapter.java:36)
> 	at
> org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAss
> ertionsRequired.java:54)
> 	at
> org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure
> .java:48)
> 	at
> org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRule
> IgnoreAfterMaxFailures.java:65)
> 	at
> org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnore
> TestSuites.java:55)
> 	at
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> ementAdapter.java:36)
> 	at
> com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.
> run(ThreadLeakControl.java:365)
> 	at java.lang.Thread.run(Thread.java:745)
> Caused by: java.io.IOException: error=12, Not enough space
> 	at java.lang.UNIXProcess.forkAndExec(Native Method)
> 	at java.lang.UNIXProcess.<init>(UNIXProcess.java:137)
> 	at java.lang.ProcessImpl.start(ProcessImpl.java:130)
> 	at java.lang.ProcessBuilder.start(ProcessBuilder.java:1028)
> 	... 44 more
> 
> 	at __randomizedtesting.SeedInfo.seed([5D8F351977870E3F]:0)
> 	at
> org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.lo
> adPermissionInfo(RawLocalFileSystem.java:620)
> 	at
> org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.g
> etPermission(RawLocalFileSystem.java:557)
> 	at
> org.apache.hadoop.util.DiskChecker.mkdirsWithExistsAndPermissionCheck(
> DiskChecker.java:139)
> 	at
> org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:156)
> 	at
> org.apache.hadoop.hdfs.server.datanode.DataNode$DataNodeDiskChecker.
> checkDir(DataNode.java:2239)
> 	at
> org.apache.hadoop.hdfs.server.datanode.DataNode.checkStorageLocations(
> DataNode.java:2281)
> 	at
> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNo
> de.java:2263)
> 	at
> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(D
> ataNode.java:2155)
> 	at
> org.apache.hadoop.hdfs.MiniDFSCluster.startDataNodes(MiniDFSCluster.jav
> a:1443)
> 	at
> org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.j
> ava:828)
> 	at
> org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:738)
> 	at
> org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:608)
> 	at
> org.apache.solr.cloud.hdfs.HdfsTestUtil.setupClass(HdfsTestUtil.java:98)
> 	at
> org.apache.solr.cloud.hdfs.HdfsNNFailoverTest.setupClass(HdfsNNFailoverT
> est.java:44)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.j
> ava:57)
> 	at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
> sorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:606)
> 	at
> com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(Randomize
> dRunner.java:1627)
> 	at
> com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(Rando
> mizedRunner.java:776)
> 	at
> com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(Rando
> mizedRunner.java:792)
> 	at
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> ementAdapter.java:36)
> 	at
> com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.
> evaluate(SystemPropertiesRestoreRule.java:57)
> 	at
> org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeA
> fterRule.java:46)
> 	at
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> ementAdapter.java:36)
> 	at
> org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreCl
> assName.java:42)
> 	at
> com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMet
> hodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
> 	at
> com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMet
> hodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
> 	at
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> ementAdapter.java:36)
> 	at
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> ementAdapter.java:36)
> 	at
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> ementAdapter.java:36)
> 	at
> org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAss
> ertionsRequired.java:54)
> 	at
> org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure
> .java:48)
> 	at
> org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRule
> IgnoreAfterMaxFailures.java:65)
> 	at
> org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnore
> TestSuites.java:55)
> 	at
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> ementAdapter.java:36)
> 	at
> com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.
> run(ThreadLeakControl.java:365)
> 	at java.lang.Thread.run(Thread.java:745)
> 
> 
> FAILED:
> junit.framework.TestSuite.org.apache.solr.store.hdfs.HdfsDirectoryTest
> 
> Error Message:
> access denied ("java.io.FilePermission"
> "/export/home/jenkins/workspace/Lucene-Solr-5.x-Solaris/solr/build/solr-
> core/test/J1" "write")
> 
> Stack Trace:
> java.security.AccessControlException: access denied
> ("java.io.FilePermission" "/export/home/jenkins/workspace/Lucene-Solr-
> 5.x-Solaris/solr/build/solr-core/test/J1" "write")
> 	at __randomizedtesting.SeedInfo.seed([5D8F351977870E3F]:0)
> 	at
> java.security.AccessControlContext.checkPermission(AccessControlContext.j
> ava:395)
> 	at
> java.security.AccessController.checkPermission(AccessController.java:559)
> 	at
> java.lang.SecurityManager.checkPermission(SecurityManager.java:549)
> 	at java.lang.SecurityManager.checkWrite(SecurityManager.java:979)
> 	at java.io.File.canWrite(File.java:785)
> 	at org.apache.hadoop.fs.FileUtil.canWrite(FileUtil.java:1002)
> 	at
> org.apache.hadoop.hdfs.MiniDFSCluster.createPermissionsDiagnosisString(
> MiniDFSCluster.java:856)
> 	at
> org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.j
> ava:812)
> 	at
> org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:738)
> 	at
> org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:608)
> 	at
> org.apache.solr.cloud.hdfs.HdfsTestUtil.setupClass(HdfsTestUtil.java:98)
> 	at
> org.apache.solr.cloud.hdfs.HdfsTestUtil.setupClass(HdfsTestUtil.java:60)
> 	at
> org.apache.solr.store.hdfs.HdfsDirectoryTest.beforeClass(HdfsDirectoryTest.
> java:62)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.j
> ava:57)
> 	at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
> sorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:606)
> 	at
> com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(Randomize
> dRunner.java:1627)
> 	at
> com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(Rando
> mizedRunner.java:776)
> 	at
> com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(Rando
> mizedRunner.java:792)
> 	at
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> ementAdapter.java:36)
> 	at
> com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.
> evaluate(SystemPropertiesRestoreRule.java:57)
> 	at
> org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeA
> fterRule.java:46)
> 	at
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> ementAdapter.java:36)
> 	at
> org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreCl
> assName.java:42)
> 	at
> com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMet
> hodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
> 	at
> com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMet
> hodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
> 	at
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> ementAdapter.java:36)
> 	at
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> ementAdapter.java:36)
> 	at
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> ementAdapter.java:36)
> 	at
> org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAss
> ertionsRequired.java:54)
> 	at
> org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure
> .java:48)
> 	at
> org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRule
> IgnoreAfterMaxFailures.java:65)
> 	at
> org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnore
> TestSuites.java:55)
> 	at
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> ementAdapter.java:36)
> 	at
> com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.
> run(ThreadLeakControl.java:365)
> 	at java.lang.Thread.run(Thread.java:745)
> 
> 
> FAILED:
> junit.framework.TestSuite.org.apache.solr.store.hdfs.HdfsDirectoryTest
> 
> Error Message:
> 1 thread leaked from SUITE scope at
> org.apache.solr.store.hdfs.HdfsDirectoryTest:     1) Thread[id=20389,
> name=IPC Server idle connection scanner for port 41610, state=WAITING,
> group=TGRP-HdfsDirectoryTest]         at java.lang.Object.wait(Native
> Method)         at java.lang.Object.wait(Object.java:503)         at
> java.util.TimerThread.mainLoop(Timer.java:526)         at
> java.util.TimerThread.run(Timer.java:505)
> 
> Stack Trace:
> com.carrotsearch.randomizedtesting.ThreadLeakError: 1 thread leaked from
> SUITE scope at org.apache.solr.store.hdfs.HdfsDirectoryTest:
>    1) Thread[id=20389, name=IPC Server idle connection scanner for port
> 41610, state=WAITING, group=TGRP-HdfsDirectoryTest]
>         at java.lang.Object.wait(Native Method)
>         at java.lang.Object.wait(Object.java:503)
>         at java.util.TimerThread.mainLoop(Timer.java:526)
>         at java.util.TimerThread.run(Timer.java:505)
> 	at __randomizedtesting.SeedInfo.seed([5D8F351977870E3F]:0)
> 
> 
> FAILED:
> junit.framework.TestSuite.org.apache.solr.store.hdfs.HdfsDirectoryTest
> 
> Error Message:
> There are still zombie threads that couldn't be terminated:    1)
> Thread[id=20389, name=IPC Server idle connection scanner for port 41610,
> state=WAITING, group=TGRP-HdfsDirectoryTest]         at
> java.lang.Object.wait(Native Method)         at
> java.lang.Object.wait(Object.java:503)         at
> java.util.TimerThread.mainLoop(Timer.java:526)         at
> java.util.TimerThread.run(Timer.java:505)
> 
> Stack Trace:
> com.carrotsearch.randomizedtesting.ThreadLeakError: There are still zombie
> threads that couldn't be terminated:
>    1) Thread[id=20389, name=IPC Server idle connection scanner for port
> 41610, state=WAITING, group=TGRP-HdfsDirectoryTest]
>         at java.lang.Object.wait(Native Method)
>         at java.lang.Object.wait(Object.java:503)
>         at java.util.TimerThread.mainLoop(Timer.java:526)
>         at java.util.TimerThread.run(Timer.java:505)
> 	at __randomizedtesting.SeedInfo.seed([5D8F351977870E3F]:0)
> 
> 
> 
> 
> Build Log:
> [...truncated 10577 lines...]
>    [junit4] Suite: org.apache.solr.cloud.hdfs.HdfsNNFailoverTest
>    [junit4]   2> Creating dataDir: /export/home/jenkins/workspace/Lucene-
> Solr-5.x-Solaris/solr/build/solr-
> core/test/J0/temp/solr.cloud.hdfs.HdfsNNFailoverTest_5D8F351977870E3F-
> 001/init-core-data-001
>    [junit4]   2> 2599844 INFO  (SUITE-HdfsNNFailoverTest-
> seed#[5D8F351977870E3F]-worker) [    ] o.a.s.BaseDistributedSearchTestCase
> Setting hostContext system property: /
>    [junit4]   2> 2616331 WARN  (SUITE-HdfsNNFailoverTest-
> seed#[5D8F351977870E3F]-worker) [    ] o.a.h.u.NativeCodeLoader Unable to
> load native-hadoop library for your platform... using builtin-java classes
> where applicable
>    [junit4]   1> Formatting using clusterid: testClusterID
>    [junit4]   2> 2617524 WARN  (SUITE-HdfsNNFailoverTest-
> seed#[5D8F351977870E3F]-worker) [    ] o.a.h.m.i.MetricsConfig Cannot
> locate configuration: tried hadoop-metrics2-namenode.properties,hadoop-
> metrics2.properties
>    [junit4]   2> 2617755 INFO  (SUITE-HdfsNNFailoverTest-
> seed#[5D8F351977870E3F]-worker) [    ] o.m.log Logging to
> org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via
> org.mortbay.log.Slf4jLog
>    [junit4]   2> 2617771 WARN  (SUITE-HdfsNNFailoverTest-
> seed#[5D8F351977870E3F]-worker) [    ] o.a.h.h.HttpRequestLog Jetty
> request log can only be enabled using Log4j
>    [junit4]   2> 2617878 INFO  (SUITE-HdfsNNFailoverTest-
> seed#[5D8F351977870E3F]-worker) [    ] o.m.log jetty-6.1.26
>    [junit4]   2> 2617942 INFO  (SUITE-HdfsNNFailoverTest-
> seed#[5D8F351977870E3F]-worker) [    ] o.m.log Extract
> jar:file:/export/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-
> hdfs/tests/hadoop-hdfs-2.6.0-tests.jar!/webapps/hdfs to
> ./temp/Jetty_solaris.vm_35231_hdfs____thayv4/webapp
>    [junit4]   2> 2618129 INFO  (SUITE-HdfsNNFailoverTest-
> seed#[5D8F351977870E3F]-worker) [    ] o.m.log NO JSP Support for /, did not
> find org.apache.jasper.servlet.JspServlet
>    [junit4]   2> 2619464 INFO  (SUITE-HdfsNNFailoverTest-
> seed#[5D8F351977870E3F]-worker) [    ] o.m.log Started
> HttpServer2$SelectChannelConnectorWithSafeStartup@solaris-vm:35231
>    [junit4]   2> 2637264 WARN  (SUITE-HdfsNNFailoverTest-
> seed#[5D8F351977870E3F]-worker) [    ] o.a.h.h.s.d.DataNode Invalid
> dfs.datanode.data.dir /export/home/jenkins/workspace/Lucene-Solr-5.x-
> Solaris/solr/build/solr-
> core/test/J0/temp/solr.cloud.hdfs.HdfsNNFailoverTest_5D8F351977870E3F-
> 001/tempDir-001/hdfsBaseDir/data/data2 :
>    [junit4]   2> java.io.IOException: Cannot run program "chmod": error=12,
> Not enough space
>    [junit4]   2> 	at java.lang.ProcessBuilder.start(ProcessBuilder.java:1047)
>    [junit4]   2> 	at org.apache.hadoop.util.Shell.runCommand(Shell.java:485)
>    [junit4]   2> 	at org.apache.hadoop.util.Shell.run(Shell.java:455)
>    [junit4]   2> 	at
> org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:715
> )
>    [junit4]   2> 	at
> org.apache.hadoop.util.Shell.execCommand(Shell.java:808)
>    [junit4]   2> 	at
> org.apache.hadoop.util.Shell.execCommand(Shell.java:791)
>    [junit4]   2> 	at
> org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSyste
> m.java:656)
>    [junit4]   2> 	at
> org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.java:4
> 90)
>    [junit4]   2> 	at
> org.apache.hadoop.util.DiskChecker.mkdirsWithExistsAndPermissionCheck(
> DiskChecker.java:140)
>    [junit4]   2> 	at
> org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:156)
>    [junit4]   2> 	at
> org.apache.hadoop.hdfs.server.datanode.DataNode$DataNodeDiskChecker.
> checkDir(DataNode.java:2239)
>    [junit4]   2> 	at
> org.apache.hadoop.hdfs.server.datanode.DataNode.checkStorageLocations(
> DataNode.java:2281)
>    [junit4]   2> 	at
> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNo
> de.java:2263)
>    [junit4]   2> 	at
> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(D
> ataNode.java:2155)
>    [junit4]   2> 	at
> org.apache.hadoop.hdfs.MiniDFSCluster.startDataNodes(MiniDFSCluster.jav
> a:1443)
>    [junit4]   2> 	at
> org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.j
> ava:828)
>    [junit4]   2> 	at
> org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:738)
>    [junit4]   2> 	at
> org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:608)
>    [junit4]   2> 	at
> org.apache.solr.cloud.hdfs.HdfsTestUtil.setupClass(HdfsTestUtil.java:98)
>    [junit4]   2> 	at
> org.apache.solr.cloud.hdfs.HdfsNNFailoverTest.setupClass(HdfsNNFailoverT
> est.java:44)
>    [junit4]   2> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
>    [junit4]   2> 	at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.j
> ava:57)
>    [junit4]   2> 	at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
> sorImpl.java:43)
>    [junit4]   2> 	at java.lang.reflect.Method.invoke(Method.java:606)
>    [junit4]   2> 	at
> com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(Randomize
> dRunner.java:1627)
>    [junit4]   2> 	at
> com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(Rando
> mizedRunner.java:776)
>    [junit4]   2> 	at
> com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(Rando
> mizedRunner.java:792)
>    [junit4]   2> 	at
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> ementAdapter.java:36)
>    [junit4]   2> 	at
> com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.
> evaluate(SystemPropertiesRestoreRule.java:57)
>    [junit4]   2> 	at
> org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeA
> fterRule.java:46)
>    [junit4]   2> 	at
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> ementAdapter.java:36)
>    [junit4]   2> 	at
> org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreCl
> assName.java:42)
>    [junit4]   2> 	at
> com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMet
> hodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
>    [junit4]   2> 	at
> com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMet
> hodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
>    [junit4]   2> 	at
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> ementAdapter.java:36)
>    [junit4]   2> 	at
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> ementAdapter.java:36)
>    [junit4]   2> 	at
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> ementAdapter.java:36)
>    [junit4]   2> 	at
> org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAss
> ertionsRequired.java:54)
>    [junit4]   2> 	at
> org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure
> .java:48)
>    [junit4]   2> 	at
> org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRule
> IgnoreAfterMaxFailures.java:65)
>    [junit4]   2> 	at
> org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnore
> TestSuites.java:55)
>    [junit4]   2> 	at
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> ementAdapter.java:36)
>    [junit4]   2> 	at
> com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.
> run(ThreadLeakControl.java:365)
>    [junit4]   2> 	at java.lang.Thread.run(Thread.java:745)
>    [junit4]   2> Caused by: java.io.IOException: error=12, Not enough space
>    [junit4]   2> 	at java.lang.UNIXProcess.forkAndExec(Native Method)
>    [junit4]   2> 	at java.lang.UNIXProcess.<init>(UNIXProcess.java:137)
>    [junit4]   2> 	at java.lang.ProcessImpl.start(ProcessImpl.java:130)
>    [junit4]   2> 	at java.lang.ProcessBuilder.start(ProcessBuilder.java:1028)
>    [junit4]   2> 	... 43 more
>    [junit4]   2> 2637287 WARN
> (org.apache.hadoop.util.JvmPauseMonitor$Monitor@be51b7) [    ]
> o.a.h.u.JvmPauseMonitor Detected pause in JVM or host machine (eg GC):
> pause of approximately 15969ms
>    [junit4]   2> No GCs detected
>    [junit4]   2> 2637368 WARN  (SUITE-HdfsNNFailoverTest-
> seed#[5D8F351977870E3F]-worker) [    ] o.a.h.h.HttpRequestLog Jetty
> request log can only be enabled using Log4j
>    [junit4]   2> 2637384 INFO  (SUITE-HdfsNNFailoverTest-
> seed#[5D8F351977870E3F]-worker) [    ] o.m.log jetty-6.1.26
>    [junit4]   2> 2637422 INFO  (SUITE-HdfsNNFailoverTest-
> seed#[5D8F351977870E3F]-worker) [    ] o.m.log Extract
> jar:file:/export/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-
> hdfs/tests/hadoop-hdfs-2.6.0-tests.jar!/webapps/datanode to
> ./temp/Jetty_solaris.vm_49465_datanode____96t731/webapp
>    [junit4]   2> 2637655 INFO  (SUITE-HdfsNNFailoverTest-
> seed#[5D8F351977870E3F]-worker) [    ] o.m.log NO JSP Support for /, did not
> find org.apache.jasper.servlet.JspServlet
>    [junit4]   2> 2638756 INFO  (SUITE-HdfsNNFailoverTest-
> seed#[5D8F351977870E3F]-worker) [    ] o.m.log Started
> HttpServer2$SelectChannelConnectorWithSafeStartup@solaris-vm:49465
>    [junit4]   2> 2645079 INFO  (SUITE-HdfsNNFailoverTest-
> seed#[5D8F351977870E3F]-worker) [    ] o.m.log Stopped
> HttpServer2$SelectChannelConnectorWithSafeStartup@solaris-vm:0
>    [junit4]   2> 2645234 ERROR (DataNode:
> [[[DISK]file:/export/home/jenkins/workspace/Lucene-Solr-5.x-
> Solaris/solr/build/solr-
> core/test/J0/temp/solr.cloud.hdfs.HdfsNNFailoverTest_5D8F351977870E3F-
> 001/tempDir-001/hdfsBaseDir/data/data1/,
> [DISK]file:/export/home/jenkins/workspace/Lucene-Solr-5.x-
> Solaris/solr/build/solr-
> core/test/J0/temp/solr.cloud.hdfs.HdfsNNFailoverTest_5D8F351977870E3F-
> 001/tempDir-001/hdfsBaseDir/data/data2/]]  heartbeating to solaris-
> vm/127.0.0.1:61051) [    ] o.a.h.h.s.d.DataNode Initialization failed for Block
> pool <registering> (Datanode Uuid unassigned) service to solaris-
> vm/127.0.0.1:61051. Exiting.
>    [junit4]   2> java.io.IOException: DN shut down before block pool
> connected
>    [junit4]   2> 	at
> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.retrieveNamespac
> eInfo(BPServiceActor.java:185)
>    [junit4]   2> 	at
> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAnd
> Handshake(BPServiceActor.java:215)
>    [junit4]   2> 	at
> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceAct
> or.java:828)
>    [junit4]   2> 	at java.lang.Thread.run(Thread.java:745)
>    [junit4]   2> 2645236 WARN  (DataNode:
> [[[DISK]file:/export/home/jenkins/workspace/Lucene-Solr-5.x-
> Solaris/solr/build/solr-
> core/test/J0/temp/solr.cloud.hdfs.HdfsNNFailoverTest_5D8F351977870E3F-
> 001/tempDir-001/hdfsBaseDir/data/data1/,
> [DISK]file:/export/home/jenkins/workspace/Lucene-Solr-5.x-
> Solaris/solr/build/solr-
> core/test/J0/temp/solr.cloud.hdfs.HdfsNNFailoverTest_5D8F351977870E3F-
> 001/tempDir-001/hdfsBaseDir/data/data2/]]  heartbeating to solaris-
> vm/127.0.0.1:61051) [    ] o.a.h.h.s.d.DataNode Ending block pool service for:
> Block pool <registering> (Datanode Uuid unassigned) service to solaris-
> vm/127.0.0.1:61051
>    [junit4]   2> 2645259 WARN
> (org.apache.hadoop.hdfs.server.blockmanagement.DecommissionManager
> $Monitor@7b7964) [    ] o.a.h.h.s.b.DecommissionManager Monitor
> interrupted: java.lang.InterruptedException: sleep interrupted
>    [junit4]   2> 2645314 INFO  (SUITE-HdfsNNFailoverTest-
> seed#[5D8F351977870E3F]-worker) [    ] o.m.log Stopped
> HttpServer2$SelectChannelConnectorWithSafeStartup@solaris-vm:0
>    [junit4]   2> 2645418 ERROR (SUITE-HdfsNNFailoverTest-
> seed#[5D8F351977870E3F]-worker) [    ] o.a.h.m.l.MethodMetric Error
> invoking method getBlocksTotal
>    [junit4]   2> java.lang.reflect.InvocationTargetException
>    [junit4]   2> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
>    [junit4]   2> 	at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.j
> ava:57)
>    [junit4]   2> 	at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
> sorImpl.java:43)
>    [junit4]   2> 	at java.lang.reflect.Method.invoke(Method.java:606)
>    [junit4]   2> 	at
> org.apache.hadoop.metrics2.lib.MethodMetric$2.snapshot(MethodMetric.j
> ava:111)
>    [junit4]   2> 	at
> org.apache.hadoop.metrics2.lib.MethodMetric.snapshot(MethodMetric.jav
> a:144)
>    [junit4]   2> 	at
> org.apache.hadoop.metrics2.lib.MetricsRegistry.snapshot(MetricsRegistry.ja
> va:387)
>    [junit4]   2> 	at
> org.apache.hadoop.metrics2.lib.MetricsSourceBuilder$1.getMetrics(MetricsS
> ourceBuilder.java:79)
>    [junit4]   2> 	at
> org.apache.hadoop.metrics2.impl.MetricsSourceAdapter.getMetrics(Metrics
> SourceAdapter.java:195)
>    [junit4]   2> 	at
> org.apache.hadoop.metrics2.impl.MetricsSourceAdapter.updateJmxCache(
> MetricsSourceAdapter.java:172)
>    [junit4]   2> 	at
> org.apache.hadoop.metrics2.impl.MetricsSourceAdapter.getMBeanInfo(Me
> tricsSourceAdapter.java:151)
>    [junit4]   2> 	at
> com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getClassName(De
> faultMBeanServerInterceptor.java:1804)
>    [junit4]   2> 	at
> com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.safeGetClassNam
> e(DefaultMBeanServerInterceptor.java:1595)
>    [junit4]   2> 	at
> com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.checkMBeanPer
> mission(DefaultMBeanServerInterceptor.java:1813)
>    [junit4]   2> 	at
> com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.exclusiveUnregist
> erMBean(DefaultMBeanServerInterceptor.java:430)
>    [junit4]   2> 	at
> com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.unregisterMBean
> (DefaultMBeanServerInterceptor.java:415)
>    [junit4]   2> 	at
> com.sun.jmx.mbeanserver.JmxMBeanServer.unregisterMBean(JmxMBeanS
> erver.java:546)
>    [junit4]   2> 	at
> org.apache.hadoop.metrics2.util.MBeans.unregister(MBeans.java:81)
>    [junit4]   2> 	at
> org.apache.hadoop.metrics2.impl.MetricsSourceAdapter.stopMBeans(Metri
> csSourceAdapter.java:227)
>    [junit4]   2> 	at
> org.apache.hadoop.metrics2.impl.MetricsSourceAdapter.stop(MetricsSourc
> eAdapter.java:212)
>    [junit4]   2> 	at
> org.apache.hadoop.metrics2.impl.MetricsSystemImpl.stopSources(MetricsS
> ystemImpl.java:461)
>    [junit4]   2> 	at
> org.apache.hadoop.metrics2.impl.MetricsSystemImpl.stop(MetricsSystemIm
> pl.java:212)
>    [junit4]   2> 	at
> org.apache.hadoop.metrics2.impl.MetricsSystemImpl.shutdown(MetricsSyst
> emImpl.java:592)
>    [junit4]   2> 	at
> org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.shutdownInstance(D
> efaultMetricsSystem.java:72)
>    [junit4]   2> 	at
> org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.shutdown(DefaultMe
> tricsSystem.java:68)
>    [junit4]   2> 	at
> org.apache.hadoop.hdfs.server.namenode.metrics.NameNodeMetrics.shut
> down(NameNodeMetrics.java:145)
>    [junit4]   2> 	at
> org.apache.hadoop.hdfs.server.namenode.NameNode.stop(NameNode.jav
> a:822)
>    [junit4]   2> 	at
> org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:172
> 0)
>    [junit4]   2> 	at
> org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:169
> 9)
>    [junit4]   2> 	at
> org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.j
> ava:838)
>    [junit4]   2> 	at
> org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:738)
>    [junit4]   2> 	at
> org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:608)
>    [junit4]   2> 	at
> org.apache.solr.cloud.hdfs.HdfsTestUtil.setupClass(HdfsTestUtil.java:98)
>    [junit4]   2> 	at
> org.apache.solr.cloud.hdfs.HdfsNNFailoverTest.setupClass(HdfsNNFailoverT
> est.java:44)
>    [junit4]   2> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
>    [junit4]   2> 	at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.j
> ava:57)
>    [junit4]   2> 	at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
> sorImpl.java:43)
>    [junit4]   2> 	at java.lang.reflect.Method.invoke(Method.java:606)
>    [junit4]   2> 	at
> com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(Randomize
> dRunner.java:1627)
>    [junit4]   2> 	at
> com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(Rando
> mizedRunner.java:776)
>    [junit4]   2> 	at
> com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(Rando
> mizedRunner.java:792)
>    [junit4]   2> 	at
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> ementAdapter.java:36)
>    [junit4]   2> 	at
> com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.
> evaluate(SystemPropertiesRestoreRule.java:57)
>    [junit4]   2> 	at
> org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeA
> fterRule.java:46)
>    [junit4]   2> 	at
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> ementAdapter.java:36)
>    [junit4]   2> 	at
> org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreCl
> assName.java:42)
>    [junit4]   2> 	at
> com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMet
> hodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
>    [junit4]   2> 	at
> com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMet
> hodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
>    [junit4]   2> 	at
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> ementAdapter.java:36)
>    [junit4]   2> 	at
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> ementAdapter.java:36)
>    [junit4]   2> 	at
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> ementAdapter.java:36)
>    [junit4]   2> 	at
> org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAss
> ertionsRequired.java:54)
>    [junit4]   2> 	at
> org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure
> .java:48)
>    [junit4]   2> 	at
> org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRule
> IgnoreAfterMaxFailures.java:65)
>    [junit4]   2> 	at
> org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnore
> TestSuites.java:55)
>    [junit4]   2> 	at
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> ementAdapter.java:36)
>    [junit4]   2> 	at
> com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.
> run(ThreadLeakControl.java:365)
>    [junit4]   2> 	at java.lang.Thread.run(Thread.java:745)
>    [junit4]   2> Caused by: java.lang.NullPointerException
>    [junit4]   2> 	at
> org.apache.hadoop.hdfs.server.blockmanagement.BlocksMap.size(BlocksM
> ap.java:198)
>    [junit4]   2> 	at
> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.getTotalBl
> ocks(BlockManager.java:3291)
>    [junit4]   2> 	at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlocksTotal(F
> SNamesystem.java:6223)
>    [junit4]   2> 	... 58 more
>    [junit4]   2> 2645432 INFO  (SUITE-HdfsNNFailoverTest-
> seed#[5D8F351977870E3F]-worker) [    ] o.a.s.SolrTestCaseJ4 ###deleteCore
>    [junit4]   2> NOTE: test params are: codec=Asserting(Lucene53),
> sim=RandomSimilarityProvider(queryNorm=true,coord=yes): {},
> locale=mk_MK, timezone=Asia/Shanghai
>    [junit4]   2> NOTE: SunOS 5.11 x86/Oracle Corporation 1.7.0_85 (32-
> bit)/cpus=3,threads=1,free=99794816,total=518979584
>    [junit4]   2> NOTE: All tests run in this JVM: [SolrCloudExampleTest,
> TestStressVersions, TestSerializedLuceneMatchVersion, TestSolrJ,
> DistanceUnitsTest, MultiThreadedOCPTest, TestDistribDocBasedVersion,
> BJQParserTest, ZkCLITest, QueryEqualityTest, PrimitiveFieldTypeTest,
> DistributedQueryComponentOptimizationTest, AliasIntegrationTest,
> TestInitQParser, TestAuthorizationFramework, TestLazyCores,
> SolrIndexConfigTest, TestFunctionQuery, TestXIncludeConfig,
> HardAutoCommitTest, DocValuesMultiTest, TestDefaultStatsCache,
> SolrRequestParserTest, RecoveryZkTest, UpdateParamsTest,
> TestSolrDeletionPolicy1, TestDFRSimilarityFactory, TestFastWriter,
> PathHierarchyTokenizerFactoryTest, TestDynamicLoading,
> TestElisionMultitermQuery, PolyFieldTest, UnloadDistributedZkTest,
> TestJsonRequest, TestRuleBasedAuthorizationPlugin,
> TestManagedStopFilterFactory, TestRawResponseWriter, IndexSchemaTest,
> TestEmbeddedSolrServerConstructors, InfoHandlerTest,
> AlternateDirectoryTest, LeaderElectionTest, JsonLoaderTest,
> TestCoreContainer, DirectSolrSpellCheckerTest, RequestLoggingTest,
> ZkNodePropsTest, TermsComponentTest, TestConfig,
> TestFieldTypeCollectionResource, XsltUpdateRequestHandlerTest,
> TestManagedSchemaFieldResource, TestSchemaResource,
> DataDrivenBlockJoinTest, TestExactStatsCache, TestConfigSetProperties,
> DeleteLastCustomShardedReplicaTest, TestAnalyzedSuggestions,
> DirectUpdateHandlerTest, ExternalFileFieldSortTest, TestIBSimilarityFactory,
> TestMissingGroups, ClusterStateUpdateTest, ActionThrottleTest,
> QueryElevationComponentTest, DocValuesTest, QueryResultKeyTest,
> TestLRUCache, TestPhraseSuggestions, SimplePostToolTest,
> TriLevelCompositeIdRoutingTest, DistributedMLTComponentTest,
> CloudExitableDirectoryReaderTest, TestSolrCloudWithKerberosAlt,
> TestCodecSupport, TestConfigSets, PeerSyncTest,
> XmlUpdateRequestHandlerTest, SpatialHeatmapFacetsTest,
> SoftAutoCommitTest, TestSchemaNameResource,
> PreAnalyzedUpdateProcessorTest, TestJmxMonitoredMap,
> TestDistributedStatsComponentCardinality,
> TestManagedSynonymFilterFactory, JSONWriterTest, TestNRTOpen,
> ReplicationFactorTest, DOMUtilTest, SolrCoreTest,
> DocExpirationUpdateProcessorFactoryTest, FastVectorHighlighterTest,
> SuggesterFSTTest, TestExtendedDismaxParser, TestSolrConfigHandler,
> DocumentAnalysisRequestHandlerTest,
> DistributedFacetPivotSmallAdvancedTest, BlockDirectoryTest,
> TestQuerySenderNoQuery, TestHashPartitioner, DateFieldTest,
> SegmentsInfoRequestHandlerTest, TestFieldCollectionResource,
> RecoveryAfterSoftCommitTest, TestMergePolicyConfig, TestFieldSortValues,
> SecurityConfHandlerTest, TestStressReorder, BufferStoreTest,
> TestRandomRequestDistribution, HdfsBasicDistributedZkTest,
> TestCloudManagedSchemaConcurrent, TestReplicaProperties,
> DisMaxRequestHandlerTest, TestMacros, TestStressLucene,
> TestReloadAndDeleteDocs, BasicAuthIntegrationTest, TestDocSet,
> BasicDistributedZkTest, DistributedQueryElevationComponentTest,
> TestGroupingSearch, TestObjectReleaseTracker, MoreLikeThisHandlerTest,
> OverseerTest, TestFaceting, TestUpdate, TestClassNameShortening,
> TestRestManager, SyncSliceTest, ShardRoutingTest, ZkSolrClientTest,
> TestZkChroot, TestRandomDVFaceting, ShardRoutingCustomTest,
> TestDistributedGrouping, DistributedSpellCheckComponentTest,
> ZkControllerTest, TestRealTimeGet, TestReload,
> DistributedTermsComponentTest, TestRangeQuery, SimpleFacetsTest,
> TestSolr4Spatial, StatsComponentTest, SolrCmdDistributorTest, TestSort,
> CurrencyFieldXmlFileTest, AnalysisAfterCoreReloadTest,
> TestFoldingMultitermQuery, SuggesterTSTTest, TestCSVLoader,
> SchemaVersionSpecificBehaviorTest, SolrCoreCheckLockOnStartupTest,
> DirectUpdateHandlerOptimizeTest,
> StatelessScriptUpdateProcessorFactoryTest, DistanceFunctionTest,
> IndexBasedSpellCheckerTest, StandardRequestHandlerTest,
> TestOmitPositions, DocumentBuilderTest, RequiredFieldsTest,
> TestArbitraryIndexDir, LoggingHandlerTest, ReturnFieldsTest,
> MBeansHandlerTest, UniqFieldsUpdateProcessorFactoryTest,
> PingRequestHandlerTest, TestComponentsName, TestLFUCache,
> PreAnalyzedFieldTest, TestSystemIdResolver, SpellingQueryConverterTest,
> TestUtils, TestDocumentBuilder, SliceStateTest, SystemInfoHandlerTest,
> UUIDFieldTest, FileUtilsTest, CircularListTest, TestRTGBase,
> CursorPagingTest, DistributedIntervalFacetingTest,
> TestDistributedMissingSort, TestSimpleTrackingShardHandler,
> AsyncMigrateRouteKeyTest, DeleteInactiveReplicaTest,
> DistribDocExpirationUpdateProcessorTest, LeaderFailoverAfterPartitionTest,
> OverriddenZkACLAndCredentialsProvidersTest,
> OverseerCollectionConfigSetProcessorTest, OverseerRolesTest,
> OverseerTaskQueueTest, SSLMigrationTest, SaslZkACLProviderTest,
> SimpleCollectionCreateDeleteTest, TestAuthenticationFramework,
> TestCloudInspectUtil, TestCollectionAPI, TestMiniSolrCloudClusterSSL,
> TestRebalanceLeaders, TestRequestStatusCollectionAPI,
> HdfsBasicDistributedZk2Test, HdfsChaosMonkeySafeLeaderTest,
> HdfsCollectionsAPIDistributedZkTest, HdfsNNFailoverTest]
>    [junit4]   2> NOTE: reproduce with: ant test  -
> Dtestcase=HdfsNNFailoverTest -Dtests.seed=5D8F351977870E3F -
> Dtests.slow=true -Dtests.locale=mk_MK -Dtests.timezone=Asia/Shanghai -
> Dtests.asserts=true -Dtests.file.encoding=UTF-8
>    [junit4] ERROR   0.00s J0 | HdfsNNFailoverTest (suite) <<<
>    [junit4]    > Throwable #1: java.lang.RuntimeException: Error while running
> command to get file permissions : java.io.IOException: Cannot run program
> "/bin/ls": error=12, Not enough space
>    [junit4]    > 	at java.lang.ProcessBuilder.start(ProcessBuilder.java:1047)
>    [junit4]    > 	at org.apache.hadoop.util.Shell.runCommand(Shell.java:485)
>    [junit4]    > 	at org.apache.hadoop.util.Shell.run(Shell.java:455)
>    [junit4]    > 	at
> org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:715
> )
>    [junit4]    > 	at
> org.apache.hadoop.util.Shell.execCommand(Shell.java:808)
>    [junit4]    > 	at
> org.apache.hadoop.util.Shell.execCommand(Shell.java:791)
>    [junit4]    > 	at
> org.apache.hadoop.fs.FileUtil.execCommand(FileUtil.java:1097)
>    [junit4]    > 	at
> org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.lo
> adPermissionInfo(RawLocalFileSystem.java:582)
>    [junit4]    > 	at
> org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.g
> etPermission(RawLocalFileSystem.java:557)
>    [junit4]    > 	at
> org.apache.hadoop.util.DiskChecker.mkdirsWithExistsAndPermissionCheck(
> DiskChecker.java:139)
>    [junit4]    > 	at
> org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:156)
>    [junit4]    > 	at
> org.apache.hadoop.hdfs.server.datanode.DataNode$DataNodeDiskChecker.
> checkDir(DataNode.java:2239)
>    [junit4]    > 	at
> org.apache.hadoop.hdfs.server.datanode.DataNode.checkStorageLocations(
> DataNode.java:2281)
>    [junit4]    > 	at
> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNo
> de.java:2263)
>    [junit4]    > 	at
> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(D
> ataNode.java:2155)
>    [junit4]    > 	at
> org.apache.hadoop.hdfs.MiniDFSCluster.startDataNodes(MiniDFSCluster.jav
> a:1443)
>    [junit4]    > 	at
> org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.j
> ava:828)
>    [junit4]    > 	at
> org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:738)
>    [junit4]    > 	at
> org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:608)
>    [junit4]    > 	at
> org.apache.solr.cloud.hdfs.HdfsTestUtil.setupClass(HdfsTestUtil.java:98)
>    [junit4]    > 	at
> org.apache.solr.cloud.hdfs.HdfsNNFailoverTest.setupClass(HdfsNNFailoverT
> est.java:44)
>    [junit4]    > 	at java.lang.Thread.run(Thread.java:745)
>    [junit4]    > Caused by: java.io.IOException: error=12, Not enough space
>    [junit4]    > 	at java.lang.UNIXProcess.forkAndExec(Native Method)
>    [junit4]    > 	at java.lang.UNIXProcess.<init>(UNIXProcess.java:137)
>    [junit4]    > 	at java.lang.ProcessImpl.start(ProcessImpl.java:130)
>    [junit4]    > 	at java.lang.ProcessBuilder.start(ProcessBuilder.java:1028)
>    [junit4]    > 	... 44 more
>    [junit4]    > 	at
> __randomizedtesting.SeedInfo.seed([5D8F351977870E3F]:0)
>    [junit4]    > 	at
> org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.lo
> adPermissionInfo(RawLocalFileSystem.java:620)
>    [junit4]    > 	at
> org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.g
> etPermission(RawLocalFileSystem.java:557)
>    [junit4]    > 	at
> org.apache.hadoop.util.DiskChecker.mkdirsWithExistsAndPermissionCheck(
> DiskChecker.java:139)
>    [junit4]    > 	at
> org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:156)
>    [junit4]    > 	at
> org.apache.hadoop.hdfs.server.datanode.DataNode$DataNodeDiskChecker.
> checkDir(DataNode.java:2239)
>    [junit4]    > 	at
> org.apache.hadoop.hdfs.server.datanode.DataNode.checkStorageLocations(
> DataNode.java:2281)
>    [junit4]    > 	at
> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNo
> de.java:2263)
>    [junit4]    > 	at
> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(D
> ataNode.java:2155)
>    [junit4]    > 	at
> org.apache.hadoop.hdfs.MiniDFSCluster.startDataNodes(MiniDFSCluster.jav
> a:1443)
>    [junit4]    > 	at
> org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.j
> ava:828)
>    [junit4]    > 	at
> org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:738)
>    [junit4]    > 	at
> org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:608)
>    [junit4]    > 	at
> org.apache.solr.cloud.hdfs.HdfsTestUtil.setupClass(HdfsTestUtil.java:98)
>    [junit4]    > 	at
> org.apache.solr.cloud.hdfs.HdfsNNFailoverTest.setupClass(HdfsNNFailoverT
> est.java:44)
>    [junit4]    > 	at java.lang.Thread.run(Thread.java:745)
>    [junit4] Completed [426/536] on J0 in 45.66s, 0 tests, 1 error <<< FAILURES!
> 
> [...truncated 300 lines...]
>    [junit4] Suite: org.apache.solr.store.hdfs.HdfsDirectoryTest
>    [junit4]   2> Creating dataDir: /export/home/jenkins/workspace/Lucene-
> Solr-5.x-Solaris/solr/build/solr-
> core/test/J1/temp/solr.store.hdfs.HdfsDirectoryTest_5D8F351977870E3F-
> 001/init-core-data-001
>    [junit4]   2> 3147821 INFO  (SUITE-HdfsDirectoryTest-
> seed#[5D8F351977870E3F]-worker) [    ] o.a.s.SolrTestCaseJ4 Randomized ssl
> (false) and clientAuth (false)
>    [junit4]   1> Formatting using clusterid: testClusterID
>    [junit4]   2> 3147964 WARN  (SUITE-HdfsDirectoryTest-
> seed#[5D8F351977870E3F]-worker) [    ] o.a.h.m.i.MetricsConfig Cannot
> locate configuration: tried hadoop-metrics2-namenode.properties,hadoop-
> metrics2.properties
>    [junit4]   2> 3147974 WARN  (SUITE-HdfsDirectoryTest-
> seed#[5D8F351977870E3F]-worker) [    ] o.a.h.h.HttpRequestLog Jetty
> request log can only be enabled using Log4j
>    [junit4]   2> 3147976 INFO  (SUITE-HdfsDirectoryTest-
> seed#[5D8F351977870E3F]-worker) [    ] o.m.log jetty-6.1.26
>    [junit4]   2> 3147994 INFO  (SUITE-HdfsDirectoryTest-
> seed#[5D8F351977870E3F]-worker) [    ] o.m.log Extract
> jar:file:/export/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-
> hdfs/tests/hadoop-hdfs-2.6.0-tests.jar!/webapps/hdfs to
> ./temp/Jetty_solaris.vm_46547_hdfs____.vwfmpk/webapp
>    [junit4]   2> 3148170 INFO  (SUITE-HdfsDirectoryTest-
> seed#[5D8F351977870E3F]-worker) [    ] o.m.log NO JSP Support for /, did not
> find org.apache.jasper.servlet.JspServlet
>    [junit4]   2> 3148982 INFO  (SUITE-HdfsDirectoryTest-
> seed#[5D8F351977870E3F]-worker) [    ] o.m.log Started
> HttpServer2$SelectChannelConnectorWithSafeStartup@solaris-vm:46547
>    [junit4]   2> 3157264 INFO  (SUITE-HdfsDirectoryTest-
> seed#[5D8F351977870E3F]-worker) [    ] o.m.log Stopped
> HttpServer2$SelectChannelConnectorWithSafeStartup@solaris-vm:0
>    [junit4]   2> 3157403 INFO  (SUITE-HdfsDirectoryTest-
> seed#[5D8F351977870E3F]-worker) [    ] o.a.s.SolrTestCaseJ4 ###deleteCore
>    [junit4]   2> Aug 29, 2015 9:01:07 PM
> com.carrotsearch.randomizedtesting.ThreadLeakControl checkThreadLeaks
>    [junit4]   2> WARNING: Will linger awaiting termination of 1 leaked
> thread(s).
>    [junit4]   2> Aug 29, 2015 9:01:27 PM
> com.carrotsearch.randomizedtesting.ThreadLeakControl checkThreadLeaks
>    [junit4]   2> SEVERE: 1 thread leaked from SUITE scope at
> org.apache.solr.store.hdfs.HdfsDirectoryTest:
>    [junit4]   2>    1) Thread[id=20389, name=IPC Server idle connection scanner
> for port 41610, state=WAITING, group=TGRP-HdfsDirectoryTest]
>    [junit4]   2>         at java.lang.Object.wait(Native Method)
>    [junit4]   2>         at java.lang.Object.wait(Object.java:503)
>    [junit4]   2>         at java.util.TimerThread.mainLoop(Timer.java:526)
>    [junit4]   2>         at java.util.TimerThread.run(Timer.java:505)
>    [junit4]   2> Aug 29, 2015 9:01:27 PM
> com.carrotsearch.randomizedtesting.ThreadLeakControl tryToInterruptAll
>    [junit4]   2> INFO: Starting to interrupt leaked threads:
>    [junit4]   2>    1) Thread[id=20389, name=IPC Server idle connection scanner
> for port 41610, state=WAITING, group=TGRP-HdfsDirectoryTest]
>    [junit4]   2> Aug 29, 2015 9:01:30 PM
> com.carrotsearch.randomizedtesting.ThreadLeakControl tryToInterruptAll
>    [junit4]   2> SEVERE: There are still zombie threads that couldn't be
> terminated:
>    [junit4]   2>    1) Thread[id=20389, name=IPC Server idle connection scanner
> for port 41610, state=WAITING, group=TGRP-HdfsDirectoryTest]
>    [junit4]   2>         at java.lang.Object.wait(Native Method)
>    [junit4]   2>         at java.lang.Object.wait(Object.java:503)
>    [junit4]   2>         at java.util.TimerThread.mainLoop(Timer.java:526)
>    [junit4]   2>         at java.util.TimerThread.run(Timer.java:505)
>    [junit4]   2> NOTE: test params are: codec=Asserting(Lucene53): {},
> docValues:{}, sim=DefaultSimilarity, locale=es_BO,
> timezone=Antarctica/South_Pole
>    [junit4]   2> NOTE: SunOS 5.11 x86/Oracle Corporation 1.7.0_85 (32-
> bit)/cpus=3,threads=2,free=136627544,total=518979584
>    [junit4]   2> NOTE: All tests run in this JVM: [TestIndexingPerformance,
> TestCSVResponseWriter, DistributedQueryComponentCustomSortTest,
> DirectSolrConnectionTest, FullSolrCloudDistribCmdsTest,
> TestShardHandlerFactory, CacheHeaderTest, BasicZkTest, TestTrie,
> FieldAnalysisRequestHandlerTest, PKIAuthenticationIntegrationTest,
> OpenCloseCoreStressTest, TestSuggestSpellingConverter, StressHdfsTest,
> CleanupOldIndexTest, DistributedExpandComponentTest,
> TestHdfsUpdateLog, TestSolrXml, TestAddFieldRealTimeGet, TestJsonFacets,
> DistributedSuggestComponentTest,
> OutOfBoxZkACLAndCredentialsProvidersTest, AnalyticsMergeStrategyTest,
> HLLUtilTest, ResponseHeaderTest, SearchHandlerTest,
> BinaryUpdateRequestHandlerTest, DistributedFacetPivotWhiteBoxTest,
> ConnectionManagerTest, SpellCheckComponentTest,
> TestScoreJoinQPNoScore, SolrTestCaseJ4Test, SolrIndexSplitterTest,
> TestConfigSetsAPI, TestDefaultSearchFieldResource, TestCryptoKeys,
> TestNonDefinedSimilarityFactory, TestCoreDiscovery, RollingRestartTest,
> SolrInfoMBeanTest, CustomCollectionTest, DistributedVersionInfoTest,
> ClusterStateTest, TestReversedWildcardFilterFactory, SolrXmlInZkTest,
> DistributedFacetPivotLongTailTest, URLClassifyProcessorTest,
> TestLMJelinekMercerSimilarityFactory, RequestHandlersTest,
> RemoteQueryErrorTest, LeaderElectionIntegrationTest,
> SharedFSAutoReplicaFailoverTest, TestBadConfig,
> SignatureUpdateProcessorFactoryTest, TestCursorMarkWithoutUniqueKey,
> TestCrossCoreJoin, SparseHLLTest, DistributedQueueTest,
> BigEndianAscendingWordSerializerTest, TestBM25SimilarityFactory,
> AutoCommitTest, DateMathParserTest, BasicFunctionalityTest,
> SuggesterWFSTTest, TestCollapseQParserPlugin, TestManagedResource,
> TestSha256AuthenticationProvider, CollectionTooManyReplicasTest,
> BadCopyFieldTest, TestDownShardTolerantSearch, CloudMLTQParserTest,
> NotRequiredUniqueKeyTest, TestAnalyzeInfixSuggestions,
> ExitableDirectoryReaderTest, TestScoreJoinQPScore, DeleteShardTest,
> RankQueryTest, TestSchemaManager, UpdateRequestProcessorFactoryTest,
> CursorMarkTest, DistributedDebugComponentTest, DeleteReplicaTest,
> RAMDirectoryFactoryTest, ConcurrentDeleteAndCreateCollectionTest,
> TestQueryTypes, OutputWriterTest, TestSchemaSimilarityResource,
> HighlighterMaxOffsetTest, ResponseLogComponentTest,
> TestCloudPivotFacet, DocValuesMissingTest,
> FieldMutatingUpdateProcessorTest, HttpPartitionTest, TestCollationField,
> ZkStateWriterTest, TestQuerySenderListener, AtomicUpdatesTest,
> TestStressRecovery, TestRandomFaceting,
> SharedFSAutoReplicaFailoverUtilsTest, CoreAdminHandlerTest,
> HighlighterConfigTest, TestCustomSort, MultiTermTest,
> VMParamsZkACLAndCredentialsProvidersTest,
> IgnoreCommitOptimizeUpdateProcessorFactoryTest, CollectionReloadTest,
> PrimUtilsTest, TestRecovery, TestWriterPerf,
> AddSchemaFieldsUpdateProcessorFactoryTest, TimeZoneUtilsTest,
> CurrencyFieldOpenExchangeTest, TestSolrCLIRunExample,
> TestPHPSerializedResponseWriter, ChaosMonkeySafeLeaderTest,
> TestIndexSearcher, EnumFieldTest, TestSolrIndexConfig,
> TermVectorComponentDistributedTest, TestJoin, TestExpandComponent,
> TestManagedResourceStorage, SortByFunctionTest,
> TestDefaultSimilarityFactory, SuggesterTest, TestValueSourceCache,
> SolrPluginUtilsTest, TermVectorComponentTest, TestFiltering,
> TestQueryUtils, FileBasedSpellCheckerTest, BasicDistributedZk2Test,
> CollectionsAPIDistributedZkTest, TestReplicationHandler,
> TestDistributedSearch, BadIndexSchemaTest, ConvertedLegacyTest,
> HighlighterTest, ShowFileRequestHandlerTest, SpellCheckCollatorTest,
> SpatialFilterTest, NoCacheHeaderTest, WordBreakSolrSpellCheckerTest,
> TestPseudoReturnFields, TestAtomicUpdateErrorCases,
> TestWordDelimiterFilterFactory, DefaultValueUpdateProcessorTest,
> TestRemoteStreaming, DebugComponentTest, TestSurroundQueryParser,
> LukeRequestHandlerTest, TestSolrQueryParser,
> IndexSchemaRuntimeFieldTest, RegexBoostProcessorTest,
> TestJmxIntegration, QueryParsingTest, TestPartialUpdateDeduplication,
> CSVRequestHandlerTest, TestBinaryResponseWriter, SOLR749Test,
> CopyFieldTest, BadComponentTest, TestSolrDeletionPolicy2, SampleTest,
> TestBinaryField, TestSearchPerf, NumericFieldsTest, MinimalSchemaTest,
> TestFuzzyAnalyzedSuggestions, TestSolrCoreProperties,
> TestPostingsSolrHighlighter, TestLuceneMatchVersion,
> SpellPossibilityIteratorTest, TestCharFilters, SynonymTokenizerTest,
> EchoParamsTest, TestSweetSpotSimilarityFactory, TestPerFieldSimilarity,
> TestLMDirichletSimilarityFactory, ResourceLoaderTest,
> TestFastOutputStream, ScriptEngineTest,
> OpenExchangeRatesOrgProviderTest, PluginInfoTest, TestFastLRUCache,
> ChaosMonkeyNothingIsSafeTest, TestHighlightDedupGrouping,
> TestTolerantSearch, TestJettySolrRunner, AssignTest,
> AsyncCallRequestStatusResponseTest, CollectionStateFormat2Test,
> CollectionsAPIAsyncDistributedZkTest, DistribCursorPagingTest,
> DistribJoinFromCollectionTest, LeaderInitiatedRecoveryOnCommitTest,
> MigrateRouteKeyTest, OverseerStatusTest, ShardSplitTest,
> TestConfigSetsAPIExclusivity, TestConfigSetsAPIZkFailure,
> TestLeaderElectionZkExpiry, TestMiniSolrCloudCluster,
> TestShortCircuitedRequests, HdfsRecoverLeaseTest,
> CachingDirectoryFactoryTest, HdfsDirectoryFactoryTest, TestConfigOverlay,
> TestConfigSetImmutable, TestImplicitCoreProperties,
> TestInfoStreamLogging, TestInitParams, TestSolrDynamicMBean,
> TestBlobHandler, TestConfigReload, TestReplicationHandlerBackup,
> TestSolrConfigHandlerConcurrent, CoreAdminCreateDiscoverTest,
> CoreAdminRequestStatusTest, CoreMergeIndexesAdminHandlerTest,
> DistributedFacetPivotLargeTest, DistributedFacetPivotSmallTest,
> FacetPivotSmallTest, SuggestComponentTest, JavabinLoaderTest,
> SmileWriterTest, TestIntervalFaceting, TestChildDocTransformer,
> TestCustomDocTransformer, TestSortingResponseWriter,
> TestBulkSchemaAPI, TestFieldResource,
> TestManagedSchemaDynamicFieldResource, TestBulkSchemaConcurrent,
> TestCloudSchemaless, TestReloadDeadlock, TestSearcherReuse,
> TestSimpleQParserPlugin, TestSmileRequest, TestSolr4Spatial2,
> TestStandardQParsers, TestStressUserVersions, TestTrieFacet,
> TestMinMaxOnMultiValuedField, TestOrdValues,
> TestSortByMinMaxFunction, SimpleMLTQParserTest, TestDistribIDF,
> TestExactSharedStatsCache, TestPKIAuthenticationPlugin,
> TestBlendedInfixSuggestions, TestFileDictionaryLookup,
> TestFreeTextSuggestions, TestHighFrequencyDictionaryFactory,
> BlockCacheTest, HdfsDirectoryTest]
>    [junit4]   2> NOTE: reproduce with: ant test  -Dtestcase=HdfsDirectoryTest -
> Dtests.seed=5D8F351977870E3F -Dtests.slow=true -Dtests.locale=es_BO -
> Dtests.timezone=Antarctica/South_Pole -Dtests.asserts=true -
> Dtests.file.encoding=UTF-8
>    [junit4] ERROR   0.00s J1 | HdfsDirectoryTest (suite) <<<
>    [junit4]    > Throwable #1: java.security.AccessControlException: access
> denied ("java.io.FilePermission" "/export/home/jenkins/workspace/Lucene-
> Solr-5.x-Solaris/solr/build/solr-core/test/J1" "write")
>    [junit4]    > 	at
> __randomizedtesting.SeedInfo.seed([5D8F351977870E3F]:0)
>    [junit4]    > 	at
> java.security.AccessControlContext.checkPermission(AccessControlContext.j
> ava:395)
>    [junit4]    > 	at
> java.security.AccessController.checkPermission(AccessController.java:559)
>    [junit4]    > 	at
> java.lang.SecurityManager.checkPermission(SecurityManager.java:549)
>    [junit4]    > 	at
> java.lang.SecurityManager.checkWrite(SecurityManager.java:979)
>    [junit4]    > 	at java.io.File.canWrite(File.java:785)
>    [junit4]    > 	at org.apache.hadoop.fs.FileUtil.canWrite(FileUtil.java:1002)
>    [junit4]    > 	at
> org.apache.hadoop.hdfs.MiniDFSCluster.createPermissionsDiagnosisString(
> MiniDFSCluster.java:856)
>    [junit4]    > 	at
> org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.j
> ava:812)
>    [junit4]    > 	at
> org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:738)
>    [junit4]    > 	at
> org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:608)
>    [junit4]    > 	at
> org.apache.solr.cloud.hdfs.HdfsTestUtil.setupClass(HdfsTestUtil.java:98)
>    [junit4]    > 	at
> org.apache.solr.cloud.hdfs.HdfsTestUtil.setupClass(HdfsTestUtil.java:60)
>    [junit4]    > 	at
> org.apache.solr.store.hdfs.HdfsDirectoryTest.beforeClass(HdfsDirectoryTest.
> java:62)
>    [junit4]    > 	at java.lang.Thread.run(Thread.java:745)Throwable #2:
> com.carrotsearch.randomizedtesting.ThreadLeakError: 1 thread leaked from
> SUITE scope at org.apache.solr.store.hdfs.HdfsDirectoryTest:
>    [junit4]    >    1) Thread[id=20389, name=IPC Server idle connection scanner
> for port 41610, state=WAITING, group=TGRP-HdfsDirectoryTest]
>    [junit4]    >         at java.lang.Object.wait(Native Method)
>    [junit4]    >         at java.lang.Object.wait(Object.java:503)
>    [junit4]    >         at java.util.TimerThread.mainLoop(Timer.java:526)
>    [junit4]    >         at java.util.TimerThread.run(Timer.java:505)
>    [junit4]    > 	at
> __randomizedtesting.SeedInfo.seed([5D8F351977870E3F]:0)Throwable #3:
> com.carrotsearch.randomizedtesting.ThreadLeakError: There are still zombie
> threads that couldn't be terminated:
>    [junit4]    >    1) Thread[id=20389, name=IPC Server idle connection scanner
> for port 41610, state=WAITING, group=TGRP-HdfsDirectoryTest]
>    [junit4]    >         at java.lang.Object.wait(Native Method)
>    [junit4]    >         at java.lang.Object.wait(Object.java:503)
>    [junit4]    >         at java.util.TimerThread.mainLoop(Timer.java:526)
>    [junit4]    >         at java.util.TimerThread.run(Timer.java:505)
>    [junit4]    > 	at
> __randomizedtesting.SeedInfo.seed([5D8F351977870E3F]:0)
>    [junit4] Completed [521/536] on J1 in 33.05s, 0 tests, 3 errors <<< FAILURES!
> 
> [...truncated 64 lines...]
> BUILD FAILED
> /export/home/jenkins/workspace/Lucene-Solr-5.x-Solaris/build.xml:785:
> The following error occurred while executing this line:
> /export/home/jenkins/workspace/Lucene-Solr-5.x-Solaris/build.xml:729:
> The following error occurred while executing this line:
> /export/home/jenkins/workspace/Lucene-Solr-5.x-Solaris/build.xml:59: The
> following error occurred while executing this line:
> /export/home/jenkins/workspace/Lucene-Solr-5.x-
> Solaris/solr/build.xml:233: The following error occurred while executing this
> line:
> /export/home/jenkins/workspace/Lucene-Solr-5.x-Solaris/solr/common-
> build.xml:524: The following error occurred while executing this line:
> /export/home/jenkins/workspace/Lucene-Solr-5.x-Solaris/lucene/common-
> build.xml:1452: The following error occurred while executing this line:
> /export/home/jenkins/workspace/Lucene-Solr-5.x-Solaris/lucene/common-
> build.xml:1006: There were test failures: 536 suites, 2123 tests, 4 suite-level
> errors, 108 ignored (34 assumptions)
> 
> Total time: 77 minutes 51 seconds
> Build step 'Invoke Ant' marked build as failure
> Archiving artifacts
> [WARNINGS] Skipping publisher since build result is FAILURE
> Recording test results
> Email was triggered for: Failure - Any
> Sending email for trigger: Failure - Any
> 



---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@lucene.apache.org
For additional commands, e-mail: dev-help@lucene.apache.org


[JENKINS] Lucene-Solr-5.x-Solaris (multiarch/jdk1.7.0) - Build # 9 - Still Failing!

Posted by Policeman Jenkins Server <je...@thetaphi.de>.
Build: http://jenkins.thetaphi.de/job/Lucene-Solr-5.x-Solaris/9/
Java: multiarch/jdk1.7.0 -d32 -server -XX:+UseConcMarkSweepGC

4 tests failed.
FAILED:  junit.framework.TestSuite.org.apache.solr.cloud.hdfs.HdfsNNFailoverTest

Error Message:
Error while running command to get file permissions : java.io.IOException: Cannot run program "/bin/ls": error=12, Not enough space  at java.lang.ProcessBuilder.start(ProcessBuilder.java:1047)  at org.apache.hadoop.util.Shell.runCommand(Shell.java:485)  at org.apache.hadoop.util.Shell.run(Shell.java:455)  at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:715)  at org.apache.hadoop.util.Shell.execCommand(Shell.java:808)  at org.apache.hadoop.util.Shell.execCommand(Shell.java:791)  at org.apache.hadoop.fs.FileUtil.execCommand(FileUtil.java:1097)  at org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.loadPermissionInfo(RawLocalFileSystem.java:582)  at org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.getPermission(RawLocalFileSystem.java:557)  at org.apache.hadoop.util.DiskChecker.mkdirsWithExistsAndPermissionCheck(DiskChecker.java:139)  at org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:156)  at org.apache.hadoop.hdfs.server.datanode.DataNode$DataNodeDiskChecker.checkDir(DataNode.java:2239)  at org.apache.hadoop.hdfs.server.datanode.DataNode.checkStorageLocations(DataNode.java:2281)  at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:2263)  at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:2155)  at org.apache.hadoop.hdfs.MiniDFSCluster.startDataNodes(MiniDFSCluster.java:1443)  at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:828)  at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:738)  at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:608)  at org.apache.solr.cloud.hdfs.HdfsTestUtil.setupClass(HdfsTestUtil.java:98)  at org.apache.solr.cloud.hdfs.HdfsNNFailoverTest.setupClass(HdfsNNFailoverTest.java:44)  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)  at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)  at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)  at java.lang.reflect.Method.invoke(Method.java:606)  at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1627)  at com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:776)  at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:792)  at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)  at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)  at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46)  at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)  at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42)  at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)  at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)  at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)  at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)  at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)  at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:54)  at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48)  at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65)  at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:55)  at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)  at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365)  at java.lang.Thread.run(Thread.java:745) Caused by: java.io.IOException: error=12, Not enough space  at java.lang.UNIXProcess.forkAndExec(Native Method)  at java.lang.UNIXProcess.<init>(UNIXProcess.java:137)  at java.lang.ProcessImpl.start(ProcessImpl.java:130)  at java.lang.ProcessBuilder.start(ProcessBuilder.java:1028)  ... 44 more 

Stack Trace:
java.lang.RuntimeException: Error while running command to get file permissions : java.io.IOException: Cannot run program "/bin/ls": error=12, Not enough space
	at java.lang.ProcessBuilder.start(ProcessBuilder.java:1047)
	at org.apache.hadoop.util.Shell.runCommand(Shell.java:485)
	at org.apache.hadoop.util.Shell.run(Shell.java:455)
	at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:715)
	at org.apache.hadoop.util.Shell.execCommand(Shell.java:808)
	at org.apache.hadoop.util.Shell.execCommand(Shell.java:791)
	at org.apache.hadoop.fs.FileUtil.execCommand(FileUtil.java:1097)
	at org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.loadPermissionInfo(RawLocalFileSystem.java:582)
	at org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.getPermission(RawLocalFileSystem.java:557)
	at org.apache.hadoop.util.DiskChecker.mkdirsWithExistsAndPermissionCheck(DiskChecker.java:139)
	at org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:156)
	at org.apache.hadoop.hdfs.server.datanode.DataNode$DataNodeDiskChecker.checkDir(DataNode.java:2239)
	at org.apache.hadoop.hdfs.server.datanode.DataNode.checkStorageLocations(DataNode.java:2281)
	at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:2263)
	at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:2155)
	at org.apache.hadoop.hdfs.MiniDFSCluster.startDataNodes(MiniDFSCluster.java:1443)
	at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:828)
	at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:738)
	at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:608)
	at org.apache.solr.cloud.hdfs.HdfsTestUtil.setupClass(HdfsTestUtil.java:98)
	at org.apache.solr.cloud.hdfs.HdfsNNFailoverTest.setupClass(HdfsNNFailoverTest.java:44)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1627)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:776)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:792)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:54)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65)
	at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:55)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365)
	at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.IOException: error=12, Not enough space
	at java.lang.UNIXProcess.forkAndExec(Native Method)
	at java.lang.UNIXProcess.<init>(UNIXProcess.java:137)
	at java.lang.ProcessImpl.start(ProcessImpl.java:130)
	at java.lang.ProcessBuilder.start(ProcessBuilder.java:1028)
	... 44 more

	at __randomizedtesting.SeedInfo.seed([5D8F351977870E3F]:0)
	at org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.loadPermissionInfo(RawLocalFileSystem.java:620)
	at org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.getPermission(RawLocalFileSystem.java:557)
	at org.apache.hadoop.util.DiskChecker.mkdirsWithExistsAndPermissionCheck(DiskChecker.java:139)
	at org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:156)
	at org.apache.hadoop.hdfs.server.datanode.DataNode$DataNodeDiskChecker.checkDir(DataNode.java:2239)
	at org.apache.hadoop.hdfs.server.datanode.DataNode.checkStorageLocations(DataNode.java:2281)
	at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:2263)
	at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:2155)
	at org.apache.hadoop.hdfs.MiniDFSCluster.startDataNodes(MiniDFSCluster.java:1443)
	at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:828)
	at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:738)
	at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:608)
	at org.apache.solr.cloud.hdfs.HdfsTestUtil.setupClass(HdfsTestUtil.java:98)
	at org.apache.solr.cloud.hdfs.HdfsNNFailoverTest.setupClass(HdfsNNFailoverTest.java:44)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1627)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:776)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:792)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:54)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65)
	at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:55)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365)
	at java.lang.Thread.run(Thread.java:745)


FAILED:  junit.framework.TestSuite.org.apache.solr.store.hdfs.HdfsDirectoryTest

Error Message:
access denied ("java.io.FilePermission" "/export/home/jenkins/workspace/Lucene-Solr-5.x-Solaris/solr/build/solr-core/test/J1" "write")

Stack Trace:
java.security.AccessControlException: access denied ("java.io.FilePermission" "/export/home/jenkins/workspace/Lucene-Solr-5.x-Solaris/solr/build/solr-core/test/J1" "write")
	at __randomizedtesting.SeedInfo.seed([5D8F351977870E3F]:0)
	at java.security.AccessControlContext.checkPermission(AccessControlContext.java:395)
	at java.security.AccessController.checkPermission(AccessController.java:559)
	at java.lang.SecurityManager.checkPermission(SecurityManager.java:549)
	at java.lang.SecurityManager.checkWrite(SecurityManager.java:979)
	at java.io.File.canWrite(File.java:785)
	at org.apache.hadoop.fs.FileUtil.canWrite(FileUtil.java:1002)
	at org.apache.hadoop.hdfs.MiniDFSCluster.createPermissionsDiagnosisString(MiniDFSCluster.java:856)
	at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:812)
	at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:738)
	at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:608)
	at org.apache.solr.cloud.hdfs.HdfsTestUtil.setupClass(HdfsTestUtil.java:98)
	at org.apache.solr.cloud.hdfs.HdfsTestUtil.setupClass(HdfsTestUtil.java:60)
	at org.apache.solr.store.hdfs.HdfsDirectoryTest.beforeClass(HdfsDirectoryTest.java:62)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1627)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:776)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:792)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:54)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65)
	at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:55)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365)
	at java.lang.Thread.run(Thread.java:745)


FAILED:  junit.framework.TestSuite.org.apache.solr.store.hdfs.HdfsDirectoryTest

Error Message:
1 thread leaked from SUITE scope at org.apache.solr.store.hdfs.HdfsDirectoryTest:     1) Thread[id=20389, name=IPC Server idle connection scanner for port 41610, state=WAITING, group=TGRP-HdfsDirectoryTest]         at java.lang.Object.wait(Native Method)         at java.lang.Object.wait(Object.java:503)         at java.util.TimerThread.mainLoop(Timer.java:526)         at java.util.TimerThread.run(Timer.java:505)

Stack Trace:
com.carrotsearch.randomizedtesting.ThreadLeakError: 1 thread leaked from SUITE scope at org.apache.solr.store.hdfs.HdfsDirectoryTest: 
   1) Thread[id=20389, name=IPC Server idle connection scanner for port 41610, state=WAITING, group=TGRP-HdfsDirectoryTest]
        at java.lang.Object.wait(Native Method)
        at java.lang.Object.wait(Object.java:503)
        at java.util.TimerThread.mainLoop(Timer.java:526)
        at java.util.TimerThread.run(Timer.java:505)
	at __randomizedtesting.SeedInfo.seed([5D8F351977870E3F]:0)


FAILED:  junit.framework.TestSuite.org.apache.solr.store.hdfs.HdfsDirectoryTest

Error Message:
There are still zombie threads that couldn't be terminated:    1) Thread[id=20389, name=IPC Server idle connection scanner for port 41610, state=WAITING, group=TGRP-HdfsDirectoryTest]         at java.lang.Object.wait(Native Method)         at java.lang.Object.wait(Object.java:503)         at java.util.TimerThread.mainLoop(Timer.java:526)         at java.util.TimerThread.run(Timer.java:505)

Stack Trace:
com.carrotsearch.randomizedtesting.ThreadLeakError: There are still zombie threads that couldn't be terminated:
   1) Thread[id=20389, name=IPC Server idle connection scanner for port 41610, state=WAITING, group=TGRP-HdfsDirectoryTest]
        at java.lang.Object.wait(Native Method)
        at java.lang.Object.wait(Object.java:503)
        at java.util.TimerThread.mainLoop(Timer.java:526)
        at java.util.TimerThread.run(Timer.java:505)
	at __randomizedtesting.SeedInfo.seed([5D8F351977870E3F]:0)




Build Log:
[...truncated 10577 lines...]
   [junit4] Suite: org.apache.solr.cloud.hdfs.HdfsNNFailoverTest
   [junit4]   2> Creating dataDir: /export/home/jenkins/workspace/Lucene-Solr-5.x-Solaris/solr/build/solr-core/test/J0/temp/solr.cloud.hdfs.HdfsNNFailoverTest_5D8F351977870E3F-001/init-core-data-001
   [junit4]   2> 2599844 INFO  (SUITE-HdfsNNFailoverTest-seed#[5D8F351977870E3F]-worker) [    ] o.a.s.BaseDistributedSearchTestCase Setting hostContext system property: /
   [junit4]   2> 2616331 WARN  (SUITE-HdfsNNFailoverTest-seed#[5D8F351977870E3F]-worker) [    ] o.a.h.u.NativeCodeLoader Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
   [junit4]   1> Formatting using clusterid: testClusterID
   [junit4]   2> 2617524 WARN  (SUITE-HdfsNNFailoverTest-seed#[5D8F351977870E3F]-worker) [    ] o.a.h.m.i.MetricsConfig Cannot locate configuration: tried hadoop-metrics2-namenode.properties,hadoop-metrics2.properties
   [junit4]   2> 2617755 INFO  (SUITE-HdfsNNFailoverTest-seed#[5D8F351977870E3F]-worker) [    ] o.m.log Logging to org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via org.mortbay.log.Slf4jLog
   [junit4]   2> 2617771 WARN  (SUITE-HdfsNNFailoverTest-seed#[5D8F351977870E3F]-worker) [    ] o.a.h.h.HttpRequestLog Jetty request log can only be enabled using Log4j
   [junit4]   2> 2617878 INFO  (SUITE-HdfsNNFailoverTest-seed#[5D8F351977870E3F]-worker) [    ] o.m.log jetty-6.1.26
   [junit4]   2> 2617942 INFO  (SUITE-HdfsNNFailoverTest-seed#[5D8F351977870E3F]-worker) [    ] o.m.log Extract jar:file:/export/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-2.6.0-tests.jar!/webapps/hdfs to ./temp/Jetty_solaris.vm_35231_hdfs____thayv4/webapp
   [junit4]   2> 2618129 INFO  (SUITE-HdfsNNFailoverTest-seed#[5D8F351977870E3F]-worker) [    ] o.m.log NO JSP Support for /, did not find org.apache.jasper.servlet.JspServlet
   [junit4]   2> 2619464 INFO  (SUITE-HdfsNNFailoverTest-seed#[5D8F351977870E3F]-worker) [    ] o.m.log Started HttpServer2$SelectChannelConnectorWithSafeStartup@solaris-vm:35231
   [junit4]   2> 2637264 WARN  (SUITE-HdfsNNFailoverTest-seed#[5D8F351977870E3F]-worker) [    ] o.a.h.h.s.d.DataNode Invalid dfs.datanode.data.dir /export/home/jenkins/workspace/Lucene-Solr-5.x-Solaris/solr/build/solr-core/test/J0/temp/solr.cloud.hdfs.HdfsNNFailoverTest_5D8F351977870E3F-001/tempDir-001/hdfsBaseDir/data/data2 : 
   [junit4]   2> java.io.IOException: Cannot run program "chmod": error=12, Not enough space
   [junit4]   2> 	at java.lang.ProcessBuilder.start(ProcessBuilder.java:1047)
   [junit4]   2> 	at org.apache.hadoop.util.Shell.runCommand(Shell.java:485)
   [junit4]   2> 	at org.apache.hadoop.util.Shell.run(Shell.java:455)
   [junit4]   2> 	at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:715)
   [junit4]   2> 	at org.apache.hadoop.util.Shell.execCommand(Shell.java:808)
   [junit4]   2> 	at org.apache.hadoop.util.Shell.execCommand(Shell.java:791)
   [junit4]   2> 	at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:656)
   [junit4]   2> 	at org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.java:490)
   [junit4]   2> 	at org.apache.hadoop.util.DiskChecker.mkdirsWithExistsAndPermissionCheck(DiskChecker.java:140)
   [junit4]   2> 	at org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:156)
   [junit4]   2> 	at org.apache.hadoop.hdfs.server.datanode.DataNode$DataNodeDiskChecker.checkDir(DataNode.java:2239)
   [junit4]   2> 	at org.apache.hadoop.hdfs.server.datanode.DataNode.checkStorageLocations(DataNode.java:2281)
   [junit4]   2> 	at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:2263)
   [junit4]   2> 	at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:2155)
   [junit4]   2> 	at org.apache.hadoop.hdfs.MiniDFSCluster.startDataNodes(MiniDFSCluster.java:1443)
   [junit4]   2> 	at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:828)
   [junit4]   2> 	at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:738)
   [junit4]   2> 	at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:608)
   [junit4]   2> 	at org.apache.solr.cloud.hdfs.HdfsTestUtil.setupClass(HdfsTestUtil.java:98)
   [junit4]   2> 	at org.apache.solr.cloud.hdfs.HdfsNNFailoverTest.setupClass(HdfsNNFailoverTest.java:44)
   [junit4]   2> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
   [junit4]   2> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
   [junit4]   2> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
   [junit4]   2> 	at java.lang.reflect.Method.invoke(Method.java:606)
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1627)
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:776)
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:792)
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
   [junit4]   2> 	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46)
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
   [junit4]   2> 	at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42)
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
   [junit4]   2> 	at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:54)
   [junit4]   2> 	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48)
   [junit4]   2> 	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65)
   [junit4]   2> 	at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:55)
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365)
   [junit4]   2> 	at java.lang.Thread.run(Thread.java:745)
   [junit4]   2> Caused by: java.io.IOException: error=12, Not enough space
   [junit4]   2> 	at java.lang.UNIXProcess.forkAndExec(Native Method)
   [junit4]   2> 	at java.lang.UNIXProcess.<init>(UNIXProcess.java:137)
   [junit4]   2> 	at java.lang.ProcessImpl.start(ProcessImpl.java:130)
   [junit4]   2> 	at java.lang.ProcessBuilder.start(ProcessBuilder.java:1028)
   [junit4]   2> 	... 43 more
   [junit4]   2> 2637287 WARN  (org.apache.hadoop.util.JvmPauseMonitor$Monitor@be51b7) [    ] o.a.h.u.JvmPauseMonitor Detected pause in JVM or host machine (eg GC): pause of approximately 15969ms
   [junit4]   2> No GCs detected
   [junit4]   2> 2637368 WARN  (SUITE-HdfsNNFailoverTest-seed#[5D8F351977870E3F]-worker) [    ] o.a.h.h.HttpRequestLog Jetty request log can only be enabled using Log4j
   [junit4]   2> 2637384 INFO  (SUITE-HdfsNNFailoverTest-seed#[5D8F351977870E3F]-worker) [    ] o.m.log jetty-6.1.26
   [junit4]   2> 2637422 INFO  (SUITE-HdfsNNFailoverTest-seed#[5D8F351977870E3F]-worker) [    ] o.m.log Extract jar:file:/export/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-2.6.0-tests.jar!/webapps/datanode to ./temp/Jetty_solaris.vm_49465_datanode____96t731/webapp
   [junit4]   2> 2637655 INFO  (SUITE-HdfsNNFailoverTest-seed#[5D8F351977870E3F]-worker) [    ] o.m.log NO JSP Support for /, did not find org.apache.jasper.servlet.JspServlet
   [junit4]   2> 2638756 INFO  (SUITE-HdfsNNFailoverTest-seed#[5D8F351977870E3F]-worker) [    ] o.m.log Started HttpServer2$SelectChannelConnectorWithSafeStartup@solaris-vm:49465
   [junit4]   2> 2645079 INFO  (SUITE-HdfsNNFailoverTest-seed#[5D8F351977870E3F]-worker) [    ] o.m.log Stopped HttpServer2$SelectChannelConnectorWithSafeStartup@solaris-vm:0
   [junit4]   2> 2645234 ERROR (DataNode: [[[DISK]file:/export/home/jenkins/workspace/Lucene-Solr-5.x-Solaris/solr/build/solr-core/test/J0/temp/solr.cloud.hdfs.HdfsNNFailoverTest_5D8F351977870E3F-001/tempDir-001/hdfsBaseDir/data/data1/, [DISK]file:/export/home/jenkins/workspace/Lucene-Solr-5.x-Solaris/solr/build/solr-core/test/J0/temp/solr.cloud.hdfs.HdfsNNFailoverTest_5D8F351977870E3F-001/tempDir-001/hdfsBaseDir/data/data2/]]  heartbeating to solaris-vm/127.0.0.1:61051) [    ] o.a.h.h.s.d.DataNode Initialization failed for Block pool <registering> (Datanode Uuid unassigned) service to solaris-vm/127.0.0.1:61051. Exiting. 
   [junit4]   2> java.io.IOException: DN shut down before block pool connected
   [junit4]   2> 	at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.retrieveNamespaceInfo(BPServiceActor.java:185)
   [junit4]   2> 	at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:215)
   [junit4]   2> 	at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:828)
   [junit4]   2> 	at java.lang.Thread.run(Thread.java:745)
   [junit4]   2> 2645236 WARN  (DataNode: [[[DISK]file:/export/home/jenkins/workspace/Lucene-Solr-5.x-Solaris/solr/build/solr-core/test/J0/temp/solr.cloud.hdfs.HdfsNNFailoverTest_5D8F351977870E3F-001/tempDir-001/hdfsBaseDir/data/data1/, [DISK]file:/export/home/jenkins/workspace/Lucene-Solr-5.x-Solaris/solr/build/solr-core/test/J0/temp/solr.cloud.hdfs.HdfsNNFailoverTest_5D8F351977870E3F-001/tempDir-001/hdfsBaseDir/data/data2/]]  heartbeating to solaris-vm/127.0.0.1:61051) [    ] o.a.h.h.s.d.DataNode Ending block pool service for: Block pool <registering> (Datanode Uuid unassigned) service to solaris-vm/127.0.0.1:61051
   [junit4]   2> 2645259 WARN  (org.apache.hadoop.hdfs.server.blockmanagement.DecommissionManager$Monitor@7b7964) [    ] o.a.h.h.s.b.DecommissionManager Monitor interrupted: java.lang.InterruptedException: sleep interrupted
   [junit4]   2> 2645314 INFO  (SUITE-HdfsNNFailoverTest-seed#[5D8F351977870E3F]-worker) [    ] o.m.log Stopped HttpServer2$SelectChannelConnectorWithSafeStartup@solaris-vm:0
   [junit4]   2> 2645418 ERROR (SUITE-HdfsNNFailoverTest-seed#[5D8F351977870E3F]-worker) [    ] o.a.h.m.l.MethodMetric Error invoking method getBlocksTotal
   [junit4]   2> java.lang.reflect.InvocationTargetException
   [junit4]   2> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
   [junit4]   2> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
   [junit4]   2> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
   [junit4]   2> 	at java.lang.reflect.Method.invoke(Method.java:606)
   [junit4]   2> 	at org.apache.hadoop.metrics2.lib.MethodMetric$2.snapshot(MethodMetric.java:111)
   [junit4]   2> 	at org.apache.hadoop.metrics2.lib.MethodMetric.snapshot(MethodMetric.java:144)
   [junit4]   2> 	at org.apache.hadoop.metrics2.lib.MetricsRegistry.snapshot(MetricsRegistry.java:387)
   [junit4]   2> 	at org.apache.hadoop.metrics2.lib.MetricsSourceBuilder$1.getMetrics(MetricsSourceBuilder.java:79)
   [junit4]   2> 	at org.apache.hadoop.metrics2.impl.MetricsSourceAdapter.getMetrics(MetricsSourceAdapter.java:195)
   [junit4]   2> 	at org.apache.hadoop.metrics2.impl.MetricsSourceAdapter.updateJmxCache(MetricsSourceAdapter.java:172)
   [junit4]   2> 	at org.apache.hadoop.metrics2.impl.MetricsSourceAdapter.getMBeanInfo(MetricsSourceAdapter.java:151)
   [junit4]   2> 	at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getClassName(DefaultMBeanServerInterceptor.java:1804)
   [junit4]   2> 	at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.safeGetClassName(DefaultMBeanServerInterceptor.java:1595)
   [junit4]   2> 	at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.checkMBeanPermission(DefaultMBeanServerInterceptor.java:1813)
   [junit4]   2> 	at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.exclusiveUnregisterMBean(DefaultMBeanServerInterceptor.java:430)
   [junit4]   2> 	at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.unregisterMBean(DefaultMBeanServerInterceptor.java:415)
   [junit4]   2> 	at com.sun.jmx.mbeanserver.JmxMBeanServer.unregisterMBean(JmxMBeanServer.java:546)
   [junit4]   2> 	at org.apache.hadoop.metrics2.util.MBeans.unregister(MBeans.java:81)
   [junit4]   2> 	at org.apache.hadoop.metrics2.impl.MetricsSourceAdapter.stopMBeans(MetricsSourceAdapter.java:227)
   [junit4]   2> 	at org.apache.hadoop.metrics2.impl.MetricsSourceAdapter.stop(MetricsSourceAdapter.java:212)
   [junit4]   2> 	at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.stopSources(MetricsSystemImpl.java:461)
   [junit4]   2> 	at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.stop(MetricsSystemImpl.java:212)
   [junit4]   2> 	at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.shutdown(MetricsSystemImpl.java:592)
   [junit4]   2> 	at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.shutdownInstance(DefaultMetricsSystem.java:72)
   [junit4]   2> 	at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.shutdown(DefaultMetricsSystem.java:68)
   [junit4]   2> 	at org.apache.hadoop.hdfs.server.namenode.metrics.NameNodeMetrics.shutdown(NameNodeMetrics.java:145)
   [junit4]   2> 	at org.apache.hadoop.hdfs.server.namenode.NameNode.stop(NameNode.java:822)
   [junit4]   2> 	at org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:1720)
   [junit4]   2> 	at org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:1699)
   [junit4]   2> 	at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:838)
   [junit4]   2> 	at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:738)
   [junit4]   2> 	at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:608)
   [junit4]   2> 	at org.apache.solr.cloud.hdfs.HdfsTestUtil.setupClass(HdfsTestUtil.java:98)
   [junit4]   2> 	at org.apache.solr.cloud.hdfs.HdfsNNFailoverTest.setupClass(HdfsNNFailoverTest.java:44)
   [junit4]   2> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
   [junit4]   2> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
   [junit4]   2> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
   [junit4]   2> 	at java.lang.reflect.Method.invoke(Method.java:606)
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1627)
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:776)
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:792)
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
   [junit4]   2> 	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46)
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
   [junit4]   2> 	at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42)
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
   [junit4]   2> 	at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:54)
   [junit4]   2> 	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48)
   [junit4]   2> 	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65)
   [junit4]   2> 	at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:55)
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365)
   [junit4]   2> 	at java.lang.Thread.run(Thread.java:745)
   [junit4]   2> Caused by: java.lang.NullPointerException
   [junit4]   2> 	at org.apache.hadoop.hdfs.server.blockmanagement.BlocksMap.size(BlocksMap.java:198)
   [junit4]   2> 	at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.getTotalBlocks(BlockManager.java:3291)
   [junit4]   2> 	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlocksTotal(FSNamesystem.java:6223)
   [junit4]   2> 	... 58 more
   [junit4]   2> 2645432 INFO  (SUITE-HdfsNNFailoverTest-seed#[5D8F351977870E3F]-worker) [    ] o.a.s.SolrTestCaseJ4 ###deleteCore
   [junit4]   2> NOTE: test params are: codec=Asserting(Lucene53), sim=RandomSimilarityProvider(queryNorm=true,coord=yes): {}, locale=mk_MK, timezone=Asia/Shanghai
   [junit4]   2> NOTE: SunOS 5.11 x86/Oracle Corporation 1.7.0_85 (32-bit)/cpus=3,threads=1,free=99794816,total=518979584
   [junit4]   2> NOTE: All tests run in this JVM: [SolrCloudExampleTest, TestStressVersions, TestSerializedLuceneMatchVersion, TestSolrJ, DistanceUnitsTest, MultiThreadedOCPTest, TestDistribDocBasedVersion, BJQParserTest, ZkCLITest, QueryEqualityTest, PrimitiveFieldTypeTest, DistributedQueryComponentOptimizationTest, AliasIntegrationTest, TestInitQParser, TestAuthorizationFramework, TestLazyCores, SolrIndexConfigTest, TestFunctionQuery, TestXIncludeConfig, HardAutoCommitTest, DocValuesMultiTest, TestDefaultStatsCache, SolrRequestParserTest, RecoveryZkTest, UpdateParamsTest, TestSolrDeletionPolicy1, TestDFRSimilarityFactory, TestFastWriter, PathHierarchyTokenizerFactoryTest, TestDynamicLoading, TestElisionMultitermQuery, PolyFieldTest, UnloadDistributedZkTest, TestJsonRequest, TestRuleBasedAuthorizationPlugin, TestManagedStopFilterFactory, TestRawResponseWriter, IndexSchemaTest, TestEmbeddedSolrServerConstructors, InfoHandlerTest, AlternateDirectoryTest, LeaderElectionTest, JsonLoaderTest, TestCoreContainer, DirectSolrSpellCheckerTest, RequestLoggingTest, ZkNodePropsTest, TermsComponentTest, TestConfig, TestFieldTypeCollectionResource, XsltUpdateRequestHandlerTest, TestManagedSchemaFieldResource, TestSchemaResource, DataDrivenBlockJoinTest, TestExactStatsCache, TestConfigSetProperties, DeleteLastCustomShardedReplicaTest, TestAnalyzedSuggestions, DirectUpdateHandlerTest, ExternalFileFieldSortTest, TestIBSimilarityFactory, TestMissingGroups, ClusterStateUpdateTest, ActionThrottleTest, QueryElevationComponentTest, DocValuesTest, QueryResultKeyTest, TestLRUCache, TestPhraseSuggestions, SimplePostToolTest, TriLevelCompositeIdRoutingTest, DistributedMLTComponentTest, CloudExitableDirectoryReaderTest, TestSolrCloudWithKerberosAlt, TestCodecSupport, TestConfigSets, PeerSyncTest, XmlUpdateRequestHandlerTest, SpatialHeatmapFacetsTest, SoftAutoCommitTest, TestSchemaNameResource, PreAnalyzedUpdateProcessorTest, TestJmxMonitoredMap, TestDistributedStatsComponentCardinality, TestManagedSynonymFilterFactory, JSONWriterTest, TestNRTOpen, ReplicationFactorTest, DOMUtilTest, SolrCoreTest, DocExpirationUpdateProcessorFactoryTest, FastVectorHighlighterTest, SuggesterFSTTest, TestExtendedDismaxParser, TestSolrConfigHandler, DocumentAnalysisRequestHandlerTest, DistributedFacetPivotSmallAdvancedTest, BlockDirectoryTest, TestQuerySenderNoQuery, TestHashPartitioner, DateFieldTest, SegmentsInfoRequestHandlerTest, TestFieldCollectionResource, RecoveryAfterSoftCommitTest, TestMergePolicyConfig, TestFieldSortValues, SecurityConfHandlerTest, TestStressReorder, BufferStoreTest, TestRandomRequestDistribution, HdfsBasicDistributedZkTest, TestCloudManagedSchemaConcurrent, TestReplicaProperties, DisMaxRequestHandlerTest, TestMacros, TestStressLucene, TestReloadAndDeleteDocs, BasicAuthIntegrationTest, TestDocSet, BasicDistributedZkTest, DistributedQueryElevationComponentTest, TestGroupingSearch, TestObjectReleaseTracker, MoreLikeThisHandlerTest, OverseerTest, TestFaceting, TestUpdate, TestClassNameShortening, TestRestManager, SyncSliceTest, ShardRoutingTest, ZkSolrClientTest, TestZkChroot, TestRandomDVFaceting, ShardRoutingCustomTest, TestDistributedGrouping, DistributedSpellCheckComponentTest, ZkControllerTest, TestRealTimeGet, TestReload, DistributedTermsComponentTest, TestRangeQuery, SimpleFacetsTest, TestSolr4Spatial, StatsComponentTest, SolrCmdDistributorTest, TestSort, CurrencyFieldXmlFileTest, AnalysisAfterCoreReloadTest, TestFoldingMultitermQuery, SuggesterTSTTest, TestCSVLoader, SchemaVersionSpecificBehaviorTest, SolrCoreCheckLockOnStartupTest, DirectUpdateHandlerOptimizeTest, StatelessScriptUpdateProcessorFactoryTest, DistanceFunctionTest, IndexBasedSpellCheckerTest, StandardRequestHandlerTest, TestOmitPositions, DocumentBuilderTest, RequiredFieldsTest, TestArbitraryIndexDir, LoggingHandlerTest, ReturnFieldsTest, MBeansHandlerTest, UniqFieldsUpdateProcessorFactoryTest, PingRequestHandlerTest, TestComponentsName, TestLFUCache, PreAnalyzedFieldTest, TestSystemIdResolver, SpellingQueryConverterTest, TestUtils, TestDocumentBuilder, SliceStateTest, SystemInfoHandlerTest, UUIDFieldTest, FileUtilsTest, CircularListTest, TestRTGBase, CursorPagingTest, DistributedIntervalFacetingTest, TestDistributedMissingSort, TestSimpleTrackingShardHandler, AsyncMigrateRouteKeyTest, DeleteInactiveReplicaTest, DistribDocExpirationUpdateProcessorTest, LeaderFailoverAfterPartitionTest, OverriddenZkACLAndCredentialsProvidersTest, OverseerCollectionConfigSetProcessorTest, OverseerRolesTest, OverseerTaskQueueTest, SSLMigrationTest, SaslZkACLProviderTest, SimpleCollectionCreateDeleteTest, TestAuthenticationFramework, TestCloudInspectUtil, TestCollectionAPI, TestMiniSolrCloudClusterSSL, TestRebalanceLeaders, TestRequestStatusCollectionAPI, HdfsBasicDistributedZk2Test, HdfsChaosMonkeySafeLeaderTest, HdfsCollectionsAPIDistributedZkTest, HdfsNNFailoverTest]
   [junit4]   2> NOTE: reproduce with: ant test  -Dtestcase=HdfsNNFailoverTest -Dtests.seed=5D8F351977870E3F -Dtests.slow=true -Dtests.locale=mk_MK -Dtests.timezone=Asia/Shanghai -Dtests.asserts=true -Dtests.file.encoding=UTF-8
   [junit4] ERROR   0.00s J0 | HdfsNNFailoverTest (suite) <<<
   [junit4]    > Throwable #1: java.lang.RuntimeException: Error while running command to get file permissions : java.io.IOException: Cannot run program "/bin/ls": error=12, Not enough space
   [junit4]    > 	at java.lang.ProcessBuilder.start(ProcessBuilder.java:1047)
   [junit4]    > 	at org.apache.hadoop.util.Shell.runCommand(Shell.java:485)
   [junit4]    > 	at org.apache.hadoop.util.Shell.run(Shell.java:455)
   [junit4]    > 	at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:715)
   [junit4]    > 	at org.apache.hadoop.util.Shell.execCommand(Shell.java:808)
   [junit4]    > 	at org.apache.hadoop.util.Shell.execCommand(Shell.java:791)
   [junit4]    > 	at org.apache.hadoop.fs.FileUtil.execCommand(FileUtil.java:1097)
   [junit4]    > 	at org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.loadPermissionInfo(RawLocalFileSystem.java:582)
   [junit4]    > 	at org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.getPermission(RawLocalFileSystem.java:557)
   [junit4]    > 	at org.apache.hadoop.util.DiskChecker.mkdirsWithExistsAndPermissionCheck(DiskChecker.java:139)
   [junit4]    > 	at org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:156)
   [junit4]    > 	at org.apache.hadoop.hdfs.server.datanode.DataNode$DataNodeDiskChecker.checkDir(DataNode.java:2239)
   [junit4]    > 	at org.apache.hadoop.hdfs.server.datanode.DataNode.checkStorageLocations(DataNode.java:2281)
   [junit4]    > 	at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:2263)
   [junit4]    > 	at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:2155)
   [junit4]    > 	at org.apache.hadoop.hdfs.MiniDFSCluster.startDataNodes(MiniDFSCluster.java:1443)
   [junit4]    > 	at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:828)
   [junit4]    > 	at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:738)
   [junit4]    > 	at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:608)
   [junit4]    > 	at org.apache.solr.cloud.hdfs.HdfsTestUtil.setupClass(HdfsTestUtil.java:98)
   [junit4]    > 	at org.apache.solr.cloud.hdfs.HdfsNNFailoverTest.setupClass(HdfsNNFailoverTest.java:44)
   [junit4]    > 	at java.lang.Thread.run(Thread.java:745)
   [junit4]    > Caused by: java.io.IOException: error=12, Not enough space
   [junit4]    > 	at java.lang.UNIXProcess.forkAndExec(Native Method)
   [junit4]    > 	at java.lang.UNIXProcess.<init>(UNIXProcess.java:137)
   [junit4]    > 	at java.lang.ProcessImpl.start(ProcessImpl.java:130)
   [junit4]    > 	at java.lang.ProcessBuilder.start(ProcessBuilder.java:1028)
   [junit4]    > 	... 44 more
   [junit4]    > 	at __randomizedtesting.SeedInfo.seed([5D8F351977870E3F]:0)
   [junit4]    > 	at org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.loadPermissionInfo(RawLocalFileSystem.java:620)
   [junit4]    > 	at org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.getPermission(RawLocalFileSystem.java:557)
   [junit4]    > 	at org.apache.hadoop.util.DiskChecker.mkdirsWithExistsAndPermissionCheck(DiskChecker.java:139)
   [junit4]    > 	at org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:156)
   [junit4]    > 	at org.apache.hadoop.hdfs.server.datanode.DataNode$DataNodeDiskChecker.checkDir(DataNode.java:2239)
   [junit4]    > 	at org.apache.hadoop.hdfs.server.datanode.DataNode.checkStorageLocations(DataNode.java:2281)
   [junit4]    > 	at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:2263)
   [junit4]    > 	at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:2155)
   [junit4]    > 	at org.apache.hadoop.hdfs.MiniDFSCluster.startDataNodes(MiniDFSCluster.java:1443)
   [junit4]    > 	at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:828)
   [junit4]    > 	at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:738)
   [junit4]    > 	at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:608)
   [junit4]    > 	at org.apache.solr.cloud.hdfs.HdfsTestUtil.setupClass(HdfsTestUtil.java:98)
   [junit4]    > 	at org.apache.solr.cloud.hdfs.HdfsNNFailoverTest.setupClass(HdfsNNFailoverTest.java:44)
   [junit4]    > 	at java.lang.Thread.run(Thread.java:745)
   [junit4] Completed [426/536] on J0 in 45.66s, 0 tests, 1 error <<< FAILURES!

[...truncated 300 lines...]
   [junit4] Suite: org.apache.solr.store.hdfs.HdfsDirectoryTest
   [junit4]   2> Creating dataDir: /export/home/jenkins/workspace/Lucene-Solr-5.x-Solaris/solr/build/solr-core/test/J1/temp/solr.store.hdfs.HdfsDirectoryTest_5D8F351977870E3F-001/init-core-data-001
   [junit4]   2> 3147821 INFO  (SUITE-HdfsDirectoryTest-seed#[5D8F351977870E3F]-worker) [    ] o.a.s.SolrTestCaseJ4 Randomized ssl (false) and clientAuth (false)
   [junit4]   1> Formatting using clusterid: testClusterID
   [junit4]   2> 3147964 WARN  (SUITE-HdfsDirectoryTest-seed#[5D8F351977870E3F]-worker) [    ] o.a.h.m.i.MetricsConfig Cannot locate configuration: tried hadoop-metrics2-namenode.properties,hadoop-metrics2.properties
   [junit4]   2> 3147974 WARN  (SUITE-HdfsDirectoryTest-seed#[5D8F351977870E3F]-worker) [    ] o.a.h.h.HttpRequestLog Jetty request log can only be enabled using Log4j
   [junit4]   2> 3147976 INFO  (SUITE-HdfsDirectoryTest-seed#[5D8F351977870E3F]-worker) [    ] o.m.log jetty-6.1.26
   [junit4]   2> 3147994 INFO  (SUITE-HdfsDirectoryTest-seed#[5D8F351977870E3F]-worker) [    ] o.m.log Extract jar:file:/export/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-2.6.0-tests.jar!/webapps/hdfs to ./temp/Jetty_solaris.vm_46547_hdfs____.vwfmpk/webapp
   [junit4]   2> 3148170 INFO  (SUITE-HdfsDirectoryTest-seed#[5D8F351977870E3F]-worker) [    ] o.m.log NO JSP Support for /, did not find org.apache.jasper.servlet.JspServlet
   [junit4]   2> 3148982 INFO  (SUITE-HdfsDirectoryTest-seed#[5D8F351977870E3F]-worker) [    ] o.m.log Started HttpServer2$SelectChannelConnectorWithSafeStartup@solaris-vm:46547
   [junit4]   2> 3157264 INFO  (SUITE-HdfsDirectoryTest-seed#[5D8F351977870E3F]-worker) [    ] o.m.log Stopped HttpServer2$SelectChannelConnectorWithSafeStartup@solaris-vm:0
   [junit4]   2> 3157403 INFO  (SUITE-HdfsDirectoryTest-seed#[5D8F351977870E3F]-worker) [    ] o.a.s.SolrTestCaseJ4 ###deleteCore
   [junit4]   2> Aug 29, 2015 9:01:07 PM com.carrotsearch.randomizedtesting.ThreadLeakControl checkThreadLeaks
   [junit4]   2> WARNING: Will linger awaiting termination of 1 leaked thread(s).
   [junit4]   2> Aug 29, 2015 9:01:27 PM com.carrotsearch.randomizedtesting.ThreadLeakControl checkThreadLeaks
   [junit4]   2> SEVERE: 1 thread leaked from SUITE scope at org.apache.solr.store.hdfs.HdfsDirectoryTest: 
   [junit4]   2>    1) Thread[id=20389, name=IPC Server idle connection scanner for port 41610, state=WAITING, group=TGRP-HdfsDirectoryTest]
   [junit4]   2>         at java.lang.Object.wait(Native Method)
   [junit4]   2>         at java.lang.Object.wait(Object.java:503)
   [junit4]   2>         at java.util.TimerThread.mainLoop(Timer.java:526)
   [junit4]   2>         at java.util.TimerThread.run(Timer.java:505)
   [junit4]   2> Aug 29, 2015 9:01:27 PM com.carrotsearch.randomizedtesting.ThreadLeakControl tryToInterruptAll
   [junit4]   2> INFO: Starting to interrupt leaked threads:
   [junit4]   2>    1) Thread[id=20389, name=IPC Server idle connection scanner for port 41610, state=WAITING, group=TGRP-HdfsDirectoryTest]
   [junit4]   2> Aug 29, 2015 9:01:30 PM com.carrotsearch.randomizedtesting.ThreadLeakControl tryToInterruptAll
   [junit4]   2> SEVERE: There are still zombie threads that couldn't be terminated:
   [junit4]   2>    1) Thread[id=20389, name=IPC Server idle connection scanner for port 41610, state=WAITING, group=TGRP-HdfsDirectoryTest]
   [junit4]   2>         at java.lang.Object.wait(Native Method)
   [junit4]   2>         at java.lang.Object.wait(Object.java:503)
   [junit4]   2>         at java.util.TimerThread.mainLoop(Timer.java:526)
   [junit4]   2>         at java.util.TimerThread.run(Timer.java:505)
   [junit4]   2> NOTE: test params are: codec=Asserting(Lucene53): {}, docValues:{}, sim=DefaultSimilarity, locale=es_BO, timezone=Antarctica/South_Pole
   [junit4]   2> NOTE: SunOS 5.11 x86/Oracle Corporation 1.7.0_85 (32-bit)/cpus=3,threads=2,free=136627544,total=518979584
   [junit4]   2> NOTE: All tests run in this JVM: [TestIndexingPerformance, TestCSVResponseWriter, DistributedQueryComponentCustomSortTest, DirectSolrConnectionTest, FullSolrCloudDistribCmdsTest, TestShardHandlerFactory, CacheHeaderTest, BasicZkTest, TestTrie, FieldAnalysisRequestHandlerTest, PKIAuthenticationIntegrationTest, OpenCloseCoreStressTest, TestSuggestSpellingConverter, StressHdfsTest, CleanupOldIndexTest, DistributedExpandComponentTest, TestHdfsUpdateLog, TestSolrXml, TestAddFieldRealTimeGet, TestJsonFacets, DistributedSuggestComponentTest, OutOfBoxZkACLAndCredentialsProvidersTest, AnalyticsMergeStrategyTest, HLLUtilTest, ResponseHeaderTest, SearchHandlerTest, BinaryUpdateRequestHandlerTest, DistributedFacetPivotWhiteBoxTest, ConnectionManagerTest, SpellCheckComponentTest, TestScoreJoinQPNoScore, SolrTestCaseJ4Test, SolrIndexSplitterTest, TestConfigSetsAPI, TestDefaultSearchFieldResource, TestCryptoKeys, TestNonDefinedSimilarityFactory, TestCoreDiscovery, RollingRestartTest, SolrInfoMBeanTest, CustomCollectionTest, DistributedVersionInfoTest, ClusterStateTest, TestReversedWildcardFilterFactory, SolrXmlInZkTest, DistributedFacetPivotLongTailTest, URLClassifyProcessorTest, TestLMJelinekMercerSimilarityFactory, RequestHandlersTest, RemoteQueryErrorTest, LeaderElectionIntegrationTest, SharedFSAutoReplicaFailoverTest, TestBadConfig, SignatureUpdateProcessorFactoryTest, TestCursorMarkWithoutUniqueKey, TestCrossCoreJoin, SparseHLLTest, DistributedQueueTest, BigEndianAscendingWordSerializerTest, TestBM25SimilarityFactory, AutoCommitTest, DateMathParserTest, BasicFunctionalityTest, SuggesterWFSTTest, TestCollapseQParserPlugin, TestManagedResource, TestSha256AuthenticationProvider, CollectionTooManyReplicasTest, BadCopyFieldTest, TestDownShardTolerantSearch, CloudMLTQParserTest, NotRequiredUniqueKeyTest, TestAnalyzeInfixSuggestions, ExitableDirectoryReaderTest, TestScoreJoinQPScore, DeleteShardTest, RankQueryTest, TestSchemaManager, UpdateRequestProcessorFactoryTest, CursorMarkTest, DistributedDebugComponentTest, DeleteReplicaTest, RAMDirectoryFactoryTest, ConcurrentDeleteAndCreateCollectionTest, TestQueryTypes, OutputWriterTest, TestSchemaSimilarityResource, HighlighterMaxOffsetTest, ResponseLogComponentTest, TestCloudPivotFacet, DocValuesMissingTest, FieldMutatingUpdateProcessorTest, HttpPartitionTest, TestCollationField, ZkStateWriterTest, TestQuerySenderListener, AtomicUpdatesTest, TestStressRecovery, TestRandomFaceting, SharedFSAutoReplicaFailoverUtilsTest, CoreAdminHandlerTest, HighlighterConfigTest, TestCustomSort, MultiTermTest, VMParamsZkACLAndCredentialsProvidersTest, IgnoreCommitOptimizeUpdateProcessorFactoryTest, CollectionReloadTest, PrimUtilsTest, TestRecovery, TestWriterPerf, AddSchemaFieldsUpdateProcessorFactoryTest, TimeZoneUtilsTest, CurrencyFieldOpenExchangeTest, TestSolrCLIRunExample, TestPHPSerializedResponseWriter, ChaosMonkeySafeLeaderTest, TestIndexSearcher, EnumFieldTest, TestSolrIndexConfig, TermVectorComponentDistributedTest, TestJoin, TestExpandComponent, TestManagedResourceStorage, SortByFunctionTest, TestDefaultSimilarityFactory, SuggesterTest, TestValueSourceCache, SolrPluginUtilsTest, TermVectorComponentTest, TestFiltering, TestQueryUtils, FileBasedSpellCheckerTest, BasicDistributedZk2Test, CollectionsAPIDistributedZkTest, TestReplicationHandler, TestDistributedSearch, BadIndexSchemaTest, ConvertedLegacyTest, HighlighterTest, ShowFileRequestHandlerTest, SpellCheckCollatorTest, SpatialFilterTest, NoCacheHeaderTest, WordBreakSolrSpellCheckerTest, TestPseudoReturnFields, TestAtomicUpdateErrorCases, TestWordDelimiterFilterFactory, DefaultValueUpdateProcessorTest, TestRemoteStreaming, DebugComponentTest, TestSurroundQueryParser, LukeRequestHandlerTest, TestSolrQueryParser, IndexSchemaRuntimeFieldTest, RegexBoostProcessorTest, TestJmxIntegration, QueryParsingTest, TestPartialUpdateDeduplication, CSVRequestHandlerTest, TestBinaryResponseWriter, SOLR749Test, CopyFieldTest, BadComponentTest, TestSolrDeletionPolicy2, SampleTest, TestBinaryField, TestSearchPerf, NumericFieldsTest, MinimalSchemaTest, TestFuzzyAnalyzedSuggestions, TestSolrCoreProperties, TestPostingsSolrHighlighter, TestLuceneMatchVersion, SpellPossibilityIteratorTest, TestCharFilters, SynonymTokenizerTest, EchoParamsTest, TestSweetSpotSimilarityFactory, TestPerFieldSimilarity, TestLMDirichletSimilarityFactory, ResourceLoaderTest, TestFastOutputStream, ScriptEngineTest, OpenExchangeRatesOrgProviderTest, PluginInfoTest, TestFastLRUCache, ChaosMonkeyNothingIsSafeTest, TestHighlightDedupGrouping, TestTolerantSearch, TestJettySolrRunner, AssignTest, AsyncCallRequestStatusResponseTest, CollectionStateFormat2Test, CollectionsAPIAsyncDistributedZkTest, DistribCursorPagingTest, DistribJoinFromCollectionTest, LeaderInitiatedRecoveryOnCommitTest, MigrateRouteKeyTest, OverseerStatusTest, ShardSplitTest, TestConfigSetsAPIExclusivity, TestConfigSetsAPIZkFailure, TestLeaderElectionZkExpiry, TestMiniSolrCloudCluster, TestShortCircuitedRequests, HdfsRecoverLeaseTest, CachingDirectoryFactoryTest, HdfsDirectoryFactoryTest, TestConfigOverlay, TestConfigSetImmutable, TestImplicitCoreProperties, TestInfoStreamLogging, TestInitParams, TestSolrDynamicMBean, TestBlobHandler, TestConfigReload, TestReplicationHandlerBackup, TestSolrConfigHandlerConcurrent, CoreAdminCreateDiscoverTest, CoreAdminRequestStatusTest, CoreMergeIndexesAdminHandlerTest, DistributedFacetPivotLargeTest, DistributedFacetPivotSmallTest, FacetPivotSmallTest, SuggestComponentTest, JavabinLoaderTest, SmileWriterTest, TestIntervalFaceting, TestChildDocTransformer, TestCustomDocTransformer, TestSortingResponseWriter, TestBulkSchemaAPI, TestFieldResource, TestManagedSchemaDynamicFieldResource, TestBulkSchemaConcurrent, TestCloudSchemaless, TestReloadDeadlock, TestSearcherReuse, TestSimpleQParserPlugin, TestSmileRequest, TestSolr4Spatial2, TestStandardQParsers, TestStressUserVersions, TestTrieFacet, TestMinMaxOnMultiValuedField, TestOrdValues, TestSortByMinMaxFunction, SimpleMLTQParserTest, TestDistribIDF, TestExactSharedStatsCache, TestPKIAuthenticationPlugin, TestBlendedInfixSuggestions, TestFileDictionaryLookup, TestFreeTextSuggestions, TestHighFrequencyDictionaryFactory, BlockCacheTest, HdfsDirectoryTest]
   [junit4]   2> NOTE: reproduce with: ant test  -Dtestcase=HdfsDirectoryTest -Dtests.seed=5D8F351977870E3F -Dtests.slow=true -Dtests.locale=es_BO -Dtests.timezone=Antarctica/South_Pole -Dtests.asserts=true -Dtests.file.encoding=UTF-8
   [junit4] ERROR   0.00s J1 | HdfsDirectoryTest (suite) <<<
   [junit4]    > Throwable #1: java.security.AccessControlException: access denied ("java.io.FilePermission" "/export/home/jenkins/workspace/Lucene-Solr-5.x-Solaris/solr/build/solr-core/test/J1" "write")
   [junit4]    > 	at __randomizedtesting.SeedInfo.seed([5D8F351977870E3F]:0)
   [junit4]    > 	at java.security.AccessControlContext.checkPermission(AccessControlContext.java:395)
   [junit4]    > 	at java.security.AccessController.checkPermission(AccessController.java:559)
   [junit4]    > 	at java.lang.SecurityManager.checkPermission(SecurityManager.java:549)
   [junit4]    > 	at java.lang.SecurityManager.checkWrite(SecurityManager.java:979)
   [junit4]    > 	at java.io.File.canWrite(File.java:785)
   [junit4]    > 	at org.apache.hadoop.fs.FileUtil.canWrite(FileUtil.java:1002)
   [junit4]    > 	at org.apache.hadoop.hdfs.MiniDFSCluster.createPermissionsDiagnosisString(MiniDFSCluster.java:856)
   [junit4]    > 	at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:812)
   [junit4]    > 	at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:738)
   [junit4]    > 	at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:608)
   [junit4]    > 	at org.apache.solr.cloud.hdfs.HdfsTestUtil.setupClass(HdfsTestUtil.java:98)
   [junit4]    > 	at org.apache.solr.cloud.hdfs.HdfsTestUtil.setupClass(HdfsTestUtil.java:60)
   [junit4]    > 	at org.apache.solr.store.hdfs.HdfsDirectoryTest.beforeClass(HdfsDirectoryTest.java:62)
   [junit4]    > 	at java.lang.Thread.run(Thread.java:745)Throwable #2: com.carrotsearch.randomizedtesting.ThreadLeakError: 1 thread leaked from SUITE scope at org.apache.solr.store.hdfs.HdfsDirectoryTest: 
   [junit4]    >    1) Thread[id=20389, name=IPC Server idle connection scanner for port 41610, state=WAITING, group=TGRP-HdfsDirectoryTest]
   [junit4]    >         at java.lang.Object.wait(Native Method)
   [junit4]    >         at java.lang.Object.wait(Object.java:503)
   [junit4]    >         at java.util.TimerThread.mainLoop(Timer.java:526)
   [junit4]    >         at java.util.TimerThread.run(Timer.java:505)
   [junit4]    > 	at __randomizedtesting.SeedInfo.seed([5D8F351977870E3F]:0)Throwable #3: com.carrotsearch.randomizedtesting.ThreadLeakError: There are still zombie threads that couldn't be terminated:
   [junit4]    >    1) Thread[id=20389, name=IPC Server idle connection scanner for port 41610, state=WAITING, group=TGRP-HdfsDirectoryTest]
   [junit4]    >         at java.lang.Object.wait(Native Method)
   [junit4]    >         at java.lang.Object.wait(Object.java:503)
   [junit4]    >         at java.util.TimerThread.mainLoop(Timer.java:526)
   [junit4]    >         at java.util.TimerThread.run(Timer.java:505)
   [junit4]    > 	at __randomizedtesting.SeedInfo.seed([5D8F351977870E3F]:0)
   [junit4] Completed [521/536] on J1 in 33.05s, 0 tests, 3 errors <<< FAILURES!

[...truncated 64 lines...]
BUILD FAILED
/export/home/jenkins/workspace/Lucene-Solr-5.x-Solaris/build.xml:785: The following error occurred while executing this line:
/export/home/jenkins/workspace/Lucene-Solr-5.x-Solaris/build.xml:729: The following error occurred while executing this line:
/export/home/jenkins/workspace/Lucene-Solr-5.x-Solaris/build.xml:59: The following error occurred while executing this line:
/export/home/jenkins/workspace/Lucene-Solr-5.x-Solaris/solr/build.xml:233: The following error occurred while executing this line:
/export/home/jenkins/workspace/Lucene-Solr-5.x-Solaris/solr/common-build.xml:524: The following error occurred while executing this line:
/export/home/jenkins/workspace/Lucene-Solr-5.x-Solaris/lucene/common-build.xml:1452: The following error occurred while executing this line:
/export/home/jenkins/workspace/Lucene-Solr-5.x-Solaris/lucene/common-build.xml:1006: There were test failures: 536 suites, 2123 tests, 4 suite-level errors, 108 ignored (34 assumptions)

Total time: 77 minutes 51 seconds
Build step 'Invoke Ant' marked build as failure
Archiving artifacts
[WARNINGS] Skipping publisher since build result is FAILURE
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



[JENKINS] Lucene-Solr-5.x-Solaris (multiarch/jdk1.7.0) - Build # 8 - Still Failing!

Posted by Policeman Jenkins Server <je...@thetaphi.de>.
Build: http://jenkins.thetaphi.de/job/Lucene-Solr-5.x-Solaris/8/
Java: multiarch/jdk1.7.0 -d32 -server -XX:+UseG1GC

1 tests failed.
FAILED:  org.apache.lucene.codecs.TestCodecLoadingDeadlock.testDeadlock

Error Message:
Cannot run program "/usr/jdk/instances/jdk1.7.0/jre/bin/java": error=12, Not enough space

Stack Trace:
java.io.IOException: Cannot run program "/usr/jdk/instances/jdk1.7.0/jre/bin/java": error=12, Not enough space
	at __randomizedtesting.SeedInfo.seed([BA40896CF620C7CB:B72B6878F07A6A1D]:0)
	at java.lang.ProcessBuilder.start(ProcessBuilder.java:1047)
	at org.apache.lucene.codecs.TestCodecLoadingDeadlock.testDeadlock(TestCodecLoadingDeadlock.java:67)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1627)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:836)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:798)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:458)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:845)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$3.evaluate(RandomizedRunner.java:747)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:781)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:792)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:798)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$2.evaluate(ThreadLeakControl.java:401)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.runSuite(RandomizedRunner.java:651)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.access$200(RandomizedRunner.java:138)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$1.run(RandomizedRunner.java:568)
Caused by: java.io.IOException: error=12, Not enough space
	at java.lang.UNIXProcess.forkAndExec(Native Method)
	at java.lang.UNIXProcess.<init>(UNIXProcess.java:137)
	at java.lang.ProcessImpl.start(ProcessImpl.java:130)
	at java.lang.ProcessBuilder.start(ProcessBuilder.java:1028)
	... 22 more




Build Log:
[...truncated 11449 lines...]
   [junit4] Suite: org.apache.lucene.codecs.TestCodecLoadingDeadlock
   [junit4] ERROR   3.16s J0 | TestCodecLoadingDeadlock.testDeadlock <<<
   [junit4]    > Throwable #1: java.io.IOException: Cannot run program "/usr/jdk/instances/jdk1.7.0/jre/bin/java": error=12, Not enough space
   [junit4]    > 	at __randomizedtesting.SeedInfo.seed([BA40896CF620C7CB:B72B6878F07A6A1D]:0)
   [junit4]    > 	at java.lang.ProcessBuilder.start(ProcessBuilder.java:1047)
   [junit4]    > 	at org.apache.lucene.codecs.TestCodecLoadingDeadlock.testDeadlock(TestCodecLoadingDeadlock.java:67)
   [junit4]    > Caused by: java.io.IOException: error=12, Not enough space
   [junit4]    > 	at java.lang.UNIXProcess.forkAndExec(Native Method)
   [junit4]    > 	at java.lang.UNIXProcess.<init>(UNIXProcess.java:137)
   [junit4]    > 	at java.lang.ProcessImpl.start(ProcessImpl.java:130)
   [junit4]    > 	at java.lang.ProcessBuilder.start(ProcessBuilder.java:1028)
   [junit4]    > 	... 22 more
   [junit4] Completed [63/411] on J0 in 3.24s, 1 test, 1 error <<< FAILURES!

[...truncated 1142 lines...]
BUILD FAILED
/export/home/jenkins/workspace/Lucene-Solr-5.x-Solaris/build.xml:785: The following error occurred while executing this line:
/export/home/jenkins/workspace/Lucene-Solr-5.x-Solaris/build.xml:729: The following error occurred while executing this line:
/export/home/jenkins/workspace/Lucene-Solr-5.x-Solaris/build.xml:59: The following error occurred while executing this line:
/export/home/jenkins/workspace/Lucene-Solr-5.x-Solaris/lucene/build.xml:50: The following error occurred while executing this line:
/export/home/jenkins/workspace/Lucene-Solr-5.x-Solaris/lucene/common-build.xml:1452: The following error occurred while executing this line:
/export/home/jenkins/workspace/Lucene-Solr-5.x-Solaris/lucene/common-build.xml:1006: There were test failures: 411 suites, 3324 tests, 1 error, 54 ignored (50 assumptions)

Total time: 8 minutes 0 seconds
Build step 'Invoke Ant' marked build as failure
Archiving artifacts
[WARNINGS] Skipping publisher since build result is FAILURE
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



RE: [JENKINS] Lucene-Solr-5.x-Solaris (multiarch/jdk1.7.0) - Build # 7 - Still Failing!

Posted by Uwe Schindler <uw...@thetaphi.de>.
I have to fix number of open files on Solaris, this is worse than Ubuntu:

jenkins@solaris-vm:~$ ulimit -a
core file size          (blocks, -c) unlimited
data seg size           (kbytes, -d) unlimited
file size               (blocks, -f) unlimited
open files                      (-n) 256
pipe size            (512 bytes, -p) 10
stack size              (kbytes, -s) 8192
cpu time               (seconds, -t) unlimited
max user processes              (-u) 29995
virtual memory          (kbytes, -v) unlimited

There are a  number of discussions on this error message, one hint: http://kb.vmware.com/selfservice/microsites/search.do?language=en_US&cmd=displayKC&externalId=2007790

Sory for that! Will take care later.

Uwe

-----
Uwe Schindler
H.-H.-Meier-Allee 63, D-28213 Bremen
http://www.thetaphi.de
eMail: uwe@thetaphi.de


> -----Original Message-----
> From: Policeman Jenkins Server [mailto:jenkins@thetaphi.de]
> Sent: Saturday, August 29, 2015 7:12 AM
> To: uwe@thetaphi.de; cpoerschke@apache.org; gchanan@apache.org;
> yonik@apache.org; dev@lucene.apache.org
> Subject: [JENKINS] Lucene-Solr-5.x-Solaris (multiarch/jdk1.7.0) - Build # 7 -
> Still Failing!
> 
> Build: http://jenkins.thetaphi.de/job/Lucene-Solr-5.x-Solaris/7/
> Java: multiarch/jdk1.7.0 -d64 -XX:+UseCompressedOops -XX:+UseG1GC
> 
> 1 tests failed.
> FAILED:  org.apache.lucene.codecs.TestCodecLoadingDeadlock.testDeadlock
> 
> Error Message:
> Cannot run program "/usr/jdk/instances/jdk1.7.0/jre/bin/java": error=12,
> Not enough space
> 
> Stack Trace:
> java.io.IOException: Cannot run program
> "/usr/jdk/instances/jdk1.7.0/jre/bin/java": error=12, Not enough space
> 	at
> __randomizedtesting.SeedInfo.seed([A506913E9D3F33AD:A86D702A9B659E
> 7B]:0)
> 	at java.lang.ProcessBuilder.start(ProcessBuilder.java:1047)
> 	at
> org.apache.lucene.codecs.TestCodecLoadingDeadlock.testDeadlock(TestCod
> ecLoadingDeadlock.java:67)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.j
> ava:57)
> 	at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
> sorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:606)
> 	at
> com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(Randomize
> dRunner.java:1627)
> 	at
> com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(Rando
> mizedRunner.java:836)
> 	at
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> ementAdapter.java:36)
> 	at
> com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.
> run(ThreadLeakControl.java:365)
> 	at
> com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask
> (ThreadLeakControl.java:798)
> 	at
> com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadL
> eakControl.java:458)
> 	at
> com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(Ran
> domizedRunner.java:845)
> 	at
> com.carrotsearch.randomizedtesting.RandomizedRunner$3.evaluate(Rando
> mizedRunner.java:747)
> 	at
> com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(Rando
> mizedRunner.java:781)
> 	at
> com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(Rando
> mizedRunner.java:792)
> 	at
> com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(Stat
> ementAdapter.java:36)
> 	at
> com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.
> run(ThreadLeakControl.java:365)
> 	at
> com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask
> (ThreadLeakControl.java:798)
> 	at
> com.carrotsearch.randomizedtesting.ThreadLeakControl$2.evaluate(ThreadL
> eakControl.java:401)
> 	at
> com.carrotsearch.randomizedtesting.RandomizedRunner.runSuite(Randomi
> zedRunner.java:651)
> 	at
> com.carrotsearch.randomizedtesting.RandomizedRunner.access$200(Rando
> mizedRunner.java:138)
> 	at
> com.carrotsearch.randomizedtesting.RandomizedRunner$1.run(Randomized
> Runner.java:568)
> Caused by: java.io.IOException: error=12, Not enough space
> 	at java.lang.UNIXProcess.forkAndExec(Native Method)
> 	at java.lang.UNIXProcess.<init>(UNIXProcess.java:137)
> 	at java.lang.ProcessImpl.start(ProcessImpl.java:130)
> 	at java.lang.ProcessBuilder.start(ProcessBuilder.java:1028)
> 	... 22 more
> 
> 
> 
> 
> Build Log:
> [...truncated 797 lines...]
>    [junit4] Suite: org.apache.lucene.codecs.TestCodecLoadingDeadlock
>    [junit4] ERROR   3.07s J0 | TestCodecLoadingDeadlock.testDeadlock <<<
>    [junit4]    > Throwable #1: java.io.IOException: Cannot run program
> "/usr/jdk/instances/jdk1.7.0/jre/bin/java": error=12, Not enough space
>    [junit4]    > 	at
> __randomizedtesting.SeedInfo.seed([A506913E9D3F33AD:A86D702A9B659E
> 7B]:0)
>    [junit4]    > 	at java.lang.ProcessBuilder.start(ProcessBuilder.java:1047)
>    [junit4]    > 	at
> org.apache.lucene.codecs.TestCodecLoadingDeadlock.testDeadlock(TestCod
> ecLoadingDeadlock.java:67)
>    [junit4]    > Caused by: java.io.IOException: error=12, Not enough space
>    [junit4]    > 	at java.lang.UNIXProcess.forkAndExec(Native Method)
>    [junit4]    > 	at java.lang.UNIXProcess.<init>(UNIXProcess.java:137)
>    [junit4]    > 	at java.lang.ProcessImpl.start(ProcessImpl.java:130)
>    [junit4]    > 	at java.lang.ProcessBuilder.start(ProcessBuilder.java:1028)
>    [junit4]    > 	... 22 more
>    [junit4] Completed [141/411] on J0 in 3.07s, 1 test, 1 error <<< FAILURES!
> 
> [...truncated 888 lines...]
> BUILD FAILED
> /export/home/jenkins/workspace/Lucene-Solr-5.x-Solaris/build.xml:785:
> The following error occurred while executing this line:
> /export/home/jenkins/workspace/Lucene-Solr-5.x-Solaris/build.xml:729:
> The following error occurred while executing this line:
> /export/home/jenkins/workspace/Lucene-Solr-5.x-Solaris/build.xml:59: The
> following error occurred while executing this line:
> /export/home/jenkins/workspace/Lucene-Solr-5.x-
> Solaris/lucene/build.xml:50: The following error occurred while executing this
> line:
> /export/home/jenkins/workspace/Lucene-Solr-5.x-Solaris/lucene/common-
> build.xml:1452: The following error occurred while executing this line:
> /export/home/jenkins/workspace/Lucene-Solr-5.x-Solaris/lucene/common-
> build.xml:1006: There were test failures: 411 suites, 3324 tests, 1 error, 48
> ignored (44 assumptions)
> 
> Total time: 6 minutes 22 seconds
> Build step 'Invoke Ant' marked build as failure Archiving artifacts
> [WARNINGS] Skipping publisher since build result is FAILURE Recording test
> results Email was triggered for: Failure - Any Sending email for trigger: Failure
> - Any
> 



---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@lucene.apache.org
For additional commands, e-mail: dev-help@lucene.apache.org