You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@oozie.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2017/11/14 14:38:47 UTC
Failed: OOZIE-2973 PreCommit Build #217
Jira: https://issues.apache.org/jira/browse/OOZIE-2973
Build: https://builds.apache.org/job/PreCommit-OOZIE-Build/217/
###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 848.17 KB...]
+1 the patch does not introduce any line longer than 132
-1 the patch does not add/modify any testcase
+1 RAT
+1 the patch does not seem to introduce new RAT warnings
+1 JAVADOC
+1 the patch does not seem to introduce new Javadoc warnings
WARNING: the current HEAD has 77 Javadoc warning(s)
-1 COMPILE
+1 HEAD compiles
-1 patch does not compile
+1 the patch does not seem to introduce new javac warnings
+1 There are no new bugs found in total.
+1 There are no new bugs found in [core].
+1 There are no new bugs found in [tools].
+1 There are no new bugs found in [sharelib/hive2].
+1 There are no new bugs found in [sharelib/distcp].
+1 There are no new bugs found in [sharelib/hcatalog].
+1 There are no new bugs found in [sharelib/streaming].
+1 There are no new bugs found in [sharelib/sqoop].
+1 There are no new bugs found in [sharelib/oozie].
+1 There are no new bugs found in [sharelib/pig].
+1 There are no new bugs found in [sharelib/hive].
+1 There are no new bugs found in [sharelib/spark].
+1 There are no new bugs found in [client].
+1 There are no new bugs found in [examples].
+1 There are no new bugs found in [docs].
+1 There are no new bugs found in [server].
+1 BACKWARDS_COMPATIBILITY
+1 the patch does not change any JPA Entity/Colum/Basic/Lob/Transient annotations
+1 the patch does not modify JPA files
-1 TESTS - patch does not compile, cannot run testcases
-1 DISTRO
-1 distro tarball fails with the patch
----------------------------
-1 Overall result, please check the reported -1(s)
There is at least one warning, please check
The full output of the test-patch run is available at
https://builds.apache.org/job/PreCommit-OOZIE-Build/217/
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
0 3706k 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0100 3706k 100 3706k 0 0 3285k 0 0:00:01 0:00:01 --:--:-- 14.6M
Adding comment to JIRA
Comment added.
test-patch exit code: 1
Build step 'Execute shell' marked build as failure
[description-setter] Description set: OOZIE-2973
Archiving artifacts
[Fast Archiver] Compressed 1.12 MB of artifacts by 50.0% relative to #214
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any
###################################################################################
############################## FAILED TESTS (if any) ##############################
All tests passed
Failed: OOZIE-2869 PreCommit Build #220
Posted by Apache Jenkins Server <je...@builds.apache.org>.
Jira: https://issues.apache.org/jira/browse/OOZIE-2869
Build: https://builds.apache.org/job/PreCommit-OOZIE-Build/220/
###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 1.57 MB...]
[INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ oozie-zookeeper-security-tests ---
[debug] execute contextualize
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/zookeeper-security-tests/src/test/resources
[INFO] Copying 3 resources
[INFO]
[INFO] --- maven-compiler-plugin:2.3.2:testCompile (default-testCompile) @ oozie-zookeeper-security-tests ---
[INFO] Nothing to compile - all classes are up to date
[INFO]
[INFO] --- maven-surefire-plugin:2.20.1:test (default-test) @ oozie-zookeeper-security-tests ---
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Oozie Main .................................. SUCCESS [ 0.578 s]
[INFO] Apache Oozie Client ................................ SUCCESS [ 1.228 s]
[INFO] Apache Oozie Share Lib Oozie ....................... SUCCESS [ 2.095 s]
[INFO] Apache Oozie Share Lib HCatalog .................... SUCCESS [ 0.650 s]
[INFO] Apache Oozie Share Lib Distcp ...................... SUCCESS [ 0.133 s]
[INFO] Apache Oozie Core .................................. SUCCESS [ 5.797 s]
[INFO] Apache Oozie Share Lib Streaming ................... SUCCESS [06:42 min]
[INFO] Apache Oozie Share Lib Pig ......................... SUCCESS [04:05 min]
[INFO] Apache Oozie Share Lib Hive ........................ SUCCESS [01:08 min]
[INFO] Apache Oozie Share Lib Hive 2 ...................... SUCCESS [ 5.842 s]
[INFO] Apache Oozie Share Lib Sqoop ....................... SUCCESS [ 2.898 s]
[INFO] Apache Oozie Examples .............................. SUCCESS [ 5.057 s]
[INFO] Apache Oozie Share Lib Spark ....................... SUCCESS [ 2.050 s]
[INFO] Apache Oozie Share Lib ............................. SUCCESS [ 0.016 s]
[INFO] Apache Oozie Docs .................................. SUCCESS [ 0.025 s]
[INFO] Apache Oozie WebApp ................................ SUCCESS [ 2.374 s]
[INFO] Apache Oozie Tools ................................. SUCCESS [ 1.138 s]
[INFO] Apache Oozie MiniOozie ............................. SUCCESS [ 1.226 s]
[INFO] Apache Oozie Server ................................ SUCCESS [ 2.849 s]
[INFO] Apache Oozie Distro ................................ SUCCESS [ 1.088 s]
[INFO] Apache Oozie ZooKeeper Security Tests .............. SUCCESS [ 1.658 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 12:34 min
[INFO] Finished at: 2017-11-14T19:44:29Z
[INFO] Final Memory: 73M/1439M
[INFO] ------------------------------------------------------------------------
bin/test-patch-20-tests: line 105: echo: write error: No space left on device
Failure, check for details /home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/test-patch/tmp/test-patch-20-tests-post.out
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
0 3706k 0 5792 0 0 7292 0 0:08:40 --:--:-- 0:08:40 7292
curl: (23) Failed writing body (0 != 5792)
Could not download jira-cli tool, thus no JIRA updating
Build step 'Execute shell' marked build as failure
[description-setter] Description set: OOZIE-2869
Archiving artifacts
[Fast Archiver] Compressed 1.65 MB of artifacts by 39.7% relative to #216
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any
###################################################################################
############################## FAILED TESTS (if any) ##############################
61 tests failed.
FAILED: org.apache.oozie.example.TestLocalOozieExample.testLocalOozieExampleEnd
Error Message:
Unable to initialize WebAppContext
Stack Trace:
java.io.IOException: Unable to initialize WebAppContext
at org.apache.oozie.example.TestLocalOozieExample.setUp(TestLocalOozieExample.java:37)
Caused by: java.io.IOException: No space left on device
at org.apache.oozie.example.TestLocalOozieExample.setUp(TestLocalOozieExample.java:37)
FAILED: org.apache.oozie.example.TestLocalOozieExample.testLocalOozieExampleKill
Error Message:
Unable to initialize WebAppContext
Stack Trace:
java.io.IOException: Unable to initialize WebAppContext
at org.apache.oozie.example.TestLocalOozieExample.setUp(TestLocalOozieExample.java:37)
Caused by: java.io.IOException: No space left on device
at org.apache.oozie.example.TestLocalOozieExample.setUp(TestLocalOozieExample.java:37)
FAILED: org.apache.oozie.action.hadoop.TestHiveActionExecutor.testSetupMethods
Error Message:
Timed out waiting for Mini HDFS Cluster to start
Stack Trace:
java.io.IOException: Timed out waiting for Mini HDFS Cluster to start
FAILED: org.apache.oozie.action.hadoop.TestHiveActionExecutor.testActionConfLoadDefaultResources
Error Message:
Timed out waiting for Mini HDFS Cluster to start
Stack Trace:
java.io.IOException: Timed out waiting for Mini HDFS Cluster to start
FAILED: org.apache.oozie.action.hadoop.TestHiveActionExecutor.testHiveAction
Error Message:
Unable to initialize WebAppContext
Stack Trace:
java.io.IOException: Unable to initialize WebAppContext
Caused by: java.io.IOException: No space left on device
FAILED: org.apache.oozie.action.hadoop.TestHiveMain.testJobIDPattern
Error Message:
Timed out waiting for Mini HDFS Cluster to start
Stack Trace:
java.io.IOException: Timed out waiting for Mini HDFS Cluster to start
at org.apache.oozie.action.hadoop.TestHiveMain.setUp(TestHiveMain.java:44)
FAILED: org.apache.oozie.action.hadoop.TestHiveMain.testMain
Error Message:
Timed out waiting for Mini HDFS Cluster to start
Stack Trace:
java.io.IOException: Timed out waiting for Mini HDFS Cluster to start
at org.apache.oozie.action.hadoop.TestHiveMain.setUp(TestHiveMain.java:44)
FAILED: org.apache.oozie.action.hadoop.TestHive2ActionExecutor.testSetupMethodsForQuery
Error Message:
Unable to initialize WebAppContext
Stack Trace:
java.io.IOException: Unable to initialize WebAppContext
at org.apache.oozie.action.hadoop.TestHive2ActionExecutor.setUp(TestHive2ActionExecutor.java:56)
Caused by: java.io.IOException: No space left on device
at org.apache.oozie.action.hadoop.TestHive2ActionExecutor.setUp(TestHive2ActionExecutor.java:56)
FAILED: org.apache.oozie.action.hadoop.TestHive2ActionExecutor.testHive2Action
Error Message:
/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/hive2/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestHive2ActionExecutor/testHive2Action/dfe1b44f-ed70-4414-9d3a-2aee17b7d775/conf/hadoop-conf/hadoop-site.xml (No such file or directory)
Stack Trace:
java.io.FileNotFoundException: /home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/hive2/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestHive2ActionExecutor/testHive2Action/dfe1b44f-ed70-4414-9d3a-2aee17b7d775/conf/hadoop-conf/hadoop-site.xml (No such file or directory)
at org.apache.oozie.action.hadoop.TestHive2ActionExecutor.setUp(TestHive2ActionExecutor.java:56)
FAILED: org.apache.oozie.action.hadoop.TestHive2ActionExecutor.testHive2ActionFails
Error Message:
Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/hive2/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestHive2ActionExecutor/testHive2ActionFails/3a9ab56b-2acf-4f13-bd66-d5430eb61bff]
Stack Trace:
java.lang.RuntimeException: Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/hive2/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestHive2ActionExecutor/testHive2ActionFails/3a9ab56b-2acf-4f13-bd66-d5430eb61bff]
at org.apache.oozie.action.hadoop.TestHive2ActionExecutor.setUp(TestHive2ActionExecutor.java:56)
FAILED: org.apache.oozie.action.hadoop.TestHive2ActionExecutor.testSetupMethodsForScript
Error Message:
Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/hive2/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestHive2ActionExecutor/testSetupMethodsForScript/595216b5-9058-447d-b39d-8a1899f340ed]
Stack Trace:
java.lang.RuntimeException: Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/hive2/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestHive2ActionExecutor/testSetupMethodsForScript/595216b5-9058-447d-b39d-8a1899f340ed]
at org.apache.oozie.action.hadoop.TestHive2ActionExecutor.setUp(TestHive2ActionExecutor.java:56)
FAILED: org.apache.oozie.action.hadoop.TestPigActionExecutor.testPig
Error Message:
Timed out waiting for Mini HDFS Cluster to start
Stack Trace:
java.io.IOException: Timed out waiting for Mini HDFS Cluster to start
at org.apache.oozie.action.hadoop.TestPigActionExecutor.setUp(TestPigActionExecutor.java:71)
FAILED: org.apache.oozie.action.hadoop.TestPigActionExecutor.testExecutionStatsWithMaxStatsSizeLimit
Error Message:
YARN App state for app application_1510688362402_0001 expected:<FINISHED> but was:<ACCEPTED>
Stack Trace:
junit.framework.AssertionFailedError: YARN App state for app application_1510688362402_0001 expected:<FINISHED> but was:<ACCEPTED>
at org.apache.oozie.action.hadoop.TestPigActionExecutor.testExecutionStatsWithMaxStatsSizeLimit(TestPigActionExecutor.java:271)
FAILED: org.apache.oozie.action.hadoop.TestPigActionExecutor.testPigError
Error Message:
File /user/test/868bcf66-8fc3-41c1-ac93-615639ba503a/netty-3.6.2.Final.jar could only be replicated to 0 nodes instead of minReplication (=1). There are 2 datanode(s) running and 2 node(s) are excluded in this operation.
at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1549)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:3200)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:641)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:482)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2039)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2035)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2033)
Stack Trace:
org.apache.hadoop.ipc.RemoteException:
File /user/test/868bcf66-8fc3-41c1-ac93-615639ba503a/netty-3.6.2.Final.jar could only be replicated to 0 nodes instead of minReplication (=1). There are 2 datanode(s) running and 2 node(s) are excluded in this operation.
at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1549)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:3200)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:641)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:482)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2039)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2035)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2033)
FAILED: org.apache.oozie.action.hadoop.TestPigActionExecutor.testActionConfLoadDefaultResources
Error Message:
File /user/test/9b97d57e-d34d-4090-bed5-2ff78be3b9e7/app/script.pig could only be replicated to 0 nodes instead of minReplication (=1). There are 2 datanode(s) running and 2 node(s) are excluded in this operation.
at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1549)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:3200)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:641)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:482)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2039)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2035)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2033)
Stack Trace:
org.apache.hadoop.ipc.RemoteException:
File /user/test/9b97d57e-d34d-4090-bed5-2ff78be3b9e7/app/script.pig could only be replicated to 0 nodes instead of minReplication (=1). There are 2 datanode(s) running and 2 node(s) are excluded in this operation.
at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1549)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:3200)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:641)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:482)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2039)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2035)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2033)
FAILED: org.apache.oozie.action.hadoop.TestPigActionExecutor.testExecutionStatsWithRetrieveStatsFalse
Error Message:
File /user/test/90f5d1b3-7ceb-4785-8f90-a4c6033bd786/app/script.pig could only be replicated to 0 nodes instead of minReplication (=1). There are 2 datanode(s) running and no node(s) are excluded in this operation.
at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1549)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:3200)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:641)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:482)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2039)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2035)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2033)
Stack Trace:
org.apache.hadoop.ipc.RemoteException:
File /user/test/90f5d1b3-7ceb-4785-8f90-a4c6033bd786/app/script.pig could only be replicated to 0 nodes instead of minReplication (=1). There are 2 datanode(s) running and no node(s) are excluded in this operation.
at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1549)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:3200)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:641)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:482)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2039)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2035)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2033)
FAILED: org.apache.oozie.action.hadoop.TestPigActionExecutor.testExternalChildIds
Error Message:
File /user/test/f457a10e-7377-4792-acf6-667bace2275e/app/script.pig could only be replicated to 0 nodes instead of minReplication (=1). There are 2 datanode(s) running and no node(s) are excluded in this operation.
at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1549)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:3200)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:641)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:482)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2039)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2035)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2033)
Stack Trace:
org.apache.hadoop.ipc.RemoteException:
File /user/test/f457a10e-7377-4792-acf6-667bace2275e/app/script.pig could only be replicated to 0 nodes instead of minReplication (=1). There are 2 datanode(s) running and no node(s) are excluded in this operation.
at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1549)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:3200)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:641)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:482)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2039)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2035)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2033)
FAILED: org.apache.oozie.action.hadoop.TestPigActionExecutor.testExecutionStats
Error Message:
File /user/test/3d86af28-3dad-4c08-8b35-b4f9d5a4504e/app/script.pig could only be replicated to 0 nodes instead of minReplication (=1). There are 2 datanode(s) running and no node(s) are excluded in this operation.
at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1549)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:3200)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:641)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:482)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2039)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2035)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2033)
Stack Trace:
org.apache.hadoop.ipc.RemoteException:
File /user/test/3d86af28-3dad-4c08-8b35-b4f9d5a4504e/app/script.pig could only be replicated to 0 nodes instead of minReplication (=1). There are 2 datanode(s) running and no node(s) are excluded in this operation.
at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1549)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:3200)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:641)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:482)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2039)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2035)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2033)
FAILED: org.apache.oozie.action.hadoop.TestPigActionExecutor.testUdfPig
Error Message:
File /user/test/fc0cbeb7-8767-4b3d-a250-79c685e5293e/udf.jar could only be replicated to 0 nodes instead of minReplication (=1). There are 2 datanode(s) running and no node(s) are excluded in this operation.
at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1549)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:3200)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:641)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:482)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2039)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2035)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2033)
Stack Trace:
org.apache.hadoop.ipc.RemoteException:
File /user/test/fc0cbeb7-8767-4b3d-a250-79c685e5293e/udf.jar could only be replicated to 0 nodes instead of minReplication (=1). There are 2 datanode(s) running and no node(s) are excluded in this operation.
at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1549)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:3200)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:641)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:482)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2039)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2035)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2033)
FAILED: org.apache.oozie.action.hadoop.TestPigMain.testJobIDPattern
Error Message:
Cannot delete /user/test/b754fe31-1f53-4dc7-9da3-9dc249d06348. Name node is in safe mode.
Resources are low on NN. Please add or free up more resources then turn off safe mode manually. NOTE: If you turn off safe mode before adding resources, the NN will immediately return to safe mode. Use "hdfs dfsadmin -safemode leave" to turn safe mode off.
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkNameNodeSafeMode(FSNamesystem.java:1364)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.deleteInternal(FSNamesystem.java:3967)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.deleteInt(FSNamesystem.java:3925)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.delete(FSNamesystem.java:3909)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.delete(NameNodeRpcServer.java:786)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.delete(ClientNamenodeProtocolServerSideTranslatorPB.java:589)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2039)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2035)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2033)
Stack Trace:
org.apache.hadoop.ipc.RemoteException:
Cannot delete /user/test/b754fe31-1f53-4dc7-9da3-9dc249d06348. Name node is in safe mode.
Resources are low on NN. Please add or free up more resources then turn off safe mode manually. NOTE: If you turn off safe mode before adding resources, the NN will immediately return to safe mode. Use "hdfs dfsadmin -safemode leave" to turn safe mode off.
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkNameNodeSafeMode(FSNamesystem.java:1364)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.deleteInternal(FSNamesystem.java:3967)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.deleteInt(FSNamesystem.java:3925)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.delete(FSNamesystem.java:3909)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.delete(NameNodeRpcServer.java:786)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.delete(ClientNamenodeProtocolServerSideTranslatorPB.java:589)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2039)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2035)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2033)
at org.apache.oozie.action.hadoop.TestPigMain.setUp(TestPigMain.java:47)
FAILED: org.apache.oozie.action.hadoop.TestPigMain.testPig_withNullExternalID
Error Message:
Cannot delete /user/test/3fe696da-3175-4cd5-b923-b0ada0e7c504. Name node is in safe mode.
Resources are low on NN. Please add or free up more resources then turn off safe mode manually. NOTE: If you turn off safe mode before adding resources, the NN will immediately return to safe mode. Use "hdfs dfsadmin -safemode leave" to turn safe mode off.
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkNameNodeSafeMode(FSNamesystem.java:1364)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.deleteInternal(FSNamesystem.java:3967)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.deleteInt(FSNamesystem.java:3925)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.delete(FSNamesystem.java:3909)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.delete(NameNodeRpcServer.java:786)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.delete(ClientNamenodeProtocolServerSideTranslatorPB.java:589)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2039)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2035)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2033)
Stack Trace:
org.apache.hadoop.ipc.RemoteException:
Cannot delete /user/test/3fe696da-3175-4cd5-b923-b0ada0e7c504. Name node is in safe mode.
Resources are low on NN. Please add or free up more resources then turn off safe mode manually. NOTE: If you turn off safe mode before adding resources, the NN will immediately return to safe mode. Use "hdfs dfsadmin -safemode leave" to turn safe mode off.
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkNameNodeSafeMode(FSNamesystem.java:1364)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.deleteInternal(FSNamesystem.java:3967)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.deleteInt(FSNamesystem.java:3925)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.delete(FSNamesystem.java:3909)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.delete(NameNodeRpcServer.java:786)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.delete(ClientNamenodeProtocolServerSideTranslatorPB.java:589)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2039)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2035)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2033)
at org.apache.oozie.action.hadoop.TestPigMain.setUp(TestPigMain.java:47)
FAILED: org.apache.oozie.action.hadoop.TestPigMain.testEmbeddedPigWithinPython
Error Message:
Cannot delete /user/test/2221b27a-36aa-4985-b631-cd90f29c90d2. Name node is in safe mode.
Resources are low on NN. Please add or free up more resources then turn off safe mode manually. NOTE: If you turn off safe mode before adding resources, the NN will immediately return to safe mode. Use "hdfs dfsadmin -safemode leave" to turn safe mode off.
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkNameNodeSafeMode(FSNamesystem.java:1364)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.deleteInternal(FSNamesystem.java:3967)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.deleteInt(FSNamesystem.java:3925)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.delete(FSNamesystem.java:3909)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.delete(NameNodeRpcServer.java:786)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.delete(ClientNamenodeProtocolServerSideTranslatorPB.java:589)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2039)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2035)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2033)
Stack Trace:
org.apache.hadoop.ipc.RemoteException:
Cannot delete /user/test/2221b27a-36aa-4985-b631-cd90f29c90d2. Name node is in safe mode.
Resources are low on NN. Please add or free up more resources then turn off safe mode manually. NOTE: If you turn off safe mode before adding resources, the NN will immediately return to safe mode. Use "hdfs dfsadmin -safemode leave" to turn safe mode off.
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkNameNodeSafeMode(FSNamesystem.java:1364)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.deleteInternal(FSNamesystem.java:3967)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.deleteInt(FSNamesystem.java:3925)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.delete(FSNamesystem.java:3909)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.delete(NameNodeRpcServer.java:786)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.delete(ClientNamenodeProtocolServerSideTranslatorPB.java:589)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2039)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2035)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2033)
at org.apache.oozie.action.hadoop.TestPigMain.setUp(TestPigMain.java:47)
FAILED: org.apache.oozie.action.hadoop.TestPigMain.testPigScript
Error Message:
Cannot delete /user/test/66853dc6-7337-4d6b-83ff-cb197a006d1d. Name node is in safe mode.
Resources are low on NN. Please add or free up more resources then turn off safe mode manually. NOTE: If you turn off safe mode before adding resources, the NN will immediately return to safe mode. Use "hdfs dfsadmin -safemode leave" to turn safe mode off.
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkNameNodeSafeMode(FSNamesystem.java:1364)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.deleteInternal(FSNamesystem.java:3967)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.deleteInt(FSNamesystem.java:3925)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.delete(FSNamesystem.java:3909)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.delete(NameNodeRpcServer.java:786)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.delete(ClientNamenodeProtocolServerSideTranslatorPB.java:589)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2039)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2035)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2033)
Stack Trace:
org.apache.hadoop.ipc.RemoteException:
Cannot delete /user/test/66853dc6-7337-4d6b-83ff-cb197a006d1d. Name node is in safe mode.
Resources are low on NN. Please add or free up more resources then turn off safe mode manually. NOTE: If you turn off safe mode before adding resources, the NN will immediately return to safe mode. Use "hdfs dfsadmin -safemode leave" to turn safe mode off.
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkNameNodeSafeMode(FSNamesystem.java:1364)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.deleteInternal(FSNamesystem.java:3967)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.deleteInt(FSNamesystem.java:3925)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.delete(FSNamesystem.java:3909)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.delete(NameNodeRpcServer.java:786)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.delete(ClientNamenodeProtocolServerSideTranslatorPB.java:589)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2039)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2035)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2033)
at org.apache.oozie.action.hadoop.TestPigMain.setUp(TestPigMain.java:47)
FAILED: org.apache.oozie.action.hadoop.TestSparkActionExecutor.testSetupMethodsWithSparkConfiguration
Error Message:
Call From asf918.gq1.ygridcore.net/67.195.81.138 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
Stack Trace:
java.net.ConnectException: Call From asf918.gq1.ygridcore.net/67.195.81.138 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
Caused by: java.net.ConnectException: Connection refused
FAILED: org.apache.oozie.action.hadoop.TestSparkActionExecutor.testSetupMethods
Error Message:
Call From asf918.gq1.ygridcore.net/67.195.81.138 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
Stack Trace:
java.net.ConnectException: Call From asf918.gq1.ygridcore.net/67.195.81.138 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
Caused by: java.net.ConnectException: Connection refused
FAILED: org.apache.oozie.action.hadoop.TestSparkMain.testJobIDPattern
Error Message:
File /user/jenkins/target/MiniMRCluster_377820428-tmpDir/MRAppJar.jar could only be replicated to 0 nodes instead of minReplication (=1). There are 2 datanode(s) running and 2 node(s) are excluded in this operation.
at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1549)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:3200)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:641)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:482)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2039)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2035)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2033)
Stack Trace:
org.apache.hadoop.ipc.RemoteException:
File /user/jenkins/target/MiniMRCluster_377820428-tmpDir/MRAppJar.jar could only be replicated to 0 nodes instead of minReplication (=1). There are 2 datanode(s) running and 2 node(s) are excluded in this operation.
at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1549)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:3200)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:641)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:482)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2039)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2035)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2033)
FAILED: org.apache.oozie.action.hadoop.TestSparkMain.testPatterns
Error Message:
Call From asf918.gq1.ygridcore.net/67.195.81.138 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
Stack Trace:
java.net.ConnectException: Call From asf918.gq1.ygridcore.net/67.195.81.138 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
Caused by: java.net.ConnectException: Connection refused
FAILED: org.apache.oozie.action.hadoop.TestSparkMain.testMain
Error Message:
Call From asf918.gq1.ygridcore.net/67.195.81.138 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
Stack Trace:
java.net.ConnectException: Call From asf918.gq1.ygridcore.net/67.195.81.138 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
Caused by: java.net.ConnectException: Connection refused
FAILED: org.apache.oozie.action.hadoop.TestSqoopActionExecutor.testSetupMethods
Error Message:
Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/sqoop/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestSqoopActionExecutor/testSetupMethods/862724c7-3f69-47d9-92df-e97beb5687a3]
Stack Trace:
java.lang.RuntimeException: Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/sqoop/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestSqoopActionExecutor/testSetupMethods/862724c7-3f69-47d9-92df-e97beb5687a3]
FAILED: org.apache.oozie.action.hadoop.TestMapReduceActionExecutor.testMapReduceWithUberJarEnabled
Error Message:
YARN App state for app application_1510687948111_0001 expected:<FINISHED> but was:<ACCEPTED>
Stack Trace:
junit.framework.AssertionFailedError: YARN App state for app application_1510687948111_0001 expected:<FINISHED> but was:<ACCEPTED>
at org.apache.oozie.action.hadoop.TestMapReduceActionExecutor._testSubmit(TestMapReduceActionExecutor.java:400)
at org.apache.oozie.action.hadoop.TestMapReduceActionExecutor._testMapReduceWithUberJar(TestMapReduceActionExecutor.java:727)
at org.apache.oozie.action.hadoop.TestMapReduceActionExecutor.testMapReduceWithUberJarEnabled(TestMapReduceActionExecutor.java:815)
FAILED: org.apache.oozie.action.hadoop.TestMapReduceActionExecutor.testSetMapredJobName
Error Message:
YARN App state for app application_1510687948111_0002 expected:<FINISHED> but was:<ACCEPTED>
Stack Trace:
junit.framework.AssertionFailedError: YARN App state for app application_1510687948111_0002 expected:<FINISHED> but was:<ACCEPTED>
at org.apache.oozie.action.hadoop.TestMapReduceActionExecutor.testSetMapredJobName(TestMapReduceActionExecutor.java:1144)
FAILED: org.apache.oozie.action.hadoop.TestMapReduceActionExecutor.testMapReduceWithUberJarDisabled
Error Message:
Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testMapReduceWithUberJarDisabled/2ba87b17-5f6f-4287-8e03-43664c11f4d8]
Stack Trace:
java.lang.RuntimeException: Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testMapReduceWithUberJarDisabled/2ba87b17-5f6f-4287-8e03-43664c11f4d8]
FAILED: org.apache.oozie.action.hadoop.TestMapReduceActionExecutor.testPipes
Error Message:
Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testPipes/36db0ca4-f545-4a28-86a2-e667d11df8d1]
Stack Trace:
java.lang.RuntimeException: Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testPipes/36db0ca4-f545-4a28-86a2-e667d11df8d1]
FAILED: org.apache.oozie.action.hadoop.TestMapReduceActionExecutor.testCommaSeparatedFilesAndArchives
Error Message:
Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testCommaSeparatedFilesAndArchives/76ee0e80-b576-48d0-9d56-f00c5638491f]
Stack Trace:
java.lang.RuntimeException: Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testCommaSeparatedFilesAndArchives/76ee0e80-b576-48d0-9d56-f00c5638491f]
FAILED: org.apache.oozie.action.hadoop.TestMapReduceActionExecutor.testEndWithoutConfiguration
Error Message:
Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testEndWithoutConfiguration/2501aaa4-5554-47cd-8d11-7b9a808dad2a]
Stack Trace:
java.lang.RuntimeException: Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testEndWithoutConfiguration/2501aaa4-5554-47cd-8d11-7b9a808dad2a]
FAILED: org.apache.oozie.action.hadoop.TestMapReduceActionExecutor.testMapReduceWithConfigClass
Error Message:
Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testMapReduceWithConfigClass/153cc883-0909-4414-bd61-d3e20baba80b]
Stack Trace:
java.lang.RuntimeException: Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testMapReduceWithConfigClass/153cc883-0909-4414-bd61-d3e20baba80b]
FAILED: org.apache.oozie.action.hadoop.TestMapReduceActionExecutor.testMapReduceWithCredentials
Error Message:
Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testMapReduceWithCredentials/bf093427-1f52-473b-a1b4-1e3f9ef0e00a]
Stack Trace:
java.lang.RuntimeException: Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testMapReduceWithCredentials/bf093427-1f52-473b-a1b4-1e3f9ef0e00a]
FAILED: org.apache.oozie.action.hadoop.TestMapReduceActionExecutor.testSetupMethods
Error Message:
Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testSetupMethods/d66ade73-ea11-4ab5-b52f-ef50a0aeed1c]
Stack Trace:
java.lang.RuntimeException: Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testSetupMethods/d66ade73-ea11-4ab5-b52f-ef50a0aeed1c]
FAILED: org.apache.oozie.action.hadoop.TestMapReduceActionExecutor.testJobNameSetForMapReduceChild
Error Message:
Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testJobNameSetForMapReduceChild/55fe2e93-2951-4abb-8aa7-57a74d555455]
Stack Trace:
java.lang.RuntimeException: Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testJobNameSetForMapReduceChild/55fe2e93-2951-4abb-8aa7-57a74d555455]
FAILED: org.apache.oozie.action.hadoop.TestMapReduceActionExecutor.testStreamingConfOverride
Error Message:
Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testStreamingConfOverride/7fc211f6-87d1-4fee-9f53-2f176448d390]
Stack Trace:
java.lang.RuntimeException: Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testStreamingConfOverride/7fc211f6-87d1-4fee-9f53-2f176448d390]
FAILED: org.apache.oozie.action.hadoop.TestMapReduceActionExecutor.testConfigDefaultPropsToAction
Error Message:
Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testConfigDefaultPropsToAction/528afd7d-2cdd-4df6-a986-ca225dd3b71b]
Stack Trace:
java.lang.RuntimeException: Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testConfigDefaultPropsToAction/528afd7d-2cdd-4df6-a986-ca225dd3b71b]
FAILED: org.apache.oozie.action.hadoop.TestMapReduceActionExecutor.testMapReduceWithConfigClassThrowException
Error Message:
Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testMapReduceWithConfigClassThrowException/99b7679d-d1b1-4340-8555-0affdf3ca9ee]
Stack Trace:
java.lang.RuntimeException: Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testMapReduceWithConfigClassThrowException/99b7679d-d1b1-4340-8555-0affdf3ca9ee]
FAILED: org.apache.oozie.action.hadoop.TestMapReduceActionExecutor.testStreaming
Error Message:
Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testStreaming/db8fb176-bfff-4ce9-bb25-40f701e29c0d]
Stack Trace:
java.lang.RuntimeException: Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testStreaming/db8fb176-bfff-4ce9-bb25-40f701e29c0d]
FAILED: org.apache.oozie.action.hadoop.TestMapReduceActionExecutor.testSetExecutionStats_when_user_has_specified_stats_write_TRUE
Error Message:
Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testSetExecutionStats_when_user_has_specified_stats_write_TRUE/72b4b968-70c5-485a-8e21-5c2605ca3ff7]
Stack Trace:
java.lang.RuntimeException: Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testSetExecutionStats_when_user_has_specified_stats_write_TRUE/72b4b968-70c5-485a-8e21-5c2605ca3ff7]
FAILED: org.apache.oozie.action.hadoop.TestMapReduceActionExecutor.testSetExecutionStats_when_user_has_specified_stats_write_FALSE
Error Message:
Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testSetExecutionStats_when_user_has_specified_stats_write_FALSE/d713e740-58d8-49a3-a6f5-4eb22d1fccc3]
Stack Trace:
java.lang.RuntimeException: Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testSetExecutionStats_when_user_has_specified_stats_write_FALSE/d713e740-58d8-49a3-a6f5-4eb22d1fccc3]
FAILED: org.apache.oozie.action.hadoop.TestMapReduceActionExecutor.testMapReduceWithConfigClassNotFound
Error Message:
Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testMapReduceWithConfigClassNotFound/311829fb-9709-49f7-aeb8-366c162b564d]
Stack Trace:
java.lang.RuntimeException: Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testMapReduceWithConfigClassNotFound/311829fb-9709-49f7-aeb8-366c162b564d]
FAILED: org.apache.oozie.action.hadoop.TestMapReduceActionExecutor.testMapReduce
Error Message:
Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testMapReduce/6a5650f3-f190-4074-b6ec-eb3945753f8b]
Stack Trace:
java.lang.RuntimeException: Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testMapReduce/6a5650f3-f190-4074-b6ec-eb3945753f8b]
FAILED: org.apache.oozie.action.hadoop.TestMapReduceActionExecutor.testMapReduceActionError
Error Message:
Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testMapReduceActionError/486296f9-c664-4aae-9d5b-8bb1b5e417dc]
Stack Trace:
java.lang.RuntimeException: Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testMapReduceActionError/486296f9-c664-4aae-9d5b-8bb1b5e417dc]
FAILED: org.apache.oozie.action.hadoop.TestMapReduceActionExecutor.testMapReduceActionKill
Error Message:
Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testMapReduceActionKill/ba54ffc3-fe55-449f-ae17-741e32f9fb01]
Stack Trace:
java.lang.RuntimeException: Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testMapReduceActionKill/ba54ffc3-fe55-449f-ae17-741e32f9fb01]
FAILED: org.apache.oozie.tools.TestOozieSharelibCLI.testOozieSharelibCLICreate
Error Message:
File /user/jenkins/target/MiniMRCluster_667895151-tmpDir/MRAppJar.jar could only be replicated to 0 nodes instead of minReplication (=1). There are 2 datanode(s) running and 2 node(s) are excluded in this operation.
at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1549)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:3200)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:641)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:482)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2039)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2035)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2033)
Stack Trace:
org.apache.hadoop.ipc.RemoteException:
File /user/jenkins/target/MiniMRCluster_667895151-tmpDir/MRAppJar.jar could only be replicated to 0 nodes instead of minReplication (=1). There are 2 datanode(s) running and 2 node(s) are excluded in this operation.
at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1549)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:3200)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:641)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:482)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2039)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2035)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2033)
FAILED: org.apache.oozie.test.TestWorkflow.testParallelFsAndShellWorkflowCompletesSuccessfully
Error Message:
expected:<SUCCEEDED> but was:<RUNNING>
Stack Trace:
junit.framework.AssertionFailedError: expected:<SUCCEEDED> but was:<RUNNING>
at org.apache.oozie.test.TestWorkflow.runWorkflowFromFile(TestWorkflow.java:167)
at org.apache.oozie.test.TestWorkflow.testParallelFsAndShellWorkflowCompletesSuccessfully(TestWorkflow.java:117)
FAILED: org.apache.oozie.test.TestWorkflowRetries.testParallelFsAndShellWorkflowCompletesSuccessfully
Error Message:
expected:<SUCCEEDED> but was:<RUNNING>
Stack Trace:
junit.framework.AssertionFailedError: expected:<SUCCEEDED> but was:<RUNNING>
FAILED: org.apache.oozie.action.hadoop.TestPyspark.testPyspark
Error Message:
Call From asf918.gq1.ygridcore.net/67.195.81.138 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
Stack Trace:
java.net.ConnectException: Call From asf918.gq1.ygridcore.net/67.195.81.138 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
Caused by: java.net.ConnectException: Connection refused
FAILED: org.apache.oozie.action.hadoop.TestSparkActionExecutor.testSparkAction
Error Message:
Call From asf918.gq1.ygridcore.net/67.195.81.138 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
Stack Trace:
java.net.ConnectException: Call From asf918.gq1.ygridcore.net/67.195.81.138 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
Caused by: java.net.ConnectException: Connection refused
FAILED: org.apache.oozie.action.hadoop.TestSqoopActionExecutor.testSqoopActionWithRedundantArgsAndFreeFormQuery
Error Message:
Cannot create directory /home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/sqoop/build/test/data/dfs/name1/current
Stack Trace:
java.io.IOException: Cannot create directory /home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/sqoop/build/test/data/dfs/name1/current
FAILED: org.apache.oozie.action.hadoop.TestSqoopActionExecutor.testSqoopActionWithArgsAndFreeFormQuery
Error Message:
Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/sqoop/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestSqoopActionExecutor/testSqoopActionWithArgsAndFreeFormQuery/64f39b9c-0c9d-4dd7-81b6-7943a54798f8]
Stack Trace:
java.lang.RuntimeException: Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/sqoop/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestSqoopActionExecutor/testSqoopActionWithArgsAndFreeFormQuery/64f39b9c-0c9d-4dd7-81b6-7943a54798f8]
FAILED: org.apache.oozie.action.hadoop.TestSqoopActionExecutor.testSqoopActionWithBadRedundantArgsAndFreeFormQuery
Error Message:
Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/sqoop/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestSqoopActionExecutor/testSqoopActionWithBadRedundantArgsAndFreeFormQuery/1a466781-2517-43c1-babf-7e0156737bf8]
Stack Trace:
java.lang.RuntimeException: Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/sqoop/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestSqoopActionExecutor/testSqoopActionWithBadRedundantArgsAndFreeFormQuery/1a466781-2517-43c1-babf-7e0156737bf8]
FAILED: org.apache.oozie.action.hadoop.TestSqoopActionExecutor.testSqoopActionWithBadCommand
Error Message:
Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/sqoop/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestSqoopActionExecutor/testSqoopActionWithBadCommand/626a6841-b8a6-426b-99c9-3c6488dd27c6]
Stack Trace:
java.lang.RuntimeException: Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/sqoop/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestSqoopActionExecutor/testSqoopActionWithBadCommand/626a6841-b8a6-426b-99c9-3c6488dd27c6]
FAILED: org.apache.oozie.action.hadoop.TestSqoopActionExecutor.testSqoopEval
Error Message:
Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/sqoop/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestSqoopActionExecutor/testSqoopEval/5a0416a0-e434-4b80-b9cd-c40eeefe9340]
Stack Trace:
java.lang.RuntimeException: Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/sqoop/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestSqoopActionExecutor/testSqoopEval/5a0416a0-e434-4b80-b9cd-c40eeefe9340]
FAILED: org.apache.oozie.action.hadoop.TestSqoopActionExecutor.testSqoopActionWithRedundantPrefix
Error Message:
Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/sqoop/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestSqoopActionExecutor/testSqoopActionWithRedundantPrefix/f1bf5dbf-61a8-4722-b921-1d5c64a5bae0]
Stack Trace:
java.lang.RuntimeException: Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/sqoop/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestSqoopActionExecutor/testSqoopActionWithRedundantPrefix/f1bf5dbf-61a8-4722-b921-1d5c64a5bae0]
FAILED: org.apache.oozie.action.hadoop.TestSqoopActionExecutor.testSqoopAction
Error Message:
Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/sqoop/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestSqoopActionExecutor/testSqoopAction/10ff4ddd-ff25-42e4-8dfc-4ee6939dd5c1]
Stack Trace:
java.lang.RuntimeException: Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/sqoop/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestSqoopActionExecutor/testSqoopAction/10ff4ddd-ff25-42e4-8dfc-4ee6939dd5c1]
Failed: OOZIE-3107 PreCommit Build #219
Posted by Apache Jenkins Server <je...@builds.apache.org>.
Jira: https://issues.apache.org/jira/browse/OOZIE-3107
Build: https://builds.apache.org/job/PreCommit-OOZIE-Build/219/
###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 1.63 MB...]
[ERROR] mvn <goals> -rf :oozie-webapp
bin/test-patch-30-distro: line 92: echo: write error: No space left on device
Reports
Running test-patch task CLEAN
Running test-patch task RAW_PATCH_ANALYSIS
Running test-patch task RAT
Running test-patch task JAVADOC
Running test-patch task COMPILE
Running test-patch task FINDBUGS_DIFF
mkdir: cannot create directory '/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build/test-patch/tmp/FINDBUGS_DIFF/diff': No space left on device
[TRACE] Downloading FindBugs diff JAR from https://repo1.maven.org/maven2/me/andrz/findbugs/findbugs-diff/0.1.0/findbugs-diff-0.1.0-all.jar
tee: /home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build/test-patch/tmp/test-patch-11-findbugs-diff-report.out: No space left on device
bin/test-patch-11-findbugs-diff: line 179: /home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build/test-patch/tmp/FINDBUGS_DIFF/diff/findbugs-diff-0.1.0-all.jar: No such file or directory
[TRACE] FindBugs diff JAR downloaded
md5sum: /home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build/test-patch/tmp/FINDBUGS_DIFF/diff/findbugs-diff-0.1.0-all.jar: No such file or directory
bin/test-patch-11-findbugs-diff: line 187: /home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build/test-patch/tmp/FINDBUGS_DIFF/diff/findbugs-diff-0.1.0-all.jar.md5sum: No such file or directory
grep: /home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build/bin/findbugs-diff-0.1.0-all.jar.md5sum: No such file or directory
[TRACE] FindBugs diff JAR checked, is safe to use
[TRACE] Performing FindBugs diffs
Error: Unable to access jarfile /home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build/test-patch/tmp/FINDBUGS_DIFF/diff/findbugs-diff-0.1.0-all.jar
Error: Unable to access jarfile /home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build/test-patch/tmp/FINDBUGS_DIFF/diff/findbugs-diff-0.1.0-all.jar
Error: Unable to access jarfile /home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build/test-patch/tmp/FINDBUGS_DIFF/diff/findbugs-diff-0.1.0-all.jar
Error: Unable to access jarfile /home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build/test-patch/tmp/FINDBUGS_DIFF/diff/findbugs-diff-0.1.0-all.jar
Error: Unable to access jarfile /home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build/test-patch/tmp/FINDBUGS_DIFF/diff/findbugs-diff-0.1.0-all.jar
Error: Unable to access jarfile /home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build/test-patch/tmp/FINDBUGS_DIFF/diff/findbugs-diff-0.1.0-all.jar
Error: Unable to access jarfile /home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build/test-patch/tmp/FINDBUGS_DIFF/diff/findbugs-diff-0.1.0-all.jar
Error: Unable to access jarfile /home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build/test-patch/tmp/FINDBUGS_DIFF/diff/findbugs-diff-0.1.0-all.jar
Error: Unable to access jarfile /home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build/test-patch/tmp/FINDBUGS_DIFF/diff/findbugs-diff-0.1.0-all.jar
Error: Unable to access jarfile /home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build/test-patch/tmp/FINDBUGS_DIFF/diff/findbugs-diff-0.1.0-all.jar
Error: Unable to access jarfile /home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build/test-patch/tmp/FINDBUGS_DIFF/diff/findbugs-diff-0.1.0-all.jar
Error: Unable to access jarfile /home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build/test-patch/tmp/FINDBUGS_DIFF/diff/findbugs-diff-0.1.0-all.jar
Error: Unable to access jarfile /home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build/test-patch/tmp/FINDBUGS_DIFF/diff/findbugs-diff-0.1.0-all.jar
Error: Unable to access jarfile /home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build/test-patch/tmp/FINDBUGS_DIFF/diff/findbugs-diff-0.1.0-all.jar
Error: Unable to access jarfile /home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build/test-patch/tmp/FINDBUGS_DIFF/diff/findbugs-diff-0.1.0-all.jar
[TRACE] FindBugs diffs performed
[TRACE] Checking FindBugs diffs and creating reports
find: '/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build/test-patch/tmp/FINDBUGS_DIFF/diff': No such file or directory
[INFO] There are no new bugs found totally].
[TRACE] FindBugs diffs checked and reports created
[TRACE] Summary file size is 1122 bytes
[TRACE] Full summary file size is 184 bytes
Failure, check for details /home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build/test-patch/tmp/test-patch-11-findbugs-diff-report.out
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
0 3706k 0 13032 0 0 14134 0 0:04:28 --:--:-- 0:04:28 14134
curl: (23) Failed writing body (0 != 13032)
Could not download jira-cli tool, thus no JIRA updating
Build step 'Execute shell' marked build as failure
[description-setter] Description set: OOZIE-3107
Archiving artifacts
[Fast Archiver] Compressed 1.77 MB of artifacts by 54.8% relative to #216
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any
###################################################################################
############################## FAILED TESTS (if any) ##############################
40 tests failed.
FAILED: org.apache.oozie.action.hadoop.TestJavaActionExecutor.testCannotKillActionWhenACLSpecified
Error Message:
YARN App state for app application_1510681491938_0001 expected:<FINISHED> but was:<ACCEPTED>
Stack Trace:
junit.framework.AssertionFailedError: YARN App state for app application_1510681491938_0001 expected:<FINISHED> but was:<ACCEPTED>
at org.apache.oozie.action.hadoop.TestJavaActionExecutor.testCannotKillActionWhenACLSpecified(TestJavaActionExecutor.java:1688)
FAILED: org.apache.oozie.action.hadoop.TestJavaActionExecutor.testOutputSubmitOK
Error Message:
YARN App state for app application_1510681491938_0002 expected:<FINISHED> but was:<ACCEPTED>
Stack Trace:
junit.framework.AssertionFailedError: YARN App state for app application_1510681491938_0002 expected:<FINISHED> but was:<ACCEPTED>
at org.apache.oozie.action.hadoop.TestJavaActionExecutor.testOutputSubmitOK(TestJavaActionExecutor.java:399)
FAILED: org.apache.oozie.action.hadoop.TestJavaActionExecutor.testSubmitLauncherConfigurationOverridesLauncherMapperProperties
Error Message:
YARN App state for app application_1510681491938_0003 expected:<FINISHED> but was:<ACCEPTED>
Stack Trace:
junit.framework.AssertionFailedError: YARN App state for app application_1510681491938_0003 expected:<FINISHED> but was:<ACCEPTED>
at org.apache.oozie.action.hadoop.TestJavaActionExecutor.testSubmitLauncherConfigurationOverridesLauncherMapperProperties(TestJavaActionExecutor.java:2601)
FAILED: org.apache.oozie.action.hadoop.TestJavaActionExecutor.testJobSubmissionWithoutYarnKill
Error Message:
YARN App state for app application_1510681491938_0004 expected:<FINISHED> but was:<ACCEPTED>
Stack Trace:
junit.framework.AssertionFailedError: YARN App state for app application_1510681491938_0004 expected:<FINISHED> but was:<ACCEPTED>
at org.apache.oozie.action.hadoop.TestJavaActionExecutor.testJobSubmissionWithoutYarnKill(TestJavaActionExecutor.java:2186)
FAILED: org.apache.oozie.action.hadoop.TestJavaActionExecutor.testExceptionSubmitException
Error Message:
YARN App state for app application_1510681491938_0005 expected:<FINISHED> but was:<ACCEPTED>
Stack Trace:
junit.framework.AssertionFailedError: YARN App state for app application_1510681491938_0005 expected:<FINISHED> but was:<ACCEPTED>
at org.apache.oozie.action.hadoop.TestJavaActionExecutor.testExceptionSubmitException(TestJavaActionExecutor.java:519)
FAILED: org.apache.oozie.action.hadoop.TestJavaActionExecutor.testSubmitOKWithLauncherEnvVars
Error Message:
YARN App state for app application_1510681491938_0007 expected:<FINISHED> but was:<ACCEPTED>
Stack Trace:
junit.framework.AssertionFailedError: YARN App state for app application_1510681491938_0007 expected:<FINISHED> but was:<ACCEPTED>
at org.apache.oozie.action.hadoop.TestJavaActionExecutor.testSubmitOKWithLauncherEnvVars(TestJavaActionExecutor.java:2500)
FAILED: org.apache.oozie.action.hadoop.TestJavaActionExecutor.testSubmitOKWithLauncherJavaOpts
Error Message:
YARN App state for app application_1510681491938_0008 expected:<FINISHED> but was:<ACCEPTED>
Stack Trace:
junit.framework.AssertionFailedError: YARN App state for app application_1510681491938_0008 expected:<FINISHED> but was:<ACCEPTED>
at org.apache.oozie.action.hadoop.TestJavaActionExecutor.testSubmitOKWithLauncherJavaOpts(TestJavaActionExecutor.java:2435)
FAILED: org.apache.oozie.action.hadoop.TestJavaActionExecutor.testPrepare
Error Message:
YARN App state for app application_1510681491938_0009 expected:<FINISHED> but was:<ACCEPTED>
Stack Trace:
junit.framework.AssertionFailedError: YARN App state for app application_1510681491938_0009 expected:<FINISHED> but was:<ACCEPTED>
at org.apache.oozie.action.hadoop.TestJavaActionExecutor.testPrepare(TestJavaActionExecutor.java:842)
FAILED: org.apache.oozie.action.hadoop.TestJavaActionExecutor.testSubmitOKWithVcoresAndMemory
Error Message:
YARN App state for app application_1510681491938_0011 expected:<FINISHED> but was:<ACCEPTED>
Stack Trace:
junit.framework.AssertionFailedError: YARN App state for app application_1510681491938_0011 expected:<FINISHED> but was:<ACCEPTED>
at org.apache.oozie.action.hadoop.TestJavaActionExecutor.testSubmitOKWithVcoresAndMemory(TestJavaActionExecutor.java:2414)
FAILED: org.apache.oozie.action.hadoop.TestJavaActionExecutor.testEmptyArgsWithNullArgsNotAllowed
Error Message:
YARN App state for app application_1510681491938_0012 expected:<FINISHED> but was:<ACCEPTED>
Stack Trace:
junit.framework.AssertionFailedError: YARN App state for app application_1510681491938_0012 expected:<FINISHED> but was:<ACCEPTED>
at org.apache.oozie.action.hadoop.TestJavaActionExecutor.testEmptyArgs(TestJavaActionExecutor.java:2372)
at org.apache.oozie.action.hadoop.TestJavaActionExecutor.testEmptyArgsWithNullArgsNotAllowed(TestJavaActionExecutor.java:2352)
FAILED: org.apache.oozie.action.hadoop.TestJavaActionExecutor.testRecovery
Error Message:
expected:<...on_1510681491938_001[3]> but was:<...on_1510681491938_001[4]>
Stack Trace:
junit.framework.ComparisonFailure: expected:<...on_1510681491938_001[3]> but was:<...on_1510681491938_001[4]>
at org.apache.oozie.action.hadoop.TestJavaActionExecutor.testRecovery(TestJavaActionExecutor.java:590)
FAILED: org.apache.oozie.action.hadoop.TestJavaActionExecutor.testEnvVarsPropagatedFromLauncherConfig
Error Message:
YARN App state for app application_1510681491938_0016 expected:<FINISHED> but was:<ACCEPTED>
Stack Trace:
junit.framework.AssertionFailedError: YARN App state for app application_1510681491938_0016 expected:<FINISHED> but was:<ACCEPTED>
at org.apache.oozie.action.hadoop.TestJavaActionExecutor.testEnvVarsPropagatedFromLauncherConfig(TestJavaActionExecutor.java:2521)
FAILED: org.apache.oozie.action.hadoop.TestJavaActionExecutor.testExceptionSubmitThrowable
Error Message:
YARN App state for app application_1510681491938_0018 expected:<FINISHED> but was:<ACCEPTED>
Stack Trace:
junit.framework.AssertionFailedError: YARN App state for app application_1510681491938_0018 expected:<FINISHED> but was:<ACCEPTED>
at org.apache.oozie.action.hadoop.TestJavaActionExecutor.testExceptionSubmitThrowable(TestJavaActionExecutor.java:541)
FAILED: org.apache.oozie.action.hadoop.TestJavaActionExecutor.testAdditionalJarSubmitOK
Error Message:
YARN App state for app application_1510681491938_0019 expected:<FINISHED> but was:<ACCEPTED>
Stack Trace:
junit.framework.AssertionFailedError: YARN App state for app application_1510681491938_0019 expected:<FINISHED> but was:<ACCEPTED>
at org.apache.oozie.action.hadoop.TestJavaActionExecutor.testAdditionalJarSubmitOK(TestJavaActionExecutor.java:455)
FAILED: org.apache.oozie.action.hadoop.TestJavaActionExecutor.testChildKill
Error Message:
expected:<RUNNING> but was:<ACCEPTED>
Stack Trace:
junit.framework.AssertionFailedError: expected:<RUNNING> but was:<ACCEPTED>
at org.apache.oozie.action.hadoop.TestJavaActionExecutor.testChildKill(TestJavaActionExecutor.java:2628)
FAILED: org.apache.oozie.action.hadoop.TestJavaActionExecutor.testIdSwapSubmitOK
Error Message:
YARN App state for app application_1510681491938_0021 expected:<FINISHED> but was:<ACCEPTED>
Stack Trace:
junit.framework.AssertionFailedError: YARN App state for app application_1510681491938_0021 expected:<FINISHED> but was:<ACCEPTED>
at org.apache.oozie.action.hadoop.TestJavaActionExecutor.testIdSwapSubmitOK(TestJavaActionExecutor.java:424)
FAILED: org.apache.oozie.action.hadoop.TestJavaActionExecutor.testEmptyArgsWithNullArgsAllowed
Error Message:
YARN App state for app application_1510681491938_0022 expected:<FINISHED> but was:<ACCEPTED>
Stack Trace:
junit.framework.AssertionFailedError: YARN App state for app application_1510681491938_0022 expected:<FINISHED> but was:<ACCEPTED>
at org.apache.oozie.action.hadoop.TestJavaActionExecutor.testEmptyArgs(TestJavaActionExecutor.java:2372)
at org.apache.oozie.action.hadoop.TestJavaActionExecutor.testEmptyArgsWithNullArgsAllowed(TestJavaActionExecutor.java:2356)
FAILED: org.apache.oozie.action.hadoop.TestJavaActionExecutor.testExit0SubmitOK
Error Message:
YARN App state for app application_1510681491938_0023 expected:<FINISHED> but was:<ACCEPTED>
Stack Trace:
junit.framework.AssertionFailedError: YARN App state for app application_1510681491938_0023 expected:<FINISHED> but was:<ACCEPTED>
at org.apache.oozie.action.hadoop.TestJavaActionExecutor.testExit0SubmitOK(TestJavaActionExecutor.java:474)
FAILED: org.apache.oozie.action.hadoop.TestJavaActionExecutor.testActionShareLibWithNonDefaultNamenode
Error Message:
YARN App state for app application_1510681491938_0024 expected:<FINISHED> but was:<ACCEPTED>
Stack Trace:
junit.framework.AssertionFailedError: YARN App state for app application_1510681491938_0024 expected:<FINISHED> but was:<ACCEPTED>
at org.apache.oozie.action.hadoop.TestJavaActionExecutor.testActionShareLibWithNonDefaultNamenode(TestJavaActionExecutor.java:2152)
FAILED: org.apache.oozie.action.hadoop.TestJavaActionExecutor.testSimpestSleSubmitOK
Error Message:
YARN App state for app application_1510681491938_0026 expected:<FINISHED> but was:<ACCEPTED>
Stack Trace:
junit.framework.AssertionFailedError: YARN App state for app application_1510681491938_0026 expected:<FINISHED> but was:<ACCEPTED>
at org.apache.oozie.action.hadoop.TestJavaActionExecutor.testSimpestSleSubmitOK(TestJavaActionExecutor.java:379)
FAILED: org.apache.oozie.action.hadoop.TestJavaActionExecutor.testExit1SubmitError
Error Message:
YARN App state for app application_1510681491938_0027 expected:<FINISHED> but was:<ACCEPTED>
Stack Trace:
junit.framework.AssertionFailedError: YARN App state for app application_1510681491938_0027 expected:<FINISHED> but was:<ACCEPTED>
at org.apache.oozie.action.hadoop.TestJavaActionExecutor.testExit1SubmitError(TestJavaActionExecutor.java:495)
FAILED: org.apache.oozie.command.coord.TestCoordActionMissingDependenciesXCommand.testCoordActionPullPushDependencyMissing
Error Message:
E0803: IO error, E0603: SQL error in operation, The transaction has been rolled back. See the nested exceptions for details on the errors that occurred.
Stack Trace:
org.apache.oozie.command.CommandException: E0803: IO error, E0603: SQL error in operation, The transaction has been rolled back. See the nested exceptions for details on the errors that occurred.
at org.apache.oozie.command.coord.TestCoordActionMissingDependenciesXCommand.testCoordActionPullPushDependencyMissing(TestCoordActionMissingDependenciesXCommand.java:202)
Caused by: org.apache.oozie.command.CommandException: E0603: SQL error in operation, The transaction has been rolled back. See the nested exceptions for details on the errors that occurred.
at org.apache.oozie.command.coord.TestCoordActionMissingDependenciesXCommand.testCoordActionPullPushDependencyMissing(TestCoordActionMissingDependenciesXCommand.java:202)
Caused by: org.apache.oozie.executor.jpa.JPAExecutorException: E0603: SQL error in operation, The transaction has been rolled back. See the nested exceptions for details on the errors that occurred.
at org.apache.oozie.command.coord.TestCoordActionMissingDependenciesXCommand.testCoordActionPullPushDependencyMissing(TestCoordActionMissingDependenciesXCommand.java:202)
FAILED: org.apache.oozie.command.coord.TestCoordActionMissingDependenciesXCommand.testCoordActionPushDependencyMissing
Error Message:
E0803: IO error, E0603: SQL error in operation, The transaction has been rolled back. See the nested exceptions for details on the errors that occurred.
Stack Trace:
org.apache.oozie.command.CommandException: E0803: IO error, E0603: SQL error in operation, The transaction has been rolled back. See the nested exceptions for details on the errors that occurred.
at org.apache.oozie.command.coord.TestCoordActionMissingDependenciesXCommand.testCoordActionPushDependencyMissing(TestCoordActionMissingDependenciesXCommand.java:144)
Caused by: org.apache.oozie.command.CommandException: E0603: SQL error in operation, The transaction has been rolled back. See the nested exceptions for details on the errors that occurred.
at org.apache.oozie.command.coord.TestCoordActionMissingDependenciesXCommand.testCoordActionPushDependencyMissing(TestCoordActionMissingDependenciesXCommand.java:144)
Caused by: org.apache.oozie.executor.jpa.JPAExecutorException: E0603: SQL error in operation, The transaction has been rolled back. See the nested exceptions for details on the errors that occurred.
at org.apache.oozie.command.coord.TestCoordActionMissingDependenciesXCommand.testCoordActionPushDependencyMissing(TestCoordActionMissingDependenciesXCommand.java:144)
FAILED: org.apache.oozie.command.coord.TestCoordActionMissingDependenciesXCommand.testCoordActionPullDependencyMissing
Error Message:
E0803: IO error, E0603: SQL error in operation, The transaction has been rolled back. See the nested exceptions for details on the errors that occurred.
Stack Trace:
org.apache.oozie.command.CommandException: E0803: IO error, E0603: SQL error in operation, The transaction has been rolled back. See the nested exceptions for details on the errors that occurred.
at org.apache.oozie.command.coord.TestCoordActionMissingDependenciesXCommand.testCoordActionPullDependencyMissing(TestCoordActionMissingDependenciesXCommand.java:81)
Caused by: org.apache.oozie.command.CommandException: E0603: SQL error in operation, The transaction has been rolled back. See the nested exceptions for details on the errors that occurred.
at org.apache.oozie.command.coord.TestCoordActionMissingDependenciesXCommand.testCoordActionPullDependencyMissing(TestCoordActionMissingDependenciesXCommand.java:81)
Caused by: org.apache.oozie.executor.jpa.JPAExecutorException: E0603: SQL error in operation, The transaction has been rolled back. See the nested exceptions for details on the errors that occurred.
at org.apache.oozie.command.coord.TestCoordActionMissingDependenciesXCommand.testCoordActionPullDependencyMissing(TestCoordActionMissingDependenciesXCommand.java:81)
FAILED: org.apache.oozie.command.coord.TestCoordActionMissingDependenciesXCommand.testCoordActionInputLogicMissing
Error Message:
should not throw exception E0803: IO error, E0603: SQL error in operation, The transaction has been rolled back. See the nested exceptions for details on the errors that occurred.
Stack Trace:
junit.framework.AssertionFailedError: should not throw exception E0803: IO error, E0603: SQL error in operation, The transaction has been rolled back. See the nested exceptions for details on the errors that occurred.
at org.apache.oozie.command.coord.TestCoordActionMissingDependenciesXCommand.testCoordActionInputLogicMissing(TestCoordActionMissingDependenciesXCommand.java:248)
FAILED: org.apache.oozie.command.coord.TestCoordMaterializeTransitionXCommand.testSuccessedJobSlaParseElFunctionVariableInMaterializeActions
Error Message:
E0803: IO error, E0603: SQL error in operation, The transaction has been rolled back. See the nested exceptions for details on the errors that occurred.
Stack Trace:
org.apache.oozie.command.CommandException: E0803: IO error, E0603: SQL error in operation, The transaction has been rolled back. See the nested exceptions for details on the errors that occurred.
at org.apache.oozie.command.coord.TestCoordMaterializeTransitionXCommand.testSuccessedJobSlaParseElFunctionVariableInMaterializeActions(TestCoordMaterializeTransitionXCommand.java:1022)
Caused by: org.apache.oozie.command.CommandException: E0603: SQL error in operation, The transaction has been rolled back. See the nested exceptions for details on the errors that occurred.
at org.apache.oozie.command.coord.TestCoordMaterializeTransitionXCommand.testSuccessedJobSlaParseElFunctionVariableInMaterializeActions(TestCoordMaterializeTransitionXCommand.java:1022)
Caused by: org.apache.oozie.executor.jpa.JPAExecutorException: E0603: SQL error in operation, The transaction has been rolled back. See the nested exceptions for details on the errors that occurred.
at org.apache.oozie.command.coord.TestCoordMaterializeTransitionXCommand.testSuccessedJobSlaParseElFunctionVariableInMaterializeActions(TestCoordMaterializeTransitionXCommand.java:1022)
FAILED: org.apache.oozie.command.coord.TestCoordSubmitXCommand.testBasicSubmitWithMultipleStartInstancesInputEvent
Error Message:
Unexpected failure: org.apache.oozie.command.CommandException: E0803: IO error, E0603: SQL error in operation, The transaction has been rolled back. See the nested exceptions for details on the errors that occurred.
Stack Trace:
junit.framework.AssertionFailedError: Unexpected failure: org.apache.oozie.command.CommandException: E0803: IO error, E0603: SQL error in operation, The transaction has been rolled back. See the nested exceptions for details on the errors that occurred.
at org.apache.oozie.command.coord.TestCoordSubmitXCommand.testBasicSubmitWithMultipleStartInstancesInputEvent(TestCoordSubmitXCommand.java:449)
FAILED: org.apache.oozie.service.TestConfigurationService.testOozieConfig
Error Message:
expected:<...oozie-db;create=true[]> but was:<...oozie-db;create=true[;sql.enforce_strict_size=true]>
Stack Trace:
junit.framework.ComparisonFailure: expected:<...oozie-db;create=true[]> but was:<...oozie-db;create=true[;sql.enforce_strict_size=true]>
at org.apache.oozie.service.TestConfigurationService.testOozieConfig(TestConfigurationService.java:201)
FAILED: org.apache.oozie.util.TestMetricsInstrumentation.testJMXInstrumentation
Error Message:
Could not find own virtual machine
Stack Trace:
junit.framework.AssertionFailedError: Could not find own virtual machine
at org.apache.oozie.util.TestMetricsInstrumentation.testJMXInstrumentation(TestMetricsInstrumentation.java:243)
FAILED: org.apache.oozie.test.TestWorkflow.testParallelFsAndShellWorkflowCompletesSuccessfully
Error Message:
expected:<SUCCEEDED> but was:<RUNNING>
Stack Trace:
junit.framework.AssertionFailedError: expected:<SUCCEEDED> but was:<RUNNING>
at org.apache.oozie.test.TestWorkflow.runWorkflowFromFile(TestWorkflow.java:167)
at org.apache.oozie.test.TestWorkflow.testParallelFsAndShellWorkflowCompletesSuccessfully(TestWorkflow.java:117)
FAILED: org.apache.oozie.test.TestWorkflowRetries.testParallelFsAndShellWorkflowCompletesSuccessfully
Error Message:
expected:<SUCCEEDED> but was:<RUNNING>
Stack Trace:
junit.framework.AssertionFailedError: expected:<SUCCEEDED> but was:<RUNNING>
FAILED: org.apache.oozie.action.hadoop.TestPyspark.testPyspark
Error Message:
Timed out waiting for Mini HDFS Cluster to start
Stack Trace:
java.io.IOException: Timed out waiting for Mini HDFS Cluster to start
FAILED: org.apache.oozie.action.hadoop.TestSparkActionExecutor.testSparkAction
Error Message:
YARN App state for app application_1510687721838_0001 expected:<FINISHED> but was:<ACCEPTED>
Stack Trace:
junit.framework.AssertionFailedError: YARN App state for app application_1510687721838_0001 expected:<FINISHED> but was:<ACCEPTED>
at org.apache.oozie.action.hadoop.TestSparkActionExecutor.testSparkAction(TestSparkActionExecutor.java:180)
FAILED: org.apache.oozie.action.hadoop.TestSqoopActionExecutor.testSqoopActionWithRedundantArgsAndFreeFormQuery
Error Message:
YARN App state for app application_1510686381129_0001 expected:<FINISHED> but was:<ACCEPTED>
Stack Trace:
junit.framework.AssertionFailedError: YARN App state for app application_1510686381129_0001 expected:<FINISHED> but was:<ACCEPTED>
at org.apache.oozie.action.hadoop.TestSqoopActionExecutor.runSqoopActionFreeFormQuery(TestSqoopActionExecutor.java:318)
at org.apache.oozie.action.hadoop.TestSqoopActionExecutor.testSqoopActionWithRedundantArgsAndFreeFormQuery(TestSqoopActionExecutor.java:303)
FAILED: org.apache.oozie.action.hadoop.TestSqoopActionExecutor.testSqoopActionWithArgsAndFreeFormQuery
Error Message:
YARN App state for app application_1510686381129_0002 expected:<FINISHED> but was:<ACCEPTED>
Stack Trace:
junit.framework.AssertionFailedError: YARN App state for app application_1510686381129_0002 expected:<FINISHED> but was:<ACCEPTED>
at org.apache.oozie.action.hadoop.TestSqoopActionExecutor.runSqoopActionFreeFormQuery(TestSqoopActionExecutor.java:318)
at org.apache.oozie.action.hadoop.TestSqoopActionExecutor.testSqoopActionWithArgsAndFreeFormQuery(TestSqoopActionExecutor.java:310)
FAILED: org.apache.oozie.action.hadoop.TestSqoopActionExecutor.testSqoopActionWithBadRedundantArgsAndFreeFormQuery
Error Message:
YARN App state for app application_1510686381129_0003 expected:<FINISHED> but was:<ACCEPTED>
Stack Trace:
junit.framework.AssertionFailedError: YARN App state for app application_1510686381129_0003 expected:<FINISHED> but was:<ACCEPTED>
at org.apache.oozie.action.hadoop.TestSqoopActionExecutor.runSqoopActionWithBadCommand(TestSqoopActionExecutor.java:203)
at org.apache.oozie.action.hadoop.TestSqoopActionExecutor.testSqoopActionWithBadRedundantArgsAndFreeFormQuery(TestSqoopActionExecutor.java:295)
FAILED: org.apache.oozie.action.hadoop.TestSqoopActionExecutor.testSqoopActionWithBadCommand
Error Message:
YARN App state for app application_1510686381129_0004 expected:<FINISHED> but was:<ACCEPTED>
Stack Trace:
junit.framework.AssertionFailedError: YARN App state for app application_1510686381129_0004 expected:<FINISHED> but was:<ACCEPTED>
at org.apache.oozie.action.hadoop.TestSqoopActionExecutor.runSqoopActionWithBadCommand(TestSqoopActionExecutor.java:203)
at org.apache.oozie.action.hadoop.TestSqoopActionExecutor.testSqoopActionWithBadCommand(TestSqoopActionExecutor.java:195)
FAILED: org.apache.oozie.action.hadoop.TestSqoopActionExecutor.testSqoopEval
Error Message:
YARN App state for app application_1510686381129_0005 expected:<FINISHED> but was:<ACCEPTED>
Stack Trace:
junit.framework.AssertionFailedError: YARN App state for app application_1510686381129_0005 expected:<FINISHED> but was:<ACCEPTED>
at org.apache.oozie.action.hadoop.TestSqoopActionExecutor.testSqoopEval(TestSqoopActionExecutor.java:272)
FAILED: org.apache.oozie.action.hadoop.TestSqoopActionExecutor.testSqoopActionWithRedundantPrefix
Error Message:
YARN App state for app application_1510686381129_0006 expected:<FINISHED> but was:<ACCEPTED>
Stack Trace:
junit.framework.AssertionFailedError: YARN App state for app application_1510686381129_0006 expected:<FINISHED> but was:<ACCEPTED>
at org.apache.oozie.action.hadoop.TestSqoopActionExecutor.runSqoopAction(TestSqoopActionExecutor.java:237)
at org.apache.oozie.action.hadoop.TestSqoopActionExecutor.testSqoopActionWithRedundantPrefix(TestSqoopActionExecutor.java:229)
FAILED: org.apache.oozie.action.hadoop.TestSqoopActionExecutor.testSqoopAction
Error Message:
YARN App state for app application_1510686381129_0007 expected:<FINISHED> but was:<ACCEPTED>
Stack Trace:
junit.framework.AssertionFailedError: YARN App state for app application_1510686381129_0007 expected:<FINISHED> but was:<ACCEPTED>
at org.apache.oozie.action.hadoop.TestSqoopActionExecutor.runSqoopAction(TestSqoopActionExecutor.java:237)
at org.apache.oozie.action.hadoop.TestSqoopActionExecutor.testSqoopAction(TestSqoopActionExecutor.java:221)
Failed: OOZIE-2585 PreCommit Build #218
Posted by Apache Jenkins Server <je...@builds.apache.org>.
Jira: https://issues.apache.org/jira/browse/OOZIE-2585
Build: https://builds.apache.org/job/PreCommit-OOZIE-Build/218/
###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 680.02 KB...]
[INFO] Finished at: 2017-11-14T15:16:08Z
[INFO] Final Memory: 643M/1840M
[INFO] ------------------------------------------------------------------------
[TRACE] FindBugs output in HEAD verified and saved
Running test-patch task BACKWARDS_COMPATIBILITY
Running test-patch task TESTS
Running test-patch task DISTRO
Applying patch
Checking patch core/src/test/java/org/apache/oozie/service/TestPartitionDependencyManagerEhcache.java...
Checking patch core/src/test/java/org/apache/oozie/service/TestPartitionDependencyManagerService.java...
error: while searching for:
import java.lang.management.ManagementFactory;
import java.lang.management.MemoryMXBean;
import java.net.URISyntaxException;
import java.util.ArrayList;
import java.util.Collection;
import java.util.List;
import org.apache.oozie.CoordinatorActionBean;
import org.apache.oozie.client.CoordinatorAction.Status;
import org.apache.oozie.client.rest.JsonBean;
import org.apache.oozie.dependency.hcat.HCatMessageHandler;
import org.apache.oozie.executor.jpa.BatchQueryExecutor;
import org.apache.oozie.jms.JMSConnectionInfo;
import org.apache.oozie.service.Services;
import org.apache.oozie.test.XDataTestCase;
import org.apache.oozie.util.HCatURI;
import org.apache.oozie.util.XLog;
import org.junit.Test;
/**
error: patch failed: core/src/test/java/org/apache/oozie/service/TestPartitionDependencyManagerService.java:21
error: core/src/test/java/org/apache/oozie/service/TestPartitionDependencyManagerService.java: patch does not apply
Checking patch src/test/java/org/apache/oozie/service/TestPartitionDependencyManagerEhcache.java...
error: src/test/java/org/apache/oozie/service/TestPartitionDependencyManagerEhcache.java: No such file or directory
Checking patch src/test/java/org/apache/oozie/service/TestPartitionDependencyManagerService.java...
error: src/test/java/org/apache/oozie/service/TestPartitionDependencyManagerService.java: No such file or directory
Patch failed to apply to head of branch
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0100 3706k 100 3706k 0 0 3118k 0 0:00:01 0:00:01 --:--:-- 6456k
Adding comment to JIRA
Comment added.
test-patch exit code: 1
Build step 'Execute shell' marked build as failure
[description-setter] Description set: OOZIE-2585
Archiving artifacts
[Fast Archiver] Compressed 844.77 KB of artifacts by 64.4% relative to #214
Recording test results
ERROR: Step ?Publish JUnit test result report? failed: No test report files were found. Configuration error?
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any
###################################################################################
############################## FAILED TESTS (if any) ##############################
No tests ran.