You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@oozie.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2017/11/14 19:44:33 UTC
Failed: OOZIE-2869 PreCommit Build #220
Jira: https://issues.apache.org/jira/browse/OOZIE-2869
Build: https://builds.apache.org/job/PreCommit-OOZIE-Build/220/
###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 1.57 MB...]
[INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ oozie-zookeeper-security-tests ---
[debug] execute contextualize
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/zookeeper-security-tests/src/test/resources
[INFO] Copying 3 resources
[INFO]
[INFO] --- maven-compiler-plugin:2.3.2:testCompile (default-testCompile) @ oozie-zookeeper-security-tests ---
[INFO] Nothing to compile - all classes are up to date
[INFO]
[INFO] --- maven-surefire-plugin:2.20.1:test (default-test) @ oozie-zookeeper-security-tests ---
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Oozie Main .................................. SUCCESS [ 0.578 s]
[INFO] Apache Oozie Client ................................ SUCCESS [ 1.228 s]
[INFO] Apache Oozie Share Lib Oozie ....................... SUCCESS [ 2.095 s]
[INFO] Apache Oozie Share Lib HCatalog .................... SUCCESS [ 0.650 s]
[INFO] Apache Oozie Share Lib Distcp ...................... SUCCESS [ 0.133 s]
[INFO] Apache Oozie Core .................................. SUCCESS [ 5.797 s]
[INFO] Apache Oozie Share Lib Streaming ................... SUCCESS [06:42 min]
[INFO] Apache Oozie Share Lib Pig ......................... SUCCESS [04:05 min]
[INFO] Apache Oozie Share Lib Hive ........................ SUCCESS [01:08 min]
[INFO] Apache Oozie Share Lib Hive 2 ...................... SUCCESS [ 5.842 s]
[INFO] Apache Oozie Share Lib Sqoop ....................... SUCCESS [ 2.898 s]
[INFO] Apache Oozie Examples .............................. SUCCESS [ 5.057 s]
[INFO] Apache Oozie Share Lib Spark ....................... SUCCESS [ 2.050 s]
[INFO] Apache Oozie Share Lib ............................. SUCCESS [ 0.016 s]
[INFO] Apache Oozie Docs .................................. SUCCESS [ 0.025 s]
[INFO] Apache Oozie WebApp ................................ SUCCESS [ 2.374 s]
[INFO] Apache Oozie Tools ................................. SUCCESS [ 1.138 s]
[INFO] Apache Oozie MiniOozie ............................. SUCCESS [ 1.226 s]
[INFO] Apache Oozie Server ................................ SUCCESS [ 2.849 s]
[INFO] Apache Oozie Distro ................................ SUCCESS [ 1.088 s]
[INFO] Apache Oozie ZooKeeper Security Tests .............. SUCCESS [ 1.658 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 12:34 min
[INFO] Finished at: 2017-11-14T19:44:29Z
[INFO] Final Memory: 73M/1439M
[INFO] ------------------------------------------------------------------------
bin/test-patch-20-tests: line 105: echo: write error: No space left on device
Failure, check for details /home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/test-patch/tmp/test-patch-20-tests-post.out
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
0 3706k 0 5792 0 0 7292 0 0:08:40 --:--:-- 0:08:40 7292
curl: (23) Failed writing body (0 != 5792)
Could not download jira-cli tool, thus no JIRA updating
Build step 'Execute shell' marked build as failure
[description-setter] Description set: OOZIE-2869
Archiving artifacts
[Fast Archiver] Compressed 1.65 MB of artifacts by 39.7% relative to #216
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any
###################################################################################
############################## FAILED TESTS (if any) ##############################
61 tests failed.
FAILED: org.apache.oozie.example.TestLocalOozieExample.testLocalOozieExampleEnd
Error Message:
Unable to initialize WebAppContext
Stack Trace:
java.io.IOException: Unable to initialize WebAppContext
at org.apache.oozie.example.TestLocalOozieExample.setUp(TestLocalOozieExample.java:37)
Caused by: java.io.IOException: No space left on device
at org.apache.oozie.example.TestLocalOozieExample.setUp(TestLocalOozieExample.java:37)
FAILED: org.apache.oozie.example.TestLocalOozieExample.testLocalOozieExampleKill
Error Message:
Unable to initialize WebAppContext
Stack Trace:
java.io.IOException: Unable to initialize WebAppContext
at org.apache.oozie.example.TestLocalOozieExample.setUp(TestLocalOozieExample.java:37)
Caused by: java.io.IOException: No space left on device
at org.apache.oozie.example.TestLocalOozieExample.setUp(TestLocalOozieExample.java:37)
FAILED: org.apache.oozie.action.hadoop.TestHiveActionExecutor.testSetupMethods
Error Message:
Timed out waiting for Mini HDFS Cluster to start
Stack Trace:
java.io.IOException: Timed out waiting for Mini HDFS Cluster to start
FAILED: org.apache.oozie.action.hadoop.TestHiveActionExecutor.testActionConfLoadDefaultResources
Error Message:
Timed out waiting for Mini HDFS Cluster to start
Stack Trace:
java.io.IOException: Timed out waiting for Mini HDFS Cluster to start
FAILED: org.apache.oozie.action.hadoop.TestHiveActionExecutor.testHiveAction
Error Message:
Unable to initialize WebAppContext
Stack Trace:
java.io.IOException: Unable to initialize WebAppContext
Caused by: java.io.IOException: No space left on device
FAILED: org.apache.oozie.action.hadoop.TestHiveMain.testJobIDPattern
Error Message:
Timed out waiting for Mini HDFS Cluster to start
Stack Trace:
java.io.IOException: Timed out waiting for Mini HDFS Cluster to start
at org.apache.oozie.action.hadoop.TestHiveMain.setUp(TestHiveMain.java:44)
FAILED: org.apache.oozie.action.hadoop.TestHiveMain.testMain
Error Message:
Timed out waiting for Mini HDFS Cluster to start
Stack Trace:
java.io.IOException: Timed out waiting for Mini HDFS Cluster to start
at org.apache.oozie.action.hadoop.TestHiveMain.setUp(TestHiveMain.java:44)
FAILED: org.apache.oozie.action.hadoop.TestHive2ActionExecutor.testSetupMethodsForQuery
Error Message:
Unable to initialize WebAppContext
Stack Trace:
java.io.IOException: Unable to initialize WebAppContext
at org.apache.oozie.action.hadoop.TestHive2ActionExecutor.setUp(TestHive2ActionExecutor.java:56)
Caused by: java.io.IOException: No space left on device
at org.apache.oozie.action.hadoop.TestHive2ActionExecutor.setUp(TestHive2ActionExecutor.java:56)
FAILED: org.apache.oozie.action.hadoop.TestHive2ActionExecutor.testHive2Action
Error Message:
/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/hive2/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestHive2ActionExecutor/testHive2Action/dfe1b44f-ed70-4414-9d3a-2aee17b7d775/conf/hadoop-conf/hadoop-site.xml (No such file or directory)
Stack Trace:
java.io.FileNotFoundException: /home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/hive2/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestHive2ActionExecutor/testHive2Action/dfe1b44f-ed70-4414-9d3a-2aee17b7d775/conf/hadoop-conf/hadoop-site.xml (No such file or directory)
at org.apache.oozie.action.hadoop.TestHive2ActionExecutor.setUp(TestHive2ActionExecutor.java:56)
FAILED: org.apache.oozie.action.hadoop.TestHive2ActionExecutor.testHive2ActionFails
Error Message:
Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/hive2/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestHive2ActionExecutor/testHive2ActionFails/3a9ab56b-2acf-4f13-bd66-d5430eb61bff]
Stack Trace:
java.lang.RuntimeException: Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/hive2/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestHive2ActionExecutor/testHive2ActionFails/3a9ab56b-2acf-4f13-bd66-d5430eb61bff]
at org.apache.oozie.action.hadoop.TestHive2ActionExecutor.setUp(TestHive2ActionExecutor.java:56)
FAILED: org.apache.oozie.action.hadoop.TestHive2ActionExecutor.testSetupMethodsForScript
Error Message:
Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/hive2/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestHive2ActionExecutor/testSetupMethodsForScript/595216b5-9058-447d-b39d-8a1899f340ed]
Stack Trace:
java.lang.RuntimeException: Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/hive2/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestHive2ActionExecutor/testSetupMethodsForScript/595216b5-9058-447d-b39d-8a1899f340ed]
at org.apache.oozie.action.hadoop.TestHive2ActionExecutor.setUp(TestHive2ActionExecutor.java:56)
FAILED: org.apache.oozie.action.hadoop.TestPigActionExecutor.testPig
Error Message:
Timed out waiting for Mini HDFS Cluster to start
Stack Trace:
java.io.IOException: Timed out waiting for Mini HDFS Cluster to start
at org.apache.oozie.action.hadoop.TestPigActionExecutor.setUp(TestPigActionExecutor.java:71)
FAILED: org.apache.oozie.action.hadoop.TestPigActionExecutor.testExecutionStatsWithMaxStatsSizeLimit
Error Message:
YARN App state for app application_1510688362402_0001 expected:<FINISHED> but was:<ACCEPTED>
Stack Trace:
junit.framework.AssertionFailedError: YARN App state for app application_1510688362402_0001 expected:<FINISHED> but was:<ACCEPTED>
at org.apache.oozie.action.hadoop.TestPigActionExecutor.testExecutionStatsWithMaxStatsSizeLimit(TestPigActionExecutor.java:271)
FAILED: org.apache.oozie.action.hadoop.TestPigActionExecutor.testPigError
Error Message:
File /user/test/868bcf66-8fc3-41c1-ac93-615639ba503a/netty-3.6.2.Final.jar could only be replicated to 0 nodes instead of minReplication (=1). There are 2 datanode(s) running and 2 node(s) are excluded in this operation.
at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1549)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:3200)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:641)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:482)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2039)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2035)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2033)
Stack Trace:
org.apache.hadoop.ipc.RemoteException:
File /user/test/868bcf66-8fc3-41c1-ac93-615639ba503a/netty-3.6.2.Final.jar could only be replicated to 0 nodes instead of minReplication (=1). There are 2 datanode(s) running and 2 node(s) are excluded in this operation.
at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1549)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:3200)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:641)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:482)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2039)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2035)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2033)
FAILED: org.apache.oozie.action.hadoop.TestPigActionExecutor.testActionConfLoadDefaultResources
Error Message:
File /user/test/9b97d57e-d34d-4090-bed5-2ff78be3b9e7/app/script.pig could only be replicated to 0 nodes instead of minReplication (=1). There are 2 datanode(s) running and 2 node(s) are excluded in this operation.
at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1549)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:3200)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:641)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:482)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2039)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2035)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2033)
Stack Trace:
org.apache.hadoop.ipc.RemoteException:
File /user/test/9b97d57e-d34d-4090-bed5-2ff78be3b9e7/app/script.pig could only be replicated to 0 nodes instead of minReplication (=1). There are 2 datanode(s) running and 2 node(s) are excluded in this operation.
at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1549)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:3200)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:641)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:482)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2039)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2035)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2033)
FAILED: org.apache.oozie.action.hadoop.TestPigActionExecutor.testExecutionStatsWithRetrieveStatsFalse
Error Message:
File /user/test/90f5d1b3-7ceb-4785-8f90-a4c6033bd786/app/script.pig could only be replicated to 0 nodes instead of minReplication (=1). There are 2 datanode(s) running and no node(s) are excluded in this operation.
at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1549)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:3200)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:641)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:482)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2039)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2035)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2033)
Stack Trace:
org.apache.hadoop.ipc.RemoteException:
File /user/test/90f5d1b3-7ceb-4785-8f90-a4c6033bd786/app/script.pig could only be replicated to 0 nodes instead of minReplication (=1). There are 2 datanode(s) running and no node(s) are excluded in this operation.
at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1549)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:3200)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:641)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:482)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2039)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2035)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2033)
FAILED: org.apache.oozie.action.hadoop.TestPigActionExecutor.testExternalChildIds
Error Message:
File /user/test/f457a10e-7377-4792-acf6-667bace2275e/app/script.pig could only be replicated to 0 nodes instead of minReplication (=1). There are 2 datanode(s) running and no node(s) are excluded in this operation.
at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1549)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:3200)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:641)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:482)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2039)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2035)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2033)
Stack Trace:
org.apache.hadoop.ipc.RemoteException:
File /user/test/f457a10e-7377-4792-acf6-667bace2275e/app/script.pig could only be replicated to 0 nodes instead of minReplication (=1). There are 2 datanode(s) running and no node(s) are excluded in this operation.
at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1549)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:3200)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:641)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:482)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2039)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2035)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2033)
FAILED: org.apache.oozie.action.hadoop.TestPigActionExecutor.testExecutionStats
Error Message:
File /user/test/3d86af28-3dad-4c08-8b35-b4f9d5a4504e/app/script.pig could only be replicated to 0 nodes instead of minReplication (=1). There are 2 datanode(s) running and no node(s) are excluded in this operation.
at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1549)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:3200)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:641)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:482)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2039)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2035)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2033)
Stack Trace:
org.apache.hadoop.ipc.RemoteException:
File /user/test/3d86af28-3dad-4c08-8b35-b4f9d5a4504e/app/script.pig could only be replicated to 0 nodes instead of minReplication (=1). There are 2 datanode(s) running and no node(s) are excluded in this operation.
at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1549)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:3200)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:641)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:482)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2039)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2035)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2033)
FAILED: org.apache.oozie.action.hadoop.TestPigActionExecutor.testUdfPig
Error Message:
File /user/test/fc0cbeb7-8767-4b3d-a250-79c685e5293e/udf.jar could only be replicated to 0 nodes instead of minReplication (=1). There are 2 datanode(s) running and no node(s) are excluded in this operation.
at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1549)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:3200)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:641)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:482)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2039)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2035)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2033)
Stack Trace:
org.apache.hadoop.ipc.RemoteException:
File /user/test/fc0cbeb7-8767-4b3d-a250-79c685e5293e/udf.jar could only be replicated to 0 nodes instead of minReplication (=1). There are 2 datanode(s) running and no node(s) are excluded in this operation.
at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1549)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:3200)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:641)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:482)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2039)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2035)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2033)
FAILED: org.apache.oozie.action.hadoop.TestPigMain.testJobIDPattern
Error Message:
Cannot delete /user/test/b754fe31-1f53-4dc7-9da3-9dc249d06348. Name node is in safe mode.
Resources are low on NN. Please add or free up more resources then turn off safe mode manually. NOTE: If you turn off safe mode before adding resources, the NN will immediately return to safe mode. Use "hdfs dfsadmin -safemode leave" to turn safe mode off.
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkNameNodeSafeMode(FSNamesystem.java:1364)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.deleteInternal(FSNamesystem.java:3967)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.deleteInt(FSNamesystem.java:3925)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.delete(FSNamesystem.java:3909)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.delete(NameNodeRpcServer.java:786)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.delete(ClientNamenodeProtocolServerSideTranslatorPB.java:589)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2039)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2035)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2033)
Stack Trace:
org.apache.hadoop.ipc.RemoteException:
Cannot delete /user/test/b754fe31-1f53-4dc7-9da3-9dc249d06348. Name node is in safe mode.
Resources are low on NN. Please add or free up more resources then turn off safe mode manually. NOTE: If you turn off safe mode before adding resources, the NN will immediately return to safe mode. Use "hdfs dfsadmin -safemode leave" to turn safe mode off.
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkNameNodeSafeMode(FSNamesystem.java:1364)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.deleteInternal(FSNamesystem.java:3967)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.deleteInt(FSNamesystem.java:3925)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.delete(FSNamesystem.java:3909)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.delete(NameNodeRpcServer.java:786)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.delete(ClientNamenodeProtocolServerSideTranslatorPB.java:589)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2039)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2035)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2033)
at org.apache.oozie.action.hadoop.TestPigMain.setUp(TestPigMain.java:47)
FAILED: org.apache.oozie.action.hadoop.TestPigMain.testPig_withNullExternalID
Error Message:
Cannot delete /user/test/3fe696da-3175-4cd5-b923-b0ada0e7c504. Name node is in safe mode.
Resources are low on NN. Please add or free up more resources then turn off safe mode manually. NOTE: If you turn off safe mode before adding resources, the NN will immediately return to safe mode. Use "hdfs dfsadmin -safemode leave" to turn safe mode off.
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkNameNodeSafeMode(FSNamesystem.java:1364)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.deleteInternal(FSNamesystem.java:3967)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.deleteInt(FSNamesystem.java:3925)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.delete(FSNamesystem.java:3909)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.delete(NameNodeRpcServer.java:786)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.delete(ClientNamenodeProtocolServerSideTranslatorPB.java:589)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2039)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2035)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2033)
Stack Trace:
org.apache.hadoop.ipc.RemoteException:
Cannot delete /user/test/3fe696da-3175-4cd5-b923-b0ada0e7c504. Name node is in safe mode.
Resources are low on NN. Please add or free up more resources then turn off safe mode manually. NOTE: If you turn off safe mode before adding resources, the NN will immediately return to safe mode. Use "hdfs dfsadmin -safemode leave" to turn safe mode off.
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkNameNodeSafeMode(FSNamesystem.java:1364)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.deleteInternal(FSNamesystem.java:3967)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.deleteInt(FSNamesystem.java:3925)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.delete(FSNamesystem.java:3909)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.delete(NameNodeRpcServer.java:786)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.delete(ClientNamenodeProtocolServerSideTranslatorPB.java:589)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2039)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2035)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2033)
at org.apache.oozie.action.hadoop.TestPigMain.setUp(TestPigMain.java:47)
FAILED: org.apache.oozie.action.hadoop.TestPigMain.testEmbeddedPigWithinPython
Error Message:
Cannot delete /user/test/2221b27a-36aa-4985-b631-cd90f29c90d2. Name node is in safe mode.
Resources are low on NN. Please add or free up more resources then turn off safe mode manually. NOTE: If you turn off safe mode before adding resources, the NN will immediately return to safe mode. Use "hdfs dfsadmin -safemode leave" to turn safe mode off.
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkNameNodeSafeMode(FSNamesystem.java:1364)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.deleteInternal(FSNamesystem.java:3967)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.deleteInt(FSNamesystem.java:3925)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.delete(FSNamesystem.java:3909)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.delete(NameNodeRpcServer.java:786)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.delete(ClientNamenodeProtocolServerSideTranslatorPB.java:589)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2039)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2035)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2033)
Stack Trace:
org.apache.hadoop.ipc.RemoteException:
Cannot delete /user/test/2221b27a-36aa-4985-b631-cd90f29c90d2. Name node is in safe mode.
Resources are low on NN. Please add or free up more resources then turn off safe mode manually. NOTE: If you turn off safe mode before adding resources, the NN will immediately return to safe mode. Use "hdfs dfsadmin -safemode leave" to turn safe mode off.
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkNameNodeSafeMode(FSNamesystem.java:1364)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.deleteInternal(FSNamesystem.java:3967)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.deleteInt(FSNamesystem.java:3925)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.delete(FSNamesystem.java:3909)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.delete(NameNodeRpcServer.java:786)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.delete(ClientNamenodeProtocolServerSideTranslatorPB.java:589)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2039)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2035)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2033)
at org.apache.oozie.action.hadoop.TestPigMain.setUp(TestPigMain.java:47)
FAILED: org.apache.oozie.action.hadoop.TestPigMain.testPigScript
Error Message:
Cannot delete /user/test/66853dc6-7337-4d6b-83ff-cb197a006d1d. Name node is in safe mode.
Resources are low on NN. Please add or free up more resources then turn off safe mode manually. NOTE: If you turn off safe mode before adding resources, the NN will immediately return to safe mode. Use "hdfs dfsadmin -safemode leave" to turn safe mode off.
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkNameNodeSafeMode(FSNamesystem.java:1364)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.deleteInternal(FSNamesystem.java:3967)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.deleteInt(FSNamesystem.java:3925)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.delete(FSNamesystem.java:3909)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.delete(NameNodeRpcServer.java:786)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.delete(ClientNamenodeProtocolServerSideTranslatorPB.java:589)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2039)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2035)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2033)
Stack Trace:
org.apache.hadoop.ipc.RemoteException:
Cannot delete /user/test/66853dc6-7337-4d6b-83ff-cb197a006d1d. Name node is in safe mode.
Resources are low on NN. Please add or free up more resources then turn off safe mode manually. NOTE: If you turn off safe mode before adding resources, the NN will immediately return to safe mode. Use "hdfs dfsadmin -safemode leave" to turn safe mode off.
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkNameNodeSafeMode(FSNamesystem.java:1364)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.deleteInternal(FSNamesystem.java:3967)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.deleteInt(FSNamesystem.java:3925)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.delete(FSNamesystem.java:3909)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.delete(NameNodeRpcServer.java:786)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.delete(ClientNamenodeProtocolServerSideTranslatorPB.java:589)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2039)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2035)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2033)
at org.apache.oozie.action.hadoop.TestPigMain.setUp(TestPigMain.java:47)
FAILED: org.apache.oozie.action.hadoop.TestSparkActionExecutor.testSetupMethodsWithSparkConfiguration
Error Message:
Call From asf918.gq1.ygridcore.net/67.195.81.138 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
Stack Trace:
java.net.ConnectException: Call From asf918.gq1.ygridcore.net/67.195.81.138 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
Caused by: java.net.ConnectException: Connection refused
FAILED: org.apache.oozie.action.hadoop.TestSparkActionExecutor.testSetupMethods
Error Message:
Call From asf918.gq1.ygridcore.net/67.195.81.138 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
Stack Trace:
java.net.ConnectException: Call From asf918.gq1.ygridcore.net/67.195.81.138 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
Caused by: java.net.ConnectException: Connection refused
FAILED: org.apache.oozie.action.hadoop.TestSparkMain.testJobIDPattern
Error Message:
File /user/jenkins/target/MiniMRCluster_377820428-tmpDir/MRAppJar.jar could only be replicated to 0 nodes instead of minReplication (=1). There are 2 datanode(s) running and 2 node(s) are excluded in this operation.
at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1549)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:3200)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:641)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:482)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2039)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2035)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2033)
Stack Trace:
org.apache.hadoop.ipc.RemoteException:
File /user/jenkins/target/MiniMRCluster_377820428-tmpDir/MRAppJar.jar could only be replicated to 0 nodes instead of minReplication (=1). There are 2 datanode(s) running and 2 node(s) are excluded in this operation.
at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1549)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:3200)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:641)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:482)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2039)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2035)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2033)
FAILED: org.apache.oozie.action.hadoop.TestSparkMain.testPatterns
Error Message:
Call From asf918.gq1.ygridcore.net/67.195.81.138 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
Stack Trace:
java.net.ConnectException: Call From asf918.gq1.ygridcore.net/67.195.81.138 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
Caused by: java.net.ConnectException: Connection refused
FAILED: org.apache.oozie.action.hadoop.TestSparkMain.testMain
Error Message:
Call From asf918.gq1.ygridcore.net/67.195.81.138 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
Stack Trace:
java.net.ConnectException: Call From asf918.gq1.ygridcore.net/67.195.81.138 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
Caused by: java.net.ConnectException: Connection refused
FAILED: org.apache.oozie.action.hadoop.TestSqoopActionExecutor.testSetupMethods
Error Message:
Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/sqoop/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestSqoopActionExecutor/testSetupMethods/862724c7-3f69-47d9-92df-e97beb5687a3]
Stack Trace:
java.lang.RuntimeException: Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/sqoop/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestSqoopActionExecutor/testSetupMethods/862724c7-3f69-47d9-92df-e97beb5687a3]
FAILED: org.apache.oozie.action.hadoop.TestMapReduceActionExecutor.testMapReduceWithUberJarEnabled
Error Message:
YARN App state for app application_1510687948111_0001 expected:<FINISHED> but was:<ACCEPTED>
Stack Trace:
junit.framework.AssertionFailedError: YARN App state for app application_1510687948111_0001 expected:<FINISHED> but was:<ACCEPTED>
at org.apache.oozie.action.hadoop.TestMapReduceActionExecutor._testSubmit(TestMapReduceActionExecutor.java:400)
at org.apache.oozie.action.hadoop.TestMapReduceActionExecutor._testMapReduceWithUberJar(TestMapReduceActionExecutor.java:727)
at org.apache.oozie.action.hadoop.TestMapReduceActionExecutor.testMapReduceWithUberJarEnabled(TestMapReduceActionExecutor.java:815)
FAILED: org.apache.oozie.action.hadoop.TestMapReduceActionExecutor.testSetMapredJobName
Error Message:
YARN App state for app application_1510687948111_0002 expected:<FINISHED> but was:<ACCEPTED>
Stack Trace:
junit.framework.AssertionFailedError: YARN App state for app application_1510687948111_0002 expected:<FINISHED> but was:<ACCEPTED>
at org.apache.oozie.action.hadoop.TestMapReduceActionExecutor.testSetMapredJobName(TestMapReduceActionExecutor.java:1144)
FAILED: org.apache.oozie.action.hadoop.TestMapReduceActionExecutor.testMapReduceWithUberJarDisabled
Error Message:
Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testMapReduceWithUberJarDisabled/2ba87b17-5f6f-4287-8e03-43664c11f4d8]
Stack Trace:
java.lang.RuntimeException: Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testMapReduceWithUberJarDisabled/2ba87b17-5f6f-4287-8e03-43664c11f4d8]
FAILED: org.apache.oozie.action.hadoop.TestMapReduceActionExecutor.testPipes
Error Message:
Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testPipes/36db0ca4-f545-4a28-86a2-e667d11df8d1]
Stack Trace:
java.lang.RuntimeException: Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testPipes/36db0ca4-f545-4a28-86a2-e667d11df8d1]
FAILED: org.apache.oozie.action.hadoop.TestMapReduceActionExecutor.testCommaSeparatedFilesAndArchives
Error Message:
Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testCommaSeparatedFilesAndArchives/76ee0e80-b576-48d0-9d56-f00c5638491f]
Stack Trace:
java.lang.RuntimeException: Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testCommaSeparatedFilesAndArchives/76ee0e80-b576-48d0-9d56-f00c5638491f]
FAILED: org.apache.oozie.action.hadoop.TestMapReduceActionExecutor.testEndWithoutConfiguration
Error Message:
Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testEndWithoutConfiguration/2501aaa4-5554-47cd-8d11-7b9a808dad2a]
Stack Trace:
java.lang.RuntimeException: Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testEndWithoutConfiguration/2501aaa4-5554-47cd-8d11-7b9a808dad2a]
FAILED: org.apache.oozie.action.hadoop.TestMapReduceActionExecutor.testMapReduceWithConfigClass
Error Message:
Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testMapReduceWithConfigClass/153cc883-0909-4414-bd61-d3e20baba80b]
Stack Trace:
java.lang.RuntimeException: Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testMapReduceWithConfigClass/153cc883-0909-4414-bd61-d3e20baba80b]
FAILED: org.apache.oozie.action.hadoop.TestMapReduceActionExecutor.testMapReduceWithCredentials
Error Message:
Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testMapReduceWithCredentials/bf093427-1f52-473b-a1b4-1e3f9ef0e00a]
Stack Trace:
java.lang.RuntimeException: Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testMapReduceWithCredentials/bf093427-1f52-473b-a1b4-1e3f9ef0e00a]
FAILED: org.apache.oozie.action.hadoop.TestMapReduceActionExecutor.testSetupMethods
Error Message:
Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testSetupMethods/d66ade73-ea11-4ab5-b52f-ef50a0aeed1c]
Stack Trace:
java.lang.RuntimeException: Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testSetupMethods/d66ade73-ea11-4ab5-b52f-ef50a0aeed1c]
FAILED: org.apache.oozie.action.hadoop.TestMapReduceActionExecutor.testJobNameSetForMapReduceChild
Error Message:
Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testJobNameSetForMapReduceChild/55fe2e93-2951-4abb-8aa7-57a74d555455]
Stack Trace:
java.lang.RuntimeException: Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testJobNameSetForMapReduceChild/55fe2e93-2951-4abb-8aa7-57a74d555455]
FAILED: org.apache.oozie.action.hadoop.TestMapReduceActionExecutor.testStreamingConfOverride
Error Message:
Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testStreamingConfOverride/7fc211f6-87d1-4fee-9f53-2f176448d390]
Stack Trace:
java.lang.RuntimeException: Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testStreamingConfOverride/7fc211f6-87d1-4fee-9f53-2f176448d390]
FAILED: org.apache.oozie.action.hadoop.TestMapReduceActionExecutor.testConfigDefaultPropsToAction
Error Message:
Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testConfigDefaultPropsToAction/528afd7d-2cdd-4df6-a986-ca225dd3b71b]
Stack Trace:
java.lang.RuntimeException: Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testConfigDefaultPropsToAction/528afd7d-2cdd-4df6-a986-ca225dd3b71b]
FAILED: org.apache.oozie.action.hadoop.TestMapReduceActionExecutor.testMapReduceWithConfigClassThrowException
Error Message:
Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testMapReduceWithConfigClassThrowException/99b7679d-d1b1-4340-8555-0affdf3ca9ee]
Stack Trace:
java.lang.RuntimeException: Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testMapReduceWithConfigClassThrowException/99b7679d-d1b1-4340-8555-0affdf3ca9ee]
FAILED: org.apache.oozie.action.hadoop.TestMapReduceActionExecutor.testStreaming
Error Message:
Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testStreaming/db8fb176-bfff-4ce9-bb25-40f701e29c0d]
Stack Trace:
java.lang.RuntimeException: Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testStreaming/db8fb176-bfff-4ce9-bb25-40f701e29c0d]
FAILED: org.apache.oozie.action.hadoop.TestMapReduceActionExecutor.testSetExecutionStats_when_user_has_specified_stats_write_TRUE
Error Message:
Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testSetExecutionStats_when_user_has_specified_stats_write_TRUE/72b4b968-70c5-485a-8e21-5c2605ca3ff7]
Stack Trace:
java.lang.RuntimeException: Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testSetExecutionStats_when_user_has_specified_stats_write_TRUE/72b4b968-70c5-485a-8e21-5c2605ca3ff7]
FAILED: org.apache.oozie.action.hadoop.TestMapReduceActionExecutor.testSetExecutionStats_when_user_has_specified_stats_write_FALSE
Error Message:
Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testSetExecutionStats_when_user_has_specified_stats_write_FALSE/d713e740-58d8-49a3-a6f5-4eb22d1fccc3]
Stack Trace:
java.lang.RuntimeException: Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testSetExecutionStats_when_user_has_specified_stats_write_FALSE/d713e740-58d8-49a3-a6f5-4eb22d1fccc3]
FAILED: org.apache.oozie.action.hadoop.TestMapReduceActionExecutor.testMapReduceWithConfigClassNotFound
Error Message:
Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testMapReduceWithConfigClassNotFound/311829fb-9709-49f7-aeb8-366c162b564d]
Stack Trace:
java.lang.RuntimeException: Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testMapReduceWithConfigClassNotFound/311829fb-9709-49f7-aeb8-366c162b564d]
FAILED: org.apache.oozie.action.hadoop.TestMapReduceActionExecutor.testMapReduce
Error Message:
Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testMapReduce/6a5650f3-f190-4074-b6ec-eb3945753f8b]
Stack Trace:
java.lang.RuntimeException: Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testMapReduce/6a5650f3-f190-4074-b6ec-eb3945753f8b]
FAILED: org.apache.oozie.action.hadoop.TestMapReduceActionExecutor.testMapReduceActionError
Error Message:
Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testMapReduceActionError/486296f9-c664-4aae-9d5b-8bb1b5e417dc]
Stack Trace:
java.lang.RuntimeException: Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testMapReduceActionError/486296f9-c664-4aae-9d5b-8bb1b5e417dc]
FAILED: org.apache.oozie.action.hadoop.TestMapReduceActionExecutor.testMapReduceActionKill
Error Message:
Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testMapReduceActionKill/ba54ffc3-fe55-449f-ae17-741e32f9fb01]
Stack Trace:
java.lang.RuntimeException: Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/streaming/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestMapReduceActionExecutor/testMapReduceActionKill/ba54ffc3-fe55-449f-ae17-741e32f9fb01]
FAILED: org.apache.oozie.tools.TestOozieSharelibCLI.testOozieSharelibCLICreate
Error Message:
File /user/jenkins/target/MiniMRCluster_667895151-tmpDir/MRAppJar.jar could only be replicated to 0 nodes instead of minReplication (=1). There are 2 datanode(s) running and 2 node(s) are excluded in this operation.
at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1549)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:3200)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:641)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:482)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2039)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2035)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2033)
Stack Trace:
org.apache.hadoop.ipc.RemoteException:
File /user/jenkins/target/MiniMRCluster_667895151-tmpDir/MRAppJar.jar could only be replicated to 0 nodes instead of minReplication (=1). There are 2 datanode(s) running and 2 node(s) are excluded in this operation.
at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1549)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:3200)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:641)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:482)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2039)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2035)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2033)
FAILED: org.apache.oozie.test.TestWorkflow.testParallelFsAndShellWorkflowCompletesSuccessfully
Error Message:
expected:<SUCCEEDED> but was:<RUNNING>
Stack Trace:
junit.framework.AssertionFailedError: expected:<SUCCEEDED> but was:<RUNNING>
at org.apache.oozie.test.TestWorkflow.runWorkflowFromFile(TestWorkflow.java:167)
at org.apache.oozie.test.TestWorkflow.testParallelFsAndShellWorkflowCompletesSuccessfully(TestWorkflow.java:117)
FAILED: org.apache.oozie.test.TestWorkflowRetries.testParallelFsAndShellWorkflowCompletesSuccessfully
Error Message:
expected:<SUCCEEDED> but was:<RUNNING>
Stack Trace:
junit.framework.AssertionFailedError: expected:<SUCCEEDED> but was:<RUNNING>
FAILED: org.apache.oozie.action.hadoop.TestPyspark.testPyspark
Error Message:
Call From asf918.gq1.ygridcore.net/67.195.81.138 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
Stack Trace:
java.net.ConnectException: Call From asf918.gq1.ygridcore.net/67.195.81.138 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
Caused by: java.net.ConnectException: Connection refused
FAILED: org.apache.oozie.action.hadoop.TestSparkActionExecutor.testSparkAction
Error Message:
Call From asf918.gq1.ygridcore.net/67.195.81.138 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
Stack Trace:
java.net.ConnectException: Call From asf918.gq1.ygridcore.net/67.195.81.138 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
Caused by: java.net.ConnectException: Connection refused
FAILED: org.apache.oozie.action.hadoop.TestSqoopActionExecutor.testSqoopActionWithRedundantArgsAndFreeFormQuery
Error Message:
Cannot create directory /home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/sqoop/build/test/data/dfs/name1/current
Stack Trace:
java.io.IOException: Cannot create directory /home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/sqoop/build/test/data/dfs/name1/current
FAILED: org.apache.oozie.action.hadoop.TestSqoopActionExecutor.testSqoopActionWithArgsAndFreeFormQuery
Error Message:
Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/sqoop/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestSqoopActionExecutor/testSqoopActionWithArgsAndFreeFormQuery/64f39b9c-0c9d-4dd7-81b6-7943a54798f8]
Stack Trace:
java.lang.RuntimeException: Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/sqoop/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestSqoopActionExecutor/testSqoopActionWithArgsAndFreeFormQuery/64f39b9c-0c9d-4dd7-81b6-7943a54798f8]
FAILED: org.apache.oozie.action.hadoop.TestSqoopActionExecutor.testSqoopActionWithBadRedundantArgsAndFreeFormQuery
Error Message:
Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/sqoop/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestSqoopActionExecutor/testSqoopActionWithBadRedundantArgsAndFreeFormQuery/1a466781-2517-43c1-babf-7e0156737bf8]
Stack Trace:
java.lang.RuntimeException: Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/sqoop/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestSqoopActionExecutor/testSqoopActionWithBadRedundantArgsAndFreeFormQuery/1a466781-2517-43c1-babf-7e0156737bf8]
FAILED: org.apache.oozie.action.hadoop.TestSqoopActionExecutor.testSqoopActionWithBadCommand
Error Message:
Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/sqoop/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestSqoopActionExecutor/testSqoopActionWithBadCommand/626a6841-b8a6-426b-99c9-3c6488dd27c6]
Stack Trace:
java.lang.RuntimeException: Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/sqoop/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestSqoopActionExecutor/testSqoopActionWithBadCommand/626a6841-b8a6-426b-99c9-3c6488dd27c6]
FAILED: org.apache.oozie.action.hadoop.TestSqoopActionExecutor.testSqoopEval
Error Message:
Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/sqoop/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestSqoopActionExecutor/testSqoopEval/5a0416a0-e434-4b80-b9cd-c40eeefe9340]
Stack Trace:
java.lang.RuntimeException: Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/sqoop/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestSqoopActionExecutor/testSqoopEval/5a0416a0-e434-4b80-b9cd-c40eeefe9340]
FAILED: org.apache.oozie.action.hadoop.TestSqoopActionExecutor.testSqoopActionWithRedundantPrefix
Error Message:
Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/sqoop/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestSqoopActionExecutor/testSqoopActionWithRedundantPrefix/f1bf5dbf-61a8-4722-b921-1d5c64a5bae0]
Stack Trace:
java.lang.RuntimeException: Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/sqoop/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestSqoopActionExecutor/testSqoopActionWithRedundantPrefix/f1bf5dbf-61a8-4722-b921-1d5c64a5bae0]
FAILED: org.apache.oozie.action.hadoop.TestSqoopActionExecutor.testSqoopAction
Error Message:
Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/sqoop/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestSqoopActionExecutor/testSqoopAction/10ff4ddd-ff25-42e4-8dfc-4ee6939dd5c1]
Stack Trace:
java.lang.RuntimeException: Could not create testcase dir[/home/jenkins/jenkins-slave/workspace/PreCommit-OOZIE-Build@2/sharelib/sqoop/target/test-data/oozietests/org.apache.oozie.action.hadoop.TestSqoopActionExecutor/testSqoopAction/10ff4ddd-ff25-42e4-8dfc-4ee6939dd5c1]