You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@mesos.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2013/12/10 07:52:09 UTC
Build failed in Jenkins:
Mesos-Trunk-Ubuntu-Build-Out-Of-Src-Set-JAVA_HOME #1792
See <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Build-Out-Of-Src-Set-JAVA_HOME/1792/>
------------------------------------------
[...truncated 262 lines...]
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Building remotely on ubuntu3 in workspace <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Build-Out-Of-Src-Set-JAVA_HOME/ws/>
Fetching changes from the remote Git repository
Fetching upstream changes from https://git-wip-us.apache.org/repos/asf/mesos.git
Checking out Revision 4f99d86febbfc43d5c233961eb7860d766b53d72 (origin/master)
Cleaning workspace
Resetting working tree
FATAL: Command "clean -fdx" returned status code 1:
stdout: Removing build/
stderr: warning: failed to remove build/
hudson.plugins.git.GitException: Command "clean -fdx" returned status code 1:
stdout: Removing build/
stderr: warning: failed to remove build/
at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.launchCommandIn(CliGitAPIImpl.java:981)
at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.launchCommandIn(CliGitAPIImpl.java:961)
at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.launchCommandIn(CliGitAPIImpl.java:957)
at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.launchCommand(CliGitAPIImpl.java:877)
at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.launchCommand(CliGitAPIImpl.java:887)
at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.clean(CliGitAPIImpl.java:347)
at hudson.plugins.git.GitAPI.clean(GitAPI.java:251)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:622)
at hudson.remoting.RemoteInvocationHandler$RPCRequest.perform(RemoteInvocationHandler.java:299)
at hudson.remoting.RemoteInvocationHandler$RPCRequest.call(RemoteInvocationHandler.java:280)
at hudson.remoting.RemoteInvocationHandler$RPCRequest.call(RemoteInvocationHandler.java:239)
at hudson.remoting.UserRequest.perform(UserRequest.java:118)
at hudson.remoting.UserRequest.perform(UserRequest.java:48)
at hudson.remoting.Request$2.run(Request.java:328)
at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
at java.util.concurrent.FutureTask.run(FutureTask.java:166)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1146)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:701)
Jenkins build is back to normal :
Mesos-Trunk-Ubuntu-Build-Out-Of-Src-Set-JAVA_HOME #1795
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Build-Out-Of-Src-Set-JAVA_HOME/1795/changes>
Build failed in Jenkins:
Mesos-Trunk-Ubuntu-Build-Out-Of-Src-Set-JAVA_HOME #1794
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Build-Out-Of-Src-Set-JAVA_HOME/1794/changes>
Changes:
[bmahler] Fixed the python tests in the presence of muliple eggs.
[niklas] Added stringifier.hpp to stout/Makefile.am.
[rossgit] Printed timezone, full ISO 8601, with timestamps.
[benjamin.hindman] Added min/max functions to stout which take two Options.
------------------------------------------
[...truncated 23198 lines...]
I1211 11:50:43.580816 3921 group.cpp:675] Syncing group operations: queue size (joins, cancels, datas) = (1, 0, 0)
I1211 11:50:43.580822 3921 group.cpp:337] Trying to create path '/znode' in ZooKeeper
2013-12-11 11:50:43,581:3899(0x2ad2c4b15700):ZOO_INFO@check_events@1632: session establishment complete on server [127.0.0.1:41287], sessionId=0x142e17e93840008, negotiated timeout=10000
I1211 11:50:43.581209 3924 group.cpp:280] Group process ((1533)@67.195.138.60:40217) connected to ZooKeeper
I1211 11:50:43.581230 3924 group.cpp:675] Syncing group operations: queue size (joins, cancels, datas) = (0, 0, 0)
I1211 11:50:43.581238 3924 group.cpp:337] Trying to create path '/znode' in ZooKeeper
2013-12-11 11:50:43,581:3899(0x2ad2c5319700):ZOO_INFO@check_events@1632: session establishment complete on server [127.0.0.1:41287], sessionId=0x142e17e93840009, negotiated timeout=10000
I1211 11:50:43.581715 3920 group.cpp:280] Group process ((1540)@67.195.138.60:40217) connected to ZooKeeper
I1211 11:50:43.581735 3920 group.cpp:675] Syncing group operations: queue size (joins, cancels, datas) = (0, 0, 0)
I1211 11:50:43.581745 3920 group.cpp:337] Trying to create path '/znode' in ZooKeeper
I1211 11:50:43.608192 3924 contender.cpp:203] New candidate (id='4', data='master@67.195.138.60:40217') has entered the contest for leadership
I1211 11:50:43.609326 3919 detector.cpp:130] Detected a new leader (id='4')
I1211 11:50:43.609432 3920 group.cpp:562] Trying to get '/znode/0000000004' in ZooKeeper
I1211 11:50:43.609480 3925 detector.cpp:130] Detected a new leader (id='4')
I1211 11:50:43.609558 3923 group.cpp:562] Trying to get '/znode/0000000004' in ZooKeeper
I1211 11:50:43.610683 3920 detector.cpp:322] A new leading master (UPID=master@67.195.138.60:40217) is detected
I1211 11:50:43.610723 3923 detector.cpp:322] A new leading master (UPID=master@67.195.138.60:40217) is detected
I1211 11:50:43.610752 3924 master.cpp:746] The newly elected leader is master@67.195.138.60:40217
I1211 11:50:43.610766 3924 master.cpp:750] Elected as the leading master!
I1211 11:50:43.610805 3925 slave.cpp:497] New master detected at master@67.195.138.60:40217
I1211 11:50:43.610729 3922 detector.cpp:130] Detected a new leader (id='4')
I1211 11:50:43.610867 3920 status_update_manager.cpp:160] New master detected at master@67.195.138.60:40217
I1211 11:50:43.610894 3922 group.cpp:562] Trying to get '/znode/0000000004' in ZooKeeper
I1211 11:50:43.610908 3925 slave.cpp:524] Detecting new master
I1211 11:50:43.610954 3920 master.cpp:1366] Attempting to register slave on janus.apache.org at slave(136)@67.195.138.60:40217
I1211 11:50:43.610965 3920 master.cpp:2628] Adding slave 201312111150-1015726915-40217-3899-0 at janus.apache.org with cpus(*):2; mem(*):1024; disk(*):23038; ports(*):[31000-32000]
I1211 11:50:43.611063 3924 slave.cpp:542] Registered with master master@67.195.138.60:40217; given slave ID 201312111150-1015726915-40217-3899-0
I1211 11:50:43.611316 3919 hierarchical_allocator_process.hpp:445] Added slave 201312111150-1015726915-40217-3899-0 (janus.apache.org) with cpus(*):2; mem(*):1024; disk(*):23038; ports(*):[31000-32000] (and cpus(*):2; mem(*):1024; disk(*):23038; ports(*):[31000-32000] available)
I1211 11:50:43.611376 3919 hierarchical_allocator_process.hpp:708] Performed allocation for slave 201312111150-1015726915-40217-3899-0 in 16175ns
I1211 11:50:43.611950 3922 detector.cpp:322] A new leading master (UPID=master@67.195.138.60:40217) is detected
I1211 11:50:43.611979 3922 sched.cpp:207] New master detected at master@67.195.138.60:40217
I1211 11:50:43.611991 3922 sched.cpp:260] Authenticating with master master@67.195.138.60:40217
I1211 11:50:43.612040 3922 sched.cpp:229] Detecting new master
I1211 11:50:43.612061 3920 authenticatee.hpp:124] Creating new client SASL connection
I1211 11:50:43.612289 3925 master.cpp:1849] Authenticating framework at scheduler(131)@67.195.138.60:40217
I1211 11:50:43.612387 3920 authenticator.hpp:140] Creating new server SASL connection
I1211 11:50:43.612515 3920 authenticatee.hpp:212] Received SASL authentication mechanisms: CRAM-MD5
I1211 11:50:43.612534 3920 authenticatee.hpp:238] Attempting to authenticate with mechanism 'CRAM-MD5'
I1211 11:50:43.612565 3920 authenticator.hpp:243] Received SASL authentication start
I1211 11:50:43.612645 3920 authenticator.hpp:325] Authentication requires more steps
I1211 11:50:43.612671 3920 authenticatee.hpp:258] Received SASL authentication step
I1211 11:50:43.612717 3920 authenticator.hpp:271] Received SASL authentication step
I1211 11:50:43.612738 3920 auxprop.cpp:81] Request to lookup properties for user: 'test-principal' realm: 'janus.apache.org' server FQDN: 'janus.apache.org' SASL_AUXPROP_VERIFY_AGAINST_HASH: false SASL_AUXPROP_OVERRIDE: false SASL_AUXPROP_AUTHZID: false
I1211 11:50:43.612746 3920 auxprop.cpp:153] Looking up auxiliary property '*userPassword'
I1211 11:50:43.612764 3920 auxprop.cpp:153] Looking up auxiliary property '*cmusaslsecretCRAM-MD5'
I1211 11:50:43.612773 3920 auxprop.cpp:81] Request to lookup properties for user: 'test-principal' realm: 'janus.apache.org' server FQDN: 'janus.apache.org' SASL_AUXPROP_VERIFY_AGAINST_HASH: false SASL_AUXPROP_OVERRIDE: false SASL_AUXPROP_AUTHZID: true
I1211 11:50:43.612782 3920 auxprop.cpp:103] Skipping auxiliary property '*userPassword' since SASL_AUXPROP_AUTHZID == true
I1211 11:50:43.612787 3920 auxprop.cpp:103] Skipping auxiliary property '*cmusaslsecretCRAM-MD5' since SASL_AUXPROP_AUTHZID == true
I1211 11:50:43.612798 3920 authenticator.hpp:317] Authentication success
I1211 11:50:43.612831 3920 master.cpp:1889] Successfully authenticated framework at scheduler(131)@67.195.138.60:40217
I1211 11:50:43.612841 3922 authenticatee.hpp:298] Authentication success
I1211 11:50:43.612934 3924 sched.cpp:334] Successfully authenticated with master master@67.195.138.60:40217
I1211 11:50:43.613029 3920 master.cpp:801] Received registration request from scheduler(131)@67.195.138.60:40217
I1211 11:50:43.613206 3920 master.cpp:819] Registering framework 201312111150-1015726915-40217-3899-0000 at scheduler(131)@67.195.138.60:40217
I1211 11:50:43.613284 3920 hierarchical_allocator_process.hpp:332] Added framework 201312111150-1015726915-40217-3899-0000
I1211 11:50:43.613282 3925 sched.cpp:383] Framework registered with 201312111150-1015726915-40217-3899-0000
I1211 11:50:43.613334 3925 sched.cpp:397] Scheduler::registered took 16166ns
I1211 11:50:43.613348 3920 hierarchical_allocator_process.hpp:752] Offering cpus(*):2; mem(*):1024; disk(*):23038; ports(*):[31000-32000] on slave 201312111150-1015726915-40217-3899-0 to framework 201312111150-1015726915-40217-3899-0000
I1211 11:50:43.613469 3920 hierarchical_allocator_process.hpp:688] Performed allocation for 1 slaves in 159529ns
I1211 11:50:43.613512 3922 master.hpp:437] Adding offer 201312111150-1015726915-40217-3899-0 with resources cpus(*):2; mem(*):1024; disk(*):23038; ports(*):[31000-32000] on slave 201312111150-1015726915-40217-3899-0 (janus.apache.org)
I1211 11:50:43.613558 3922 master.cpp:1804] Sending 1 offers to framework 201312111150-1015726915-40217-3899-0000
I1211 11:50:43.613847 3923 sched.cpp:517] Scheduler::resourceOffers took 193621ns
I1211 11:50:43.613955 3922 master.cpp:2141] Processing reply for offer 201312111150-1015726915-40217-3899-0 on slave 201312111150-1015726915-40217-3899-0 (janus.apache.org) for framework 201312111150-1015726915-40217-3899-0000
I1211 11:50:43.614043 3922 master.hpp:409] Adding task 0 with resources cpus(*):1; mem(*):500 on slave 201312111150-1015726915-40217-3899-0 (janus.apache.org)
I1211 11:50:43.614069 3922 master.cpp:2265] Launching task 0 of framework 201312111150-1015726915-40217-3899-0000 with resources cpus(*):1; mem(*):500 on slave 201312111150-1015726915-40217-3899-0 (janus.apache.org)
I1211 11:50:43.614131 3923 slave.cpp:727] Got assigned task 0 for framework 201312111150-1015726915-40217-3899-0000
I1211 11:50:43.614204 3922 master.hpp:447] Removing offer 201312111150-1015726915-40217-3899-0 with resources cpus(*):2; mem(*):1024; disk(*):23038; ports(*):[31000-32000] on slave 201312111150-1015726915-40217-3899-0 (janus.apache.org)
I1211 11:50:43.614200 3925 hierarchical_allocator_process.hpp:547] Framework 201312111150-1015726915-40217-3899-0000 left cpus(*):1; mem(*):524; disk(*):23038; ports(*):[31000-32000] unused on slave 201312111150-1015726915-40217-3899-0
I1211 11:50:43.614277 3923 slave.cpp:836] Launching task 0 for framework 201312111150-1015726915-40217-3899-0000
I1211 11:50:43.614336 3925 hierarchical_allocator_process.hpp:590] Framework 201312111150-1015726915-40217-3899-0000 filtered slave 201312111150-1015726915-40217-3899-0 for 5secs
I1211 11:50:43.616286 3923 slave.cpp:946] Queuing task '0' for executor default of framework '201312111150-1015726915-40217-3899-0000
I1211 11:50:43.616341 3923 slave.cpp:466] Successfully attached file '/tmp/AllocatorZooKeeperTest_0_SlaveReregistersFirst_cUKyLp/slaves/201312111150-1015726915-40217-3899-0/frameworks/201312111150-1015726915-40217-3899-0000/executors/default/runs/0ea037dd-2187-4439-a5ac-23448a7dce9c'
I1211 11:50:43.617944 3925 exec.cpp:178] Executor started at: executor(45)@67.195.138.60:40217 with pid 3899
I1211 11:50:43.617969 3920 slave.cpp:2089] Monitoring executor default of framework 201312111150-1015726915-40217-3899-0000 forked at pid 3899
I1211 11:50:43.618115 3920 slave.cpp:1422] Got registration for executor 'default' of framework 201312111150-1015726915-40217-3899-0000
I1211 11:50:43.618341 3920 slave.cpp:1543] Flushing queued task 0 for executor 'default' of framework 201312111150-1015726915-40217-3899-0000
I1211 11:50:43.618356 3919 exec.cpp:202] Executor registered on slave 201312111150-1015726915-40217-3899-0
I1211 11:50:43.619914 3919 exec.cpp:214] Executor::registered took 12486ns
I1211 11:50:43.619959 3919 exec.cpp:289] Executor asked to run task '0'
I1211 11:50:43.619994 3919 exec.cpp:298] Executor::launchTask took 21674ns
I1211 11:50:43.621526 3919 exec.cpp:521] Executor sending status update TASK_RUNNING (UUID: e70df6fa-a6bb-4838-93fb-cea872c39dcf) for task 0 of framework 201312111150-1015726915-40217-3899-0000
I1211 11:50:43.621593 3919 slave.cpp:1756] Handling status update TASK_RUNNING (UUID: e70df6fa-a6bb-4838-93fb-cea872c39dcf) for task 0 of framework 201312111150-1015726915-40217-3899-0000 from executor(45)@67.195.138.60:40217
I1211 11:50:43.621706 3921 status_update_manager.cpp:312] Received status update TASK_RUNNING (UUID: e70df6fa-a6bb-4838-93fb-cea872c39dcf) for task 0 of framework 201312111150-1015726915-40217-3899-0000
I1211 11:50:43.621726 3921 status_update_manager.cpp:491] Creating StatusUpdate stream for task 0 of framework 201312111150-1015726915-40217-3899-0000
I1211 11:50:43.621805 3921 status_update_manager.cpp:365] Forwarding status update TASK_RUNNING (UUID: e70df6fa-a6bb-4838-93fb-cea872c39dcf) for task 0 of framework 201312111150-1015726915-40217-3899-0000 to master@67.195.138.60:40217
I1211 11:50:43.621908 3919 master.cpp:1552] Status update TASK_RUNNING (UUID: e70df6fa-a6bb-4838-93fb-cea872c39dcf) for task 0 of framework 201312111150-1015726915-40217-3899-0000 from slave(136)@67.195.138.60:40217
I1211 11:50:43.621911 3925 slave.cpp:1875] Status update manager successfully handled status update TASK_RUNNING (UUID: e70df6fa-a6bb-4838-93fb-cea872c39dcf) for task 0 of framework 201312111150-1015726915-40217-3899-0000
I1211 11:50:43.622222 3919 master.cpp:556] Master terminating
I1211 11:50:43.622230 3925 slave.cpp:1881] Sending acknowledgement for status update TASK_RUNNING (UUID: e70df6fa-a6bb-4838-93fb-cea872c39dcf) for task 0 of framework 201312111150-1015726915-40217-3899-0000 to executor(45)@67.195.138.60:40217
I1211 11:50:43.621994 3920 sched.cpp:608] Scheduler::statusUpdate took 34178ns
I1211 11:50:43.622262 3899 master.cpp:209] Shutting down master
I1211 11:50:43.622302 3899 master.hpp:427] Removing task 0 with resources cpus(*):1; mem(*):500 on slave 201312111150-1015726915-40217-3899-0 (janus.apache.org)
I1211 11:50:43.622436 3924 exec.cpp:335] Executor received status update acknowledgement e70df6fa-a6bb-4838-93fb-cea872c39dcf for task 0 of framework 201312111150-1015726915-40217-3899-0000
I1211 11:50:43.622448 3925 slave.cpp:1956] master@67.195.138.60:40217 exited
W1211 11:50:43.622463 3925 slave.cpp:1959] Master disconnected! Waiting for a new master to be elected
I1211 11:50:43.622493 3899 master.cpp:252] Removing slave 201312111150-1015726915-40217-3899-0 (janus.apache.org)
I1211 11:50:43.622714 3924 status_update_manager.cpp:390] Received status update acknowledgement (UUID: e70df6fa-a6bb-4838-93fb-cea872c39dcf) for task 0 of framework 201312111150-1015726915-40217-3899-0000
I1211 11:50:43.622696 3919 hierarchical_allocator_process.hpp:637] Recovered cpus(*):1; mem(*):500 (total allocatable: cpus(*):2; mem(*):1024; disk(*):23038; ports(*):[31000-32000]) on slave 201312111150-1015726915-40217-3899-0 from framework 201312111150-1015726915-40217-3899-0000
I1211 11:50:43.622787 3924 slave.cpp:1362] Status update manager successfully handled status update acknowledgement (UUID: e70df6fa-a6bb-4838-93fb-cea872c39dcf) for task 0 of framework 201312111150-1015726915-40217-3899-0000
I1211 11:50:43.690603 3925 contender.cpp:172] Now cancelling the membership: 4
2013-12-11 11:50:43,690:3899(0x2ad12c7fa7c0):ZOO_INFO@zookeeper_close@2304: Closing zookeeper sessionId=0x142e17e93840007 to [127.0.0.1:41287]
2013-12-11 11:50:43,691:3899(0x2ad12c7fa7c0):ZOO_INFO@zookeeper_close@2304: Closing zookeeper sessionId=0x142e17e93840006 to [127.0.0.1:41287]
I1211 11:50:43.692482 3925 master.cpp:284] Master started on 67.195.138.60:40217
I1211 11:50:43.692543 3925 master.cpp:298] Master ID: 201312111150-1015726915-40217-3899
I1211 11:50:43.692564 3925 master.cpp:301] Master only allowing authenticated frameworks to register!
2013-12-11 11:50:43,692:3899(0x2ad12d404700):ZOO_INFO@log_env@658: Client environment:zookeeper.version=zookeeper C client 3.3.4
2013-12-11 11:50:43,692:3899(0x2ad12d404700):ZOO_INFO@log_env@662: Client environment:host.name=janus
2013-12-11 11:50:43,692:3899(0x2ad12d404700):ZOO_INFO@log_env@669: Client environment:os.name=Linux
2013-12-11 11:50:43,692:3899(0x2ad12d404700):ZOO_INFO@log_env@670: Client environment:os.arch=3.2.0-57-generic
2013-12-11 11:50:43,692:3899(0x2ad12d404700):ZOO_INFO@log_env@671: Client environment:os.version=#87-Ubuntu SMP Tue Nov 12 21:35:10 UTC 2013
2013-12-11 11:50:43,693:3899(0x2ad12d404700):ZOO_INFO@log_env@679: Client environment:user.name=(null)
I1211 11:50:43.693074 3920 master.cpp:84] No whitelist given. Advertising offers for all slaves
2013-12-11 11:50:43,693:3899(0x2ad12d404700):ZOO_INFO@log_env@687: Client environment:user.home=/home/jenkins
2013-12-11 11:50:43,693:3899(0x2ad12d404700):ZOO_INFO@log_env@699: Client environment:user.dir=<https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Build-Out-Of-Src-Set-JAVA_HOME/ws/build/src>
2013-12-11 11:50:43,693:3899(0x2ad12d404700):ZOO_INFO@zookeeper_init@727: Initiating client connection, host=127.0.0.1:41287 sessionTimeout=10000 watcher=0x2ad12a0ae2d0 sessionId=0 sessionPasswd=<null> context=0x2ad138153ee0 flags=0
I1211 11:50:43.693375 3920 contender.cpp:122] Joining the ZK group with data: 'master@67.195.138.60:40217'
I1211 11:50:43.693426 3925 hierarchical_allocator_process.hpp:302] Initializing hierarchical allocator process with master : master@67.195.138.60:40217
2013-12-11 11:50:43,693:3899(0x2ad2c4f17700):ZOO_INFO@check_events@1585: initiated connection to server [127.0.0.1:41287]
2013-12-11 11:50:43,694:3899(0x2ad12c9ff700):ZOO_INFO@log_env@658: Client environment:zookeeper.version=zookeeper C client 3.3.4
2013-12-11 11:50:43,694:3899(0x2ad12c9ff700):ZOO_INFO@log_env@662: Client environment:host.name=janus
2013-12-11 11:50:43,694:3899(0x2ad12c9ff700):ZOO_INFO@log_env@669: Client environment:os.name=Linux
2013-12-11 11:50:43,694:3899(0x2ad12c9ff700):ZOO_INFO@log_env@670: Client environment:os.arch=3.2.0-57-generic
2013-12-11 11:50:43,694:3899(0x2ad12c9ff700):ZOO_INFO@log_env@671: Client environment:os.version=#87-Ubuntu SMP Tue Nov 12 21:35:10 UTC 2013
2013-12-11 11:50:43,694:3899(0x2ad12c9ff700):ZOO_INFO@log_env@679: Client environment:user.name=(null)
2013-12-11 11:50:43,694:3899(0x2ad12c9ff700):ZOO_INFO@log_env@687: Client environment:user.home=/home/jenkins
2013-12-11 11:50:43,694:3899(0x2ad12c9ff700):ZOO_INFO@log_env@699: Client environment:user.dir=<https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Build-Out-Of-Src-Set-JAVA_HOME/ws/build/src>
2013-12-11 11:50:43,694:3899(0x2ad12c9ff700):ZOO_INFO@zookeeper_init@727: Initiating client connection, host=127.0.0.1:41287 sessionTimeout=10000 watcher=0x2ad12a0ae2d0 sessionId=0 sessionPasswd=<null> context=0x2ad1541703d0 flags=0
2013-12-11 11:50:43,695:3899(0x2ad2e0401700):ZOO_INFO@check_events@1585: initiated connection to server [127.0.0.1:41287]
2013-12-11 11:50:43,769:3899(0x2ad2c4f17700):ZOO_INFO@check_events@1632: session establishment complete on server [127.0.0.1:41287], sessionId=0x142e17e9384000a, negotiated timeout=10000
I1211 11:50:43.769543 3918 group.cpp:280] Group process ((1548)@67.195.138.60:40217) connected to ZooKeeper
I1211 11:50:43.769573 3918 group.cpp:675] Syncing group operations: queue size (joins, cancels, datas) = (0, 0, 0)
I1211 11:50:43.769584 3918 group.cpp:337] Trying to create path '/znode' in ZooKeeper
2013-12-11 11:50:43,769:3899(0x2ad2e0401700):ZOO_INFO@check_events@1632: session establishment complete on server [127.0.0.1:41287], sessionId=0x142e17e9384000b, negotiated timeout=10000
I1211 11:50:43.770555 3921 group.cpp:280] Group process ((1546)@67.195.138.60:40217) connected to ZooKeeper
I1211 11:50:43.770611 3921 group.cpp:675] Syncing group operations: queue size (joins, cancels, datas) = (1, 0, 0)
I1211 11:50:43.770619 3921 group.cpp:337] Trying to create path '/znode' in ZooKeeper
I1211 11:50:43.770715 3923 detector.cpp:116] The current leader (id=4) is lost
I1211 11:50:43.770742 3923 detector.cpp:138] No new leader is elected after election
I1211 11:50:43.770786 3925 detector.cpp:116] The current leader (id=4) is lost
I1211 11:50:43.770794 3925 detector.cpp:138] No new leader is elected after election
I1211 11:50:43.770894 3919 slave.cpp:518] Lost leading master
I1211 11:50:43.770911 3919 slave.cpp:524] Detecting new master
I1211 11:50:43.771263 3923 sched.cpp:201] Scheduler::disconnected took 16955ns
I1211 11:50:43.771280 3923 sched.cpp:223] No master detected
I1211 11:50:43.771294 3923 sched.cpp:229] Detecting new master
I1211 11:50:43.802899 3923 contender.cpp:203] New candidate (id='6', data='master@67.195.138.60:40217') has entered the contest for leadership
I1211 11:50:43.803299 3924 detector.cpp:130] Detected a new leader (id='6')
I1211 11:50:43.803423 3924 group.cpp:562] Trying to get '/znode/0000000006' in ZooKeeper
I1211 11:50:43.803742 3918 detector.cpp:130] Detected a new leader (id='6')
I1211 11:50:43.803822 3919 group.cpp:562] Trying to get '/znode/0000000006' in ZooKeeper
I1211 11:50:43.803887 3920 detector.cpp:130] Detected a new leader (id='6')
I1211 11:50:43.803966 3920 group.cpp:562] Trying to get '/znode/0000000006' in ZooKeeper
I1211 11:50:43.804167 3925 detector.cpp:322] A new leading master (UPID=master@67.195.138.60:40217) is detected
I1211 11:50:43.804255 3924 slave.cpp:497] New master detected at master@67.195.138.60:40217
I1211 11:50:43.804312 3921 status_update_manager.cpp:160] New master detected at master@67.195.138.60:40217
I1211 11:50:43.804388 3924 slave.cpp:524] Detecting new master
I1211 11:50:43.804432 3924 detector.cpp:322] A new leading master (UPID=master@67.195.138.60:40217) is detected
W1211 11:50:43.804435 3919 master.cpp:1381] Ignoring re-register slave message from janus.apache.org since not elected yet
I1211 11:50:43.804484 3918 sched.cpp:207] New master detected at master@67.195.138.60:40217
I1211 11:50:43.804523 3918 sched.cpp:260] Authenticating with master master@67.195.138.60:40217
I1211 11:50:43.804577 3924 detector.cpp:322] A new leading master (UPID=master@67.195.138.60:40217) is detected
I1211 11:50:43.804594 3918 sched.cpp:229] Detecting new master
I1211 11:50:43.804611 3919 authenticatee.hpp:124] Creating new client SASL connection
I1211 11:50:43.804630 3921 master.cpp:746] The newly elected leader is master@67.195.138.60:40217
I1211 11:50:43.804646 3921 master.cpp:750] Elected as the leading master!
I1211 11:50:43.804795 3918 master.cpp:1849] Authenticating framework at scheduler(131)@67.195.138.60:40217
I1211 11:50:43.804893 3919 authenticator.hpp:140] Creating new server SASL connection
I1211 11:50:43.804998 3919 authenticatee.hpp:212] Received SASL authentication mechanisms: CRAM-MD5
I1211 11:50:43.805016 3919 authenticatee.hpp:238] Attempting to authenticate with mechanism 'CRAM-MD5'
I1211 11:50:43.805047 3919 authenticator.hpp:243] Received SASL authentication start
I1211 11:50:43.805110 3919 authenticator.hpp:325] Authentication requires more steps
I1211 11:50:43.805140 3919 authenticatee.hpp:258] Received SASL authentication step
I1211 11:50:43.805179 3919 authenticator.hpp:271] Received SASL authentication step
I1211 11:50:43.805197 3919 auxprop.cpp:81] Request to lookup properties for user: 'test-principal' realm: 'janus.apache.org' server FQDN: 'janus.apache.org' SASL_AUXPROP_VERIFY_AGAINST_HASH: false SASL_AUXPROP_OVERRIDE: false SASL_AUXPROP_AUTHZID: false
I1211 11:50:43.805204 3919 auxprop.cpp:153] Looking up auxiliary property '*userPassword'
I1211 11:50:43.805218 3919 auxprop.cpp:153] Looking up auxiliary property '*cmusaslsecretCRAM-MD5'
I1211 11:50:43.805228 3919 auxprop.cpp:81] Request to lookup properties for user: 'test-principal' realm: 'janus.apache.org' server FQDN: 'janus.apache.org' SASL_AUXPROP_VERIFY_AGAINST_HASH: false SASL_AUXPROP_OVERRIDE: false SASL_AUXPROP_AUTHZID: true
I1211 11:50:43.805234 3919 auxprop.cpp:103] Skipping auxiliary property '*userPassword' since SASL_AUXPROP_AUTHZID == true
I1211 11:50:43.805239 3919 auxprop.cpp:103] Skipping auxiliary property '*cmusaslsecretCRAM-MD5' since SASL_AUXPROP_AUTHZID == true
I1211 11:50:43.805250 3919 authenticator.hpp:317] Authentication success
I1211 11:50:43.805312 3923 authenticatee.hpp:298] Authentication success
I1211 11:50:43.805320 3919 master.cpp:1889] Successfully authenticated framework at scheduler(131)@67.195.138.60:40217
I1211 11:50:43.805394 3923 sched.cpp:334] Successfully authenticated with master master@67.195.138.60:40217
2013-12-11 11:50:43,886:3899(0x2ad2c4d16700):ZOO_ERROR@handle_socket_error_msg@1579: Socket [127.0.0.1:33633] zk retcode=-4, errno=111(Connection refused): server refused to accept the client
I1211 11:50:44.372097 3921 monitor.cpp:193] Publishing resource usage for executor 'default' of framework '201312111150-1015726915-40217-3899-0000'
I1211 11:50:44.611578 3923 master.cpp:1455] Attempting to re-register slave 201312111150-1015726915-40217-3899-0 at slave(136)@67.195.138.60:40217 (janus.apache.org)
I1211 11:50:44.611616 3923 master.cpp:2628] Adding slave 201312111150-1015726915-40217-3899-0 at janus.apache.org with cpus(*):2; mem(*):1024; disk(*):23038; ports(*):[31000-32000]
I1211 11:50:44.611734 3923 master.hpp:409] Adding task 0 with resources cpus(*):1; mem(*):500 on slave 201312111150-1015726915-40217-3899-0 (janus.apache.org)
W1211 11:50:44.611757 3923 master.cpp:2723] Possibly orphaned task 0 of framework 201312111150-1015726915-40217-3899-0000 running on slave 201312111150-1015726915-40217-3899-0 (janus.apache.org)
I1211 11:50:44.611768 3925 slave.cpp:592] Re-registered with master master@67.195.138.60:40217
I1211 11:50:44.612068 3925 hierarchical_allocator_process.hpp:445] Added slave 201312111150-1015726915-40217-3899-0 (janus.apache.org) with cpus(*):2; mem(*):1024; disk(*):23038; ports(*):[31000-32000] (and cpus(*):1; mem(*):524; disk(*):23038; ports(*):[31000-32000] available)
I1211 11:50:44.612128 3925 hierarchical_allocator_process.hpp:708] Performed allocation for slave 201312111150-1015726915-40217-3899-0 in 10022ns
I1211 11:50:44.613550 3919 master.cpp:889] Re-registering framework 201312111150-1015726915-40217-3899-0000 at scheduler(131)@67.195.138.60:40217
I1211 11:50:44.613837 3921 sched.cpp:383] Framework registered with 201312111150-1015726915-40217-3899-0000
I1211 11:50:44.613876 3921 sched.cpp:397] Scheduler::registered took 14309ns
I1211 11:50:44.613873 3922 slave.cpp:1303] Updating framework 201312111150-1015726915-40217-3899-0000 pid to scheduler(131)@67.195.138.60:40217
I1211 11:50:44.613927 3920 hierarchical_allocator_process.hpp:332] Added framework 201312111150-1015726915-40217-3899-0000
I1211 11:50:44.613980 3920 hierarchical_allocator_process.hpp:752] Offering cpus(*):1; mem(*):524; disk(*):23038; ports(*):[31000-32000] on slave 201312111150-1015726915-40217-3899-0 to framework 201312111150-1015726915-40217-3899-0000
I1211 11:50:44.614135 3920 hierarchical_allocator_process.hpp:688] Performed allocation for 1 slaves in 188808ns
I1211 11:50:44.614210 3923 master.hpp:437] Adding offer 201312111150-1015726915-40217-3899-0 with resources cpus(*):1; mem(*):524; disk(*):23038; ports(*):[31000-32000] on slave 201312111150-1015726915-40217-3899-0 (janus.apache.org)
I1211 11:50:44.614254 3923 master.cpp:1804] Sending 1 offers to framework 201312111150-1015726915-40217-3899-0000
I1211 11:50:44.614434 3922 sched.cpp:517] Scheduler::resourceOffers took 56317ns
I1211 11:50:44.614625 3923 sched.cpp:719] Stopping framework '201312111150-1015726915-40217-3899-0000'
I1211 11:50:44.614647 3899 master.cpp:556] Master terminating
I1211 11:50:44.614671 3899 master.cpp:209] Shutting down master
I1211 11:50:44.614675 3923 slave.cpp:1956] master@67.195.138.60:40217 exited
W1211 11:50:44.614686 3923 slave.cpp:1959] Master disconnected! Waiting for a new master to be elected
I1211 11:50:44.614711 3899 master.hpp:427] Removing task 0 with resources cpus(*):1; mem(*):500 on slave 201312111150-1015726915-40217-3899-0 (janus.apache.org)
I1211 11:50:44.614784 3899 master.hpp:447] Removing offer 201312111150-1015726915-40217-3899-0 with resources cpus(*):1; mem(*):524; disk(*):23038; ports(*):[31000-32000] on slave 201312111150-1015726915-40217-3899-0 (janus.apache.org)
I1211 11:50:44.614889 3899 master.cpp:252] Removing slave 201312111150-1015726915-40217-3899-0 (janus.apache.org)
I1211 11:50:44.614914 3922 hierarchical_allocator_process.hpp:637] Recovered cpus(*):1; mem(*):500 (total allocatable: cpus(*):1; mem(*):500) on slave 201312111150-1015726915-40217-3899-0 from framework 201312111150-1015726915-40217-3899-0000
I1211 11:50:44.615298 3899 contender.cpp:172] Now cancelling the membership: 6
2013-12-11 11:50:44,615:3899(0x2ad12c7fa7c0):ZOO_INFO@zookeeper_close@2304: Closing zookeeper sessionId=0x142e17e9384000b to [127.0.0.1:41287]
2013-12-11 11:50:44,616:3899(0x2ad12c7fa7c0):ZOO_INFO@zookeeper_close@2304: Closing zookeeper sessionId=0x142e17e9384000a to [127.0.0.1:41287]
I1211 11:50:44.616623 3922 slave.cpp:391] Slave terminating
I1211 11:50:44.616648 3922 slave.cpp:1133] Asked to shut down framework 201312111150-1015726915-40217-3899-0000 by @0.0.0.0:0
I1211 11:50:44.616662 3922 slave.cpp:1158] Shutting down framework 201312111150-1015726915-40217-3899-0000
I1211 11:50:44.616679 3922 slave.cpp:2422] Shutting down executor 'default' of framework 201312111150-1015726915-40217-3899-0000
I1211 11:50:44.616724 3920 exec.cpp:375] Executor asked to shutdown
I1211 11:50:44.616750 3920 exec.cpp:390] Executor::shutdown took 12803ns
2013-12-11 11:50:44,617:3899(0x2ad12c7fa7c0):ZOO_INFO@zookeeper_close@2304: Closing zookeeper sessionId=0x142e17e93840008 to [127.0.0.1:41287]
2013-12-11 11:50:44,617:3899(0x2ad12c7fa7c0):ZOO_INFO@zookeeper_close@2304: Closing zookeeper sessionId=0x142e17e93840009 to [127.0.0.1:41287]
[ OK ] AllocatorZooKeeperTest/0.SlaveReregistersFirst (1099 ms)
I1211 11:50:44.671198 3899 zookeeper_test_server.cpp:93] Shutdown ZooKeeperTestServer on port 41287
[----------] 2 tests from AllocatorZooKeeperTest/0 (6404 ms total)
[----------] Global test environment tear-down
[==========] 246 tests from 43 test cases ran. (215688 ms total)
[ PASSED ] 245 tests.
[ FAILED ] 1 test, listed below:
[ FAILED ] ExamplesTest.PythonFramework
1 FAILED TEST
YOU HAVE 2 DISABLED TESTS
make[3]: *** [check-local] Error 1
make[3]: Leaving directory `<https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Build-Out-Of-Src-Set-JAVA_HOME/ws/build/src'>
make[2]: *** [check-am] Error 2
make[2]: Leaving directory `<https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Build-Out-Of-Src-Set-JAVA_HOME/ws/build/src'>
make[1]: *** [check] Error 2
make[1]: Leaving directory `<https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Build-Out-Of-Src-Set-JAVA_HOME/ws/build/src'>
make: *** [check-recursive] Error 1
Build step 'Execute shell' marked build as failure
Build failed in Jenkins:
Mesos-Trunk-Ubuntu-Build-Out-Of-Src-Set-JAVA_HOME #1793
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Build-Out-Of-Src-Set-JAVA_HOME/1793/>
------------------------------------------
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Started by an SCM change
Building remotely on ubuntu3 in workspace <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Build-Out-Of-Src-Set-JAVA_HOME/ws/>
Fetching changes from the remote Git repository
Fetching upstream changes from https://git-wip-us.apache.org/repos/asf/mesos.git
Checking out Revision 4f99d86febbfc43d5c233961eb7860d766b53d72 (origin/master)
Cleaning workspace
Resetting working tree
FATAL: Command "clean -fdx" returned status code 1:
stdout: Removing build/
stderr: warning: failed to remove build/
hudson.plugins.git.GitException: Command "clean -fdx" returned status code 1:
stdout: Removing build/
stderr: warning: failed to remove build/
at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.launchCommandIn(CliGitAPIImpl.java:981)
at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.launchCommandIn(CliGitAPIImpl.java:961)
at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.launchCommandIn(CliGitAPIImpl.java:957)
at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.launchCommand(CliGitAPIImpl.java:877)
at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.launchCommand(CliGitAPIImpl.java:887)
at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.clean(CliGitAPIImpl.java:347)
at hudson.plugins.git.GitAPI.clean(GitAPI.java:251)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:622)
at hudson.remoting.RemoteInvocationHandler$RPCRequest.perform(RemoteInvocationHandler.java:299)
at hudson.remoting.RemoteInvocationHandler$RPCRequest.call(RemoteInvocationHandler.java:280)
at hudson.remoting.RemoteInvocationHandler$RPCRequest.call(RemoteInvocationHandler.java:239)
at hudson.remoting.UserRequest.perform(UserRequest.java:118)
at hudson.remoting.UserRequest.perform(UserRequest.java:48)
at hudson.remoting.Request$2.run(Request.java:328)
at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
at java.util.concurrent.FutureTask.run(FutureTask.java:166)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1146)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:701)