You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@mesos.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2017/10/11 17:41:00 UTC

Build failed in Jenkins: Mesos-Buildbot » cmake,gcc,--verbose --disable-libtool-wrappers,GLOG_v=1 MESOS_VERBOSE=1,ubuntu:14.04,(ubuntu)&&(!ubuntu-us1)&&(!ubuntu-eu2)&&(!qnode3)&&(!H23) #4307

See <https://builds.apache.org/job/Mesos-Buildbot/BUILDTOOL=cmake,COMPILER=gcc,CONFIGURATION=--verbose%20--disable-libtool-wrappers,ENVIRONMENT=GLOG_v=1%20MESOS_VERBOSE=1,OS=ubuntu%3A14.04,label_exp=(ubuntu)&&(!ubuntu-us1)&&(!ubuntu-eu2)&&(!qnode3)&&(!H23)/4307/display/redirect?page=changes>

Changes:

[bbannier] Removed unneeded configure step in mesos-tidy Docker image.

[bbannier] Installed newer cmake version in mesos-tidy Docker image.

------------------------------------------
[...truncated 18.05 MB...]
3: I1011 13:20:23.205293 17261 master.cpp:631] Authorization enabled
3: I1011 13:20:23.205580 17254 whitelist_watcher.cpp:77] No whitelist given
3: I1011 13:20:23.205618 17252 hierarchical.cpp:171] Initialized hierarchical allocator process
3: I1011 13:20:23.209190 17261 master.cpp:2198] Elected as the leading master!
3: I1011 13:20:23.209219 17261 master.cpp:1687] Recovering from registrar
3: I1011 13:20:23.209375 17247 registrar.cpp:347] Recovering registrar
3: I1011 13:20:23.210121 17247 registrar.cpp:391] Successfully fetched the registry (0B) in 689152ns
3: I1011 13:20:23.210301 17247 registrar.cpp:495] Applied 1 operations in 36929ns; attempting to update the registry
3: I1011 13:20:23.211033 17247 registrar.cpp:552] Successfully updated the registry in 674048ns
3: I1011 13:20:23.211151 17247 registrar.cpp:424] Successfully recovered registrar
3: I1011 13:20:23.211518 17246 master.cpp:1791] Recovered 0 agents from the registry (129B); allowing 10mins for agents to re-register
3: I1011 13:20:23.211549 17254 hierarchical.cpp:209] Skipping recovery of hierarchical allocator: nothing to recover
3: W1011 13:20:23.216495 17241 process.cpp:3194] Attempted to spawn already running process files@172.17.0.3:36056
3: I1011 13:20:23.217432 17241 containerizer.cpp:292] Using isolation { environment_secret, posix/cpu, posix/mem, filesystem/posix, network/cni }
3: W1011 13:20:23.218164 17241 backend.cpp:76] Failed to create 'aufs' backend: AufsBackend requires root privileges
3: W1011 13:20:23.218295 17241 backend.cpp:76] Failed to create 'bind' backend: BindBackend requires root privileges
3: I1011 13:20:23.218333 17241 provisioner.cpp:255] Using default backend 'copy'
3: I1011 13:20:23.220546 17241 cluster.cpp:448] Creating default 'local' authorizer
3: I1011 13:20:23.222520 17254 slave.cpp:254] Mesos agent started on (454)@172.17.0.3:36056
3: I1011 13:20:23.222542 17254 slave.cpp:255] Flags at startup: --acls="" --appc_simple_discovery_uri_prefix="http://" --appc_store_dir="/tmp/OversubscriptionTest_QoSCorrectionKill_gamtGd/store/appc" --authenticate_http_readonly="true" --authenticate_http_readwrite="true" --authenticatee="crammd5" --authentication_backoff_factor="1secs" --authorizer="local" --cgroups_cpu_enable_pids_and_tids_count="false" --cgroups_enable_cfs="false" --cgroups_hierarchy="/sys/fs/cgroup" --cgroups_limit_swap="false" --cgroups_root="mesos" --container_disk_watch_interval="15secs" --containerizers="mesos" --credential="/tmp/OversubscriptionTest_QoSCorrectionKill_gamtGd/credential" --default_role="*" --disallow_sharing_agent_pid_namespace="false" --disk_watch_interval="1mins" --docker="docker" --docker_kill_orphans="true" --docker_registry="https://registry-1.docker.io" --docker_remove_delay="6hrs" --docker_socket="/var/run/docker.sock" --docker_stop_timeout="0ns" --docker_store_dir="/tmp/OversubscriptionTest_QoSCorrectionKill_gamtGd/store/docker" --docker_volume_checkpoint_dir="/var/run/mesos/isolators/docker/volume" --enforce_container_disk_quota="false" --executor_registration_timeout="1mins" --executor_reregistration_timeout="2secs" --executor_shutdown_grace_period="5secs" --fetcher_cache_dir="/tmp/OversubscriptionTest_QoSCorrectionKill_gamtGd/fetch" --fetcher_cache_size="2GB" --frameworks_home="" --gc_delay="1weeks" --gc_disk_headroom="0.1" --hadoop_home="" --help="false" --hostname_lookup="true" --http_command_executor="false" --http_credentials="/tmp/OversubscriptionTest_QoSCorrectionKill_gamtGd/http_credentials" --http_heartbeat_interval="30secs" --initialize_driver_logging="true" --isolation="posix/cpu,posix/mem" --launcher="posix" --launcher_dir="/mesos/build/src" --logbufsecs="0" --logging_level="INFO" --max_completed_executors_per_framework="150" --oversubscribed_resources_interval="15secs" --perf_duration="10secs" --perf_interval="1mins" --port="5051" --qos_correction_interval_min="0ns" --quiet="false" --recover="reconnect" --recovery_timeout="15mins" --registration_backoff_factor="10ms" --resources="cpus:2;gpus:0;mem:1024;disk:1024;ports:[31000-32000]" --revocable_cpu_low_priority="true" --runtime_dir="/tmp/OversubscriptionTest_QoSCorrectionKill_gamtGd" --sandbox_directory="/mnt/mesos/sandbox" --strict="true" --switch_user="true" --systemd_enable_support="true" --systemd_runtime_directory="/run/systemd/system" --version="false" --work_dir="/tmp/OversubscriptionTest_QoSCorrectionKill_1noM0L" --zk_session_timeout="10secs"
3: I1011 13:20:23.223135 17254 credentials.hpp:86] Loading credential for authentication from '/tmp/OversubscriptionTest_QoSCorrectionKill_gamtGd/credential'
3: I1011 13:20:23.223343 17254 slave.cpp:287] Agent using credential for: test-principal
3: I1011 13:20:23.223371 17254 credentials.hpp:37] Loading credentials for authentication from '/tmp/OversubscriptionTest_QoSCorrectionKill_gamtGd/http_credentials'
3: I1011 13:20:23.223678 17254 http.cpp:1045] Creating default 'basic' HTTP authenticator for realm 'mesos-agent-readonly'
3: I1011 13:20:23.223830 17254 http.cpp:1045] Creating default 'basic' HTTP authenticator for realm 'mesos-agent-readwrite'
3: I1011 13:20:23.225489 17254 slave.cpp:585] Agent resources: [{"name":"cpus","scalar":{"value":2.0},"type":"SCALAR"},{"name":"mem","scalar":{"value":1024.0},"type":"SCALAR"},{"name":"disk","scalar":{"value":1024.0},"type":"SCALAR"},{"name":"ports","ranges":{"range":[{"begin":31000,"end":32000}]},"type":"RANGES"}]
3: I1011 13:20:23.225749 17254 slave.cpp:593] Agent attributes: [  ]
3: I1011 13:20:23.225759 17254 slave.cpp:602] Agent hostname: 7005c4d1ac46
3: I1011 13:20:23.225947 17262 status_update_manager.cpp:177] Pausing sending status updates
3: I1011 13:20:23.226078 17262 process.cpp:3929] Handling HTTP event for process 'metrics' with path: '/metrics/snapshot'
3: I1011 13:20:23.227200 17255 http.cpp:851] Authorizing principal 'ANY' to GET the endpoint '/metrics/snapshot'
3: I1011 13:20:23.227705 17249 state.cpp:64] Recovering state from '/tmp/OversubscriptionTest_QoSCorrectionKill_1noM0L/meta'
3: I1011 13:20:23.228071 17264 status_update_manager.cpp:203] Recovering status update manager
3: I1011 13:20:23.228263 17265 containerizer.cpp:648] Recovering containerizer
3: I1011 13:20:23.234815 17257 provisioner.cpp:416] Provisioner recovery complete
3: I1011 13:20:23.235189 17256 slave.cpp:6310] Finished recovery
3: I1011 13:20:23.235790 17256 slave.cpp:6492] Querying resource estimator for oversubscribable resources
3: I1011 13:20:23.236081 17258 status_update_manager.cpp:177] Pausing sending status updates
3: I1011 13:20:23.236177 17256 slave.cpp:993] New master detected at master@172.17.0.3:36056
3: I1011 13:20:23.236331 17256 slave.cpp:1028] Detecting new master
3: I1011 13:20:23.236466 17256 slave.cpp:6506] Received oversubscribable resources {} from the resource estimator
3: I1011 13:20:23.246649 17262 slave.cpp:1055] Authenticating with master master@172.17.0.3:36056
3: I1011 13:20:23.246793 17262 slave.cpp:1064] Using default CRAM-MD5 authenticatee
3: I1011 13:20:23.247236 17265 authenticatee.cpp:121] Creating new client SASL connection
3: I1011 13:20:23.247669 17248 master.cpp:7936] Authenticating slave(454)@172.17.0.3:36056
3: I1011 13:20:23.247790 17257 authenticator.cpp:414] Starting authentication session for crammd5-authenticatee(939)@172.17.0.3:36056
3: I1011 13:20:23.248059 17248 authenticator.cpp:98] Creating new server SASL connection
3: I1011 13:20:23.248287 17258 authenticatee.cpp:213] Received SASL authentication mechanisms: CRAM-MD5
3: I1011 13:20:23.248313 17258 authenticatee.cpp:239] Attempting to authenticate with mechanism 'CRAM-MD5'
3: I1011 13:20:23.248459 17258 authenticator.cpp:204] Received SASL authentication start
3: I1011 13:20:23.248522 17258 authenticator.cpp:326] Authentication requires more steps
3: I1011 13:20:23.248647 17256 authenticatee.cpp:259] Received SASL authentication step
3: W1011 13:20:23.248677 17241 process.cpp:3194] Attempted to spawn already running process version@172.17.0.3:36056
3: I1011 13:20:23.248780 17242 authenticator.cpp:232] Received SASL authentication step
3: I1011 13:20:23.248814 17242 auxprop.cpp:109] Request to lookup properties for user: 'test-principal' realm: '7005c4d1ac46' server FQDN: '7005c4d1ac46' SASL_AUXPROP_VERIFY_AGAINST_HASH: false SASL_AUXPROP_OVERRIDE: false SASL_AUXPROP_AUTHZID: false 
3: I1011 13:20:23.248831 17242 auxprop.cpp:181] Looking up auxiliary property '*userPassword'
3: I1011 13:20:23.248889 17242 auxprop.cpp:181] Looking up auxiliary property '*cmusaslsecretCRAM-MD5'
3: I1011 13:20:23.248916 17242 auxprop.cpp:109] Request to lookup properties for user: 'test-principal' realm: '7005c4d1ac46' server FQDN: '7005c4d1ac46' SASL_AUXPROP_VERIFY_AGAINST_HASH: false SASL_AUXPROP_OVERRIDE: false SASL_AUXPROP_AUTHZID: true 
3: I1011 13:20:23.248929 17242 auxprop.cpp:131] Skipping auxiliary property '*userPassword' since SASL_AUXPROP_AUTHZID == true
3: I1011 13:20:23.248936 17242 auxprop.cpp:131] Skipping auxiliary property '*cmusaslsecretCRAM-MD5' since SASL_AUXPROP_AUTHZID == true
3: I1011 13:20:23.248955 17242 authenticator.cpp:318] Authentication success
3: I1011 13:20:23.249214 17242 master.cpp:7966] Successfully authenticated principal 'test-principal' at slave(454)@172.17.0.3:36056
3: I1011 13:20:23.249308 17251 authenticator.cpp:432] Authentication session cleanup for crammd5-authenticatee(939)@172.17.0.3:36056
3: I1011 13:20:23.249346 17256 authenticatee.cpp:299] Authentication success
3: I1011 13:20:23.249619 17264 slave.cpp:1147] Successfully authenticated with master master@172.17.0.3:36056
3: I1011 13:20:23.249948 17264 slave.cpp:1626] Will retry registration in 19.764362ms if necessary
3: I1011 13:20:23.250149 17241 sched.cpp:232] Version: 1.5.0
3: I1011 13:20:23.250288 17261 master.cpp:5801] Received register agent message from slave(454)@172.17.0.3:36056 (7005c4d1ac46)
3: I1011 13:20:23.250447 17261 master.cpp:3838] Authorizing agent with principal 'test-principal'
3: I1011 13:20:23.250993 17260 sched.cpp:336] New master detected at master@172.17.0.3:36056
3: I1011 13:20:23.251093 17260 sched.cpp:396] Authenticating with master master@172.17.0.3:36056
3: I1011 13:20:23.251114 17260 sched.cpp:403] Using default CRAM-MD5 authenticatee
3: I1011 13:20:23.251188 17245 master.cpp:5861] Authorized registration of agent at slave(454)@172.17.0.3:36056 (7005c4d1ac46)
3: I1011 13:20:23.251343 17265 authenticatee.cpp:121] Creating new client SASL connection
3: I1011 13:20:23.251344 17245 master.cpp:5954] Registering agent at slave(454)@172.17.0.3:36056 (7005c4d1ac46) with id 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-S0
3: I1011 13:20:23.251943 17257 registrar.cpp:495] Applied 1 operations in 108660ns; attempting to update the registry
3: I1011 13:20:23.252770 17257 registrar.cpp:552] Successfully updated the registry in 742912ns
3: I1011 13:20:23.253270 17245 master.cpp:7936] Authenticating scheduler-d5a06dea-d474-4afd-83b2-c2a5bbf14865@172.17.0.3:36056
3: I1011 13:20:23.253397 17242 authenticator.cpp:414] Starting authentication session for crammd5-authenticatee(940)@172.17.0.3:36056
3: I1011 13:20:23.253500 17245 master.cpp:6001] Admitted agent 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-S0 at slave(454)@172.17.0.3:36056 (7005c4d1ac46)
3: I1011 13:20:23.253774 17242 authenticator.cpp:98] Creating new server SASL connection
3: I1011 13:20:23.254020 17242 authenticatee.cpp:213] Received SASL authentication mechanisms: CRAM-MD5
3: I1011 13:20:23.254050 17242 authenticatee.cpp:239] Attempting to authenticate with mechanism 'CRAM-MD5'
3: I1011 13:20:23.254169 17242 slave.cpp:4966] Received ping from slave-observer(447)@172.17.0.3:36056
3: I1011 13:20:23.254515 17251 slave.cpp:1193] Registered with master master@172.17.0.3:36056; given agent ID 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-S0
3: I1011 13:20:23.254312 17245 master.cpp:6032] Registered agent 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-S0 at slave(454)@172.17.0.3:36056 (7005c4d1ac46) with [{"name":"cpus","scalar":{"value":2.0},"type":"SCALAR"},{"name":"mem","scalar":{"value":1024.0},"type":"SCALAR"},{"name":"disk","scalar":{"value":1024.0},"type":"SCALAR"},{"name":"ports","ranges":{"range":[{"begin":31000,"end":32000}]},"type":"RANGES"}]
3: I1011 13:20:23.254751 17245 status_update_manager.cpp:184] Resuming sending status updates
3: I1011 13:20:23.254889 17242 hierarchical.cpp:593] Added agent 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-S0 (7005c4d1ac46) with cpus:2; mem:1024; disk:1024; ports:[31000-32000] (allocated: {})
3: I1011 13:20:23.254972 17251 slave.cpp:1213] Checkpointing SlaveInfo to '/tmp/OversubscriptionTest_QoSCorrectionKill_1noM0L/meta/slaves/051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-S0/slave.info'
3: I1011 13:20:23.255275 17242 hierarchical.cpp:1943] No allocations performed
3: I1011 13:20:23.255343 17242 hierarchical.cpp:1486] Performed allocation for 1 agents in 234910ns
3: I1011 13:20:23.255452 17252 authenticator.cpp:204] Received SASL authentication start
3: I1011 13:20:23.255456 17251 slave.cpp:1262] Forwarding total oversubscribed resources {}
3: I1011 13:20:23.255514 17252 authenticator.cpp:326] Authentication requires more steps
3: I1011 13:20:23.255627 17252 authenticatee.cpp:259] Received SASL authentication step
3: I1011 13:20:23.255714 17252 authenticator.cpp:232] Received SASL authentication step
3: I1011 13:20:23.255720 17253 master.cpp:6817] Received update of agent 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-S0 at slave(454)@172.17.0.3:36056 (7005c4d1ac46) with total oversubscribed resources {}
3: I1011 13:20:23.255941 17253 master.cpp:6828] Ignoring update on agent 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-S0 at slave(454)@172.17.0.3:36056 (7005c4d1ac46) as it reports no changes
3: I1011 13:20:23.255738 17252 auxprop.cpp:109] Request to lookup properties for user: 'test-principal' realm: '7005c4d1ac46' server FQDN: '7005c4d1ac46' SASL_AUXPROP_VERIFY_AGAINST_HASH: false SASL_AUXPROP_OVERRIDE: false SASL_AUXPROP_AUTHZID: false 
3: I1011 13:20:23.255995 17252 auxprop.cpp:181] Looking up auxiliary property '*userPassword'
3: I1011 13:20:23.256026 17252 auxprop.cpp:181] Looking up auxiliary property '*cmusaslsecretCRAM-MD5'
3: I1011 13:20:23.256052 17252 auxprop.cpp:109] Request to lookup properties for user: 'test-principal' realm: '7005c4d1ac46' server FQDN: '7005c4d1ac46' SASL_AUXPROP_VERIFY_AGAINST_HASH: false SASL_AUXPROP_OVERRIDE: false SASL_AUXPROP_AUTHZID: true 
3: I1011 13:20:23.256062 17252 auxprop.cpp:131] Skipping auxiliary property '*userPassword' since SASL_AUXPROP_AUTHZID == true
3: I1011 13:20:23.256069 17252 auxprop.cpp:131] Skipping auxiliary property '*cmusaslsecretCRAM-MD5' since SASL_AUXPROP_AUTHZID == true
3: I1011 13:20:23.256084 17252 authenticator.cpp:318] Authentication success
3: I1011 13:20:23.256167 17261 authenticatee.cpp:299] Authentication success
3: I1011 13:20:23.256217 17260 master.cpp:7966] Successfully authenticated principal 'test-principal' at scheduler-d5a06dea-d474-4afd-83b2-c2a5bbf14865@172.17.0.3:36056
3: I1011 13:20:23.256249 17246 authenticator.cpp:432] Authentication session cleanup for crammd5-authenticatee(940)@172.17.0.3:36056
3: I1011 13:20:23.256409 17252 sched.cpp:502] Successfully authenticated with master master@172.17.0.3:36056
3: I1011 13:20:23.256427 17252 sched.cpp:820] Sending SUBSCRIBE call to master@172.17.0.3:36056
3: I1011 13:20:23.256554 17252 sched.cpp:853] Will retry registration in 5.418964ms if necessary
3: I1011 13:20:23.256703 17257 master.cpp:2929] Received SUBSCRIBE call for framework 'default' at scheduler-d5a06dea-d474-4afd-83b2-c2a5bbf14865@172.17.0.3:36056
3: I1011 13:20:23.256803 17257 master.cpp:2263] Authorizing framework principal 'test-principal' to receive offers for roles '{ * }'
3: I1011 13:20:23.257338 17263 master.cpp:3009] Subscribing framework default with checkpointing disabled and capabilities [ RESERVATION_REFINEMENT ]
3: I1011 13:20:23.257947 17263 sched.cpp:747] Framework registered with 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000
3: I1011 13:20:23.258024 17263 sched.cpp:761] Scheduler::registered took 57711ns
3: I1011 13:20:23.258280 17256 hierarchical.cpp:303] Added framework 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000
3: I1011 13:20:23.259413 17256 hierarchical.cpp:2033] No inverse offers to send out!
3: I1011 13:20:23.259459 17256 hierarchical.cpp:1486] Performed allocation for 1 agents in 1.04462ms
3: I1011 13:20:23.259892 17261 master.cpp:7766] Sending 1 offers to framework 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000 (default) at scheduler-d5a06dea-d474-4afd-83b2-c2a5bbf14865@172.17.0.3:36056
3: I1011 13:20:23.260401 17261 sched.cpp:917] Scheduler::resourceOffers took 105225ns
3: I1011 13:20:23.262184 17257 master.cpp:9389] Removing offer 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-O0
3: I1011 13:20:23.262316 17257 master.cpp:4196] Processing ACCEPT call for offers: [ 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-O0 ] on agent 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-S0 at slave(454)@172.17.0.3:36056 (7005c4d1ac46) for framework 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000 (default) at scheduler-d5a06dea-d474-4afd-83b2-c2a5bbf14865@172.17.0.3:36056
3: I1011 13:20:23.262393 17257 master.cpp:3565] Authorizing framework principal 'test-principal' to launch task ed94e693-cd81-4ec1-bc7f-1dfebe4e6d70
3: I1011 13:20:23.264472 17245 master.cpp:10135] Adding task ed94e693-cd81-4ec1-bc7f-1dfebe4e6d70 with resources [{"allocation_info":{"role":"*"},"name":"cpus","scalar":{"value":2.0},"type":"SCALAR"},{"allocation_info":{"role":"*"},"name":"mem","scalar":{"value":1024.0},"type":"SCALAR"},{"allocation_info":{"role":"*"},"name":"disk","scalar":{"value":1024.0},"type":"SCALAR"},{"allocation_info":{"role":"*"},"name":"ports","ranges":{"range":[{"begin":31000,"end":32000}]},"type":"RANGES"}] on agent 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-S0 at slave(454)@172.17.0.3:36056 (7005c4d1ac46)
3: I1011 13:20:23.265147 17245 master.cpp:4879] Launching task ed94e693-cd81-4ec1-bc7f-1dfebe4e6d70 of framework 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000 (default) at scheduler-d5a06dea-d474-4afd-83b2-c2a5bbf14865@172.17.0.3:36056 with resources [{"allocation_info":{"role":"*"},"name":"cpus","scalar":{"value":2.0},"type":"SCALAR"},{"allocation_info":{"role":"*"},"name":"mem","scalar":{"value":1024.0},"type":"SCALAR"},{"allocation_info":{"role":"*"},"name":"disk","scalar":{"value":1024.0},"type":"SCALAR"},{"allocation_info":{"role":"*"},"name":"ports","ranges":{"range":[{"begin":31000,"end":32000}]},"type":"RANGES"}] on agent 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-S0 at slave(454)@172.17.0.3:36056 (7005c4d1ac46)
3: I1011 13:20:23.266116 17249 slave.cpp:1747] Got assigned task 'ed94e693-cd81-4ec1-bc7f-1dfebe4e6d70' for framework 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000
3: I1011 13:20:23.266846 17249 slave.cpp:2015] Authorizing task 'ed94e693-cd81-4ec1-bc7f-1dfebe4e6d70' for framework 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000
3: I1011 13:20:23.266894 17249 slave.cpp:6809] Authorizing framework principal 'test-principal' to launch task ed94e693-cd81-4ec1-bc7f-1dfebe4e6d70
3: I1011 13:20:23.267125 17264 hierarchical.cpp:887] Updated allocation of framework 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000 on agent 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-S0 from cpus(allocated: *):2; mem(allocated: *):1024; disk(allocated: *):1024; ports(allocated: *):[31000-32000] to cpus(allocated: *):2; mem(allocated: *):1024; disk(allocated: *):1024; ports(allocated: *):[31000-32000]
3: I1011 13:20:23.267662 17243 slave.cpp:2183] Launching task 'ed94e693-cd81-4ec1-bc7f-1dfebe4e6d70' for framework 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000
3: I1011 13:20:23.268232 17243 paths.cpp:605] Trying to chown '/tmp/OversubscriptionTest_QoSCorrectionKill_1noM0L/slaves/051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-S0/frameworks/051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000/executors/ed94e693-cd81-4ec1-bc7f-1dfebe4e6d70/runs/ba95c960-053b-494b-8d65-d99ffd849b4a' to user 'mesos'
3: I1011 13:20:23.268468 17243 slave.cpp:7271] Launching executor 'ed94e693-cd81-4ec1-bc7f-1dfebe4e6d70' of framework 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000 with resources [{"allocation_info":{"role":"*"},"name":"cpus","scalar":{"value":0.1},"type":"SCALAR"},{"allocation_info":{"role":"*"},"name":"mem","scalar":{"value":32.0},"type":"SCALAR"}] in work directory '/tmp/OversubscriptionTest_QoSCorrectionKill_1noM0L/slaves/051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-S0/frameworks/051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000/executors/ed94e693-cd81-4ec1-bc7f-1dfebe4e6d70/runs/ba95c960-053b-494b-8d65-d99ffd849b4a'
3: I1011 13:20:23.269258 17243 slave.cpp:2874] Launching container ba95c960-053b-494b-8d65-d99ffd849b4a for executor 'ed94e693-cd81-4ec1-bc7f-1dfebe4e6d70' of framework 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000
3: I1011 13:20:23.269647 17243 slave.cpp:2411] Queued task 'ed94e693-cd81-4ec1-bc7f-1dfebe4e6d70' for executor 'ed94e693-cd81-4ec1-bc7f-1dfebe4e6d70' of framework 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000
3: I1011 13:20:23.269656 17265 containerizer.cpp:1122] Starting container ba95c960-053b-494b-8d65-d99ffd849b4a
3: I1011 13:20:23.269729 17243 slave.cpp:944] Successfully attached '/tmp/OversubscriptionTest_QoSCorrectionKill_1noM0L/slaves/051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-S0/frameworks/051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000/executors/ed94e693-cd81-4ec1-bc7f-1dfebe4e6d70/runs/ba95c960-053b-494b-8d65-d99ffd849b4a' to virtual path '/tmp/OversubscriptionTest_QoSCorrectionKill_1noM0L/slaves/051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-S0/frameworks/051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000/executors/ed94e693-cd81-4ec1-bc7f-1dfebe4e6d70/runs/latest'
3: I1011 13:20:23.269768 17243 slave.cpp:944] Successfully attached '/tmp/OversubscriptionTest_QoSCorrectionKill_1noM0L/slaves/051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-S0/frameworks/051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000/executors/ed94e693-cd81-4ec1-bc7f-1dfebe4e6d70/runs/ba95c960-053b-494b-8d65-d99ffd849b4a' to virtual path '/frameworks/051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000/executors/ed94e693-cd81-4ec1-bc7f-1dfebe4e6d70/runs/latest'
3: I1011 13:20:23.269798 17243 slave.cpp:944] Successfully attached '/tmp/OversubscriptionTest_QoSCorrectionKill_1noM0L/slaves/051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-S0/frameworks/051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000/executors/ed94e693-cd81-4ec1-bc7f-1dfebe4e6d70/runs/ba95c960-053b-494b-8d65-d99ffd849b4a' to virtual path '/tmp/OversubscriptionTest_QoSCorrectionKill_1noM0L/slaves/051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-S0/frameworks/051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000/executors/ed94e693-cd81-4ec1-bc7f-1dfebe4e6d70/runs/ba95c960-053b-494b-8d65-d99ffd849b4a'
3: I1011 13:20:23.270222 17265 containerizer.cpp:2751] Transitioning the state of container ba95c960-053b-494b-8d65-d99ffd849b4a from PROVISIONING to PREPARING
3: I1011 13:20:23.274739 17258 containerizer.cpp:1720] Launching 'mesos-containerizer' with flags '--help="false" --launch_info="{"command":{"arguments":["mesos-executor","--launcher_dir=\/mesos\/build\/src"],"shell":false,"value":"\/mesos\/build\/src\/mesos-executor"},"environment":{"variables":[{"name":"LIBPROCESS_PORT","type":"VALUE","value":"0"},{"name":"MESOS_AGENT_ENDPOINT","type":"VALUE","value":"172.17.0.3:36056"},{"name":"MESOS_CHECKPOINT","type":"VALUE","value":"0"},{"name":"MESOS_DIRECTORY","type":"VALUE","value":"\/tmp\/OversubscriptionTest_QoSCorrectionKill_1noM0L\/slaves\/051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-S0\/frameworks\/051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000\/executors\/ed94e693-cd81-4ec1-bc7f-1dfebe4e6d70\/runs\/ba95c960-053b-494b-8d65-d99ffd849b4a"},{"name":"MESOS_EXECUTOR_ID","type":"VALUE","value":"ed94e693-cd81-4ec1-bc7f-1dfebe4e6d70"},{"name":"MESOS_EXECUTOR_SHUTDOWN_GRACE_PERIOD","type":"VALUE","value":"5secs"},{"name":"MESOS_FRAMEWORK_ID","type":"VALUE","value":"051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000"},{"name":"MESOS_HTTP_COMMAND_EXECUTOR","type":"VALUE","value":"0"},{"name":"MESOS_SLAVE_ID","type":"VALUE","value":"051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-S0"},{"name":"MESOS_SLAVE_PID","type":"VALUE","value":"slave(454)@172.17.0.3:36056"},{"name":"MESOS_SANDBOX","type":"VALUE","value":"\/tmp\/OversubscriptionTest_QoSCorrectionKill_1noM0L\/slaves\/051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-S0\/frameworks\/051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000\/executors\/ed94e693-cd81-4ec1-bc7f-1dfebe4e6d70\/runs\/ba95c960-053b-494b-8d65-d99ffd849b4a"}]},"task_environment":{},"user":"mesos","working_directory":"\/tmp\/OversubscriptionTest_QoSCorrectionKill_1noM0L\/slaves\/051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-S0\/frameworks\/051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000\/executors\/ed94e693-cd81-4ec1-bc7f-1dfebe4e6d70\/runs\/ba95c960-053b-494b-8d65-d99ffd849b4a"}" --pipe_read="7" --pipe_write="8" --runtime_directory="/tmp/OversubscriptionTest_QoSCorrectionKill_gamtGd/containers/ba95c960-053b-494b-8d65-d99ffd849b4a" --unshare_namespace_mnt="false"'
3: I1011 13:20:23.277493 17258 launcher.cpp:140] Forked child with pid '22536' for container 'ba95c960-053b-494b-8d65-d99ffd849b4a'
3: I1011 13:20:23.278116 17258 containerizer.cpp:2751] Transitioning the state of container ba95c960-053b-494b-8d65-d99ffd849b4a from PREPARING to ISOLATING
3: I1011 13:20:23.279561 17262 containerizer.cpp:2751] Transitioning the state of container ba95c960-053b-494b-8d65-d99ffd849b4a from ISOLATING to FETCHING
3: I1011 13:20:23.279795 17259 fetcher.cpp:377] Starting to fetch URIs for container: ba95c960-053b-494b-8d65-d99ffd849b4a, directory: /tmp/OversubscriptionTest_QoSCorrectionKill_1noM0L/slaves/051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-S0/frameworks/051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000/executors/ed94e693-cd81-4ec1-bc7f-1dfebe4e6d70/runs/ba95c960-053b-494b-8d65-d99ffd849b4a
3: I1011 13:20:23.280638 17247 containerizer.cpp:2751] Transitioning the state of container ba95c960-053b-494b-8d65-d99ffd849b4a from FETCHING to RUNNING
3: I1011 13:20:23.511025 22537 exec.cpp:162] Version: 1.5.0
3: I1011 13:20:23.522120 17263 slave.cpp:3941] Got registration for executor 'ed94e693-cd81-4ec1-bc7f-1dfebe4e6d70' of framework 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000 from executor(1)@172.17.0.3:40366
3: I1011 13:20:23.525552 17243 slave.cpp:2613] Sending queued task 'ed94e693-cd81-4ec1-bc7f-1dfebe4e6d70' to executor 'ed94e693-cd81-4ec1-bc7f-1dfebe4e6d70' of framework 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000 at executor(1)@172.17.0.3:40366
3: I1011 13:20:23.527151 22557 exec.cpp:237] Executor registered on agent 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-S0
3: I1011 13:20:23.531131 22540 executor.cpp:171] Received SUBSCRIBED event
3: I1011 13:20:23.532220 22540 executor.cpp:175] Subscribed executor on 7005c4d1ac46
3: I1011 13:20:23.532438 22540 executor.cpp:171] Received LAUNCH event
3: I1011 13:20:23.532606 22540 executor.cpp:633] Starting task ed94e693-cd81-4ec1-bc7f-1dfebe4e6d70
3: I1011 13:20:23.555716 22540 executor.cpp:477] Running '/mesos/build/src/mesos-containerizer launch <POSSIBLY-SENSITIVE-DATA>'
3: I1011 13:20:23.559605 22540 executor.cpp:646] Forked command at 22563
3: I1011 13:20:23.566941 17251 slave.cpp:4395] Handling status update TASK_RUNNING (UUID: afc55acf-8ff2-46d7-8f43-10c9983a30ec) for task ed94e693-cd81-4ec1-bc7f-1dfebe4e6d70 of framework 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000 from executor(1)@172.17.0.3:40366
3: I1011 13:20:23.569560 17245 status_update_manager.cpp:323] Received status update TASK_RUNNING (UUID: afc55acf-8ff2-46d7-8f43-10c9983a30ec) for task ed94e693-cd81-4ec1-bc7f-1dfebe4e6d70 of framework 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000
3: I1011 13:20:23.569622 17245 status_update_manager.cpp:500] Creating StatusUpdate stream for task ed94e693-cd81-4ec1-bc7f-1dfebe4e6d70 of framework 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000
3: I1011 13:20:23.570644 17245 status_update_manager.cpp:377] Forwarding update TASK_RUNNING (UUID: afc55acf-8ff2-46d7-8f43-10c9983a30ec) for task ed94e693-cd81-4ec1-bc7f-1dfebe4e6d70 of framework 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000 to the agent
3: I1011 13:20:23.571130 17249 slave.cpp:4876] Forwarding the update TASK_RUNNING (UUID: afc55acf-8ff2-46d7-8f43-10c9983a30ec) for task ed94e693-cd81-4ec1-bc7f-1dfebe4e6d70 of framework 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000 to master@172.17.0.3:36056
3: I1011 13:20:23.571508 17249 slave.cpp:4770] Status update manager successfully handled status update TASK_RUNNING (UUID: afc55acf-8ff2-46d7-8f43-10c9983a30ec) for task ed94e693-cd81-4ec1-bc7f-1dfebe4e6d70 of framework 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000
3: I1011 13:20:23.571576 17249 slave.cpp:4786] Sending acknowledgement for status update TASK_RUNNING (UUID: afc55acf-8ff2-46d7-8f43-10c9983a30ec) for task ed94e693-cd81-4ec1-bc7f-1dfebe4e6d70 of framework 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000 to executor(1)@172.17.0.3:40366
3: I1011 13:20:23.571811 17242 master.cpp:6993] Status update TASK_RUNNING (UUID: afc55acf-8ff2-46d7-8f43-10c9983a30ec) for task ed94e693-cd81-4ec1-bc7f-1dfebe4e6d70 of framework 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000 from agent 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-S0 at slave(454)@172.17.0.3:36056 (7005c4d1ac46)
3: I1011 13:20:23.571912 17242 master.cpp:7055] Forwarding status update TASK_RUNNING (UUID: afc55acf-8ff2-46d7-8f43-10c9983a30ec) for task ed94e693-cd81-4ec1-bc7f-1dfebe4e6d70 of framework 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000
3: I1011 13:20:23.572166 17242 master.cpp:9157] Updating the state of task ed94e693-cd81-4ec1-bc7f-1dfebe4e6d70 of framework 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000 (latest state: TASK_RUNNING, status update state: TASK_RUNNING)
3: I1011 13:20:23.572654 17261 sched.cpp:1025] Scheduler::statusUpdate took 198671ns
3: I1011 13:20:23.573443 17256 master.cpp:5566] Processing ACKNOWLEDGE call afc55acf-8ff2-46d7-8f43-10c9983a30ec for task ed94e693-cd81-4ec1-bc7f-1dfebe4e6d70 of framework 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000 (default) at scheduler-d5a06dea-d474-4afd-83b2-c2a5bbf14865@172.17.0.3:36056 on agent 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-S0
3: I1011 13:20:23.574753 17255 status_update_manager.cpp:395] Received status update acknowledgement (UUID: afc55acf-8ff2-46d7-8f43-10c9983a30ec) for task ed94e693-cd81-4ec1-bc7f-1dfebe4e6d70 of framework 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000
3: I1011 13:20:23.575042 17255 slave.cpp:3679] Status update manager successfully handled status update acknowledgement (UUID: afc55acf-8ff2-46d7-8f43-10c9983a30ec) for task ed94e693-cd81-4ec1-bc7f-1dfebe4e6d70 of framework 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000
3: I1011 13:20:23.579807 17251 slave.cpp:6593] Received 1 QoS corrections
3: I1011 13:20:23.579900 17251 slave.cpp:6657] Killing container 'ba95c960-053b-494b-8d65-d99ffd849b4a' for executor 'ed94e693-cd81-4ec1-bc7f-1dfebe4e6d70' of framework 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000 at executor(1)@172.17.0.3:40366 as QoS correction
3: I1011 13:20:23.580194 17260 containerizer.cpp:2205] Destroying container ba95c960-053b-494b-8d65-d99ffd849b4a in RUNNING state
3: I1011 13:20:23.580240 17260 containerizer.cpp:2751] Transitioning the state of container ba95c960-053b-494b-8d65-d99ffd849b4a from RUNNING to DESTROYING
3: I1011 13:20:23.580785 17260 launcher.cpp:156] Asked to destroy container ba95c960-053b-494b-8d65-d99ffd849b4a
3: I1011 13:20:23.596204 17263 slave.cpp:5008] Got exited event for executor(1)@172.17.0.3:40366
3: I1011 13:20:23.615449 17245 containerizer.cpp:2651] Container ba95c960-053b-494b-8d65-d99ffd849b4a has exited
3: I1011 13:20:23.617903 17259 provisioner.cpp:490] Ignoring destroy request for unknown container ba95c960-053b-494b-8d65-d99ffd849b4a
3: I1011 13:20:23.618778 17249 slave.cpp:5408] Executor 'ed94e693-cd81-4ec1-bc7f-1dfebe4e6d70' of framework 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000 terminated with signal Killed
3: I1011 13:20:23.618945 17249 slave.cpp:4395] Handling status update TASK_LOST (UUID: 3ed23899-3601-47e0-9f53-b2c982db5d90) for task ed94e693-cd81-4ec1-bc7f-1dfebe4e6d70 of framework 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000 from @0.0.0.0:0
3: W1011 13:20:23.620029 17250 containerizer.cpp:2015] Ignoring update for unknown container ba95c960-053b-494b-8d65-d99ffd849b4a
3: I1011 13:20:23.620548 17247 status_update_manager.cpp:323] Received status update TASK_LOST (UUID: 3ed23899-3601-47e0-9f53-b2c982db5d90) for task ed94e693-cd81-4ec1-bc7f-1dfebe4e6d70 of framework 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000
3: I1011 13:20:23.620682 17247 status_update_manager.cpp:377] Forwarding update TASK_LOST (UUID: 3ed23899-3601-47e0-9f53-b2c982db5d90) for task ed94e693-cd81-4ec1-bc7f-1dfebe4e6d70 of framework 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000 to the agent
3: I1011 13:20:23.620914 17261 slave.cpp:4876] Forwarding the update TASK_LOST (UUID: 3ed23899-3601-47e0-9f53-b2c982db5d90) for task ed94e693-cd81-4ec1-bc7f-1dfebe4e6d70 of framework 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000 to master@172.17.0.3:36056
3: I1011 13:20:23.621244 17261 slave.cpp:4770] Status update manager successfully handled status update TASK_LOST (UUID: 3ed23899-3601-47e0-9f53-b2c982db5d90) for task ed94e693-cd81-4ec1-bc7f-1dfebe4e6d70 of framework 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000
3: I1011 13:20:23.621517 17254 master.cpp:6993] Status update TASK_LOST (UUID: 3ed23899-3601-47e0-9f53-b2c982db5d90) for task ed94e693-cd81-4ec1-bc7f-1dfebe4e6d70 of framework 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000 from agent 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-S0 at slave(454)@172.17.0.3:36056 (7005c4d1ac46)
3: I1011 13:20:23.621606 17254 master.cpp:7055] Forwarding status update TASK_LOST (UUID: 3ed23899-3601-47e0-9f53-b2c982db5d90) for task ed94e693-cd81-4ec1-bc7f-1dfebe4e6d70 of framework 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000
3: I1011 13:20:23.621804 17254 master.cpp:9157] Updating the state of task ed94e693-cd81-4ec1-bc7f-1dfebe4e6d70 of framework 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000 (latest state: TASK_LOST, status update state: TASK_LOST)
3: I1011 13:20:23.622045 17242 sched.cpp:1025] Scheduler::statusUpdate took 75745ns
3: I1011 13:20:23.622865 17254 master.cpp:5566] Processing ACKNOWLEDGE call 3ed23899-3601-47e0-9f53-b2c982db5d90 for task ed94e693-cd81-4ec1-bc7f-1dfebe4e6d70 of framework 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000 (default) at scheduler-d5a06dea-d474-4afd-83b2-c2a5bbf14865@172.17.0.3:36056 on agent 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-S0
3: I1011 13:20:23.623282 17258 hierarchical.cpp:1161] Recovered cpus(allocated: *):2; mem(allocated: *):1024; disk(allocated: *):1024; ports(allocated: *):[31000-32000] (total: cpus:2; mem:1024; disk:1024; ports:[31000-32000], allocated: {}) on agent 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-S0 from framework 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000
3: I1011 13:20:23.622941 17254 master.cpp:9251] Removing task ed94e693-cd81-4ec1-bc7f-1dfebe4e6d70 with resources [{"allocation_info":{"role":"*"},"name":"cpus","scalar":{"value":2.0},"type":"SCALAR"},{"allocation_info":{"role":"*"},"name":"mem","scalar":{"value":1024.0},"type":"SCALAR"},{"allocation_info":{"role":"*"},"name":"disk","scalar":{"value":1024.0},"type":"SCALAR"},{"allocation_info":{"role":"*"},"name":"ports","ranges":{"range":[{"begin":31000,"end":32000}]},"type":"RANGES"}] of framework 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000 on agent 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-S0 at slave(454)@172.17.0.3:36056 (7005c4d1ac46)
3: I1011 13:20:23.623934 17243 status_update_manager.cpp:395] Received status update acknowledgement (UUID: 3ed23899-3601-47e0-9f53-b2c982db5d90) for task ed94e693-cd81-4ec1-bc7f-1dfebe4e6d70 of framework 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000
3: I1011 13:20:23.624042 17243 status_update_manager.cpp:531] Cleaning up status update stream for task ed94e693-cd81-4ec1-bc7f-1dfebe4e6d70 of framework 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000
3: I1011 13:20:23.624512 17243 slave.cpp:3679] Status update manager successfully handled status update acknowledgement (UUID: 3ed23899-3601-47e0-9f53-b2c982db5d90) for task ed94e693-cd81-4ec1-bc7f-1dfebe4e6d70 of framework 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000
3: I1011 13:20:23.624567 17243 slave.cpp:7876] Completing task ed94e693-cd81-4ec1-bc7f-1dfebe4e6d70
3: I1011 13:20:23.624604 17243 slave.cpp:5512] Cleaning up executor 'ed94e693-cd81-4ec1-bc7f-1dfebe4e6d70' of framework 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000 at executor(1)@172.17.0.3:40366
3: I1011 13:20:23.624999 17264 gc.cpp:90] Scheduling '/tmp/OversubscriptionTest_QoSCorrectionKill_1noM0L/slaves/051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-S0/frameworks/051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000/executors/ed94e693-cd81-4ec1-bc7f-1dfebe4e6d70/runs/ba95c960-053b-494b-8d65-d99ffd849b4a' for gc 6.99999276841482days in the future
3: I1011 13:20:23.625174 17243 slave.cpp:5619] Cleaning up framework 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000
3: I1011 13:20:23.625216 17264 gc.cpp:90] Scheduling '/tmp/OversubscriptionTest_QoSCorrectionKill_1noM0L/slaves/051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-S0/frameworks/051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000/executors/ed94e693-cd81-4ec1-bc7f-1dfebe4e6d70' for gc 6.99999276557926days in the future
3: I1011 13:20:23.625275 17259 status_update_manager.cpp:285] Closing status update streams for framework 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000
3: I1011 13:20:23.625371 17245 process.cpp:3929] Handling HTTP event for process 'metrics' with path: '/metrics/snapshot'
3: I1011 13:20:23.625388 17264 gc.cpp:90] Scheduling '/tmp/OversubscriptionTest_QoSCorrectionKill_1noM0L/slaves/051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-S0/frameworks/051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000' for gc 6.99999276281185days in the future
3: I1011 13:20:23.626643 17247 http.cpp:851] Authorizing principal 'ANY' to GET the endpoint '/metrics/snapshot'
3: I1011 13:20:23.647490 17241 sched.cpp:2005] Asked to stop the driver
3: I1011 13:20:23.647624 17259 sched.cpp:1187] Stopping framework 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000
3: I1011 13:20:23.647954 17264 master.cpp:8447] Processing TEARDOWN call for framework 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000 (default) at scheduler-d5a06dea-d474-4afd-83b2-c2a5bbf14865@172.17.0.3:36056
3: I1011 13:20:23.647989 17264 master.cpp:8459] Removing framework 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000 (default) at scheduler-d5a06dea-d474-4afd-83b2-c2a5bbf14865@172.17.0.3:36056
3: I1011 13:20:23.648013 17264 master.cpp:3299] Deactivating framework 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000 (default) at scheduler-d5a06dea-d474-4afd-83b2-c2a5bbf14865@172.17.0.3:36056
3: I1011 13:20:23.648149 17250 hierarchical.cpp:412] Deactivated framework 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000
3: I1011 13:20:23.648280 17252 slave.cpp:3211] Asked to shut down framework 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000 by master@172.17.0.3:36056
3: I1011 13:20:23.648339 17252 slave.cpp:3226] Cannot shut down unknown framework 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000
3: I1011 13:20:23.648720 17252 hierarchical.cpp:355] Removed framework 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-0000
3: I1011 13:20:23.650002 17241 slave.cpp:869] Agent terminating
3: I1011 13:20:23.650295 17247 master.cpp:1303] Agent 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-S0 at slave(454)@172.17.0.3:36056 (7005c4d1ac46) disconnected
3: I1011 13:20:23.650336 17247 master.cpp:3336] Disconnecting agent 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-S0 at slave(454)@172.17.0.3:36056 (7005c4d1ac46)
3: I1011 13:20:23.650411 17247 master.cpp:3355] Deactivating agent 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-S0 at slave(454)@172.17.0.3:36056 (7005c4d1ac46)
3: I1011 13:20:23.650544 17242 hierarchical.cpp:690] Agent 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-S0 deactivated
3: I1011 13:20:23.659127 17241 master.cpp:1145] Master terminating
3: I1011 13:20:23.660092 17257 hierarchical.cpp:626] Removed agent 051d8ebe-fee1-4ff4-8d06-03c31a92cbe1-S0
3: [       OK ] OversubscriptionTest.QoSCorrectionKill (466 ms)
3: [ RUN      ] OversubscriptionTest.QoSCorrectionKillPartitionAware
3: I1011 13:20:23.670779 17241 cluster.cpp:162] Creating default 'local' authorizer
3: I1011 13:20:23.675649 17256 master.cpp:445] Master e3958b28-1718-49d6-9777-72ef49c1f038 (7005c4d1ac46) started on 172.17.0.3:36056
3: I1011 13:20:23.675688 17256 master.cpp:447] Flags at startup: --acls="" --agent_ping_timeout="15secs" --agent_reregister_timeout="10mins" --allocation_interval="1secs" --allocator="HierarchicalDRF" --authenticate_agents="true" --authenticate_frameworks="true" --authenticate_http_frameworks="true" --authenticate_http_readonly="true" --authenticate_http_readwrite="true" --authenticators="crammd5" --authorizers="local" --credentials="/tmp/fJbWzt/credentials" --filter_gpu_resources="true" --framework_sorter="drf" --help="false" --hostname_lookup="true" --http_authenticators="basic" --http_framework_authenticators="basic" --initialize_driver_logging="true" --log_auto_initialize="true" --logbufsecs="0" --logging_level="INFO" --max_agent_ping_timeouts="5" --max_completed_frameworks="50" --max_completed_tasks_per_framework="1000" --max_unreachable_tasks_per_framework="1000" --port="5050" --quiet="false" --recovery_agent_removal_limit="100%" --registry="in_memory" --registry_fetch_timeout="1mins" --registry_gc_interval="15mins" --registry_max_agent_age="2weeks" --registry_max_agent_count="102400" --registry_store_timeout="100secs" --registry_strict="false" --root_submissions="true" --user_sorter="drf" --version="false" --webui_dir="/usr/local/share/mesos/webui" --work_dir="/tmp/fJbWzt/master" --zk_session_timeout="10secs"
3: I1011 13:20:23.676173 17256 master.cpp:496] Master only allowing authenticated frameworks to register
3: I1011 13:20:23.676187 17256 master.cpp:502] Master only allowing authenticated agents to register
3: I1011 13:20:23.676193 17256 master.cpp:508] Master only allowing authenticated HTTP frameworks to register
3: I1011 13:20:23.676201 17256 credentials.hpp:37] Loading credentials for authentication from '/tmp/fJbWzt/credentials'
3: I1011 13:20:23.676698 17256 master.cpp:552] Using default 'crammd5' authenticator
3: I1011 13:20:23.676986 17256 http.cpp:1045] Creating default 'basic' HTTP authenticator for realm 'mesos-master-readonly'
3: I1011 13:20:23.677249 17256 http.cpp:1045] Creating default 'basic' HTTP authenticator for realm 'mesos-master-readwrite'
3: I1011 13:20:23.677386 17256 http.cpp:1045] Creating default 'basic' HTTP authenticator for realm 'mesos-master-scheduler'
3: I1011 13:20:23.677511 17256 master.cpp:631] Authorization enabled
3: I1011 13:20:23.677727 17244 hierarchical.cpp:171] Initialized hierarchical allocator process
3: I1011 13:20:23.677732 17263 whitelist_watcher.cpp:77] No whitelist given
3: I1011 13:20:23.682111 17245 master.cpp:2198] Elected as the leading master!
3: I1011 13:20:23.682166 17245 master.cpp:1687] Recovering from registrar
3: I1011 13:20:23.682371 17259 registrar.cpp:347] Recovering registrar
3: I1011 13:20:23.683120 17259 registrar.cpp:391] Successfully fetched the registry (0B) in 694784ns
3: I1011 13:20:23.683251 17259 registrar.cpp:495] Applied 1 operations in 31870ns; attempting to update the registry
3: I1011 13:20:23.683929 17259 registrar.cpp:552] Successfully updated the registry in 607232ns
3: I1011 13:20:23.684070 17259 registrar.cpp:424] Successfully recovered registrar
3: I1011 13:20:23.684545 17253 master.cpp:1791] Recovered 0 agents from the registry (129B); allowing 10mins for agents to re-register
3: I1011 13:20:23.684620 17261 hierarchical.cpp:209] Skipping recovery of hierarchical allocator: nothing to recover
Build timed out (after 300 minutes). Marking the build as failed.
Build was aborted
++ docker rmi mesos-1507725808-22146
Error response from daemon: conflict: unable to remove repository reference "mesos-1507725808-22146" (must force) - container 7005c4d1ac46 is using its referenced image 65837a73d7c3


Jenkins build is back to normal : Mesos-Buildbot » cmake,gcc,--verbose --disable-libtool-wrappers,GLOG_v=1 MESOS_VERBOSE=1,ubuntu:14.04,(ubuntu)&&(!ubuntu-us1)&&(!ubuntu-eu2)&&(!qnode3)&&(!H23) #4309

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Mesos-Buildbot/BUILDTOOL=cmake,COMPILER=gcc,CONFIGURATION=--verbose%20--disable-libtool-wrappers,ENVIRONMENT=GLOG_v=1%20MESOS_VERBOSE=1,OS=ubuntu%3A14.04,label_exp=(ubuntu)&&(!ubuntu-us1)&&(!ubuntu-eu2)&&(!qnode3)&&(!H23)/4309/display/redirect?page=changes>


Build failed in Jenkins: Mesos-Buildbot » cmake,gcc,--verbose --disable-libtool-wrappers,GLOG_v=1 MESOS_VERBOSE=1,ubuntu:14.04,(ubuntu)&&(!ubuntu-us1)&&(!ubuntu-eu2)&&(!qnode3)&&(!H23) #4308

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Mesos-Buildbot/BUILDTOOL=cmake,COMPILER=gcc,CONFIGURATION=--verbose%20--disable-libtool-wrappers,ENVIRONMENT=GLOG_v=1%20MESOS_VERBOSE=1,OS=ubuntu%3A14.04,label_exp=(ubuntu)&&(!ubuntu-us1)&&(!ubuntu-eu2)&&(!qnode3)&&(!H23)/4308/display/redirect>

------------------------------------------
Started by upstream project "Mesos-Buildbot" build number 4308
originally caused by:
 Started by an SCM change
[EnvInject] - Loading node environment variables.
Building remotely on ubuntu-6 (ubuntu trusty) in workspace <https://builds.apache.org/job/Mesos-Buildbot/BUILDTOOL=cmake,COMPILER=gcc,CONFIGURATION=--verbose%20--disable-libtool-wrappers,ENVIRONMENT=GLOG_v=1%20MESOS_VERBOSE=1,OS=ubuntu%3A14.04,label_exp=(ubuntu)&&(!ubuntu-us1)&&(!ubuntu-eu2)&&(!qnode3)&&(!H23)/ws/>
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://git-wip-us.apache.org/repos/asf/mesos.git
 > git init <https://builds.apache.org/job/Mesos-Buildbot/BUILDTOOL=cmake,COMPILER=gcc,CONFIGURATION=--verbose%20--disable-libtool-wrappers,ENVIRONMENT=GLOG_v=1%20MESOS_VERBOSE=1,OS=ubuntu%3A14.04,label_exp=(ubuntu)&&(!ubuntu-us1)&&(!ubuntu-eu2)&&(!qnode3)&&(!H23)/ws/> # timeout=10
Fetching upstream changes from https://git-wip-us.apache.org/repos/asf/mesos.git
 > git --version # timeout=10
 > git fetch --tags --progress https://git-wip-us.apache.org/repos/asf/mesos.git +refs/heads/*:refs/remotes/origin/*
 > git config remote.origin.url https://git-wip-us.apache.org/repos/asf/mesos.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://git-wip-us.apache.org/repos/asf/mesos.git # timeout=10
Fetching upstream changes from https://git-wip-us.apache.org/repos/asf/mesos.git
 > git fetch --tags --progress https://git-wip-us.apache.org/repos/asf/mesos.git +refs/heads/*:refs/remotes/origin/*
Checking out Revision 0908303142f641c1697547eb7f8e82a205d6c362 (origin/master)
Commit message: "Installed newer cmake version in mesos-tidy Docker image."
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 0908303142f641c1697547eb7f8e82a205d6c362
 > git rev-list 0908303142f641c1697547eb7f8e82a205d6c362 # timeout=10
[243aee26] $ /bin/bash -xe /tmp/jenkins8357515071707345942.sh
+ '[' origin/master = origin/1.0.x ']'
+ ./support/jenkins/buildbot.sh
Requirement already satisfied (use --upgrade to upgrade): virtualenv in /usr/lib/python2.7/dist-packages
Cleaning up...
Total errors found: 0
<https://builds.apache.org/job/Mesos-Buildbot/BUILDTOOL=cmake,COMPILER=gcc,CONFIGURATION=--verbose%20--disable-libtool-wrappers,ENVIRONMENT=GLOG_v=1%20MESOS_VERBOSE=1,OS=ubuntu%3A14.04,label_exp=(ubuntu)&&(!ubuntu-us1)&&(!ubuntu-eu2)&&(!qnode3)&&(!H23)/ws/src/python/cli_new/.virtualenv/local/lib/python2.7/site-packages/pip/_vendor/requests/packages/urllib3/util/ssl_.py>:318: SNIMissingWarning: An HTTPS request has been made, but the SNI (Subject Name Indication) extension to TLS is not available on this platform. This may cause the server to present an incorrect TLS certificate, which can cause validation failures. You can upgrade to a newer version of Python to solve this. For more information, see https://urllib3.readthedocs.io/en/latest/security.html#snimissingwarning.
  SNIMissingWarning
<https://builds.apache.org/job/Mesos-Buildbot/BUILDTOOL=cmake,COMPILER=gcc,CONFIGURATION=--verbose%20--disable-libtool-wrappers,ENVIRONMENT=GLOG_v=1%20MESOS_VERBOSE=1,OS=ubuntu%3A14.04,label_exp=(ubuntu)&&(!ubuntu-us1)&&(!ubuntu-eu2)&&(!qnode3)&&(!H23)/ws/src/python/cli_new/.virtualenv/local/lib/python2.7/site-packages/pip/_vendor/requests/packages/urllib3/util/ssl_.py>:122: InsecurePlatformWarning: A true SSLContext object is not available. This prevents urllib3 from configuring SSL appropriately and may cause certain SSL connections to fail. You can upgrade to a newer version of Python to solve this. For more information, see https://urllib3.readthedocs.io/en/latest/security.html#insecureplatformwarning.
  InsecurePlatformWarning
<https://builds.apache.org/job/Mesos-Buildbot/BUILDTOOL=cmake,COMPILER=gcc,CONFIGURATION=--verbose%20--disable-libtool-wrappers,ENVIRONMENT=GLOG_v=1%20MESOS_VERBOSE=1,OS=ubuntu%3A14.04,label_exp=(ubuntu)&&(!ubuntu-us1)&&(!ubuntu-eu2)&&(!qnode3)&&(!H23)/ws/src/python/cli_new/.virtualenv/local/lib/python2.7/site-packages/pip/_vendor/requests/packages/urllib3/util/ssl_.py>:122: InsecurePlatformWarning: A true SSLContext object is not available. This prevents urllib3 from configuring SSL appropriately and may cause certain SSL connections to fail. You can upgrade to a newer version of Python to solve this. For more information, see https://urllib3.readthedocs.io/en/latest/security.html#insecureplatformwarning.
  InsecurePlatformWarning
************* Module apply-reviews
E:169,14: Module 'ssl' has no 'SSLContext' member (no-member)
E:189,19: Unexpected keyword argument 'context' in function call (unexpected-keyword-arg)
Total errors found: 2
Checking 1214 C++ files
Virtualenv not detected... building
Rebuilding virtualenv...
Checking 39 Python files
Build step 'Execute shell' marked build as failure