You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@mesos.apache.org by Alex Rukletsov <al...@mesosphere.com> on 2017/02/01 02:53:32 UTC

Re: Bug in loop() implementation Was: Build failed in Jenkins: Mesos-Buildbot » autotools,gcc,--verbose,GLOG_v=1 MESOS_VERBOSE=1,centos:7,(docker||Hadoop)&&(!ubuntu-us1)&&(!ubuntu-eu2) #3201

Updated https://issues.apache.org/jira/browse/MESOS-7036 with a "minimal"
test case. I'll think about whether we can find a broader solution against
the described pattern instead of just patching Loop.

On Wed, Feb 1, 2017 at 2:21 AM, Kevin Klues <kl...@mesosphere.io> wrote:

> AlexR, Gaston, and I sat down this morning and got to the root cause of
> this. It is due to  bug we found in the implementation of loop(). AlexR was
> going to write up a summary and send it out for suggestions on how to fix
> it.
>
> Kevin
>
> On Mon, Jan 30, 2017 at 3:06 PM Greg Mann <gr...@mesosphere.io> wrote:
>
>> Hmmm I just noticed that I had ignored several of these recently due to
>> the "Build timed out (after 300 minutes). Marking the build as failed."
>> line at the end of the log, thinking that this was an issue on the Jenkins
>> agent. Scrolling up, however, we find:
>>
>> **** DEADLOCK DETECTED! ****
>> You are waiting on process __limiter__(609)@172.17.0.2:52605 that it is
>> currently executing.
>>
>> And doing a search through past ASF CI builds for the string "deadlock
>> detected", we see about one build a day stalling because of this issue. In
>> all cases, it occurs during IO switchboard-related tests, and it's the rate
>> limiter hitting the deadlock. I've filed a ticket here to track:
>> https://issues.apache.org/jira/browse/MESOS-7036
>>
>> Cheers,
>> Greg
>>
>>
>> On Sat, Jan 28, 2017 at 10:06 PM, Apache Jenkins Server <
>> jenkins@builds.apache.org> wrote:
>>
>> See <https://builds.apache.org/job/Mesos-Buildbot/BUILDTOOL=
>> autotools,COMPILER=gcc,CONFIGURATION=--verbose,
>> ENVIRONMENT=GLOG_v=1%20MESOS_VERBOSE=1,OS=centos%3A7,label_
>> exp=(docker%7C%7CHadoop)&&(!ubuntu-us1)&&(!ubuntu-eu2)/3201/changes>
>>
>> Changes:
>>
>> [yujie.jay] Fixed duplicate image layer ids returned by docker store.
>>
>> ------------------------------------------
>> [...truncated 154948 lines...]
>> I0129 01:47:06.737426 31205 master.cpp:3136] Deactivating agent
>> c63921a3-ffc1-4b52-b055-e7d44d4367d1-S0 at slave(587)@172.17.0.2:52605
>> (528a1b6e6459)
>> I0129 01:47:06.737676 31205 hierarchical.cpp:577] Agent
>> c63921a3-ffc1-4b52-b055-e7d44d4367d1-S0 deactivated
>> I0129 01:47:06.744351 31207 master.cpp:1102] Master terminating
>> I0129 01:47:06.745267 31211 hierarchical.cpp:510] Removed agent
>> c63921a3-ffc1-4b52-b055-e7d44d4367d1-S0
>> [       OK ] ContentType/AgentAPITest.LaunchNestedContainerSession/1
>> (607 ms)
>> [ RUN      ] ContentType/AgentAPITest.LaunchNestedContainerSessionWi
>> thTTY/0
>> I0129 01:47:06.754794 31182 cluster.cpp:160] Creating default 'local'
>> authorizer
>> I0129 01:47:06.758325 31208 master.cpp:383] Master
>> 525fac7c-347d-4045-bbe0-8853f70b0864 (528a1b6e6459) started on
>> 172.17.0.2:52605
>> I0129 01:47:06.758352 31208 master.cpp:385] Flags at startup: --acls=""
>> --agent_ping_timeout="15secs" --agent_reregister_timeout="10mins"
>> --allocation_interval="1secs" --allocator="HierarchicalDRF"
>> --authenticate_agents="true" --authenticate_frameworks="true"
>> --authenticate_http_frameworks="true" --authenticate_http_readonly="true"
>> --authenticate_http_readwrite="true" --authenticators="crammd5"
>> --authorizers="local" --credentials="/tmp/uytwMM/credentials"
>> --framework_sorter="drf" --help="false" --hostname_lookup="true"
>> --http_authenticators="basic" --http_framework_authenticators="basic"
>> --initialize_driver_logging="true" --log_auto_initialize="true"
>> --logbufsecs="0" --logging_level="INFO" --max_agent_ping_timeouts="5"
>> --max_completed_frameworks="50" --max_completed_tasks_per_framework="1000"
>> --max_unreachable_tasks_per_framework="1000" --quiet="false"
>> --recovery_agent_removal_limit="100%" --registry="in_memory"
>> --registry_fetch_timeout="1mins" --registry_gc_interval="15mins"
>> --registry_max_agent_age="2weeks" --registry_max_agent_count="102400"
>> --registry_store_timeout="100secs" --registry_strict="false"
>> --root_submissions="true" --user_sorter="drf" --version="false"
>> --webui_dir="/mesos/mesos-1.2.0/_inst/share/mesos/webui"
>> --work_dir="/tmp/uytwMM/master" --zk_session_timeout="10secs"
>> I0129 01:47:06.758709 31208 master.cpp:435] Master only allowing
>> authenticated frameworks to register
>> I0129 01:47:06.758723 31208 master.cpp:449] Master only allowing
>> authenticated agents to register
>> I0129 01:47:06.758728 31208 master.cpp:462] Master only allowing
>> authenticated HTTP frameworks to register
>> I0129 01:47:06.758735 31208 credentials.hpp:37] Loading credentials for
>> authentication from '/tmp/uytwMM/credentials'
>> I0129 01:47:06.758997 31208 master.cpp:507] Using default 'crammd5'
>> authenticator
>> I0129 01:47:06.759146 31208 http.cpp:922] Using default 'basic' HTTP
>> authenticator for realm 'mesos-master-readonly'
>> I0129 01:47:06.759323 31208 http.cpp:922] Using default 'basic' HTTP
>> authenticator for realm 'mesos-master-readwrite'
>> I0129 01:47:06.759435 31208 http.cpp:922] Using default 'basic' HTTP
>> authenticator for realm 'mesos-master-scheduler'
>> I0129 01:47:06.759572 31208 master.cpp:587] Authorization enabled
>> I0129 01:47:06.759949 31211 hierarchical.cpp:151] Initialized
>> hierarchical allocator process
>> I0129 01:47:06.760030 31211 whitelist_watcher.cpp:77] No whitelist given
>> I0129 01:47:06.762802 31211 master.cpp:2123] Elected as the leading
>> master!
>> I0129 01:47:06.762861 31211 master.cpp:1645] Recovering from registrar
>> I0129 01:47:06.762996 31203 registrar.cpp:329] Recovering registrar
>> I0129 01:47:06.763639 31206 registrar.cpp:362] Successfully fetched the
>> registry (0B) in 588032ns
>> I0129 01:47:06.763758 31206 registrar.cpp:461] Applied 1 operations in
>> 22363ns; attempting to update the registry
>> I0129 01:47:06.764998 31206 registrar.cpp:506] Successfully updated the
>> registry in 1.168896ms
>> I0129 01:47:06.765153 31206 registrar.cpp:392] Successfully recovered
>> registrar
>> I0129 01:47:06.765753 31213 master.cpp:1761] Recovered 0 agents from the
>> registry (129B); allowing 10mins for agents to re-register
>> I0129 01:47:06.765774 31211 hierarchical.cpp:178] Skipping recovery of
>> hierarchical allocator: nothing to recover
>> I0129 01:47:06.769099 31182 containerizer.cpp:220] Using isolation:
>> posix/cpu,posix/mem,filesystem/posix,network/cni
>> W0129 01:47:06.769610 31182 backend.cpp:76] Failed to create 'aufs'
>> backend: AufsBackend requires root privileges
>> W0129 01:47:06.769742 31182 backend.cpp:76] Failed to create 'bind'
>> backend: BindBackend requires root privileges
>> I0129 01:47:06.773351 31182 cluster.cpp:446] Creating default 'local'
>> authorizer
>> I0129 01:47:06.775411 31214 slave.cpp:209] Mesos agent started on (588)@
>> 172.17.0.2:52605
>> I0129 01:47:06.775449 31214 slave.cpp:210] Flags at startup: --acls=""
>> --appc_simple_discovery_uri_prefix="http://"
>> --appc_store_dir="/tmp/mesos/store/appc" --authenticate_http_readonly="true"
>> --authenticate_http_readwrite="true" --authenticatee="crammd5"
>> --authentication_backoff_factor="1secs" --authorizer="local"
>> --cgroups_cpu_enable_pids_and_tids_count="false"
>> --cgroups_enable_cfs="false" --cgroups_hierarchy="/sys/fs/cgroup"
>> --cgroups_limit_swap="false" --cgroups_root="mesos" --container_disk_watch_interval="15secs"
>> --containerizers="mesos" --credential="/tmp/ContentType_AgentAPITest_
>> LaunchNestedContainerSessionWithTTY_0_oCWVvz/credential"
>> --default_role="*" --disk_watch_interval="1mins" --docker="docker"
>> --docker_kill_orphans="true" --docker_registry="https://
>> registry-1.docker.io" --docker_remove_delay="6hrs"
>> --docker_socket="/var/run/docker.sock" --docker_stop_timeout="0ns"
>> --docker_store_dir="/tmp/mesos/store/docker" --docker_volume_checkpoint_
>> dir="/var/run/mesos/isolators/docker/volume" --enforce_container_disk_quota="false"
>> --executor_registration_timeout="1mins" --executor_shutdown_grace_period="5secs"
>> --fetcher_cache_dir="/tmp/ContentType_AgentAPITest_
>> LaunchNestedContainerSessionWithTTY_0_oCWVvz/fetch"
>> --fetcher_cache_size="2GB" --frameworks_home="" --gc_delay="1weeks"
>> --gc_disk_headroom="0.1" --hadoop_home="" --help="false"
>> --hostname_lookup="true" --http_authenticators="basic"
>> --http_command_executor="false" --http_credentials="/tmp/
>> ContentType_AgentAPITest_LaunchNestedContainerSessionWi
>> thTTY_0_oCWVvz/http_credentials" --http_heartbeat_interval="30secs"
>> --image_provisioner_backend="copy" --initialize_driver_logging="true"
>> --isolation="posix/cpu,posix/mem" --launcher="posix"
>> --launcher_dir="/mesos/mesos-1.2.0/_build/src" --logbufsecs="0"
>> --logging_level="INFO" --max_completed_executors_per_framework="150"
>> --oversubscribed_resources_interval="15secs" --perf_duration="10secs"
>> --perf_interval="1mins" --qos_correction_interval_min="0ns"
>> --quiet="false" --recover="reconnect" --recovery_timeout="15mins"
>> --registration_backoff_factor="10ms" --resources="cpus:2;gpus:0;
>> mem:1024;disk:1024;ports:[31000-32000]" --revocable_cpu_low_priority="true"
>> --runtime_dir="/tmp/ContentType_AgentAPITest_
>> LaunchNestedContainerSessionWithTTY_0_oCWVvz" --sandbox_directory="/mnt/mesos/sandbox"
>> --strict="true" --switch_user="true" --systemd_enable_support="true"
>> --systemd_runtime_directory="/run/systemd/system" --version="false"
>> --work_dir="/tmp/ContentType_AgentAPITest_LaunchNestedContainerSessionWi
>> thTTY_0_6Bxd9F"
>> I0129 01:47:06.776271 31214 credentials.hpp:86] Loading credential for
>> authentication from '/tmp/ContentType_AgentAPITest_
>> LaunchNestedContainerSessionWithTTY_0_oCWVvz/credential'
>> I0129 01:47:06.776535 31214 slave.cpp:352] Agent using credential for:
>> test-principal
>> I0129 01:47:06.776564 31214 credentials.hpp:37] Loading credentials for
>> authentication from '/tmp/ContentType_AgentAPITest_
>> LaunchNestedContainerSessionWithTTY_0_oCWVvz/http_credentials'
>> I0129 01:47:06.776891 31182 sched.cpp:232] Version: 1.2.0
>> I0129 01:47:06.776911 31214 http.cpp:922] Using default 'basic' HTTP
>> authenticator for realm 'mesos-agent-readonly'
>> I0129 01:47:06.777385 31214 http.cpp:922] Using default 'basic' HTTP
>> authenticator for realm 'mesos-agent-readwrite'
>> I0129 01:47:06.777791 31201 sched.cpp:336] New master detected at
>> master@172.17.0.2:52605
>> I0129 01:47:06.777900 31201 sched.cpp:407] Authenticating with master
>> master@172.17.0.2:52605
>> I0129 01:47:06.777925 31201 sched.cpp:414] Using default CRAM-MD5
>> authenticatee
>> I0129 01:47:06.778170 31204 authenticatee.cpp:121] Creating new client
>> SASL connection
>> I0129 01:47:06.778597 31212 master.cpp:7018] Authenticating
>> scheduler-2a6cfd36-0147-4321-b401-67cd97da2acb@172.17.0.2:52605
>> I0129 01:47:06.778734 31204 authenticator.cpp:414] Starting
>> authentication session for crammd5-authenticatee(1145)@172.17.0.2:52605
>> I0129 01:47:06.779062 31212 authenticator.cpp:98] Creating new server
>> SASL connection
>> I0129 01:47:06.779296 31204 authenticatee.cpp:213] Received SASL
>> authentication mechanisms: CRAM-MD5
>> I0129 01:47:06.779405 31204 authenticatee.cpp:239] Attempting to
>> authenticate with mechanism 'CRAM-MD5'
>> I0129 01:47:06.779362 31214 slave.cpp:539] Agent resources: cpus(*):2;
>> mem(*):1024; disk(*):1024; ports(*):[31000-32000]
>> I0129 01:47:06.779520 31214 slave.cpp:547] Agent attributes: [  ]
>> I0129 01:47:06.779531 31204 authenticator.cpp:204] Received SASL
>> authentication start
>> I0129 01:47:06.779657 31214 slave.cpp:552] Agent hostname: 528a1b6e6459
>> I0129 01:47:06.779726 31204 authenticator.cpp:326] Authentication
>> requires more steps
>> I0129 01:47:06.779829 31204 authenticatee.cpp:259] Received SASL
>> authentication step
>> I0129 01:47:06.779862 31212 status_update_manager.cpp:177] Pausing
>> sending status updates
>> I0129 01:47:06.780160 31204 authenticator.cpp:232] Received SASL
>> authentication step
>> I0129 01:47:06.780194 31204 auxprop.cpp:109] Request to lookup properties
>> for user: 'test-principal' realm: '528a1b6e6459' server FQDN:
>> '528a1b6e6459' SASL_AUXPROP_VERIFY_AGAINST_HASH: false
>> SASL_AUXPROP_OVERRIDE: false SASL_AUXPROP_AUTHZID: false
>> I0129 01:47:06.780218 31204 auxprop.cpp:181] Looking up auxiliary
>> property '*userPassword'
>> I0129 01:47:06.780268 31204 auxprop.cpp:181] Looking up auxiliary
>> property '*cmusaslsecretCRAM-MD5'
>> I0129 01:47:06.780306 31204 auxprop.cpp:109] Request to lookup properties
>> for user: 'test-principal' realm: '528a1b6e6459' server FQDN:
>> '528a1b6e6459' SASL_AUXPROP_VERIFY_AGAINST_HASH: false
>> SASL_AUXPROP_OVERRIDE: false SASL_AUXPROP_AUTHZID: true
>> I0129 01:47:06.780328 31204 auxprop.cpp:131] Skipping auxiliary property
>> '*userPassword' since SASL_AUXPROP_AUTHZID == true
>> I0129 01:47:06.780344 31204 auxprop.cpp:131] Skipping auxiliary property
>> '*cmusaslsecretCRAM-MD5' since SASL_AUXPROP_AUTHZID == true
>> I0129 01:47:06.780365 31204 authenticator.cpp:318] Authentication success
>> I0129 01:47:06.780539 31201 authenticatee.cpp:299] Authentication success
>> I0129 01:47:06.780690 31205 master.cpp:7048] Successfully authenticated
>> principal 'test-principal' at scheduler-2a6cfd36-0147-4321-
>> b401-67cd97da2acb@172.17.0.2:52605
>> I0129 01:47:06.780844 31201 sched.cpp:513] Successfully authenticated
>> with master master@172.17.0.2:52605
>> I0129 01:47:06.780866 31201 sched.cpp:836] Sending SUBSCRIBE call to
>> master@172.17.0.2:52605
>> I0129 01:47:06.780889 31213 authenticator.cpp:432] Authentication session
>> cleanup for crammd5-authenticatee(1145)@172.17.0.2:52605
>> I0129 01:47:06.780980 31201 sched.cpp:869] Will retry registration in
>> 1.756982731secs if necessary
>> I0129 01:47:06.781146 31205 master.cpp:2736] Received SUBSCRIBE call for
>> framework 'default' at scheduler-2a6cfd36-0147-4321-
>> b401-67cd97da2acb@172.17.0.2:52605
>> I0129 01:47:06.781214 31205 master.cpp:2159] Authorizing framework
>> principal 'test-principal' to receive offers for role '*'
>> I0129 01:47:06.781610 31201 master.cpp:2812] Subscribing framework
>> default with checkpointing disabled and capabilities [  ]
>> I0129 01:47:06.782166 31201 sched.cpp:759] Framework registered with
>> 525fac7c-347d-4045-bbe0-8853f70b0864-0000
>> I0129 01:47:06.782222 31201 sched.cpp:773] Scheduler::registered took
>> 26355ns
>> I0129 01:47:06.782238 31207 hierarchical.cpp:271] Added framework
>> 525fac7c-347d-4045-bbe0-8853f70b0864-0000
>> I0129 01:47:06.782414 31207 hierarchical.cpp:1675] No allocations
>> performed
>> I0129 01:47:06.782450 31207 hierarchical.cpp:1770] No inverse offers to
>> send out!
>> I0129 01:47:06.782518 31207 hierarchical.cpp:1279] Performed allocation
>> for 0 agents in 159756ns
>> I0129 01:47:06.784039 31210 state.cpp:60] Recovering state from
>> '/tmp/ContentType_AgentAPITest_LaunchNestedContainerSessionWi
>> thTTY_0_6Bxd9F/meta'
>> I0129 01:47:06.784333 31205 status_update_manager.cpp:203] Recovering
>> status update manager
>> I0129 01:47:06.784631 31202 containerizer.cpp:599] Recovering
>> containerizer
>> I0129 01:47:06.785992 31207 provisioner.cpp:251] Provisioner recovery
>> complete
>> I0129 01:47:06.786329 31205 slave.cpp:5422] Finished recovery
>> I0129 01:47:06.786854 31205 slave.cpp:5596] Querying resource estimator
>> for oversubscribable resources
>> I0129 01:47:06.787104 31209 status_update_manager.cpp:177] Pausing
>> sending status updates
>> I0129 01:47:06.787161 31201 slave.cpp:929] New master detected at
>> master@172.17.0.2:52605
>> I0129 01:47:06.787448 31201 slave.cpp:964] Detecting new master
>> I0129 01:47:06.787632 31201 slave.cpp:5610] Received oversubscribable
>> resources {} from the resource estimator
>> I0129 01:47:06.793741 31212 slave.cpp:991] Authenticating with master
>> master@172.17.0.2:52605
>> I0129 01:47:06.793848 31212 slave.cpp:1002] Using default CRAM-MD5
>> authenticatee
>> I0129 01:47:06.794116 31204 authenticatee.cpp:121] Creating new client
>> SASL connection
>> I0129 01:47:06.794420 31211 master.cpp:7018] Authenticating slave(588)@
>> 172.17.0.2:52605
>> I0129 01:47:06.794562 31206 authenticator.cpp:414] Starting
>> authentication session for crammd5-authenticatee(1146)@172.17.0.2:52605
>> I0129 01:47:06.794765 31214 authenticator.cpp:98] Creating new server
>> SASL connection
>> I0129 01:47:06.795016 31206 authenticatee.cpp:213] Received SASL
>> authentication mechanisms: CRAM-MD5
>> I0129 01:47:06.795047 31206 authenticatee.cpp:239] Attempting to
>> authenticate with mechanism 'CRAM-MD5'
>> I0129 01:47:06.795155 31214 authenticator.cpp:204] Received SASL
>> authentication start
>> I0129 01:47:06.795217 31214 authenticator.cpp:326] Authentication
>> requires more steps
>> I0129 01:47:06.795346 31206 authenticatee.cpp:259] Received SASL
>> authentication step
>> I0129 01:47:06.795459 31206 authenticator.cpp:232] Received SASL
>> authentication step
>> I0129 01:47:06.795496 31206 auxprop.cpp:109] Request to lookup properties
>> for user: 'test-principal' realm: '528a1b6e6459' server FQDN:
>> '528a1b6e6459' SASL_AUXPROP_VERIFY_AGAINST_HASH: false
>> SASL_AUXPROP_OVERRIDE: false SASL_AUXPROP_AUTHZID: false
>> I0129 01:47:06.795519 31206 auxprop.cpp:181] Looking up auxiliary
>> property '*userPassword'
>> I0129 01:47:06.795562 31206 auxprop.cpp:181] Looking up auxiliary
>> property '*cmusaslsecretCRAM-MD5'
>> I0129 01:47:06.795594 31206 auxprop.cpp:109] Request to lookup properties
>> for user: 'test-principal' realm: '528a1b6e6459' server FQDN:
>> '528a1b6e6459' SASL_AUXPROP_VERIFY_AGAINST_HASH: false
>> SASL_AUXPROP_OVERRIDE: false SASL_AUXPROP_AUTHZID: true
>> I0129 01:47:06.795611 31206 auxprop.cpp:131] Skipping auxiliary property
>> '*userPassword' since SASL_AUXPROP_AUTHZID == true
>> I0129 01:47:06.795624 31206 auxprop.cpp:131] Skipping auxiliary property
>> '*cmusaslsecretCRAM-MD5' since SASL_AUXPROP_AUTHZID == true
>> I0129 01:47:06.795645 31206 authenticator.cpp:318] Authentication success
>> I0129 01:47:06.795729 31214 authenticatee.cpp:299] Authentication success
>> I0129 01:47:06.795822 31206 master.cpp:7048] Successfully authenticated
>> principal 'test-principal' at slave(588)@172.17.0.2:52605
>> I0129 01:47:06.795853 31214 authenticator.cpp:432] Authentication session
>> cleanup for crammd5-authenticatee(1146)@172.17.0.2:52605
>> I0129 01:47:06.796175 31206 slave.cpp:1086] Successfully authenticated
>> with master master@172.17.0.2:52605
>> I0129 01:47:06.796370 31206 slave.cpp:1508] Will retry registration in
>> 13.136239ms if necessary
>> I0129 01:47:06.796613 31214 master.cpp:5303] Registering agent at
>> slave(588)@172.17.0.2:52605 (528a1b6e6459) with id
>> 525fac7c-347d-4045-bbe0-8853f70b0864-S0
>> I0129 01:47:06.797128 31209 registrar.cpp:461] Applied 1 operations in
>> 72880ns; attempting to update the registry
>> I0129 01:47:06.799134 31209 registrar.cpp:506] Successfully updated the
>> registry in 1920us
>> I0129 01:47:06.800021 31203 slave.cpp:4286] Received ping from
>> slave-observer(541)@172.17.0.2:52605
>> I0129 01:47:06.800220 31206 master.cpp:5377] Registered agent
>> 525fac7c-347d-4045-bbe0-8853f70b0864-S0 at slave(588)@172.17.0.2:52605
>> (528a1b6e6459) with cpus(*):2; mem(*):1024; disk(*):1024;
>> ports(*):[31000-32000]
>> I0129 01:47:06.800516 31205 hierarchical.cpp:478] Added agent
>> 525fac7c-347d-4045-bbe0-8853f70b0864-S0 (528a1b6e6459) with cpus(*):2;
>> mem(*):1024; disk(*):1024; ports(*):[31000-32000] (allocated: {})
>> I0129 01:47:06.801389 31205 hierarchical.cpp:1770] No inverse offers to
>> send out!
>> I0129 01:47:06.801465 31205 hierarchical.cpp:1302] Performed allocation
>> for agent 525fac7c-347d-4045-bbe0-8853f70b0864-S0 in 907380ns
>> I0129 01:47:06.801585 31203 slave.cpp:1132] Registered with master
>> master@172.17.0.2:52605; given agent ID 525fac7c-347d-4045-bbe0-
>> 8853f70b0864-S0
>> I0129 01:47:06.801614 31203 fetcher.cpp:90] Clearing fetcher cache
>> I0129 01:47:06.801818 31214 status_update_manager.cpp:184] Resuming
>> sending status updates
>> I0129 01:47:06.801955 31201 master.cpp:6847] Sending 1 offers to
>> framework 525fac7c-347d-4045-bbe0-8853f70b0864-0000 (default) at
>> scheduler-2a6cfd36-0147-4321-b401-67cd97da2acb@172.17.0.2:52605
>> I0129 01:47:06.802053 31203 slave.cpp:1160] Checkpointing SlaveInfo to
>> '/tmp/ContentType_AgentAPITest_LaunchNestedContainerSessionWi
>> thTTY_0_6Bxd9F/meta/slaves/525fac7c-347d-4045-bbe0-8853f70b0864-S0/
>> slave.info'
>> I0129 01:47:06.802398 31203 slave.cpp:1198] Forwarding total
>> oversubscribed resources {}
>> I0129 01:47:06.802531 31201 sched.cpp:933] Scheduler::resourceOffers took
>> 138849ns
>> I0129 01:47:06.802541 31203 master.cpp:5876] Received update of agent
>> 525fac7c-347d-4045-bbe0-8853f70b0864-S0 at slave(588)@172.17.0.2:52605
>> (528a1b6e6459) with total oversubscribed resources {}
>> I0129 01:47:06.803011 31212 hierarchical.cpp:548] Agent
>> 525fac7c-347d-4045-bbe0-8853f70b0864-S0 (528a1b6e6459) updated with
>> oversubscribed resources {} (total: cpus(*):2; mem(*):1024; disk(*):1024;
>> ports(*):[31000-32000], allocated: cpus(*):2; mem(*):1024; disk(*):1024;
>> ports(*):[31000-32000])
>> I0129 01:47:06.803258 31212 hierarchical.cpp:1675] No allocations
>> performed
>> I0129 01:47:06.803334 31212 hierarchical.cpp:1770] No inverse offers to
>> send out!
>> I0129 01:47:06.803416 31212 hierarchical.cpp:1302] Performed allocation
>> for agent 525fac7c-347d-4045-bbe0-8853f70b0864-S0 in 306702ns
>> I0129 01:47:06.804782 31201 master.cpp:3730] Processing ACCEPT call for
>> offers: [ 525fac7c-347d-4045-bbe0-8853f70b0864-O0 ] on agent
>> 525fac7c-347d-4045-bbe0-8853f70b0864-S0 at slave(588)@172.17.0.2:52605
>> (528a1b6e6459) for framework 525fac7c-347d-4045-bbe0-8853f70b0864-0000
>> (default) at scheduler-2a6cfd36-0147-4321-b401-67cd97da2acb@172.17.0.2:
>> 52605
>> I0129 01:47:06.804949 31201 master.cpp:3318] Authorizing framework
>> principal 'test-principal' to launch task b7415522-fc0f-44d8-b4cb-
>> f4105d7c9458
>> I0129 01:47:06.808382 31201 master.cpp:8900] Adding task
>> b7415522-fc0f-44d8-b4cb-f4105d7c9458 with resources cpus(*):2;
>> mem(*):1024; disk(*):1024; ports(*):[31000-32000] on agent
>> 525fac7c-347d-4045-bbe0-8853f70b0864-S0 at slave(588)@172.17.0.2:52605
>> (528a1b6e6459)
>> I0129 01:47:06.808854 31201 master.cpp:4380] Launching task
>> b7415522-fc0f-44d8-b4cb-f4105d7c9458 of framework
>> 525fac7c-347d-4045-bbe0-8853f70b0864-0000 (default) at
>> scheduler-2a6cfd36-0147-4321-b401-67cd97da2acb@172.17.0.2:52605 with
>> resources cpus(*):2; mem(*):1024; disk(*):1024; ports(*):[31000-32000] on
>> agent 525fac7c-347d-4045-bbe0-8853f70b0864-S0 at slave(588)@
>> 172.17.0.2:52605 (528a1b6e6459)
>> I0129 01:47:06.809702 31209 slave.cpp:1576] Got assigned task
>> 'b7415522-fc0f-44d8-b4cb-f4105d7c9458' for framework
>> 525fac7c-347d-4045-bbe0-8853f70b0864-0000
>> I0129 01:47:06.810569 31209 slave.cpp:1736] Launching task
>> 'b7415522-fc0f-44d8-b4cb-f4105d7c9458' for framework
>> 525fac7c-347d-4045-bbe0-8853f70b0864-0000
>> I0129 01:47:06.811465 31209 paths.cpp:547] Trying to chown
>> '/tmp/ContentType_AgentAPITest_LaunchNestedContainerSessionWi
>> thTTY_0_6Bxd9F/slaves/525fac7c-347d-4045-bbe0-8853f70b0864-S0/frameworks/
>> 525fac7c-347d-4045-bbe0-8853f70b0864-0000/executors/
>> b7415522-fc0f-44d8-b4cb-f4105d7c9458/runs/0e304f6c-2f13-48a2-9e31-fb31e77836ae'
>> to user 'mesos'
>> I0129 01:47:06.811852 31209 slave.cpp:6331] Launching executor
>> 'b7415522-fc0f-44d8-b4cb-f4105d7c9458' of framework
>> 525fac7c-347d-4045-bbe0-8853f70b0864-0000 with resources cpus(*):0.1;
>> mem(*):32 in work directory '/tmp/ContentType_AgentAPITest_
>> LaunchNestedContainerSessionWithTTY_0_6Bxd9F/slaves/
>> 525fac7c-347d-4045-bbe0-8853f70b0864-S0/frameworks/
>> 525fac7c-347d-4045-bbe0-8853f70b0864-0000/executors/
>> b7415522-fc0f-44d8-b4cb-f4105d7c9458/runs/0e304f6c-
>> 2f13-48a2-9e31-fb31e77836ae'
>> I0129 01:47:06.812742 31208 containerizer.cpp:992] Starting container
>> 0e304f6c-2f13-48a2-9e31-fb31e77836ae for executor
>> 'b7415522-fc0f-44d8-b4cb-f4105d7c9458' of framework
>> 525fac7c-347d-4045-bbe0-8853f70b0864-0000
>> I0129 01:47:06.812901 31209 slave.cpp:2058] Queued task
>> 'b7415522-fc0f-44d8-b4cb-f4105d7c9458' for executor
>> 'b7415522-fc0f-44d8-b4cb-f4105d7c9458' of framework
>> 525fac7c-347d-4045-bbe0-8853f70b0864-0000
>> I0129 01:47:06.812989 31209 slave.cpp:882] Successfully attached file
>> '/tmp/ContentType_AgentAPITest_LaunchNestedContainerSessionWi
>> thTTY_0_6Bxd9F/slaves/525fac7c-347d-4045-bbe0-8853f70b0864-S0/frameworks/
>> 525fac7c-347d-4045-bbe0-8853f70b0864-0000/executors/
>> b7415522-fc0f-44d8-b4cb-f4105d7c9458/runs/0e304f6c-
>> 2f13-48a2-9e31-fb31e77836ae'
>> I0129 01:47:06.817917 31215 containerizer.cpp:1541] Launching
>> 'mesos-containerizer' with flags '--help="false"
>> --launch_info="{"command":{"arguments":["mesos-executor","
>> --launcher_dir=\/mesos\/mesos-1.2.0\/_build\/src"],"shell":
>> false,"value":"\/mesos\/mesos-1.2.0\/_build\/src\/mesos-
>> executor"},"environment":{"variables":[{"name":"
>> LIBPROCESS_PORT","value":"0"},{"name":"MESOS_AGENT_ENDPOINT","value":"
>> 172.17.0.2:52605"},{"name":"MESOS_CHECKPOINT","
>> value":"0"},{"name":"MESOS_DIRECTORY","value":"\/tmp\/
>> ContentType_AgentAPITest_LaunchNestedContainerSessionWi
>> thTTY_0_6Bxd9F\/slaves\/525fac7c-347d-4045-bbe0-
>> 8853f70b0864-S0\/frameworks\/525fac7c-347d-4045-bbe0-
>> 8853f70b0864-0000\/executors\/b7415522-fc0f-44d8-b4cb-
>> f4105d7c9458\/runs\/0e304f6c-2f13-48a2-9e31-fb31e77836ae"},
>> {"name":"MESOS_EXECUTOR_ID","value":"b7415522-fc0f-44d8-
>> b4cb-f4105d7c9458"},{"name":"MESOS_EXECUTOR_SHUTDOWN_GRACE_
>> PERIOD","value":"5secs"},{"name":"MESOS_FRAMEWORK_ID","
>> value":"525fac7c-347d-4045-bbe0-8853f70b0864-0000"},{"
>> name":"MESOS_HTTP_COMMAND_EXECUTOR","value":"0"},{"name"
>> :"MESOS_SLAVE_ID","value":"525fac7c-347d-4045-bbe0-
>> 8853f70b0864-S0"},{"name":"MESOS_SLAVE_PID","value":"slave(588)@
>> 172.17.0.2:52605"},{"name":"MESOS_SANDBOX","value":"\/tmp\/ContentType_
>> AgentAPITest_LaunchNestedContainerSessionWithTTY_0_6Bxd9F\/slaves\/
>> 525fac7c-347d-4045-bbe0-8853f70b0864-S0\/frameworks\/
>> 525fac7c-347d-4045-bbe0-8853f70b0864-0000\/executors\/
>> b7415522-fc0f-44d8-b4cb-f4105d7c9458\/runs\/0e304f6c-
>> 2f13-48a2-9e31-fb31e77836ae"}]},"err":{"path":"\/tmp\/
>> ContentType_AgentAPITest_LaunchNestedContainerSessionWi
>> thTTY_0_6Bxd9F\/slaves\/525fac7c-347d-4045-bbe0-
>> 8853f70b0864-S0\/frameworks\/525fac7c-347d-4045-bbe0-
>> 8853f70b0864-0000\/executors\/b7415522-fc0f-44d8-b4cb-
>> f4105d7c9458\/runs\/0e304f6c-2f13-48a2-9e31-fb31e77836ae\/
>> stderr","type":"PATH"},"out":{"path":"\/tmp\/ContentType_AgentAPITest_
>> LaunchNestedContainerSessionWithTTY_0_6Bxd9F\/slaves\/
>> 525fac7c-347d-4045-bbe0-8853f70b0864-S0\/frameworks\/
>> 525fac7c-347d-4045-bbe0-8853f70b0864-0000\/executors\/
>> b7415522-fc0f-44d8-b4cb-f4105d7c9458\/runs\/0e304f6c-
>> 2f13-48a2-9e31-fb31e77836ae\/stdout","type":"PATH"},"user":
>> "mesos","working_directory":"\/tmp\/ContentType_AgentAPITest_
>> LaunchNestedContainerSessionWithTTY_0_6Bxd9F\/slaves\/
>> 525fac7c-347d-4045-bbe0-8853f70b0864-S0\/frameworks\/
>> 525fac7c-347d-4045-bbe0-8853f70b0864-0000\/executors\/
>> b7415522-fc0f-44d8-b4cb-f4105d7c9458\/runs\/0e304f6c-2f13-48a2-9e31-fb31e77836ae"}"
>> --pipe_read="15" --pipe_write="17" --runtime_directory="/tmp/
>> ContentType_AgentAPITest_LaunchNestedContainerSessionWi
>> thTTY_0_oCWVvz/containers/0e304f6c-2f13-48a2-9e31-fb31e77836ae"
>> --unshare_namespace_mnt="false"'
>> I0129 01:47:06.823396 31215 launcher.cpp:135] Forked child with pid
>> '10932' for container '0e304f6c-2f13-48a2-9e31-fb31e77836ae'
>> I0129 01:47:06.828348 31211 fetcher.cpp:349] Starting to fetch URIs for
>> container: 0e304f6c-2f13-48a2-9e31-fb31e77836ae, directory:
>> /tmp/ContentType_AgentAPITest_LaunchNestedContainerSessionWi
>> thTTY_0_6Bxd9F/slaves/525fac7c-347d-4045-bbe0-8853f70b0864-S0/frameworks/
>> 525fac7c-347d-4045-bbe0-8853f70b0864-0000/executors/
>> b7415522-fc0f-44d8-b4cb-f4105d7c9458/runs/0e304f6c-
>> 2f13-48a2-9e31-fb31e77836ae
>> I0129 01:47:07.005422 31206 slave.cpp:3325] Got registration for executor
>> 'b7415522-fc0f-44d8-b4cb-f4105d7c9458' of framework
>> 525fac7c-347d-4045-bbe0-8853f70b0864-0000 from executor(1)@
>> 172.17.0.2:43889
>> I0129 01:47:07.008360 31216 slave.cpp:2271] Sending queued task
>> 'b7415522-fc0f-44d8-b4cb-f4105d7c9458' to executor
>> 'b7415522-fc0f-44d8-b4cb-f4105d7c9458' of framework
>> 525fac7c-347d-4045-bbe0-8853f70b0864-0000 at executor(1)@172.17.0.2:43889
>> I0129 01:47:07.024600 31210 slave.cpp:3756] Handling status update
>> TASK_RUNNING (UUID: 63796167-fc5a-4bd4-8f16-41e24e015823) for task
>> b7415522-fc0f-44d8-b4cb-f4105d7c9458 of framework
>> 525fac7c-347d-4045-bbe0-8853f70b0864-0000 from executor(1)@
>> 172.17.0.2:43889
>> I0129 01:47:07.027077 31203 status_update_manager.cpp:323] Received
>> status update TASK_RUNNING (UUID: 63796167-fc5a-4bd4-8f16-41e24e015823)
>> for task b7415522-fc0f-44d8-b4cb-f4105d7c9458 of framework
>> 525fac7c-347d-4045-bbe0-8853f70b0864-0000
>> I0129 01:47:07.027180 31203 status_update_manager.cpp:500] Creating
>> StatusUpdate stream for task b7415522-fc0f-44d8-b4cb-f4105d7c9458 of
>> framework 525fac7c-347d-4045-bbe0-8853f70b0864-0000
>> I0129 01:47:07.027844 31203 status_update_manager.cpp:377] Forwarding
>> update TASK_RUNNING (UUID: 63796167-fc5a-4bd4-8f16-41e24e015823) for
>> task b7415522-fc0f-44d8-b4cb-f4105d7c9458 of framework
>> 525fac7c-347d-4045-bbe0-8853f70b0864-0000 to the agent
>> I0129 01:47:07.028298 31213 slave.cpp:4196] Forwarding the update
>> TASK_RUNNING (UUID: 63796167-fc5a-4bd4-8f16-41e24e015823) for task
>> b7415522-fc0f-44d8-b4cb-f4105d7c9458 of framework
>> 525fac7c-347d-4045-bbe0-8853f70b0864-0000 to master@172.17.0.2:52605
>> I0129 01:47:07.028584 31213 slave.cpp:4090] Status update manager
>> successfully handled status update TASK_RUNNING (UUID:
>> 63796167-fc5a-4bd4-8f16-41e24e015823) for task b7415522-fc0f-44d8-b4cb-f4105d7c9458
>> of framework 525fac7c-347d-4045-bbe0-8853f70b0864-0000
>> I0129 01:47:07.028638 31213 slave.cpp:4106] Sending acknowledgement for
>> status update TASK_RUNNING (UUID: 63796167-fc5a-4bd4-8f16-41e24e015823)
>> for task b7415522-fc0f-44d8-b4cb-f4105d7c9458 of framework
>> 525fac7c-347d-4045-bbe0-8853f70b0864-0000 to executor(1)@172.17.0.2:43889
>> I0129 01:47:07.028801 31207 master.cpp:6021] Status update TASK_RUNNING
>> (UUID: 63796167-fc5a-4bd4-8f16-41e24e015823) for task
>> b7415522-fc0f-44d8-b4cb-f4105d7c9458 of framework
>> 525fac7c-347d-4045-bbe0-8853f70b0864-0000 from agent
>> 525fac7c-347d-4045-bbe0-8853f70b0864-S0 at slave(588)@172.17.0.2:52605
>> (528a1b6e6459)
>> I0129 01:47:07.028880 31207 master.cpp:6089] Forwarding status update
>> TASK_RUNNING (UUID: 63796167-fc5a-4bd4-8f16-41e24e015823) for task
>> b7415522-fc0f-44d8-b4cb-f4105d7c9458 of framework
>> 525fac7c-347d-4045-bbe0-8853f70b0864-0000
>> I0129 01:47:07.029150 31207 master.cpp:8155] Updating the state of task
>> b7415522-fc0f-44d8-b4cb-f4105d7c9458 of framework
>> 525fac7c-347d-4045-bbe0-8853f70b0864-0000 (latest state: TASK_RUNNING,
>> status update state: TASK_RUNNING)
>> I0129 01:47:07.029597 31214 sched.cpp:1041] Scheduler::statusUpdate took
>> 190198ns
>> I0129 01:47:07.030319 31212 master.cpp:5017] Processing ACKNOWLEDGE call
>> 63796167-fc5a-4bd4-8f16-41e24e015823 for task b7415522-fc0f-44d8-b4cb-f4105d7c9458
>> of framework 525fac7c-347d-4045-bbe0-8853f70b0864-0000 (default) at
>> scheduler-2a6cfd36-0147-4321-b401-67cd97da2acb@172.17.0.2:52605 on agent
>> 525fac7c-347d-4045-bbe0-8853f70b0864-S0
>> I0129 01:47:07.030860 31204 status_update_manager.cpp:395] Received
>> status update acknowledgement (UUID: 63796167-fc5a-4bd4-8f16-41e24e015823)
>> for task b7415522-fc0f-44d8-b4cb-f4105d7c9458 of framework
>> 525fac7c-347d-4045-bbe0-8853f70b0864-0000
>> I0129 01:47:07.031313 31201 slave.cpp:3045] Status update manager
>> successfully handled status update acknowledgement (UUID:
>> 63796167-fc5a-4bd4-8f16-41e24e015823) for task b7415522-fc0f-44d8-b4cb-f4105d7c9458
>> of framework 525fac7c-347d-4045-bbe0-8853f70b0864-0000
>> I0129 01:47:07.034726 31204 process.cpp:3697] Handling HTTP event for
>> process 'slave(588)' with path: '/slave(588)/api/v1'
>> I0129 01:47:07.036304 31202 http.cpp:303] HTTP POST for
>> /slave(588)/api/v1 from 172.17.0.2:58766
>> I0129 01:47:07.036962 31202 http.cpp:464] Processing call
>> LAUNCH_NESTED_CONTAINER_SESSION
>> I0129 01:47:07.037560 31210 containerizer.cpp:1808] Starting nested
>> container 0e304f6c-2f13-48a2-9e31-fb31e77836ae.1d1fcfd9-5aaf-
>> 493b-b237-259eaf882577
>> I0129 01:47:07.038013 31210 containerizer.cpp:1832] Trying to chown
>> '/tmp/ContentType_AgentAPITest_LaunchNestedContainerSessionWi
>> thTTY_0_6Bxd9F/slaves/525fac7c-347d-4045-bbe0-8853f70b0864-S0/frameworks/
>> 525fac7c-347d-4045-bbe0-8853f70b0864-0000/executors/
>> b7415522-fc0f-44d8-b4cb-f4105d7c9458/runs/0e304f6c-
>> 2f13-48a2-9e31-fb31e77836ae/containers/1d1fcfd9-5aaf-493b-b237-259eaf882577'
>> to user 'mesos'
>> I0129 01:47:07.041610 31202 switchboard.cpp:434] Allocated pseudo
>> terminal '/dev/pts/1' for container 0e304f6c-2f13-48a2-9e31-
>> fb31e77836ae.1d1fcfd9-5aaf-493b-b237-259eaf882577
>> I0129 01:47:07.042568 31202 switchboard.cpp:571] Launching
>> 'mesos-io-switchboard' with flags '--heartbeat_interval="30secs"
>> --help="false" --socket_address="/tmp/mesos-io-switchboard-1e11f86a-125b-4769-97bc-3b2cd0733dcc"
>> --stderr_from_fd="21" --stderr_to_fd="2" --stdin_to_fd="21"
>> --stdout_from_fd="21" --stdout_to_fd="1" --tty="true"
>> --wait_for_connection="true"' for container 0e304f6c-2f13-48a2-9e31-
>> fb31e77836ae.1d1fcfd9-5aaf-493b-b237-259eaf882577
>> I0129 01:47:07.049176 31202 switchboard.cpp:601] Created I/O switchboard
>> server (pid: 10991) listening on socket file '/tmp/mesos-io-switchboard-
>> 1e11f86a-125b-4769-97bc-3b2cd0733dcc' for container
>> 0e304f6c-2f13-48a2-9e31-fb31e77836ae.1d1fcfd9-5aaf-493b-b237-259eaf882577
>> I0129 01:47:07.051885 31205 containerizer.cpp:1541] Launching
>> 'mesos-containerizer' with flags '--help="false"
>> --launch_info="{"command":{"shell":true,"value":"printf output && printf
>> error 1>&2"},"environment":{},"err":{"fd":22,"type":"FD"},"in":{"
>> fd":22,"type":"FD"},"out":{"fd":22,"type":"FD"},"tty_
>> slave_path":"\/dev\/pts\/1","user":"mesos"}" --pipe_read="21"
>> --pipe_write="42" --runtime_directory="/tmp/ContentType_AgentAPITest_
>> LaunchNestedContainerSessionWithTTY_0_oCWVvz/containers/
>> 0e304f6c-2f13-48a2-9e31-fb31e77836ae/containers/1d1fcfd9-5aaf-493b-b237-259eaf882577"
>> --unshare_namespace_mnt="false"'
>> I0129 01:47:07.057840 31205 launcher.cpp:135] Forked child with pid
>> '10995' for container '0e304f6c-2f13-48a2-9e31-
>> fb31e77836ae.1d1fcfd9-5aaf-493b-b237-259eaf882577'
>> I0129 01:47:07.062804 31209 fetcher.cpp:349] Starting to fetch URIs for
>> container: 0e304f6c-2f13-48a2-9e31-fb31e77836ae.1d1fcfd9-5aaf-493b-b237-259eaf882577,
>> directory: /tmp/ContentType_AgentAPITest_LaunchNestedContainerSessionWi
>> thTTY_0_6Bxd9F/slaves/525fac7c-347d-4045-bbe0-8853f70b0864-S0/frameworks/
>> 525fac7c-347d-4045-bbe0-8853f70b0864-0000/executors/
>> b7415522-fc0f-44d8-b4cb-f4105d7c9458/runs/0e304f6c-
>> 2f13-48a2-9e31-fb31e77836ae/containers/1d1fcfd9-5aaf-493b-
>> b237-259eaf882577
>>
>> **** DEADLOCK DETECTED! ****
>> You are waiting on process __limiter__(609)@172.17.0.2:52605 that it is
>> currently executing.
>> I0129 01:47:07.180491 31215 http.cpp:2647] Received EOF attach response
>> for 0e304f6c-2f13-48a2-9e31-fb31e77836ae.1d1fcfd9-5aaf-
>> 493b-b237-259eaf882577
>> I0129 01:47:07.181474 31211 containerizer.cpp:2119] Destroying container
>> 0e304f6c-2f13-48a2-9e31-fb31e77836ae.1d1fcfd9-5aaf-493b-b237-259eaf882577
>> in RUNNING state
>> I0129 01:47:07.182045 31211 launcher.cpp:151] Asked to destroy container
>> 0e304f6c-2f13-48a2-9e31-fb31e77836ae.1d1fcfd9-5aaf-493b-b237-259eaf882577
>> W0129 01:47:07.183346 31215 http.cpp:2662] Launch nested container
>> session connection for container 0e304f6c-2f13-48a2-9e31-
>> fb31e77836ae.1d1fcfd9-5aaf-493b-b237-259eaf882577 closed
>> I0129 01:47:07.187363 31182 sched.cpp:2021] Asked to stop the driver
>> I0129 01:47:07.187564 31214 sched.cpp:1203] Stopping framework
>> 525fac7c-347d-4045-bbe0-8853f70b0864-0000
>> I0129 01:47:07.188474 31211 master.cpp:7560] Processing TEARDOWN call for
>> framework 525fac7c-347d-4045-bbe0-8853f70b0864-0000 (default) at
>> scheduler-2a6cfd36-0147-4321-b401-67cd97da2acb@172.17.0.2:52605
>> I0129 01:47:07.188658 31211 master.cpp:7572] Removing framework
>> 525fac7c-347d-4045-bbe0-8853f70b0864-0000 (default) at
>> scheduler-2a6cfd36-0147-4321-b401-67cd97da2acb@172.17.0.2:52605
>> I0129 01:47:07.188823 31211 master.cpp:3080] Deactivating framework
>> 525fac7c-347d-4045-bbe0-8853f70b0864-0000 (default) at
>> scheduler-2a6cfd36-0147-4321-b401-67cd97da2acb@172.17.0.2:52605
>> I0129 01:47:07.189213 31207 hierarchical.cpp:386] Deactivated framework
>> 525fac7c-347d-4045-bbe0-8853f70b0864-0000
>> I0129 01:47:07.189306 31203 slave.cpp:2598] Asked to shut down framework
>> 525fac7c-347d-4045-bbe0-8853f70b0864-0000 by master@172.17.0.2:52605
>> I0129 01:47:07.189357 31203 slave.cpp:2623] Shutting down framework
>> 525fac7c-347d-4045-bbe0-8853f70b0864-0000
>> I0129 01:47:07.189411 31203 slave.cpp:5005] Shutting down executor
>> 'b7415522-fc0f-44d8-b4cb-f4105d7c9458' of framework
>> 525fac7c-347d-4045-bbe0-8853f70b0864-0000 at executor(1)@172.17.0.2:43889
>> I0129 01:47:07.189401 31211 master.cpp:8155] Updating the state of task
>> b7415522-fc0f-44d8-b4cb-f4105d7c9458 of framework
>> 525fac7c-347d-4045-bbe0-8853f70b0864-0000 (latest state: TASK_KILLED,
>> status update state: TASK_KILLED)
>> I0129 01:47:07.190676 31204 containerizer.cpp:2119] Destroying container
>> 0e304f6c-2f13-48a2-9e31-fb31e77836ae in RUNNING state
>> I0129 01:47:07.190798 31207 hierarchical.cpp:1011] Recovered cpus(*):2;
>> mem(*):1024; disk(*):1024; ports(*):[31000-32000] (total: cpus(*):2;
>> mem(*):1024; disk(*):1024; ports(*):[31000-32000], allocated: {}) on agent
>> 525fac7c-347d-4045-bbe0-8853f70b0864-S0 from framework
>> 525fac7c-347d-4045-bbe0-8853f70b0864-0000
>> I0129 01:47:07.191231 31211 master.cpp:8249] Removing task
>> b7415522-fc0f-44d8-b4cb-f4105d7c9458 with resources cpus(*):2;
>> mem(*):1024; disk(*):1024; ports(*):[31000-32000] of framework
>> 525fac7c-347d-4045-bbe0-8853f70b0864-0000 on agent
>> 525fac7c-347d-4045-bbe0-8853f70b0864-S0 at slave(588)@172.17.0.2:52605
>> (528a1b6e6459)
>> I0129 01:47:07.192201 31211 hierarchical.cpp:337] Removed framework
>> 525fac7c-347d-4045-bbe0-8853f70b0864-0000
>> E0129 01:47:07.234925 31201 switchboard.cpp:889] Unexpected termination
>> of I/O switchboard server: 'IOSwitchboard' exited with status 1 for
>> container 0e304f6c-2f13-48a2-9e31-fb31e77836ae.1d1fcfd9-5aaf-
>> 493b-b237-259eaf882577
>> I0129 01:47:07.235450 31201 containerizer.cpp:2482] Container
>> 0e304f6c-2f13-48a2-9e31-fb31e77836ae.1d1fcfd9-5aaf-493b-b237-259eaf882577
>> has exited
>> I0129 01:47:07.237144 31205 provisioner.cpp:322] Ignoring destroy request
>> for unknown container 0e304f6c-2f13-48a2-9e31-fb31e77836ae.1d1fcfd9-5aaf-
>> 493b-b237-259eaf882577
>> I0129 01:47:07.237336 31210 containerizer.cpp:2398] Checkpointing
>> termination state to nested container's runtime directory '/tmp/ContentType_
>> AgentAPITest_LaunchNestedContainerSessionWithTTY_0_oCWVvz/containers/
>> 0e304f6c-2f13-48a2-9e31-fb31e77836ae/containers/1d1fcfd9-5aaf-493b-b237-
>> 259eaf882577/termination'
>> I0129 01:47:07.238113 31210 launcher.cpp:151] Asked to destroy container
>> 0e304f6c-2f13-48a2-9e31-fb31e77836ae
>> I0129 01:47:07.249295 31216 slave.cpp:4328] Got exited event for
>> executor(1)@172.17.0.2:43889
>> I0129 01:47:07.337250 31202 containerizer.cpp:2482] Container
>> 0e304f6c-2f13-48a2-9e31-fb31e77836ae has exited
>> I0129 01:47:07.340733 31204 provisioner.cpp:322] Ignoring destroy request
>> for unknown container 0e304f6c-2f13-48a2-9e31-fb31e77836ae
>> I0129 01:47:07.342382 31204 slave.cpp:4691] Executor
>> 'b7415522-fc0f-44d8-b4cb-f4105d7c9458' of framework
>> 525fac7c-347d-4045-bbe0-8853f70b0864-0000 terminated with signal Killed
>> I0129 01:47:07.342463 31204 slave.cpp:4791] Cleaning up executor
>> 'b7415522-fc0f-44d8-b4cb-f4105d7c9458' of framework
>> 525fac7c-347d-4045-bbe0-8853f70b0864-0000 at executor(1)@172.17.0.2:43889
>> I0129 01:47:07.342941 31204 slave.cpp:4879] Cleaning up framework
>> 525fac7c-347d-4045-bbe0-8853f70b0864-0000
>> I0129 01:47:07.343423 31204 gc.cpp:55] Scheduling '/tmp/ContentType_
>> AgentAPITest_LaunchNestedContainerSessionWithTTY_0_6Bxd9F/slaves/
>> 525fac7c-347d-4045-bbe0-8853f70b0864-S0/frameworks/
>> 525fac7c-347d-4045-bbe0-8853f70b0864-0000/executors/
>> b7415522-fc0f-44d8-b4cb-f4105d7c9458/runs/0e304f6c-2f13-48a2-9e31-fb31e77836ae'
>> for gc 6.99999603365926days in the future
>> I0129 01:47:07.343621 31204 gc.cpp:55] Scheduling '/tmp/ContentType_
>> AgentAPITest_LaunchNestedContainerSessionWithTTY_0_6Bxd9F/slaves/
>> 525fac7c-347d-4045-bbe0-8853f70b0864-S0/frameworks/
>> 525fac7c-347d-4045-bbe0-8853f70b0864-0000/executors/
>> b7415522-fc0f-44d8-b4cb-f4105d7c9458' for gc 6.99999603146074days in the
>> future
>> I0129 01:47:07.343751 31204 gc.cpp:55] Scheduling '/tmp/ContentType_
>> AgentAPITest_LaunchNestedContainerSessionWithTTY_0_6Bxd9F/slaves/
>> 525fac7c-347d-4045-bbe0-8853f70b0864-S0/frameworks/
>> 525fac7c-347d-4045-bbe0-8853f70b0864-0000' for gc 6.9999960292237days in
>> the future
>> I0129 01:47:07.343873 31204 status_update_manager.cpp:285] Closing status
>> update streams for framework 525fac7c-347d-4045-bbe0-8853f70b0864-0000
>> I0129 01:47:07.343926 31204 status_update_manager.cpp:531] Cleaning up
>> status update stream for task b7415522-fc0f-44d8-b4cb-f4105d7c9458 of
>> framework 525fac7c-347d-4045-bbe0-8853f70b0864-0000
>> W0129 01:47:07.345073 31206 containerizer.cpp:2107] Attempted to destroy
>> unknown container 0e304f6c-2f13-48a2-9e31-fb31e77836ae.1d1fcfd9-5aaf-
>> 493b-b237-259eaf882577
>> I0129 01:47:07.346194 31182 slave.cpp:801] Agent terminating
>> I0129 01:47:07.346673 31212 master.cpp:1263] Agent
>> 525fac7c-347d-4045-bbe0-8853f70b0864-S0 at slave(588)@172.17.0.2:52605
>> (528a1b6e6459) disconnected
>> I0129 01:47:07.347143 31212 master.cpp:3117] Disconnecting agent
>> 525fac7c-347d-4045-bbe0-8853f70b0864-S0 at slave(588)@172.17.0.2:52605
>> (528a1b6e6459)
>> I0129 01:47:07.347982 31212 master.cpp:3136] Deactivating agent
>> 525fac7c-347d-4045-bbe0-8853f70b0864-S0 at slave(588)@172.17.0.2:52605
>> (528a1b6e6459)
>> I0129 01:47:07.348242 31216 hierarchical.cpp:577] Agent
>> 525fac7c-347d-4045-bbe0-8853f70b0864-S0 deactivated
>> I0129 01:47:07.353482 31182 master.cpp:1102] Master terminating
>> I0129 01:47:07.354373 31208 hierarchical.cpp:510] Removed agent
>> 525fac7c-347d-4045-bbe0-8853f70b0864-S0
>> [       OK ] ContentType/AgentAPITest.LaunchNestedContainerSessionWithTTY/0
>> (606 ms)
>> [ RUN      ] ContentType/AgentAPITest.LaunchNestedContainerSessionWi
>> thTTY/1
>> I0129 01:47:07.362337 31182 cluster.cpp:160] Creating default 'local'
>> authorizer
>> I0129 01:47:07.365028 31212 master.cpp:383] Master
>> 15845ec6-f7a4-4449-9c90-2314a16d7199 (528a1b6e6459) started on
>> 172.17.0.2:52605
>> I0129 01:47:07.365053 31212 master.cpp:385] Flags at startup: --acls=""
>> --agent_ping_timeout="15secs" --agent_reregister_timeout="10mins"
>> --allocation_interval="1secs" --allocator="HierarchicalDRF"
>> --authenticate_agents="true" --authenticate_frameworks="true"
>> --authenticate_http_frameworks="true" --authenticate_http_readonly="true"
>> --authenticate_http_readwrite="true" --authenticators="crammd5"
>> --authorizers="local" --credentials="/tmp/nrZwOH/credentials"
>> --framework_sorter="drf" --help="false" --hostname_lookup="true"
>> --http_authenticators="basic" --http_framework_authenticators="basic"
>> --initialize_driver_logging="true" --log_auto_initialize="true"
>> --logbufsecs="0" --logging_level="INFO" --max_agent_ping_timeouts="5"
>> --max_completed_frameworks="50" --max_completed_tasks_per_framework="1000"
>> --max_unreachable_tasks_per_framework="1000" --quiet="false"
>> --recovery_agent_removal_limit="100%" --registry="in_memory"
>> --registry_fetch_timeout="1mins" --registry_gc_interval="15mins"
>> --registry_max_agent_age="2weeks" --registry_max_agent_count="102400"
>> --registry_store_timeout="100secs" --registry_strict="false"
>> --root_submissions="true" --user_sorter="drf" --version="false"
>> --webui_dir="/mesos/mesos-1.2.0/_inst/share/mesos/webui"
>> --work_dir="/tmp/nrZwOH/master" --zk_session_timeout="10secs"
>> I0129 01:47:07.365435 31212 master.cpp:435] Master only allowing
>> authenticated frameworks to register
>> I0129 01:47:07.365449 31212 master.cpp:449] Master only allowing
>> authenticated agents to register
>> I0129 01:47:07.365455 31212 master.cpp:462] Master only allowing
>> authenticated HTTP frameworks to register
>> I0129 01:47:07.365463 31212 credentials.hpp:37] Loading credentials for
>> authentication from '/tmp/nrZwOH/credentials'
>> I0129 01:47:07.365718 31212 master.cpp:507] Using default 'crammd5'
>> authenticator
>> I0129 01:47:07.365849 31212 http.cpp:922] Using default 'basic' HTTP
>> authenticator for realm 'mesos-master-readonly'
>> I0129 01:47:07.365962 31212 http.cpp:922] Using default 'basic' HTTP
>> authenticator for realm 'mesos-master-readwrite'
>> I0129 01:47:07.366044 31212 http.cpp:922] Using default 'basic' HTTP
>> authenticator for realm 'mesos-master-scheduler'
>> I0129 01:47:07.366122 31212 master.cpp:587] Authorization enabled
>> I0129 01:47:07.367403 31215 hierarchical.cpp:151] Initialized
>> hierarchical allocator process
>> I0129 01:47:07.367498 31210 whitelist_watcher.cpp:77] No whitelist given
>> I0129 01:47:07.369630 31208 master.cpp:2123] Elected as the leading
>> master!
>> I0129 01:47:07.369664 31208 master.cpp:1645] Recovering from registrar
>> I0129 01:47:07.369947 31208 registrar.cpp:329] Recovering registrar
>> I0129 01:47:07.371074 31208 registrar.cpp:362] Successfully fetched the
>> registry (0B) in 1.083648ms
>> I0129 01:47:07.371201 31208 registrar.cpp:461] Applied 1 operations in
>> 25648ns; attempting to update the registry
>> I0129 01:47:07.372354 31215 registrar.cpp:506] Successfully updated the
>> registry in 1.056768ms
>> I0129 01:47:07.372536 31215 registrar.cpp:392] Successfully recovered
>> registrar
>> I0129 01:47:07.373064 31204 master.cpp:1761] Recovered 0 agents from the
>> registry (129B); allowing 10mins for agents to re-register
>> I0129 01:47:07.373103 31213 hierarchical.cpp:178] Skipping recovery of
>> hierarchical allocator: nothing to recover
>> Build timed out (after 300 minutes). Marking the build as failed.
>> Build was aborted
>> ++ docker rmi mesos-1485652063-29707
>> Error response from daemon: conflict: unable to remove repository
>> reference "mesos-1485652063-29707" (must force) - container 528a1b6e6459 is
>> using its referenced image 26aae01ba559
>>
>>
>>