You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2015/08/20 10:18:46 UTC

[jira] [Resolved] (SPARK-10131) running spark job in docker by mesos-slave

     [ https://issues.apache.org/jira/browse/SPARK-10131?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Sean Owen resolved SPARK-10131.
-------------------------------
    Resolution: Invalid

Do you mind asking this at user@spark.apache.org? as far as I can tell you're just asking for assistance in understanding the problem rather than reporting a specific problem attributable to Spark. See https://cwiki.apache.org/confluence/display/SPARK/Contributing+to+Spark please.

> running spark job in docker by mesos-slave
> ------------------------------------------
>
>                 Key: SPARK-10131
>                 URL: https://issues.apache.org/jira/browse/SPARK-10131
>             Project: Spark
>          Issue Type: Bug
>          Components: Mesos
>    Affects Versions: 1.4.1
>         Environment: docker 1.8.1
> mesos 0.23.0
> spark 1.4.1
>            Reporter: Stream Liu
>
> I try to running spark job in docker by mesos-slave.
> by i always get ERROR in mesos-slave
> E0820 07:46:08.780293     9 slave.cpp:1643] Failed to update resources for container f2aeb5ee-2419-430c-be7d-8276947b909a of executor '20150820-064813-1684252864-5050-1-S0' of framework 20150820-064813-1684252864-5050-1-0004, destroying container: Failed to determine cgroup for the 'cpu' subsystem: Failed to read /proc/13071/cgroup: Failed to open file '/proc/13071/cgroup': No such file or directory
> the target container could running but always exit with ExitCode137.
> i think this could be cause by cgroup ?
> the job could make work with out --containerizers=docker



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org