You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Dongjoon Hyun (Jira)" <ji...@apache.org> on 2020/10/18 05:14:00 UTC
[jira] [Created] (SPARK-33175) Detect duplicated mountPath and
Dongjoon Hyun created SPARK-33175:
-------------------------------------
Summary: Detect duplicated mountPath and
Key: SPARK-33175
URL: https://issues.apache.org/jira/browse/SPARK-33175
Project: Spark
Issue Type: Sub-task
Components: Kubernetes
Affects Versions: 3.1.0
Reporter: Dongjoon Hyun
If there is a mountPath conflict, the pod is created and repeats the following error messages and keep running.
{code}
$ k get pod -l 'spark-role in (driver,executor)'
NAME READY STATUS RESTARTS AGE
tpcds 1/1 Running 0 4m20s
{code}
{code}
20/10/18 05:09:26 WARN ExecutorPodsSnapshotsStoreImpl: Exception when notifying snapshot subscriber.
io.fabric8.kubernetes.client.KubernetesClientException: Failure executing: POST at: https://2fba95910afd97c73b9a74acc07e0167.sk1.us-west-2.eks.amazonaws.com/api/v1/namespaces/default/pods. Message: Pod "tpcds-exec-1" is invalid: spec.containers[0].volumeMounts[1].mountPath: Invalid value: "/data1": must be unique. Received status: Status(apiVersion=v1, code=422, details=StatusDetails(causes=[StatusCause(field=spec.containers[0].volumeMounts[1].mountPath, message=Invalid value: "/data1": must be unique, reason=FieldValueInvalid, additionalProperties={})], group=null, kind=Pod, name=tpcds-exec-1, retryAfterSeconds=null, uid=null, additionalProperties={}), kind=Status, message=Pod "tpcds-exec-1" is invalid: spec.containers[0].volumeMounts[1].mountPath: Invalid value: "/data1": must be unique, metadata=ListMeta(_continue=null, remainingItemCount=null, resourceVersion=null, selfLink=null, additionalProperties={}), reason=Invalid, status=Failure, additionalProperties={}).
{code}
We had better fail at Spark side.
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org