You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Thomas Graves (Jira)" <ji...@apache.org> on 2021/09/23 18:31:00 UTC
[jira] [Reopened] (SPARK-35672) Spark fails to launch executors
with very large user classpath lists on YARN
[ https://issues.apache.org/jira/browse/SPARK-35672?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Thomas Graves reopened SPARK-35672:
-----------------------------------
> Spark fails to launch executors with very large user classpath lists on YARN
> ----------------------------------------------------------------------------
>
> Key: SPARK-35672
> URL: https://issues.apache.org/jira/browse/SPARK-35672
> Project: Spark
> Issue Type: Bug
> Components: Spark Core, YARN
> Affects Versions: 3.1.2
> Environment: Linux RHEL7
> Spark 3.1.1
> Reporter: Erik Krogen
> Assignee: Erik Krogen
> Priority: Major
> Fix For: 3.2.0, 3.1.3
>
>
> When running Spark on YARN, the {{user-class-path}} argument to {{CoarseGrainedExecutorBackend}} is used to pass a list of user JAR URIs to executor processes. The argument is specified once for each JAR, and the URIs are fully-qualified, so the paths can be quite long. With large user JAR lists (say 1000+), this can result in system-level argument length limits being exceeded, typically manifesting as the error message:
> {code}
> /bin/bash: Argument list too long
> {code}
> A [Google search|https://www.google.com/search?q=spark%20%22%2Fbin%2Fbash%3A%20argument%20list%20too%20long%22&oq=spark%20%22%2Fbin%2Fbash%3A%20argument%20list%20too%20long%22] indicates that this is not a theoretical problem and afflicts real users, including ours. This issue was originally observed on Spark 2.3, but has been confirmed to exist in the master branch as well.
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org