You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Rob Russo (Jira)" <ji...@apache.org> on 2020/10/21 04:57:00 UTC

[jira] [Commented] (SPARK-32675) --py-files option is appended without passing value for it

    [ https://issues.apache.org/jira/browse/SPARK-32675?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17218109#comment-17218109 ] 

Rob Russo commented on SPARK-32675:
-----------------------------------

I see this is targeted for 3.1.0, but is this also going into 3.0.2? This completely breaks the mesos dispatcher service for 3.0.x for anyone trying to upgrade. We just had to hunt down the same issue

> --py-files option is appended without passing value for it
> ----------------------------------------------------------
>
>                 Key: SPARK-32675
>                 URL: https://issues.apache.org/jira/browse/SPARK-32675
>             Project: Spark
>          Issue Type: Bug
>          Components: Mesos
>    Affects Versions: 3.0.0
>            Reporter: Farhan Khan
>            Assignee: Farhan Khan
>            Priority: Major
>             Fix For: 3.1.0
>
>
> Submitted application passing --py-files option in a hardcoded manner for a Mesos Cluster in cluster mode using REST Submission API. It is causing a simple Java-based SparkPi job to fail.
> This Bug is introduced by SPARK-26466.
> Here is the example job submission:
> {code:bash}
> curl -X POST http://localhost:7077/v1/submissions/create --header "Content-Type:application/json" --data '{
> "action": "CreateSubmissionRequest",
> "appResource": "file:///opt/spark-3.0.0-bin-3.2.0/examples/jars/spark-examples_2.12-3.0.0.jar",
> "clientSparkVersion": "3.0.0",
> "appArgs": ["30"],
> "environmentVariables": {},
> "mainClass": "org.apache.spark.examples.SparkPi",
> "sparkProperties": {
>   "spark.jars": "file:///opt/spark-3.0.0-bin-3.2.0/examples/jars/spark-examples_2.12-3.0.0.jar",
>   "spark.driver.supervise": "false",
>   "spark.executor.memory": "512m",
>   "spark.driver.memory": "512m",
>   "spark.submit.deployMode": "cluster",
>   "spark.app.name": "SparkPi",
>   "spark.master": "mesos://localhost:5050"
> }}'
> {code}
> Expected Driver log would contain:
> {code:bash}
> 20/08/20 20:19:57 WARN DependencyUtils: Local jar /var/lib/mesos/slaves/e6779377-08ec-4765-9bfc-d27082fbcfa1-S0/frameworks/e6779377-08ec-4765-9bfc-d27082fbcfa1-0000/executors/driver-20200820201954-0002/runs/d9d734e8-a299-4d87-8f33-b134c65c422b/spark.driver.memory=512m does not exist, skipping.
> Error: Failed to load class org.apache.spark.examples.SparkPi.
> 20/08/20 20:19:57 INFO ShutdownHookManager: Shutdown hook called
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org