You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Farhan Khan (Jira)" <ji...@apache.org> on 2020/08/21 00:59:00 UTC

[jira] [Created] (SPARK-32675) --py-files option is appended without passing value for it

Farhan Khan created SPARK-32675:
-----------------------------------

             Summary: --py-files option is appended without passing value for it
                 Key: SPARK-32675
                 URL: https://issues.apache.org/jira/browse/SPARK-32675
             Project: Spark
          Issue Type: Bug
          Components: Mesos
    Affects Versions: 3.0.0
            Reporter: Farhan Khan


Submitted application passing --py-files option in a hardcoded manner for a Mesos Cluster in cluster mode using REST Submission API. It is causing a simple Java-based SparkPi job to fail.

This Bug is introduced by SPARK-26466.

Here is the example job submission:
{code:bash}
curl -X POST http://localhost:7077/v1/submissions/create --header "Content-Type:application/json" --data '{
"action": "CreateSubmissionRequest",
"appResource": "file:///opt/spark-3.0.0-bin-3.2.0/examples/jars/spark-examples_2.12-3.0.0.jar",
"clientSparkVersion": "3.0.0",
"appArgs": ["30"],
"environmentVariables": {},
"mainClass": "org.apache.spark.examples.SparkPi",
"sparkProperties": {
  "spark.jars": "file:///opt/spark-3.0.0-bin-3.2.0/examples/jars/spark-examples_2.12-3.0.0.jar",
  "spark.driver.supervise": "false",
  "spark.executor.memory": "512m",
  "spark.driver.memory": "512m",
  "spark.submit.deployMode": "cluster",
  "spark.app.name": "SparkPi",
  "spark.master": "mesos://localhost:5050"
}}'
{code}
Expected Dispatcher output would contain:
{code}
Unable to find source-code formatter for language: log. Available languages are: actionscript, ada, applescript, bash, c, c#, c++, cpp, css, erlang, go, groovy, haskell, html, java, javascript, js, json, lua, none, nyan, objc, perl, php, python, r, rainbow, ruby, scala, sh, sql, swift, visualbasic, xml, yaml20/08/20 20:19:57 WARN DependencyUtils: Local jar /var/lib/mesos/slaves/e6779377-08ec-4765-9bfc-d27082fbcfa1-S0/frameworks/e6779377-08ec-4765-9bfc-d27082fbcfa1-0000/executors/driver-20200820201954-0002/runs/d9d734e8-a299-4d87-8f33-b134c65c422b/spark.driver.memory=512m does not exist, skipping.
Error: Failed to load class org.apache.spark.examples.SparkPi.
20/08/20 20:19:57 INFO ShutdownHookManager: Shutdown hook called
{code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org