You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Masayoshi TSUZUKI (JIRA)" <ji...@apache.org> on 2015/04/02 08:36:52 UTC

[jira] [Created] (SPARK-6673) spark-shell.cmd can't start even when spark was built in Windows

Masayoshi TSUZUKI created SPARK-6673:
----------------------------------------

             Summary: spark-shell.cmd can't start even when spark was built in Windows
                 Key: SPARK-6673
                 URL: https://issues.apache.org/jira/browse/SPARK-6673
             Project: Spark
          Issue Type: Bug
          Components: Windows
    Affects Versions: 1.3.0
            Reporter: Masayoshi TSUZUKI


spark-shell.cmd can't start.

{code}
bin\spark-shell.cmd --master local
{code}
will get
{code}
Failed to find Spark assembly JAR.
You need to build Spark before running this program.
{code}
even when we have built spark.

This is because of the lack of the environment {{SPARK_SCALA_VERSION}} which is used in {{spark-class2.cmd}}.
In linux scripts, this value is set as {{2.10}} or {{2.11}} by default in {{load-spark-env.sh}}, but there are no equivalent script in Windows.

As workaround, by executing
{code}
set SPARK_SCALA_VERSION=2.10
{code}
before execute spark-shell.cmd, we can successfully start it.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org