You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Nick Sutcliffe (JIRA)" <ji...@apache.org> on 2018/10/05 16:55:00 UTC

[jira] [Created] (SPARK-25651) spark-shell gets wrong version of spark on windows

Nick Sutcliffe created SPARK-25651:
--------------------------------------

             Summary: spark-shell gets wrong version of spark on windows
                 Key: SPARK-25651
                 URL: https://issues.apache.org/jira/browse/SPARK-25651
             Project: Spark
          Issue Type: Bug
          Components: Spark Shell
    Affects Versions: 2.3.2
         Environment: Windows 10, running spark 2.3.2
            Reporter: Nick Sutcliffe


I have multiple versions of spark on my computer, and in particular SPARK_HOME set to a spark 2.0.2 installation.

If I browse to the bin directory of my spark 2.3.2 installation and run spark-shell, it incorrectly uses my spark 2.0.2 installation for SPARK_HOME. It seems that in spark-shell2.cmd, previously it set SPARK_HOME as follows (verified in spark 2.0.2 and spark 2.2.0):

`set SPARK_HOME=%~dp0..`

However this is not present in spark 2.3.2 and so it uses your environment defaults, resulting in the wrong version of spark starting.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org