You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2018/10/08 05:43:00 UTC

[jira] [Resolved] (SPARK-25651) spark-shell gets wrong version of spark on windows

     [ https://issues.apache.org/jira/browse/SPARK-25651?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Hyukjin Kwon resolved SPARK-25651.
----------------------------------
    Resolution: Not A Problem

> spark-shell gets wrong version of spark on windows
> --------------------------------------------------
>
>                 Key: SPARK-25651
>                 URL: https://issues.apache.org/jira/browse/SPARK-25651
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Shell
>    Affects Versions: 2.3.2
>         Environment: Windows 10, running spark 2.3.2
>            Reporter: Nick Sutcliffe
>            Priority: Major
>
> I have multiple versions of spark on my computer, and in particular SPARK_HOME set to a spark 2.0.2 installation.
> If I browse to the bin directory of my spark 2.3.2 installation and run spark-shell, it incorrectly uses my spark 2.0.2 installation for SPARK_HOME. It seems that in spark-shell2.cmd, previously it set SPARK_HOME as follows (verified in spark 2.0.2 and spark 2.2.0):
> `set SPARK_HOME=%~dp0..`
> However this is not present in spark 2.3.2, instead calling find-spark-home.cmd which appears to be incorrectly assuming to take the environment variable if it exists.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org