You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Taras (Jira)" <ji...@apache.org> on 2022/04/06 16:57:00 UTC

[jira] [Commented] (SPARK-38808) Windows, Spark 3.2.1: spark-shell command throwing this error: SparkContext: Error initializing SparkContext

    [ https://issues.apache.org/jira/browse/SPARK-38808?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17518308#comment-17518308 ] 

Taras commented on SPARK-38808:
-------------------------------

Explanation of the bug and one workaround can be found below

https://stackoverflow.com/questions/69669524/spark-illegal-character-in-path/70159638#70159638

> Windows, Spark 3.2.1: spark-shell command throwing this error: SparkContext: Error initializing SparkContext
> ------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-38808
>                 URL: https://issues.apache.org/jira/browse/SPARK-38808
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Shell
>    Affects Versions: 3.2.1
>            Reporter: Taras
>            Priority: Major
>
> Can't start spark-shell on Windows for Spark 3.2.1. Downgrading Spark version to 3.1.3 fixed the problem. I need Pandas API on Spark, which is available on >3.2, so it's not a solution for me.
> The bug and the workaround are described in the link below as well
> https://stackoverflow.com/questions/69923603/spark-shell-command-throwing-this-error-sparkcontext-error-initializing-sparkc



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org