You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Talha Javed <im...@gmail.com> on 2021/05/11 20:21:22 UTC

Installation Error - Please Help!

Hello Team!
Hope you are doing well

I have downloaded the Apache Spark version (spark-3.1.1-bin-hadoop2.7). I
have downloaded the winutils file too from github.
Python version :Python 3.9.4
Java version: java version "1.8.0_291"
Java(TM) SE Runtime Environment (build 1.8.0_291-b10)
Java HotSpot(TM) 64-Bit Server VM (build 25.291-b10, mixed mode)

WHEN I ENTER THE COMMAND spark-shell in cmd it gives me this error
"'spark-shell' is not recognized as an internal or external command,operable
program or batch file."
I am sharing the screenshots of my environment variables. Please help me. I
am stuck now.

I am looking forward to hearing from you
Thanks & Regards

Re: Installation Error - Please Help!

Posted by Sean Owen <sr...@gmail.com>.
spark-shell is not on your path. Give the full path to it.

On Tue, May 11, 2021 at 4:10 PM Talha Javed <im...@gmail.com> wrote:

> Hello Team!
> Hope you are doing well
>
> I have downloaded the Apache Spark version (spark-3.1.1-bin-hadoop2.7). I
> have downloaded the winutils file too from github.
> Python version :Python 3.9.4
> Java version: java version "1.8.0_291"
> Java(TM) SE Runtime Environment (build 1.8.0_291-b10)
> Java HotSpot(TM) 64-Bit Server VM (build 25.291-b10, mixed mode)
>
> WHEN I ENTER THE COMMAND spark-shell in cmd it gives me this error
> "'spark-shell' is not recognized as an internal or external command,operable
> program or batch file."
> I am sharing the screenshots of my environment variables. Please help me.
> I am stuck now.
>
> I am looking forward to hearing from you
> Thanks & Regards
>
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscribe@spark.apache.org