You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by emlyn <em...@swiftkey.com> on 2015/10/23 13:12:10 UTC

Re: Cannot start REPL shell since 1.4.0

xjlin0 wrote
> I cannot enter REPL shell in 1.4.0/1.4.1/1.5.0/1.5.1(with pre-built with
> or without Hadoop or home compiled with ant or maven).  There was no error
> message in v1.4.x, system prompt nothing.  On v1.5.x, once I enter
> $SPARK_HOME/bin/pyspark or spark-shell, I got
> 
> Error: Could not find or load main class org.apache.spark.launcher.Main

I have the same problem (on MacOS X Yosemite, all spark versions since 1.4,
installed both with homebrew and downloaded manually). I've been trying to
start the pyspark shell, but it also fails in the same way for spark-shell
and spark-sql and spark-submit. I've narrowed it down to the following line
in the spark-class script:

done < <("$RUNNER" -cp "$LAUNCH_CLASSPATH" org.apache.spark.launcher.Main
"$@")

(where $RUNNER is "java" and $LAUNCH_CLASSPATH is
"/usr/local/Cellar/apache-spark/1.5.1/libexec/lib/spark-assembly-1.5.1-hadoop2.6.0.jar",
which does exist and does contain the org.apache.spark.launcher.Main class,
despite the message that it can't be found)

If I run it manually, using:

SPARK_HOME=/usr/local/Cellar/apache-spark/1.5.1/libexec java -cp
/usr/local/Cellar/apache-spark/1.5.1/libexec/lib/spark-assembly-1.5.1-hadoop2.6.0.jar
org.apache.spark.launcher.Main org.apache.spark.deploy.SparkSubmit
pyspark-shell-main --name PySparkShell

It runs without that error, and instead prints out (where "\0" is a nul
character):

env\0PYSPARK_SUBMIT_ARGS="--name" "PySparkShell" "pyspark-shell"\0python\0

I'm not really sure what to try next, maybe with this extra information
someone has an idea what's going wrong, and how to fix it.



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Cannot-start-REPL-shell-since-1-4-0-tp24921p25176.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: Cannot start REPL shell since 1.4.0

Posted by Emlyn Corrin <em...@swiftkey.com>.
JAVA_HOME is unset.
I've also tried setting it with:
export JAVA_HOME=$(/usr/libexec/java_home)
which sets it to
"/Library/Java/JavaVirtualMachines/jdk1.8.0_31.jdk/Contents/Home" and I
still get the same problem.

On 23 October 2015 at 14:37, Jonathan Coveney <jc...@gmail.com> wrote:

> do you have JAVA_HOME set to a java 7 jdk?
>
> 2015-10-23 7:12 GMT-04:00 emlyn <em...@swiftkey.com>:
>
>> xjlin0 wrote
>> > I cannot enter REPL shell in 1.4.0/1.4.1/1.5.0/1.5.1(with pre-built with
>> > or without Hadoop or home compiled with ant or maven).  There was no
>> error
>> > message in v1.4.x, system prompt nothing.  On v1.5.x, once I enter
>> > $SPARK_HOME/bin/pyspark or spark-shell, I got
>> >
>> > Error: Could not find or load main class org.apache.spark.launcher.Main
>>
>> I have the same problem (on MacOS X Yosemite, all spark versions since
>> 1.4,
>> installed both with homebrew and downloaded manually). I've been trying to
>> start the pyspark shell, but it also fails in the same way for spark-shell
>> and spark-sql and spark-submit. I've narrowed it down to the following
>> line
>> in the spark-class script:
>>
>> done < <("$RUNNER" -cp "$LAUNCH_CLASSPATH" org.apache.spark.launcher.Main
>> "$@")
>>
>> (where $RUNNER is "java" and $LAUNCH_CLASSPATH is
>>
>> "/usr/local/Cellar/apache-spark/1.5.1/libexec/lib/spark-assembly-1.5.1-hadoop2.6.0.jar",
>> which does exist and does contain the org.apache.spark.launcher.Main
>> class,
>> despite the message that it can't be found)
>>
>> If I run it manually, using:
>>
>> SPARK_HOME=/usr/local/Cellar/apache-spark/1.5.1/libexec java -cp
>>
>> /usr/local/Cellar/apache-spark/1.5.1/libexec/lib/spark-assembly-1.5.1-hadoop2.6.0.jar
>> org.apache.spark.launcher.Main org.apache.spark.deploy.SparkSubmit
>> pyspark-shell-main --name PySparkShell
>>
>> It runs without that error, and instead prints out (where "\0" is a nul
>> character):
>>
>> env\0PYSPARK_SUBMIT_ARGS="--name" "PySparkShell" "pyspark-shell"\0python\0
>>
>> I'm not really sure what to try next, maybe with this extra information
>> someone has an idea what's going wrong, and how to fix it.
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/Cannot-start-REPL-shell-since-1-4-0-tp24921p25176.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>> For additional commands, e-mail: user-help@spark.apache.org
>>
>>
>


-- 
*Emlyn Corrin*

Software Engineer | SwiftKey |
emlyn@swiftkey.com | www.swiftkey.com | @swiftkey
<http://www.twitter.com/swiftkey> | fb.com/swiftkey




SwiftKey and the SwiftKey logo are registered trade marks of TouchType Ltd,
a limited company registered in England and Wales, number 06671487

UK Headquarters: SwiftKey, 91-95 Southwark Bridge Road, London, SE1 0AX, UK

CONFIDENTIALITY NOTICE: The information in this e-mail is confidential and
privileged; it is intended for use solely by the individual or entity named
as the recipient hereof. Disclosure, copying, distribution, or use of the
contents of this e-mail by persons other than the intended recipient is
strictly prohibited and may violate applicable laws. If you have received
this e-mail in error, please delete the original message and notify us by
email immediately. Thank you. TouchType Ltd.

Re: Cannot start REPL shell since 1.4.0

Posted by Jonathan Coveney <jc...@gmail.com>.
do you have JAVA_HOME set to a java 7 jdk?

2015-10-23 7:12 GMT-04:00 emlyn <em...@swiftkey.com>:

> xjlin0 wrote
> > I cannot enter REPL shell in 1.4.0/1.4.1/1.5.0/1.5.1(with pre-built with
> > or without Hadoop or home compiled with ant or maven).  There was no
> error
> > message in v1.4.x, system prompt nothing.  On v1.5.x, once I enter
> > $SPARK_HOME/bin/pyspark or spark-shell, I got
> >
> > Error: Could not find or load main class org.apache.spark.launcher.Main
>
> I have the same problem (on MacOS X Yosemite, all spark versions since 1.4,
> installed both with homebrew and downloaded manually). I've been trying to
> start the pyspark shell, but it also fails in the same way for spark-shell
> and spark-sql and spark-submit. I've narrowed it down to the following line
> in the spark-class script:
>
> done < <("$RUNNER" -cp "$LAUNCH_CLASSPATH" org.apache.spark.launcher.Main
> "$@")
>
> (where $RUNNER is "java" and $LAUNCH_CLASSPATH is
>
> "/usr/local/Cellar/apache-spark/1.5.1/libexec/lib/spark-assembly-1.5.1-hadoop2.6.0.jar",
> which does exist and does contain the org.apache.spark.launcher.Main class,
> despite the message that it can't be found)
>
> If I run it manually, using:
>
> SPARK_HOME=/usr/local/Cellar/apache-spark/1.5.1/libexec java -cp
>
> /usr/local/Cellar/apache-spark/1.5.1/libexec/lib/spark-assembly-1.5.1-hadoop2.6.0.jar
> org.apache.spark.launcher.Main org.apache.spark.deploy.SparkSubmit
> pyspark-shell-main --name PySparkShell
>
> It runs without that error, and instead prints out (where "\0" is a nul
> character):
>
> env\0PYSPARK_SUBMIT_ARGS="--name" "PySparkShell" "pyspark-shell"\0python\0
>
> I'm not really sure what to try next, maybe with this extra information
> someone has an idea what's going wrong, and how to fix it.
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Cannot-start-REPL-shell-since-1-4-0-tp24921p25176.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>

Re: Cannot start REPL shell since 1.4.0

Posted by emlyn <em...@swiftkey.com>.
emlyn wrote
> 
> xjlin0 wrote
>> I cannot enter REPL shell in 1.4.0/1.4.1/1.5.0/1.5.1(with pre-built with
>> or without Hadoop or home compiled with ant or maven).  There was no
>> error message in v1.4.x, system prompt nothing.  On v1.5.x, once I enter
>> $SPARK_HOME/bin/pyspark or spark-shell, I got
>> 
>> Error: Could not find or load main class org.apache.spark.launcher.Main
> I have the same problem

In case anyone else has the same problem: I found that the problem only
occurred under my login, not under a new clean user. After some
investigation, I found that I had "GREP_OPTIONS='--color=always'" in my
environment, which was messing up the output of grep with colour codes. I
changed that to "GREP_OPTIONS='--color=auto'" and now it works.



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Cannot-start-REPL-shell-since-1-4-0-tp24921p25182.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org