You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Kal El <pi...@yahoo.com> on 2014/01/17 13:04:10 UTC

cannot run sbt/sbt assembly

Hello,

I have tried to assemble spark (sbt/sbt assembly) with different versions of java (open JDK, sun hotspot) on an ARM v7 Cortex A15 architecture Samsung Exynos SoC  and I got the following error:

 A fatal error has been detected by the Java Runtime Environment:
#
#  Internal Error (os_linux_zero.cpp:285), pid=3039, tid=50648176
#  fatal error: caught unhandled signal 11
#
# JRE version: 7.0_21-b02

I have attached the log from one of the attempts.
Can anyone figure out why this is happening ?

P.S: I am using Scala 2.9.3 and I have used Spark on a x86 machine before, so this is not the first time I am setting up Spark.

Thanks,
Alex

Re: cannot run sbt/sbt assembly

Posted by Nicolas Seyvet <se...@yahoo.com>.
Use scala 2.9.2.  From what I read 2.9.3 is not supported.
You might want to try a later version of the JDK 7.0_51



On Friday, January 17, 2014 1:07 PM, Kal El <pi...@yahoo.com> wrote:
 
Hello,

I have tried to assemble spark (sbt/sbt assembly) with different versions of java (open JDK, sun hotspot) on an ARM v7 Cortex A15 architecture Samsung Exynos SoC  and I got the following error:

 A fatal error has been detected by the Java Runtime Environment:
#
#  Internal Error (os_linux_zero.cpp:285), pid=3039, tid=50648176
#  fatal error: caught unhandled signal 11
#
# JRE version: 7.0_21-b02

I have attached the log from one of the attempts.
Can anyone figure out why this is happening ?

P.S: I am using Scala 2.9.3 and I have used Spark on a x86 machine before, so this is not the first time I am setting up Spark.

Thanks,
Alex