You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Pankhuri Gupta <pa...@umich.edu> on 2013/11/20 16:26:52 UTC

Problem building and publishing Spark 0.8.0 incubator - java command gets killed

Hi,
	I am new to Spark and Scala. As a part of one of my projects, I am trying to build and locally publish spark-0.8.0-incubating on an Amazon ec2 cluster.
	After setting up all the java class paths and options, when I run :
	**	sbt/sbt compile , OR
	** 	sbt/sbt assembly, OR
	** 	sbt/sbt publish-local 
	The command runs for some time (approx 10 mins) and after that the java command simply gets killed. No error message is thrown out.	Here is a small snapshot of the messages:

	[info] Updating {file:/home/ec2-user/spark-0.8.0-incubating/}bagel... [info] Resolving cglib#cglib-nodep;2.2.2 … 
	[info] Done updating.
	[info] Compiling 258 Scala sources and 16 Java sources to /home/ec2-user/spark-0.8.0-incubating/core/target/scala-2.9.3/classes... sbt/sbt: line 30: 21454 Killed java -Xmx1200m -XX:MaxPermSize=350m -XX:ReservedCodeCacheSize=256m $EXTRA_ARGS $SBT_OPTS -jar "$SPARK_HOME"/sbt/sbt-launch-*.jar "$@"

	When I run the jstat, I get the following output:
	Timestamp S0 S1 E O P YGC YGCT FGC FGCT GCT LGCC GCC 
	 257.3 100.00 0.00 50.66 82.97 99.59 156 5.163 10 12.076 17.239 unknown GCCause No GC 
	 365.2 0.00 100.00 53.13 88.96 99.94 157 7.563 10 12.076 19.639 unknown GCCause No GC 
	 386.6 0.00 0.00 1.97 60.00 99.51 157 7.563 11 25.281 32.844 Permanent Generation Full No GC 
	 407.9 0.00 0.00 29.53 60.00 99.97 157 7.563 11 25.281 32.844 Permanent Generation Full No GC 
	 578.8 64.82 0.00 41.90 77.75 99.68 162 10.896 11 25.281 36.178 unknown GCCause No GC 
	 600.1 64.82 0.00 91.90 77.75 99.72 162 10.896 11 25.281 36.178 unknown GCCause No GC 
	664.2 77.92 70.32 100.00 99.94 99.71 168 12.451 11 25.281 37.732 unknown GCCause Allocation Failure

	I changed the memory limits as :  -XX:MaxPermSize=720m and  -XX:ReservedCodeCacheSize=512m, but still the problem persists.  
	I am not able to figure out the reason why the command is getting killed. Please let me know if I need to do some other checks. I read through many links on google and spark site as well but was not able to get any insight into this problem.

	I am using the following-
	Java version: 6
	Jvm :  1.6.0-openjdk.x86_64
	Scala Version : 2.9.3 installed

	Any help would be deeply appreciated.

Thanks,

Re: Problem building and publishing Spark 0.8.0 incubator - java command gets killed

Posted by Pankhuri Gupta <pa...@umich.edu>.
Thanks for the help.
	I will try deploying spark on a larger instance and then get back.

Best,
Pankhuri

On Nov 21, 2013, at 2:30 AM, Prashant Sharma <sc...@gmail.com> wrote:

> You mean t1.micro, The ram is less than a GB (615 MB) on those instances. It will not build. The size you are referring to is probably the storage size and not RAM. It might not be worth trying out spark on such instances. However if you plan to upgrade, chose atleast m1.large instances and then probably build on just one node and deploy to every other. 
> 
> HTH
> 
> 
> On Thu, Nov 21, 2013 at 12:49 PM, Pankhuri Gupta <pa...@umich.edu> wrote:
> The instance type is "ti.micro" with size as 7.9GB out of which 4.3GB is still available.
> For running spark (and later on hadoop), should i use a Storage Optimized instance or it can work on this as well?
> 
> On Nov 20, 2013, at 11:39 AM, Prashant Sharma <sc...@gmail.com> wrote:
> 
>> What is the instance type ?. Use an instance with atleast 4Gb+ RAM.  I don't think it is possible to build on less than that. Other option would be to use prebuilt binary.
>> 
>> 
>> On Wed, Nov 20, 2013 at 8:56 PM, Pankhuri Gupta <pa...@umich.edu> wrote:
>> Hi,
>> 	I am new to Spark and Scala. As a part of one of my projects, I am trying to build and locally publish spark-0.8.0-incubating on an Amazon ec2 cluster.
>> 	After setting up all the java class paths and options, when I run :
>> 	**	sbt/sbt compile , OR
>> 	** 	sbt/sbt assembly, OR
>> 	** 	sbt/sbt publish-local 
>> 	The command runs for some time (approx 10 mins) and after that the java command simply gets killed. No error message is thrown out.	Here is a small snapshot of the messages:
>> 
>> 	[info] Updating {file:/home/ec2-user/spark-0.8.0-incubating/}bagel... [info] Resolving cglib#cglib-nodep;2.2.2 … 
>> 	[info] Done updating.
>> 	[info] Compiling 258 Scala sources and 16 Java sources to /home/ec2-user/spark-0.8.0-incubating/core/target/scala-2.9.3/classes... sbt/sbt: line 30: 21454 Killed java -Xmx1200m -XX:MaxPermSize=350m -XX:ReservedCodeCacheSize=256m $EXTRA_ARGS $SBT_OPTS -jar "$SPARK_HOME"/sbt/sbt-launch-*.jar "$@"
>> 
>> 	When I run the jstat, I get the following output:
>> 	Timestamp S0 S1 E O P YGC YGCT FGC FGCT GCT LGCC GCC 
>> 	 257.3 100.00 0.00 50.66 82.97 99.59 156 5.163 10 12.076 17.239 unknown GCCause No GC 
>> 	 365.2 0.00 100.00 53.13 88.96 99.94 157 7.563 10 12.076 19.639 unknown GCCause No GC 
>> 	 386.6 0.00 0.00 1.97 60.00 99.51 157 7.563 11 25.281 32.844 Permanent Generation Full No GC 
>> 	 407.9 0.00 0.00 29.53 60.00 99.97 157 7.563 11 25.281 32.844 Permanent Generation Full No GC 
>> 	 578.8 64.82 0.00 41.90 77.75 99.68 162 10.896 11 25.281 36.178 unknown GCCause No GC 
>> 	 600.1 64.82 0.00 91.90 77.75 99.72 162 10.896 11 25.281 36.178 unknown GCCause No GC 
>> 	664.2 77.92 70.32 100.00 99.94 99.71 168 12.451 11 25.281 37.732 unknown GCCause Allocation Failure
>> 
>> 	I changed the memory limits as :  -XX:MaxPermSize=720m and  -XX:ReservedCodeCacheSize=512m, but still the problem persists.  
>> 	I am not able to figure out the reason why the command is getting killed. Please let me know if I need to do some other checks. I read through many links on google and spark site as well but was not able to get any insight into this problem.
>> 
>> 	I am using the following-
>> 	Java version: 6
>> 	Jvm :  1.6.0-openjdk.x86_64
>> 	Scala Version : 2.9.3 installed
>> 
>> 	Any help would be deeply appreciated.
>> 
>> Thanks,
>> 
>> 
>> 
>> -- 
>> s
> 
> 
> 
> 
> -- 
> s


Re: Problem building and publishing Spark 0.8.0 incubator - java command gets killed

Posted by Prashant Sharma <sc...@gmail.com>.
You mean t1.micro, The ram is less than a GB (615 MB) on those instances.
It will not build. The size you are referring to is probably the storage
size and not RAM. It might not be worth trying out spark on such instances.
However if you plan to upgrade, chose atleast m1.large instances and then
probably build on just one node and deploy to every other.

HTH


On Thu, Nov 21, 2013 at 12:49 PM, Pankhuri Gupta <pa...@umich.edu> wrote:

> The instance type is "ti.micro" with size as 7.9GB out of which 4.3GB is
> still available.
> For running spark (and later on hadoop), should i use a Storage Optimized
> instance or it can work on this as well?
>
> On Nov 20, 2013, at 11:39 AM, Prashant Sharma <sc...@gmail.com>
> wrote:
>
> What is the instance type ?. Use an instance with atleast 4Gb+ RAM.  I
> don't think it is possible to build on less than that. Other option would
> be to use prebuilt binary.
>
>
> On Wed, Nov 20, 2013 at 8:56 PM, Pankhuri Gupta <pa...@umich.edu>wrote:
>
>> Hi,
>> I am new to Spark and Scala. As a part of one of my projects, I am trying
>> to build and locally publish spark-0.8.0-incubating on an Amazon ec2
>> cluster.
>>  After setting up all the java class paths and options, when I run :
>> ** sbt/sbt compile , OR
>>  **  sbt/sbt assembly, OR
>> **  sbt/sbt publish-local
>>  The command runs for some time (approx 10 mins) and after that the java
>> command simply gets killed. No error message is thrown out. Here is a
>> small snapshot of the messages:
>>
>> [info] Updating {file:/home/ec2-user/spark-0.8.0-incubating/}bagel...
>> [info] Resolving cglib#cglib-nodep;2.2.2 …
>> [info] Done updating.
>>  [info] Compiling 258 Scala sources and 16 Java sources to
>> /home/ec2-user/spark-0.8.0-incubating/core/target/scala-2.9.3/classes... *sbt/sbt:
>> line 30: 21454 Killed java* -Xmx1200m -XX:MaxPermSize=350m
>> -XX:ReservedCodeCacheSize=256m $EXTRA_ARGS $SBT_OPTS -jar
>> "$SPARK_HOME"/sbt/sbt-launch-*.jar "$@"
>>
>> When I run the *jstat*, I get the following output:
>> Timestamp S0 S1 E O P YGC YGCT FGC FGCT GCT LGCC GCC
>>   257.3 100.00 0.00 50.66 82.97 99.59 156 5.163 10 12.076 17.239 unknown
>> GCCause No GC
>>  365.2 0.00 100.00 53.13 88.96 99.94 157 7.563 10 12.076 19.639 unknown
>> GCCause No GC
>>   386.6 0.00 0.00 1.97 60.00 99.51 157 7.563 11 25.281 32.844 *Permanent
>> Generation Full *No GC
>>  407.9 0.00 0.00 29.53 60.00 99.97 157 7.563 11 25.281 32.844 *Permanent
>> Generation Full* No GC
>>   578.8 64.82 0.00 41.90 77.75 99.68 162 10.896 11 25.281 36.178 unknown
>> GCCause No GC
>>  600.1 64.82 0.00 91.90 77.75 99.72 162 10.896 11 25.281 36.178 unknown
>> GCCause No GC
>>  664.2 77.92 70.32 100.00 99.94 99.71 168 12.451 11 25.281 37.732
>> unknown GCCause *Allocation Failure*
>>
>> I changed the memory limits as :  -XX:MaxPermSize=720m and
>> -XX:ReservedCodeCacheSize=512m, but still the problem persists.
>>  I am not able to figure out the reason why the command is getting
>> killed. Please let me know if I need to do some other checks. I read
>> through many links on google and spark site as well but was not able to get
>> any insight into this problem.
>>
>> I am using the following-
>> Java version: 6
>> Jvm :  1.6.0-openjdk.x86_64
>>  Scala Version : 2.9.3 installed
>>
>> Any help would be deeply appreciated.
>>
>> Thanks,
>>
>
>
>
> --
> s
>
>
>


-- 
s

Re: Problem building and publishing Spark 0.8.0 incubator - java command gets killed

Posted by Pankhuri Gupta <pa...@umich.edu>.
The instance type is "ti.micro" with size as 7.9GB out of which 4.3GB is still available.
For running spark (and later on hadoop), should i use a Storage Optimized instance or it can work on this as well?

On Nov 20, 2013, at 11:39 AM, Prashant Sharma <sc...@gmail.com> wrote:

> What is the instance type ?. Use an instance with atleast 4Gb+ RAM.  I don't think it is possible to build on less than that. Other option would be to use prebuilt binary.
> 
> 
> On Wed, Nov 20, 2013 at 8:56 PM, Pankhuri Gupta <pa...@umich.edu> wrote:
> Hi,
> 	I am new to Spark and Scala. As a part of one of my projects, I am trying to build and locally publish spark-0.8.0-incubating on an Amazon ec2 cluster.
> 	After setting up all the java class paths and options, when I run :
> 	**	sbt/sbt compile , OR
> 	** 	sbt/sbt assembly, OR
> 	** 	sbt/sbt publish-local 
> 	The command runs for some time (approx 10 mins) and after that the java command simply gets killed. No error message is thrown out.	Here is a small snapshot of the messages:
> 
> 	[info] Updating {file:/home/ec2-user/spark-0.8.0-incubating/}bagel... [info] Resolving cglib#cglib-nodep;2.2.2 … 
> 	[info] Done updating.
> 	[info] Compiling 258 Scala sources and 16 Java sources to /home/ec2-user/spark-0.8.0-incubating/core/target/scala-2.9.3/classes... sbt/sbt: line 30: 21454 Killed java -Xmx1200m -XX:MaxPermSize=350m -XX:ReservedCodeCacheSize=256m $EXTRA_ARGS $SBT_OPTS -jar "$SPARK_HOME"/sbt/sbt-launch-*.jar "$@"
> 
> 	When I run the jstat, I get the following output:
> 	Timestamp S0 S1 E O P YGC YGCT FGC FGCT GCT LGCC GCC 
> 	 257.3 100.00 0.00 50.66 82.97 99.59 156 5.163 10 12.076 17.239 unknown GCCause No GC 
> 	 365.2 0.00 100.00 53.13 88.96 99.94 157 7.563 10 12.076 19.639 unknown GCCause No GC 
> 	 386.6 0.00 0.00 1.97 60.00 99.51 157 7.563 11 25.281 32.844 Permanent Generation Full No GC 
> 	 407.9 0.00 0.00 29.53 60.00 99.97 157 7.563 11 25.281 32.844 Permanent Generation Full No GC 
> 	 578.8 64.82 0.00 41.90 77.75 99.68 162 10.896 11 25.281 36.178 unknown GCCause No GC 
> 	 600.1 64.82 0.00 91.90 77.75 99.72 162 10.896 11 25.281 36.178 unknown GCCause No GC 
> 	664.2 77.92 70.32 100.00 99.94 99.71 168 12.451 11 25.281 37.732 unknown GCCause Allocation Failure
> 
> 	I changed the memory limits as :  -XX:MaxPermSize=720m and  -XX:ReservedCodeCacheSize=512m, but still the problem persists.  
> 	I am not able to figure out the reason why the command is getting killed. Please let me know if I need to do some other checks. I read through many links on google and spark site as well but was not able to get any insight into this problem.
> 
> 	I am using the following-
> 	Java version: 6
> 	Jvm :  1.6.0-openjdk.x86_64
> 	Scala Version : 2.9.3 installed
> 
> 	Any help would be deeply appreciated.
> 
> Thanks,
> 
> 
> 
> -- 
> s


Re: Problem building and publishing Spark 0.8.0 incubator - java command gets killed

Posted by Prashant Sharma <sc...@gmail.com>.
What is the instance type ?. Use an instance with atleast 4Gb+ RAM.  I
don't think it is possible to build on less than that. Other option would
be to use prebuilt binary.


On Wed, Nov 20, 2013 at 8:56 PM, Pankhuri Gupta <pa...@umich.edu> wrote:

> Hi,
> I am new to Spark and Scala. As a part of one of my projects, I am trying
> to build and locally publish spark-0.8.0-incubating on an Amazon ec2
> cluster.
>  After setting up all the java class paths and options, when I run :
> ** sbt/sbt compile , OR
>  **  sbt/sbt assembly, OR
> **  sbt/sbt publish-local
>  The command runs for some time (approx 10 mins) and after that the java
> command simply gets killed. No error message is thrown out. Here is a
> small snapshot of the messages:
>
> [info] Updating {file:/home/ec2-user/spark-0.8.0-incubating/}bagel...
> [info] Resolving cglib#cglib-nodep;2.2.2 …
> [info] Done updating.
>  [info] Compiling 258 Scala sources and 16 Java sources to
> /home/ec2-user/spark-0.8.0-incubating/core/target/scala-2.9.3/classes... *sbt/sbt:
> line 30: 21454 Killed java* -Xmx1200m -XX:MaxPermSize=350m
> -XX:ReservedCodeCacheSize=256m $EXTRA_ARGS $SBT_OPTS -jar
> "$SPARK_HOME"/sbt/sbt-launch-*.jar "$@"
>
> When I run the *jstat*, I get the following output:
> Timestamp S0 S1 E O P YGC YGCT FGC FGCT GCT LGCC GCC
>   257.3 100.00 0.00 50.66 82.97 99.59 156 5.163 10 12.076 17.239 unknown
> GCCause No GC
>  365.2 0.00 100.00 53.13 88.96 99.94 157 7.563 10 12.076 19.639 unknown
> GCCause No GC
>   386.6 0.00 0.00 1.97 60.00 99.51 157 7.563 11 25.281 32.844 *Permanent
> Generation Full *No GC
>  407.9 0.00 0.00 29.53 60.00 99.97 157 7.563 11 25.281 32.844 *Permanent
> Generation Full* No GC
>   578.8 64.82 0.00 41.90 77.75 99.68 162 10.896 11 25.281 36.178 unknown
> GCCause No GC
>  600.1 64.82 0.00 91.90 77.75 99.72 162 10.896 11 25.281 36.178 unknown
> GCCause No GC
>  664.2 77.92 70.32 100.00 99.94 99.71 168 12.451 11 25.281 37.732 unknown
> GCCause *Allocation Failure*
>
> I changed the memory limits as :  -XX:MaxPermSize=720m and
> -XX:ReservedCodeCacheSize=512m, but still the problem persists.
>  I am not able to figure out the reason why the command is getting
> killed. Please let me know if I need to do some other checks. I read
> through many links on google and spark site as well but was not able to get
> any insight into this problem.
>
> I am using the following-
> Java version: 6
> Jvm :  1.6.0-openjdk.x86_64
>  Scala Version : 2.9.3 installed
>
> Any help would be deeply appreciated.
>
> Thanks,
>



-- 
s