You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@storm.apache.org by prasad ch <ch...@outlook.com> on 2015/05/07 15:54:51 UTC

java.lang.OutOfMemoryError

HI,
When i run storm application , which contains one spout and bolt here my spout is reading data from fileswhile my bolt is to receive tuples and write into a file(just reading and writing.)i did the example with 2 gb data  of 10 files each file 200 mb, when i run example i got  below  error

java.lang.OutOfMemoryError: GC overhead limit exceeded at java.lang.reflect.Method.copy(Method.java:151) at java.lang.reflect.ReflectAccess.copyMethod(ReflectAccess.java:136) at sun.reflect.Reflect

what is the reason for above error , any configuration need to do in storm.yaml ,by default nimbus.childOpts contains Xmx1024mb
please help me.
what are the basic properties need to configure with large data like 2 gb or more  please help me


Thanks!
prasad.ch
 		 	   		  

Re: java.lang.OutOfMemoryError

Posted by "Grant Overby (groverby)" <gr...@cisco.com>.
Nothing too obvious jumps out at me.

How long are the lines?
What is worker.childopts set to?
[http://www.cisco.com/web/europe/images/email/signature/est2014/logo_06.png?ct=1398192119726]

Grant Overby
Software Engineer
Cisco.com<http://www.cisco.com/>
groverby@cisco.com<ma...@cisco.com>
Mobile: 865 724 4910






[http://www.cisco.com/assets/swa/img/thinkbeforeyouprint.gif] Think before you print.

This email may contain confidential and privileged material for the sole use of the intended recipient. Any review, use, distribution or disclosure by others is strictly prohibited. If you are not the intended recipient (or authorized to receive for the recipient), please contact the sender by reply email and delete all copies of this message.

Please click here<http://www.cisco.com/web/about/doing_business/legal/cri/index.html> for Company Registration Information.





From: prasad ch <ch...@outlook.com>>
Reply-To: "user@storm.apache.org<ma...@storm.apache.org>" <us...@storm.apache.org>>
Date: Thursday, May 7, 2015 at 10:37 AM
To: storm-users <us...@storm.apache.org>>
Subject: RE: java.lang.OutOfMemoryError

t.get

Re: java.lang.OutOfMemoryError

Posted by "Grant Overby (groverby)" <gr...@cisco.com>.
I’d advise removing it, unless you have a specific reason for having it in there. Many things (but perhaps not storm, idk) in the hadoop stack use zero copy nio mechanics. -XX+DisabledExplicitGC can break them.
[http://www.cisco.com/web/europe/images/email/signature/est2014/logo_06.png?ct=1398192119726]

Grant Overby
Software Engineer
Cisco.com<http://www.cisco.com/>
groverby@cisco.com<ma...@cisco.com>
Mobile: 865 724 4910






[http://www.cisco.com/assets/swa/img/thinkbeforeyouprint.gif] Think before you print.

This email may contain confidential and privileged material for the sole use of the intended recipient. Any review, use, distribution or disclosure by others is strictly prohibited. If you are not the intended recipient (or authorized to receive for the recipient), please contact the sender by reply email and delete all copies of this message.

Please click here<http://www.cisco.com/web/about/doing_business/legal/cri/index.html> for Company Registration Information.





From: Jeff Maass <JM...@cccis.com>>
Reply-To: "user@storm.apache.org<ma...@storm.apache.org>" <us...@storm.apache.org>>
Date: Thursday, May 7, 2015 at 11:28 AM
To: "user@storm.apache.org<ma...@storm.apache.org>" <us...@storm.apache.org>>
Subject: Re: java.lang.OutOfMemoryError

After having read some advice, purely as a defensive measure against rogue spout / bolt code.  It may not be appropriate, if the Storm management code itself does explicit Garbage Collection.  I cannot recall having searched through the codebase to find if that is the case or not.

From: "Grant Overby (groverby)" <gr...@cisco.com>>
Reply-To: "user@storm.apache.org<ma...@storm.apache.org>" <us...@storm.apache.org>>
Date: 2015,Thursday, May 7 at 10:17
To: "user@storm.apache.org<ma...@storm.apache.org>" <us...@storm.apache.org>>
Subject: Re: java.lang.OutOfMemoryError

Why -XX:+DisableExplicitGC ?

[http://www.cisco.com/web/europe/images/email/signature/est2014/logo_06.png?ct=1398192119726]

Grant Overby
Software Engineer
Cisco.com<http://www.cisco.com/>
groverby@cisco.com<ma...@cisco.com>
Mobile: 865 724 4910






[http://www.cisco.com/assets/swa/img/thinkbeforeyouprint.gif] Think before you print.

This email may contain confidential and privileged material for the sole use of the intended recipient. Any review, use, distribution or disclosure by others is strictly prohibited. If you are not the intended recipient (or authorized to receive for the recipient), please contact the sender by reply email and delete all copies of this message.

Please click here<http://www.cisco.com/web/about/doing_business/legal/cri/index.html> for Company Registration Information.





From: Jeff Maass <JM...@cccis.com>>
Reply-To: "user@storm.apache.org<ma...@storm.apache.org>" <us...@storm.apache.org>>
Date: Thursday, May 7, 2015 at 11:08 AM
To: "user@storm.apache.org<ma...@storm.apache.org>" <us...@storm.apache.org>>
Subject: Re: java.lang.OutOfMemoryError

-XX:+DisableExplicitGC

Re: java.lang.OutOfMemoryError

Posted by Jeff Maass <JM...@cccis.com>.
After having read some advice, purely as a defensive measure against rogue spout / bolt code.  It may not be appropriate, if the Storm management code itself does explicit Garbage Collection.  I cannot recall having searched through the codebase to find if that is the case or not.

From: "Grant Overby (groverby)" <gr...@cisco.com>>
Reply-To: "user@storm.apache.org<ma...@storm.apache.org>" <us...@storm.apache.org>>
Date: 2015,Thursday, May 7 at 10:17
To: "user@storm.apache.org<ma...@storm.apache.org>" <us...@storm.apache.org>>
Subject: Re: java.lang.OutOfMemoryError

Why -XX:+DisableExplicitGC ?

[http://www.cisco.com/web/europe/images/email/signature/est2014/logo_06.png?ct=1398192119726]

Grant Overby
Software Engineer
Cisco.com<http://www.cisco.com/>
groverby@cisco.com<ma...@cisco.com>
Mobile: 865 724 4910






[http://www.cisco.com/assets/swa/img/thinkbeforeyouprint.gif] Think before you print.

This email may contain confidential and privileged material for the sole use of the intended recipient. Any review, use, distribution or disclosure by others is strictly prohibited. If you are not the intended recipient (or authorized to receive for the recipient), please contact the sender by reply email and delete all copies of this message.

Please click here<http://www.cisco.com/web/about/doing_business/legal/cri/index.html> for Company Registration Information.





From: Jeff Maass <JM...@cccis.com>>
Reply-To: "user@storm.apache.org<ma...@storm.apache.org>" <us...@storm.apache.org>>
Date: Thursday, May 7, 2015 at 11:08 AM
To: "user@storm.apache.org<ma...@storm.apache.org>" <us...@storm.apache.org>>
Subject: Re: java.lang.OutOfMemoryError

-XX:+DisableExplicitGC

Re: java.lang.OutOfMemoryError

Posted by "Grant Overby (groverby)" <gr...@cisco.com>.
Why -XX:+DisableExplicitGC ?

[http://www.cisco.com/web/europe/images/email/signature/est2014/logo_06.png?ct=1398192119726]

Grant Overby
Software Engineer
Cisco.com<http://www.cisco.com/>
groverby@cisco.com<ma...@cisco.com>
Mobile: 865 724 4910






[http://www.cisco.com/assets/swa/img/thinkbeforeyouprint.gif] Think before you print.

This email may contain confidential and privileged material for the sole use of the intended recipient. Any review, use, distribution or disclosure by others is strictly prohibited. If you are not the intended recipient (or authorized to receive for the recipient), please contact the sender by reply email and delete all copies of this message.

Please click here<http://www.cisco.com/web/about/doing_business/legal/cri/index.html> for Company Registration Information.





From: Jeff Maass <JM...@cccis.com>>
Reply-To: "user@storm.apache.org<ma...@storm.apache.org>" <us...@storm.apache.org>>
Date: Thursday, May 7, 2015 at 11:08 AM
To: "user@storm.apache.org<ma...@storm.apache.org>" <us...@storm.apache.org>>
Subject: Re: java.lang.OutOfMemoryError

-XX:+DisableExplicitGC

Re: java.lang.OutOfMemoryError

Posted by Jeff Maass <JM...@cccis.com>.
http://stackoverflow.com/questions/1393486/error-java-lang-outofmemoryerror-gc-overhead-limit-exceeded
http://www.oracle.com/technetwork/java/javase/gc-tuning-6-140523.html#par_gc.oom
http://www.jvmhost.com/articles/what-is-java-lang-outofmemoryerror-gc-overhead-limit-exceeded
http://javaeesupportpatterns.blogspot.com/2011/08/gc-overhead-limit-exceeded-problem-and.html

This error is specifically thrown by the parallel / concurrent collectors.  Being a newbie at JVMs, and in my testing, having run very, very simple code, I came to the conclusion that perhaps there might be a Garbage Collector more appropriate for Storm.  I recently have moved the workers to using G1GC and have seen far fewer problems with nimbus monitoring executor and supervisor monitoring worker timeouts.

This is what I ended up setting in my storm.yaml :


worker.childopts: |

-XX:+UseG1GC

    -Xms1024M

    -XX:MaxGCPauseMillis=20000

    -XX:+DisableExplicitGC

-Dcom.sun.management.jmxremote

    -Dcom.sun.management.jmxremote.ssl=false

    -Dcom.sun.management.jmxremote.authenticate=false

    -Dcom.sun.management.jmxremote.port=2%ID%




From: prasad ch <ch...@outlook.com>>
Reply-To: "user@storm.apache.org<ma...@storm.apache.org>" <us...@storm.apache.org>>
Date: 2015,Thursday, May 7 at 09:37
To: storm-users <us...@storm.apache.org>>
Subject: RE: java.lang.OutOfMemoryError





[https://bay169.mail.live.com/ol/clear.gif][https://bay169.mail.live.com/ol/clear.gif][https://bay169.mail.live.com/ol/clear.gif][https://bay169.mail.live.com/ol/clear.gif]
RE: Unable to Process Tuples In Storm cluster mode
prasad ch


06-05-2015

[https://bay169.mail.live.com/ol/clear.gif]<https://bay169.mail.live.com/ol/#>
 <https://bay169.mail.live.com/ol/#>
To: storm-users
<https://bay169.mail.live.com/ol/#>
<https://bay169.mail.live.com/ol/#>
[https://a.gfx.ms/ic/bluemanmxl.png]

HI FYI,



please help me

> > ----------------------------------------------
> > MainApplication .java
> > ......................................
> > public class MainApplication {
> > public static void main(String[] args) throws Exception {
> > Config con = new Config();
> > con.setDebug(true);
> > int i = Integer.parseInt(args[0]);
> > con.put("fileLocation", args[1]);
> > con.put("outFile", args[2]);
> > con.put("ext", ".txt");
> > con.setNumWorkers(Integer.parseInt(args[4]));
> > TopologyBuilder builder = new TopologyBuilder();
> > builder.setSpout("FileREADING-SPOUT", new FileReadingSpout(), i);
> > builder.setBolt("TEST-BOLT", new TestBolt(),
> > i).shuffleGrouping("FileREADING-SPOUT");
> > StormSubmitter.submitTopology(args[3], con,
> > builder.createTopology()); } } }
> >
> >
> > FileReadingSpout.java
> > ..................................
> >
> >
> > public class FileReadingSpout implements IRichSpout {
> > static int count=0;
> > long windowTime;
> > SpoutOutputCollector collector;
> > TopologyContext context;
> > String fileLocation, ext;
> > @Override
> > public void open(Map conf, TopologyContext context,
> > SpoutOutputCollector collector) {
> > try {
> >
> >
> > System.out.println("********************************************************************");
> > System.out.println(" FileReadingSpout ********* "+new
> > Date()+" acker"+conf.get("TOPOLOGY_MESSAGE_TIMEOUT_SECS"));
> >
> > System.out.println("********************************************************************");
> >
> > this.context = context;
> > this.collector = collector;
> > this.fileLocation = conf.get("fileLocation").toString();
> > this.ext = conf.get("ext").toString();
> > } catch (Exception e) {
> > e.printStackTrace();
> > }
> > }
> >
> > @Override
> > public void close() {
> > }
> >
> > @Override
> > public void activate() {
> > }
> >
> > @Override
> > public void deactivate() {
> > }
> >
> > @Override
> > public void nextTuple() {
> > String line;
> > try {
> > System.out.println(Thread.currentThread().getName() + "in
> > side ReadingSpout ...*" + new Date());
> > String l = getFileNames(fileLocation, ext);
> > if (l.length() > 0) {
> > System.out.println("** list of files count:" + l);
> > File oldName = new File(l);
> > String file = l.replace(".txt", ".Reading");
> > File newName = new File(file);
> > oldName.renameTo(newName);
> > readFiles(file);
> > }
> > } catch (Exception e) {
> > e.printStackTrace();
> > }
> >
> > }
> >
> > @Override
> > public void ack(Object msgId) {
> > System.out.println(count+" In side ack ::* " + msgId);
> > }
> >
> > @Override
> > public void fail(Object msgId) {
> > }
> >
> > @Override
> > public void declareOutputFields(OutputFieldsDeclarer declarer) {
> >
> > declarer.declare(new Fields("line"));
> >
> > }
> >
> > @Override
> > public Map<String, Object> getComponentConfiguration() {
> > return null;
> > }
> >
> > public static long stringToLong(String st) {
> > long result = 0L;
> > try {
> > if (st != null) {
> > SimpleDateFormat sdf = new SimpleDateFormat("yyyy-MM-dd
> > HH:mm:ss");
> > Date d = sdf.parse(st);
> > result = d.getTime();
> > }
> > } catch (Exception e) {
> > e.printStackTrace();
> > }
> > return result;
> > }
> >
> > public void readFiles(String fileName) {
> > String data = null;
> > String arr[];
> > BufferedReader reader = null;
> > try {
> > reader = new BufferedReader(new FileReader(fileName));
> >
> > System.out.println("=====================*====================");
> > System.out.println(" current reading fileName is :" +
> > fileName);
> > while ((data = reader.readLine()) != null && data.length()>0) {
> > count++;
> > collector.emit(new Values(data),data);
> > }
> >
> > } catch (IOException e) {
> > System.out.println(" file Not found :" + e);
> > } finally {
> > try {
> > if (reader != null) {
> > reader.close();
> > System.out.println(" reader is closed successfully !");
> > }
> > } catch (IOException e) {
> > System.out.println("reader is not closed properly .." + e);
> > }
> > }
> >
> > }
> >
> > public String getFileNames(String folderpath, String extention) {
> > List<String> fileNames = new ArrayList();
> >
> > File file = new File(folderpath);
> > final String ext = extention;
> > File[] files = file.listFiles(new FilenameFilter() {
> > @Override
> > public boolean accept(File dir, String name) {
> > if (name.toLowerCase().endsWith(ext) &&
> > !name.contains("ReadingComplete")) {
> > return true;
> > } else {
> > return false;
> > }
> > }
> > });
> > Arrays.sort(files,
> > LastModifiedFileComparator.LASTMODIFIED_COMPARATOR);
> > for (File f : files) {
> > return f.getAbsolutePath();
> > }
> > return "";
> > }
> >
> > }
> >
> > TestBolt.java
> > ..........................
> >
> > public class TestBolt implements IRichBolt {
> > OutputCollector collector;
> > BufferedWriter writer;
> > String filePath;
> > @Override
> > public void prepare(Map stormConf, TopologyContext context,
> > OutputCollector collector) {
> > this.collector = collector;
> > try {
> >
> > System.out.println("++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++");
> > System.out.println(stormConf.get("outFile").toString()+"
> > TESTBOLT ********* "+new Date()+"
> > acker"+stormConf.get("topology.environment"));
> >
> > System.out.println("++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++");
> >
> > writer = new BufferedWriter(new
> > FileWriter(stormConf.get("outFile").toString(),true));
> > } catch (Exception e) {
> > e.printStackTrace();
> > }
> > }
> >
> > @Override
> > public void execute(Tuple input) {
> > System.out.println("^^^^^^^^^^^^^^^^^^");
> > System.out.println(" tuple data is :" + input.getString(0));
> > try {
> > collector.ack(input);
> > collector.emit(new Values(input));
> > writer.write(input.getString(0) + " \n");
> > writer.flush();
> >
> > } catch (Exception e) {
> > e.printStackTrace();
> > }
> >
> > }
> > @Override
> > public void cleanup() {
> > System.out.println(" in side clean up");
> > }
> > @Override
> > public void declareOutputFields(OutputFieldsDeclarer declarer) {
> > declarer.declare(new Fields("data"));
> > }
> > @Override
> > public Map<String, Object> getComponentConfiguration() {
> > return null;
> > }
> > }
> >


Thanks !

________________________________
From: groverby@cisco.com<ma...@cisco.com>
To: user@storm.apache.org<ma...@storm.apache.org>
Subject: Re: java.lang.OutOfMemoryError
Date: Thu, 7 May 2015 14:24:04 +0000

GC Overhead limit exceeded usually indicates you are rapidly instantiating and releasing objects to the extent that the JVM is spending too much time in GC.

You could try increasing the worker heap size, but I'd take a look at your topology first to see if you can calm down the instantiate / release thrashing.
[http://www.cisco.com/web/europe/images/email/signature/est2014/logo_06.png?ct=1398192119726]

Grant Overby
Software Engineer
Cisco.com<http://www.cisco.com/>
groverby@cisco.com<ma...@cisco.com>
Mobile: 865 724 4910






[http://www.cisco.com/assets/swa/img/thinkbeforeyouprint.gif] Think before you print.
This email may contain confidential and privileged material for the sole use of the intended recipient. Any review, use, distribution or disclosure by others is strictly prohibited. If you are not the intended recipient (or authorized to receive for the recipient), please contact the sender by reply email and delete all copies of this message.
Please click here<http://www.cisco.com/web/about/doing_business/legal/cri/index.html> for Company Registration Information.






From: prasad ch <ch...@outlook.com>>
Reply-To: "user@storm.apache.org<ma...@storm.apache.org>" <us...@storm.apache.org>>
Date: Thursday, May 7, 2015 at 9:54 AM
To: storm-users <us...@storm.apache.org>>
Subject: java.lang.OutOfMemoryError

HI,

When i run storm application , which contains one spout and bolt here my spout is reading data from files
while my bolt is to receive tuples and write into a file(just reading and writing.)
i did the example with 2 gb data  of 10 files each file 200 mb, when i run example i got  below  error


java.lang.OutOfMemoryError: GC overhead limit exceeded at java.lang.reflect.Method.copy(Method.java:151) at java.lang.reflect.ReflectAccess.copyMethod(ReflectAccess.java:136) at sun.reflect.Reflect


what is the reason for above error , any configuration need to do in storm.yaml ,by default nimbus.childOpts contains Xmx1024mb

please help me.

what are the basic properties need to configure with large data like 2 gb or more  please help me



Thanks!

prasad.ch


RE: java.lang.OutOfMemoryError

Posted by prasad ch <ch...@outlook.com>.



RE: Unable to Process Tuples In Storm cluster modeprasad ch  06-05-2015  To: storm-usersHI FYI,
please help me
> > ----------------------------------------------> > MainApplication .java> > ......................................> > public class MainApplication {> > public static void main(String[] args) throws Exception {> > Config con = new Config();> > con.setDebug(true);> > int i = Integer.parseInt(args[0]);> > con.put("fileLocation", args[1]);> > con.put("outFile", args[2]);> > con.put("ext", ".txt");> > con.setNumWorkers(Integer.parseInt(args[4]));> > TopologyBuilder builder = new TopologyBuilder();> > builder.setSpout("FileREADING-SPOUT", new FileReadingSpout(), i);> > builder.setBolt("TEST-BOLT", new TestBolt(),> > i).shuffleGrouping("FileREADING-SPOUT");> > StormSubmitter.submitTopology(args[3], con,> > builder.createTopology()); } } }> > > > > > FileReadingSpout.java> > ..................................> > > > > > public class FileReadingSpout implements IRichSpout {> > static int count=0;> > long windowTime;> > SpoutOutputCollector collector;> > TopologyContext context;> > String fileLocation, ext;> > @Override> > public void open(Map conf, TopologyContext context,> > SpoutOutputCollector collector) {> > try {> > > > > > System.out.println("********************************************************************");> > System.out.println(" FileReadingSpout ********* "+new> > Date()+" acker"+conf.get("TOPOLOGY_MESSAGE_TIMEOUT_SECS"));> > > > System.out.println("********************************************************************");> > > > this.context = context;> > this.collector = collector;> > this.fileLocation = conf.get("fileLocation").toString();> > this.ext = conf.get("ext").toString();> > } catch (Exception e) {> > e.printStackTrace();> > }> > }> > > > @Override> > public void close() {> > }> > > > @Override> > public void activate() {> > }> > > > @Override> > public void deactivate() {> > }> > > > @Override> > public void nextTuple() {> > String line;> > try {> > System.out.println(Thread.currentThread().getName() + "in> > side ReadingSpout ...*" + new Date());> > String l = getFileNames(fileLocation, ext);> > if (l.length() > 0) {> > System.out.println("** list of files count:" + l);> > File oldName = new File(l);> > String file = l.replace(".txt", ".Reading");> > File newName = new File(file);> > oldName.renameTo(newName);> > readFiles(file);> > }> > } catch (Exception e) {> > e.printStackTrace();> > }> > > > }> > > > @Override> > public void ack(Object msgId) {> > System.out.println(count+" In side ack ::* " + msgId);> > }> > > > @Override> > public void fail(Object msgId) {> > }> > > > @Override> > public void declareOutputFields(OutputFieldsDeclarer declarer) {> > > > declarer.declare(new Fields("line"));> > > > }> > > > @Override> > public Map<String, Object> getComponentConfiguration() {> > return null;> > }> > > > public static long stringToLong(String st) {> > long result = 0L;> > try {> > if (st != null) {> > SimpleDateFormat sdf = new SimpleDateFormat("yyyy-MM-dd> > HH:mm:ss");> > Date d = sdf.parse(st);> > result = d.getTime();> > }> > } catch (Exception e) {> > e.printStackTrace();> > }> > return result;> > }> > > > public void readFiles(String fileName) {> > String data = null;> > String arr[];> > BufferedReader reader = null;> > try {> > reader = new BufferedReader(new FileReader(fileName));> > > > System.out.println("=====================*====================");> > System.out.println(" current reading fileName is :" +> > fileName);> > while ((data = reader.readLine()) != null && data.length()>0) {> > count++;> > collector.emit(new Values(data),data);> > }> > > > } catch (IOException e) {> > System.out.println(" file Not found :" + e);> > } finally {> > try {> > if (reader != null) {> > reader.close();> > System.out.println(" reader is closed successfully !");> > }> > } catch (IOException e) {> > System.out.println("reader is not closed properly .." + e);> > }> > }> > > > }> > > > public String getFileNames(String folderpath, String extention) {> > List<String> fileNames = new ArrayList();> > > > File file = new File(folderpath);> > final String ext = extention;> > File[] files = file.listFiles(new FilenameFilter() {> > @Override> > public boolean accept(File dir, String name) {> > if (name.toLowerCase().endsWith(ext) &&> > !name.contains("ReadingComplete")) {> > return true;> > } else {> > return false;> > }> > }> > });> > Arrays.sort(files,> > LastModifiedFileComparator.LASTMODIFIED_COMPARATOR);> > for (File f : files) {> > return f.getAbsolutePath();> > }> > return "";> > }> > > > }> > > > TestBolt.java> > ..........................> > > > public class TestBolt implements IRichBolt {> > OutputCollector collector;> > BufferedWriter writer;> > String filePath;> > @Override> > public void prepare(Map stormConf, TopologyContext context,> > OutputCollector collector) {> > this.collector = collector;> > try {> > > > System.out.println("++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++");> > System.out.println(stormConf.get("outFile").toString()+"> > TESTBOLT ********* "+new Date()+"> > acker"+stormConf.get("topology.environment"));> > > > System.out.println("++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++");> > > > writer = new BufferedWriter(new> > FileWriter(stormConf.get("outFile").toString(),true));> > } catch (Exception e) {> > e.printStackTrace();> > }> > }> > > > @Override> > public void execute(Tuple input) {> > System.out.println("^^^^^^^^^^^^^^^^^^");> > System.out.println(" tuple data is :" + input.getString(0));> > try {> > collector.ack(input);> > collector.emit(new Values(input));> > writer.write(input.getString(0) + " \n");> > writer.flush();> > > > } catch (Exception e) {> > e.printStackTrace();> > }> > > > }> > @Override> > public void cleanup() {> > System.out.println(" in side clean up");> > }> > @Override> > public void declareOutputFields(OutputFieldsDeclarer declarer) {> > declarer.declare(new Fields("data"));> > }> > @Override> > public Map<String, Object> getComponentConfiguration() {> > return null;> > }> > }> > 

Thanks !
From: groverby@cisco.com
To: user@storm.apache.org
Subject: Re: java.lang.OutOfMemoryError
Date: Thu, 7 May 2015 14:24:04 +0000







GC Overhead limit exceeded usually indicates you are rapidly instantiating and releasing objects to the extent that the JVM is spending too much time in GC.



You could try increasing the worker heap size, but I’d take a look at your topology first to see if you can calm down the instantiate / release thrashing.














Grant Overby

Software Engineer

Cisco.com

groverby@cisco.com

Mobile: 865 724 4910





















 Think before you print.



This email may contain confidential and privileged material for the sole use of the intended recipient. Any review, use, distribution or disclosure by others is strictly prohibited. If you are not the intended recipient (or authorized to receive for the
 recipient), please contact the sender by reply email and delete all copies of this message.

Please click here for Company Registration Information.























From: prasad ch <ch...@outlook.com>

Reply-To: "user@storm.apache.org" <us...@storm.apache.org>

Date: Thursday, May 7, 2015 at 9:54 AM

To: storm-users <us...@storm.apache.org>

Subject: java.lang.OutOfMemoryError







HI,



When i run storm application , which contains one spout and bolt here my spout is reading data from files
while my bolt is to receive tuples and write into a file(just reading and writing.)
i did the example with 2 gb data  of 10 files each file 200 mb, when i run example i got  below  error






java.lang.OutOfMemoryError: GC overhead limit exceeded at java.lang.reflect.Method.copy(Method.java:151) at java.lang.reflect.ReflectAccess.copyMethod(ReflectAccess.java:136) at sun.reflect.Reflect






what is the reason for above error , any configuration need to do in storm.yaml ,by default nimbus.childOpts contains Xmx1024mb



please help me.



what are the basic properties need to configure with large data like 2 gb or more  please help me









Thanks!



prasad.ch






 		 	   		  

Re: java.lang.OutOfMemoryError

Posted by "Grant Overby (groverby)" <gr...@cisco.com>.
GC Overhead limit exceeded usually indicates you are rapidly instantiating and releasing objects to the extent that the JVM is spending too much time in GC.

You could try increasing the worker heap size, but I’d take a look at your topology first to see if you can calm down the instantiate / release thrashing.
[http://www.cisco.com/web/europe/images/email/signature/est2014/logo_06.png?ct=1398192119726]

Grant Overby
Software Engineer
Cisco.com<http://www.cisco.com/>
groverby@cisco.com<ma...@cisco.com>
Mobile: 865 724 4910






[http://www.cisco.com/assets/swa/img/thinkbeforeyouprint.gif] Think before you print.

This email may contain confidential and privileged material for the sole use of the intended recipient. Any review, use, distribution or disclosure by others is strictly prohibited. If you are not the intended recipient (or authorized to receive for the recipient), please contact the sender by reply email and delete all copies of this message.

Please click here<http://www.cisco.com/web/about/doing_business/legal/cri/index.html> for Company Registration Information.





From: prasad ch <ch...@outlook.com>>
Reply-To: "user@storm.apache.org<ma...@storm.apache.org>" <us...@storm.apache.org>>
Date: Thursday, May 7, 2015 at 9:54 AM
To: storm-users <us...@storm.apache.org>>
Subject: java.lang.OutOfMemoryError

HI,

When i run storm application , which contains one spout and bolt here my spout is reading data from files
while my bolt is to receive tuples and write into a file(just reading and writing.)
i did the example with 2 gb data  of 10 files each file 200 mb, when i run example i got  below  error


java.lang.OutOfMemoryError: GC overhead limit exceeded at java.lang.reflect.Method.copy(Method.java:151) at java.lang.reflect.ReflectAccess.copyMethod(ReflectAccess.java:136) at sun.reflect.Reflect


what is the reason for above error , any configuration need to do in storm.yaml ,by default nimbus.childOpts contains Xmx1024mb

please help me.

what are the basic properties need to configure with large data like 2 gb or more  please help me



Thanks!

prasad.ch