You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@flink.apache.org by sohi mankotia <so...@gmail.com> on 2017/09/26 10:27:43 UTC
Re: java.lang.OutOfMemoryError: Java heap space at com.google.protobuf.AbstractMessageLite.toByteArray(AbstractMessageLite.java:62)
Hi Stefan ,
Here is main class code :
final String outFile = getOutFileName(backupDir);
final Set<String> keys = getAllRedisKeys(parameters);
final ExecutionEnvironment env =
ExecutionEnvironment.getExecutionEnvironment();
env.setParallelism(parallelism);
env
.fromCollection(keys)
.rebalance()
.flatMap(new RedisRead()).withParameters(parameters.getConfiguration())
.writeAsText(outFile);
env.execute();
RedisRead -> simply read key one by one .
Memory Per task : 2048 MB
Attaching flink logs .
Let me know if I can help anything more .
Thanks and Regards
Sohi
On Tue, Sep 26, 2017 at 2:50 PM, Stefan Richter <s.richter@data-artisans.com
> wrote:
> Hi,
>
> could you give us some more information like the size of your heap space,
> information about where and how you implemented access to Redis and how you
> kep the retrieved data, and most importantly a stack trace or (much better)
> a log?
>
> Best,
> Stefan
>
> > Am 26.09.2017 um 06:52 schrieb sohimankotia <so...@gmail.com>:
> >
> > Hi,
> >
> > I am getting Java Heap Space error while running Flink Job (Flink 1.2 ) .
> >
> > Use case : I am getting all keys from REDIS with specific pattern . Then
> > streaming over those keys and reading data from Redis for those key and
> > writing to file in HDFS .
> >
> > Job was running fine for few days but suddenly started giving Heap Space
> > error .
> >
> > Job Parallelism : 9 ( 3 nodes and 3 slots)
> >
> > Total Redis Keys : 1031966 ( aprox 33 MB in size (keys only))
> >
> >
> >
> > Logs :
> >
> >
> >
> >
> >
> >
> >
> > --
> > Sent from: http://apache-flink-user-mailing-list-archive.2336050.
> n4.nabble.com/
>
>