You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@camel.apache.org by ebinsingh <eb...@VerizonWireless.com> on 2012/04/27 22:07:04 UTC

Aggregating huge files

Hi,

I have an scenario where I need to aggregate small files containing XML data
(Records) into large ones containing 50000 xml data (Records).

But ended up in Java heap memory issue.


Is there a way to stream the records into a file till i reach 50000 records
and then move on to create another file and start writing to it.

Appreciate your help.

Ebe


--
View this message in context: http://camel.465427.n5.nabble.com/Aggregating-huge-files-tp5671368p5671368.html
Sent from the Camel - Users mailing list archive at Nabble.com.

Re: Aggregating huge files

Posted by Christian Müller <ch...@gmail.com>.
You can use the file producer to write the content into a file and use the
"fileExist=Append" option.
Use the Aggregator "only" to determine when do you aggregate all exchanges.

Best,
Christian

On Fri, Apr 27, 2012 at 10:07 PM, ebinsingh <
ebenezer.singh@verizonwireless.com> wrote:

> Hi,
>
> I have an scenario where I need to aggregate small files containing XML
> data
> (Records) into large ones containing 50000 xml data (Records).
>
> But ended up in Java heap memory issue.
>
>
> Is there a way to stream the records into a file till i reach 50000 records
> and then move on to create another file and start writing to it.
>
> Appreciate your help.
>
> Ebe
>
>
> --
> View this message in context:
> http://camel.465427.n5.nabble.com/Aggregating-huge-files-tp5671368p5671368.html
> Sent from the Camel - Users mailing list archive at Nabble.com.
>