You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@camel.apache.org by fxthomas <fe...@gmail.com> on 2016/03/01 13:09:08 UTC

camel beanIO componenet , is it meant for parsing large files ???

hello, 

   I was planning to use the camel beanIO component , but I saw something in
its implementation that beats the reasoning of processing large files using
beanIO .

In readModels(Exchange exchange, InputStream stream) method

Object readObject;
            while ((readObject = in.read()) != null) {
                if (readObject instanceof BeanIOHeader) {
                    exchange.getOut().getHeaders().putAll(((BeanIOHeader)
readObject).getHeaders());
                }
                results.add(readObject);
            }
return results;
#
Here the List of objects are added and returned only after the whole file is
read, so in case of a very big file There could be memeory issues ?. Hope my
understanding is right .  
While in.read will only load one line at a time but the List of read Objects
will have a the whole file in memory. 




regards,
Felix T



--
View this message in context: http://camel.465427.n5.nabble.com/camel-beanIO-componenet-is-it-meant-for-parsing-large-files-tp5778470.html
Sent from the Camel - Users mailing list archive at Nabble.com.

Re: camel beanIO componenet , is it meant for parsing large files ???

Posted by fxthomas <fe...@gmail.com>.
Sure , i will take a look



--
View this message in context: http://camel.465427.n5.nabble.com/camel-beanIO-componenet-is-it-meant-for-parsing-large-files-tp5778470p5778489.html
Sent from the Camel - Users mailing list archive at Nabble.com.

Re: camel beanIO componenet , is it meant for parsing large files ???

Posted by Claus Ibsen <cl...@gmail.com>.
Hi

Yeah its not streaming supported. I think beanio has some special
streaming api for that - if I recall correctly.
You are welcome to dive into beanio and see what they have, and then
we can take a look at how we can add that into camel-beanio.



On Tue, Mar 1, 2016 at 1:09 PM, fxthomas <fe...@gmail.com> wrote:
> hello,
>
>    I was planning to use the camel beanIO component , but I saw something in
> its implementation that beats the reasoning of processing large files using
> beanIO .
>
> In readModels(Exchange exchange, InputStream stream) method
>
> Object readObject;
>             while ((readObject = in.read()) != null) {
>                 if (readObject instanceof BeanIOHeader) {
>                     exchange.getOut().getHeaders().putAll(((BeanIOHeader)
> readObject).getHeaders());
>                 }
>                 results.add(readObject);
>             }
> return results;
> #
> Here the List of objects are added and returned only after the whole file is
> read, so in case of a very big file There could be memeory issues ?. Hope my
> understanding is right .
> While in.read will only load one line at a time but the List of read Objects
> will have a the whole file in memory.
>
>
>
>
> regards,
> Felix T
>
>
>
> --
> View this message in context: http://camel.465427.n5.nabble.com/camel-beanIO-componenet-is-it-meant-for-parsing-large-files-tp5778470.html
> Sent from the Camel - Users mailing list archive at Nabble.com.



-- 
Claus Ibsen
-----------------
http://davsclaus.com @davsclaus
Camel in Action 2: https://www.manning.com/ibsen2