You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@camel.apache.org by "alex.popescu" <al...@urss.ro> on 2016/10/05 15:16:19 UTC
JDBC for Big queries brings evrithing into memory
Hi,
I am trying to use JDBC to run some big queires. I used stream list and read
size block, however does not seem do the job.
<to
uri="jdbc:dataSource?useHeadersAsParameters=false&outputType=StreamList&readSize=10000"
/>
<split streaming="true" >
...
The code never acutally reaches the streaming part, it fails during run of
the query.
In my scenario I can see that about about 1.000.000 records are retrieved in
memory before getting an OOM error.
I would expect that jdbc would not get more than 10.000 in memory as
configured by the read size.
Is there some config I am missing or does it work as designed?
Please advise, any hints are welcomed.
--
View this message in context: http://camel.465427.n5.nabble.com/JDBC-for-Big-queries-brings-evrithing-into-memory-tp5788393.html
Sent from the Camel - Users mailing list archive at Nabble.com.