You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@camel.apache.org by yazan <qe...@gmail.com> on 2013/04/08 16:27:31 UTC

JDBC component handling large results

The JDBC component currently reads all the rows returned by a query in one
shot, this only works for small resultsets. I'm trying to modify the
component to support cursor semantics, so it reads only X numbers of rows,
sets the rows in a new exchange's body and forwards it, rinse repeat until
all the rows returned from a query are processed. The JDBC component only
create a producer that executes a sql statement it received in a body of  an
exchange. My question is about the best approach to do this "1 exchange" to
a "series of exchanges" mapping, My research lead me to MulticastProcessor
it seems that I should make JDBCProducer extend it and then override
createProcessorExchangePairs to create an Iterable<ProcessorExchangePair>
that actually reads rows from the resultset to populate each new exchange,
Am I on the right track here?



--
View this message in context: http://camel.465427.n5.nabble.com/JDBC-component-handling-large-results-tp5730509.html
Sent from the Camel Development mailing list archive at Nabble.com.

Re: JDBC component handling large results

Posted by Claus Ibsen <cl...@gmail.com>.
Hi

You should just return something that is iterable, then the Camel
splitter EIP can use that in streaming mode.
Then you can support loading data on the fly, and page in data in
bulks as needed.





On Tue, Apr 9, 2013 at 11:26 AM, yazan <qe...@gmail.com> wrote:
> Of course, but I was hoping I can add support for something similar to spring
> batch JDBCurserItemReader or JDBCPagingItemReader where the framework
> handles most of the details of reading a large result.
>
>
>
> --
> View this message in context: http://camel.465427.n5.nabble.com/JDBC-component-handling-large-results-tp5730509p5730582.html
> Sent from the Camel Development mailing list archive at Nabble.com.



-- 
Claus Ibsen
-----------------
Red Hat, Inc.
FuseSource is now part of Red Hat
Email: cibsen@redhat.com
Web: http://fusesource.com
Twitter: davsclaus
Blog: http://davsclaus.com
Author of Camel in Action: http://www.manning.com/ibsen

Re: JDBC component handling large results

Posted by yazan <qe...@gmail.com>.
Of course, but I was hoping I can add support for something similar to spring
batch JDBCurserItemReader or JDBCPagingItemReader where the framework
handles most of the details of reading a large result.



--
View this message in context: http://camel.465427.n5.nabble.com/JDBC-component-handling-large-results-tp5730509p5730582.html
Sent from the Camel Development mailing list archive at Nabble.com.

Re: JDBC component handling large results

Posted by Christian Müller <ch...@gmail.com>.
You can limit the returned row count in your query. Does this work for you?

Sent from a mobile device
Am 08.04.2013 16:35 schrieb "yazan" <qe...@gmail.com>:

> The JDBC component currently reads all the rows returned by a query in one
> shot, this only works for small resultsets. I'm trying to modify the
> component to support cursor semantics, so it reads only X numbers of rows,
> sets the rows in a new exchange's body and forwards it, rinse repeat until
> all the rows returned from a query are processed. The JDBC component only
> create a producer that executes a sql statement it received in a body of
>  an
> exchange. My question is about the best approach to do this "1 exchange" to
> a "series of exchanges" mapping, My research lead me to MulticastProcessor
> it seems that I should make JDBCProducer extend it and then override
> createProcessorExchangePairs to create an Iterable<ProcessorExchangePair>
> that actually reads rows from the resultset to populate each new exchange,
> Am I on the right track here?
>
>
>
> --
> View this message in context:
> http://camel.465427.n5.nabble.com/JDBC-component-handling-large-results-tp5730509.html
> Sent from the Camel Development mailing list archive at Nabble.com.
>