You are viewing a plain text version of this content. The canonical link for it is here.
Posted to ddlutils-user@db.apache.org by KEL_KS <Ke...@DSIonline.com> on 2011/07/05 21:29:37 UTC
Problems working with large files
Hello;
DdlUtils is a great utlity, which we use in one of our applications to
import database records from one database into another. It works great
EXCEPT when one of our database files contains a REALLY LARGE amounf of
records, which is often the case and then DdlUtils simply runs out of
memory. Has anybody not encountered this before, and/or come up with a
solution? If I must I will modify the applicable classes to only read X
number of rows from the table (and process them) instead of grabbing ALL of
the rows in the table into a single ResultSet. I just cannot believe that
this is not been an issue for somebody else; maybe there is a way to handle
large files in DdlUtils and I just don't know about it. If anybody knows
how to get DdlUtils to work correctly with very large files, please E-mail
me at Kent.Lichty@DSIonline.com as I don't really want to have to re-invent
the wheel here. Thanks very much for your assistance.
--
View this message in context: http://old.nabble.com/Problems-working-with-large-files-tp31999643p31999643.html
Sent from the Apache DdlUtils - User mailing list archive at Nabble.com.