You are viewing a plain text version of this content. The canonical link for it is here.
Posted to derby-user@db.apache.org by vadali <sh...@gmail.com> on 2010/08/02 13:00:41 UTC

backing up the DB

Hello,

I have a few tables (with FK), that i need to save and restore. 
My original plan was to save the entire directory of the database on one
machine, then reload it on another machine.

The problem is that i might want to only save a portion of that db (say only
lines in tables that are newer then a certain date - i do keep the date for
each line).

whats my best method? 

I thought about manually coping all the interesting lines from one db to
another one (lets name it backupdb), but that could be slow.. 
another option i thought, would be to copy the entire db to backupdb, then
removing all the lines i dont need, but when i tried that, the size of the
backupdb directory was even larger then the original...

Do you have any better options?

best regards,
vadali
-- 
View this message in context: http://old.nabble.com/backing-up-the-DB-tp29324036p29324036.html
Sent from the Apache Derby Users mailing list archive at Nabble.com.


Re: backing up the DB

Posted by vadali <sh...@gmail.com>.
Hey Dag,
Thank you for your reply :)

I ended up solving it by 
http://db.apache.org/derby/docs/10.4/tools/ctoolsimport16245.html using the
bulk import and export procedures .

VTI seems really interesting, thank you for pointing this one out for me
too.

Vadali



Dag H. Wanvik-2 wrote:
> 
> vadali <sh...@gmail.com> writes:
> 
>> Hi Kristian, Thank you for your reply
>>
>> The db can be off when i back it up. I am unfamiliar with any compress
>> routines.. any pointers would be helpful. 
> 
> Have a look here:
> 
> http://db.apache.org/derby/docs/10.4/adminguide/cadminspace21579.html
> http://db.apache.org/derby/docs/10.6/ref/rrefaltertablecompress.html
> 
>>
>> I dont really mind how the filtering is done (up front or on the backup
>> copy), i just dont want it to be too long. 
>>
>> my main goal is to burn the directory onto a cd or dvd, but since it can
>> easily exceed the size of a single media, i tar it, slice it to multiple
>> files of fixed sizes, and burn each on a different media (quite a
>> lengthly
>> process..), where later i could reattach the files, untar it, load up a
>> db
>> onto this folder, and query all i want :)
>>
>> I am not so sure what a VTI is, or how to write such a script that will
>> do a
>> sort of a database copy..
> 
> VTIs are described here:
> http://db.apache.org/derby/docs/10.6/devguide/cdevspecialtabfuncs.html
> 
> Dag
> 
> 

-- 
View this message in context: http://old.nabble.com/backing-up-the-DB-tp29324036p29470258.html
Sent from the Apache Derby Users mailing list archive at Nabble.com.


Re: backing up the DB

Posted by "Dag H. Wanvik" <da...@oracle.com>.
vadali <sh...@gmail.com> writes:

> Hi Kristian, Thank you for your reply
>
> The db can be off when i back it up. I am unfamiliar with any compress
> routines.. any pointers would be helpful. 

Have a look here:

http://db.apache.org/derby/docs/10.4/adminguide/cadminspace21579.html
http://db.apache.org/derby/docs/10.6/ref/rrefaltertablecompress.html

>
> I dont really mind how the filtering is done (up front or on the backup
> copy), i just dont want it to be too long. 
>
> my main goal is to burn the directory onto a cd or dvd, but since it can
> easily exceed the size of a single media, i tar it, slice it to multiple
> files of fixed sizes, and burn each on a different media (quite a lengthly
> process..), where later i could reattach the files, untar it, load up a db
> onto this folder, and query all i want :)
>
> I am not so sure what a VTI is, or how to write such a script that will do a
> sort of a database copy..

VTIs are described here:
http://db.apache.org/derby/docs/10.6/devguide/cdevspecialtabfuncs.html

Dag

Re: backing up the DB

Posted by vadali <sh...@gmail.com>.
Hi Kristian, Thank you for your reply

The db can be off when i back it up. I am unfamiliar with any compress
routines.. any pointers would be helpful. 

I dont really mind how the filtering is done (up front or on the backup
copy), i just dont want it to be too long. 

my main goal is to burn the directory onto a cd or dvd, but since it can
easily exceed the size of a single media, i tar it, slice it to multiple
files of fixed sizes, and burn each on a different media (quite a lengthly
process..), where later i could reattach the files, untar it, load up a db
onto this folder, and query all i want :)

I am not so sure what a VTI is, or how to write such a script that will do a
sort of a database copy..

sorry for my noobity, i am not mainly a db programmer.. 

the db can reach a few gb's of space over a certain duration of time, but
usually the user will want to burn much less (say only the data from the
previous day..). 

i hope i gave all the information you ment,
thank you for your help,
Vadali


-- 
View this message in context: http://old.nabble.com/backing-up-the-DB-tp29324036p29325754.html
Sent from the Apache Derby Users mailing list archive at Nabble.com.


Re: backing up the DB

Posted by Kristian Waagan <kr...@oracle.com>.
On Mon, Aug 02, 2010 at 04:00:41AM -0700, vadali wrote:
> 
> Hello,
> 
> I have a few tables (with FK), that i need to save and restore. 
> My original plan was to save the entire directory of the database on one
> machine, then reload it on another machine.
> 
> The problem is that i might want to only save a portion of that db (say only
> lines in tables that are newer then a certain date - i do keep the date for
> each line).
> 

Hi Vadali,

What are your requirements for uptime on the primary db?
Do you have to use online backup, or can you block update transactions
for a while?

> whats my best method? 
> 
> I thought about manually coping all the interesting lines from one db to
> another one (lets name it backupdb), but that could be slow.. 
> another option i thought, would be to copy the entire db to backupdb, then
> removing all the lines i dont need, but when i tried that, the size of the
> backupdb directory was even larger then the original...

This is probably because Derby isn't releasing unused space to the OS
before it is told to do so. You should run the compress routines on the
tables from which you deleted a large number of rows.
The extra space may be log records, which will be deleted after a checkpoint / clean shutdown.

> 
> Do you have any better options?

Seems to me you have to decide if you want to filter the data up front
or after you have created the backup.
In the former case you could write your own "database copy" using SQL to
filter the data (or maybe write a VTI), in the latter case you would
have to account for the time spent deleting unwanted data and then
compressing the tables to save space.

It might also help people to come up with better answers if you tell us
roughly how large you expect your database to be.


Regards,
-- 
Kristian

> 
> best regards,
> vadali
> -- 
> View this message in context: http://old.nabble.com/backing-up-the-DB-tp29324036p29324036.html
> Sent from the Apache Derby Users mailing list archive at Nabble.com.
>