You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@subversion.apache.org by Neil Gower <ne...@digitalextremes.com> on 2004/07/13 23:02:12 UTC

possible large repository problems...

I've been running svn on a Debian (woody) system for a few months now. 
I'm still using the 1.0.0 packages that Colin Watson released, since I 
didn't want to risk destabilizing our production environment with any 
non-essential upgrades.  Which brings me to my first question:

1. Are there any critical fixes in 1.0.5?  (Security is not a big 
concern, this is for intranet use).


Also, we recently hit the 2GB strings-file limit, which breaks the 
hotcopy functionality (as per #1819).  From what I can see in the list 
archives, the workaround are:

a) svnadmin dump (really slow)
b) lock out all of your users, cross your fingers, and copy the 
repository manually.

Which raises my second question:

2. What is the best (current) practice for backing up large repositories?


And finally, a related question to the above:

3. Does the 2GB file limit affect any other areas of svn?

I ask because clients have suddenly started timing out on one of our 
(large) repositories, using http:// or file://.  Maybe it's something 
else, but I thought I'd check before I start worrying about drive 
failures and such.


Wisdom of the list will be much appreciated.  :-)

Cheers,

-- 
Neil Gower
Brainbox Games




---------------------------------------------------------------------
To unsubscribe, e-mail: users-unsubscribe@subversion.tigris.org
For additional commands, e-mail: users-help@subversion.tigris.org

Re: possible large repository problems...

Posted by Ben Collins-Sussman <su...@collab.net>.
On Wed, 2004-07-14 at 17:05, Neil Gower wrote:

> So just to double check, there are no (known) limits on repository size, 
>   other than the hot-backup issue?

No known limits with BDB and sizes of databases.  The 'svnadmin hotcopy'
only gives you an error because it's calling apr_copy_file(), which will
choke on >2GB files unless you've compiled APR with 'largefile' support.




---------------------------------------------------------------------
To unsubscribe, e-mail: users-unsubscribe@subversion.tigris.org
For additional commands, e-mail: users-help@subversion.tigris.org

Re: possible large repository problems...

Posted by Neil Gower <ne...@digitalextremes.com>.
kfogel@collab.net wrote:

> Neil Gower <ne...@digitalextremes.com> writes:
> 
>>3. Does the 2GB file limit affect any other areas of svn?
>>
>>I ask because clients have suddenly started timing out on one of our
>>(large) repositories, using http:// or file://.  Maybe it's something
>>else, but I thought I'd check before I start worrying about drive
>>failures and such.
> 
> I don't know.  If they up their 'http-timeout' parameters in their
> ~/.subversion/servers files, does the problem go away?
> 
> Oh wait, you said this is happening even with file:// ???  How can
> there even be a "timeout" error with file:// access, hmmm...

Looks like it was a wedged repository (phew!).  Running "svnadmin 
recover" took a really long time (left it overnight), which made things 
a bit tense.  :-o  That's with a 6GB+ repository.  Do you expect the 
changes in 1.1 will improve performance on this sort of thing?  Maybe 
switching to the fs-backend (won't have to run recover any more at least)?

So just to double check, there are no (known) limits on repository size, 
  other than the hot-backup issue?

-- 
Neil Gower
Brainbox Games




---------------------------------------------------------------------
To unsubscribe, e-mail: users-unsubscribe@subversion.tigris.org
For additional commands, e-mail: users-help@subversion.tigris.org

Re: possible large repository problems...

Posted by kf...@collab.net.
Neil Gower <ne...@digitalextremes.com> writes:
> 1. Are there any critical fixes in 1.0.5?  (Security is not a big
> concern, this is for intranet use).

Nope, it was only a security fix.

(See the CHANGES file for details.)

> Also, we recently hit the 2GB strings-file limit, which breaks the
> hotcopy functionality (as per #1819).  From what I can see in the list
> archives, the workaround are:
> 
> a) svnadmin dump (really slow)
> b) lock out all of your users, cross your fingers, and copy the
> repository manually.
> 
> Which raises my second question:
> 
> 2. What is the best (current) practice for backing up large repositories?

I think hotcopy is still the most convenient way, if you recompile APR
as described in issue #1819.

If you can afford to invest a little bit of time (3-5 hours), you
might want to look at contrib/server-side/backup-recipe.sh in the
Subversion tree.  That's a sample shell script demonstrating a
backup/restore mechanism.  You'd have to write "real" scripts based on
the code in there, but the examples are extensively commented.

(If you do this, and your scripts are general enough, please post them
to the dev@ list, by the way.)

> And finally, a related question to the above:
> 
> 3. Does the 2GB file limit affect any other areas of svn?
> 
> I ask because clients have suddenly started timing out on one of our
> (large) repositories, using http:// or file://.  Maybe it's something
> else, but I thought I'd check before I start worrying about drive
> failures and such.

I don't know.  If they up their 'http-timeout' parameters in their
~/.subversion/servers files, does the problem go away?

Oh wait, you said this is happening even with file:// ???  How can
there even be a "timeout" error with file:// access, hmmm...

What are the exact errors people are seeing?

-Karl

---------------------------------------------------------------------
To unsubscribe, e-mail: users-unsubscribe@subversion.tigris.org
For additional commands, e-mail: users-help@subversion.tigris.org