You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@subversion.apache.org by Philip Martin <pm...@uklinux.net> on 2001/11/06 14:20:39 UTC

Concurrent access and "Delta source ended unexpectedly"

Hi

Looking through the code, subversion appears to lock the working copy
and the database, so working on the assumption that concurrent access
to a repository is supported, I tried testing using multiple clients
each in a separate working copy.

I have a couple of problems, one is that I can provoke deadlock
between the clients, the other is that I can provoke the following
error when trying to commit some changes:

  Sending         wcstress.1617/bar2/foo1

  svn_error: #21006 : <Incomplete data>
    commit failed: wc locks have been removed.

  svn_error: #21006 : <Incomplete data>
    commit failed:  while sending postfix text-deltas.

  svn_error: #21006 : <Incomplete data>
    Delta source ended unexpectedly

at which point the working copy seems to be broken.

My test is a script the invokes the Unix command line client, and it
works fine when I serialise the repository access. I am using
subversion revision 401, with apr, db and neon in the source tree, a
shared build and ra_local access.

Is it too early in development to expect concurrent access to work?

Is there anything special I need to do to enable concurrent access?

I will of course provide the script and instructions if this turns out
to be more than simply a configuration error on my part.

Philip


---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@subversion.tigris.org
For additional commands, e-mail: dev-help@subversion.tigris.org

Re: Concurrent access and "Delta source ended unexpectedly"

Posted by Karl Fogel <kf...@newton.ch.collab.net>.
Philip, this is great work, thanks!

Yes, concurrent access should work.  We tested it some before
Subversion went self-hosting, but it's never received the kind of
stress-testing it needs.  Until now. :-)

Can you dig in and find the causes of the two problems you
encountered?  (I just mean at least the proximate causes, although go
as deep as you care to, of course).  The deadlock issue...

> I have a couple of problems, one is that I can provoke deadlock
> between the clients, [...]

... sounds like something goes wrong on the server side (does
/blah/blah/BerkeleyDB.3.3/bin/db_recover help?  Can `svnadmin' and
`svnlook' still use the repository?), causing the clients to hang.
But can't be sure, needs investigation.

> the other is that I can provoke the following
> error when trying to commit some changes:
> 
>   Sending         wcstress.1617/bar2/foo1
> 
>   svn_error: #21006 : <Incomplete data>
>     commit failed: wc locks have been removed.
> 
>   svn_error: #21006 : <Incomplete data>
>     commit failed:  while sending postfix text-deltas.
> 
>   svn_error: #21006 : <Incomplete data>
>     Delta source ended unexpectedly
> 
> at which point the working copy seems to be broken.

Very interesting, never seen this before... How exactly is the wc
broken, can you determine?  I assume "broken" means when you try to do
further commits (or updates?), some error occurs, but what error, and
what's causing it?

-K

Philip Martin <pm...@uklinux.net> writes:
> > I have a couple of problems, one is that I can provoke deadlock
> > between the clients, the other is that I can provoke the following
> > error when trying to commit some changes:
> > 
> >   Sending         wcstress.1617/bar2/foo1
> > 
> >   svn_error: #21006 : <Incomplete data>
> >     commit failed: wc locks have been removed.
> > 
> >   svn_error: #21006 : <Incomplete data>
> >     commit failed:  while sending postfix text-deltas.
> > 
> >   svn_error: #21006 : <Incomplete data>
> >     Delta source ended unexpectedly
> > 
> > at which point the working copy seems to be broken.
> > 
> 
> This is still very easy to provoke on my system. The script I use to
> do it as appended to the end of this message. I have had a look in the
> debugger and text_delta.c:apply_delta() is generating this error. The
> amount of incomplete data always matches the number of characters that
> comprise the modification that is failing to commit, but there is just
> so much code behind svn_stream)read() that I haven't managed to track
> down the problem.
> 
> Here's what I do:
> 
> [NOTE: the script will destroy a directory/file called repostress in
> the current directory.]
> 
> I use three X-terms open on the same directory containing the script,
> which is called stress.pl (yes it's Perl, I don't do Python, and it
> was originally simply to automate some things I was doing).  In the
> first X-term I start the script using:
> 
>    $ ./stress.pl -c -s1
> 
> When the first commit messsages scroll through I start the script in
> the second X-term using:
> 
>    $ ./stress.pl -s1
> 
> Then watch as they both update and commit. Occasionally a merge
> conflict occurs:
> 
>    svn_error: #21061 : <Merge conflict during commit>
>      commit failed: wc locks have been removed.
> 
>    svn_error: #21061 : <Merge conflict during commit>
>      commit failed: while calling close_edit()
> 
>    svn_error: #21061 : <Merge conflict during commit>
>      conflict at "/bar1/foo1"
> 
> this is expected and gets automatically resolved (it's the sort of
> race I was looking to test). After typically 10-20 commits the error
> in my previous message occurs. If this affects only one instance of
> the script then the next commit by the other instance clears the
> problem! If it affects both instances then neither can commit and the
> scripts loop forever.
> 
> Once the scripts are looping I use the third X-term to touch a file
> called "stop" which causes the scripts to exit (this mechanism avoids
> any potential interrupt of an svn subprocess). At this stage I can
> attempt the commit by hand and it still fails.
> 
> About one run in twenty the scripts will deadlock, with each running
> an svn commit process.
> 
> If the script is executed without the -s option it waits for return to
> be pressed before each commit. In this mode I have committed over a
> hundred revisions, by ensuring that the instances don't race too
> much. One can provoke the race by pressing return quickly a few times
> in each X-term.
> 
> My machine is a dual processor x86 running linux kernel-2.4.14-pre6
> and libc-2.1.3. Subversion is built with apr, db, neon in the source
> tree, and it's a shared library build.
> 
> I would be interested to know if this problem reproducible elsewhere.
> After all, even if there is a bug in my script, since it is merely
> invoking svn it shouldn't be able to cause these problems.
> 
> Philip
> 
> 
> 
> 
> 
> 
> #!/usr/bin/perl -w
> 
> # This script constructs a repository, and populates it with
> # files. Then it loops making changes to a subset of the files and
> # committing the tree. When two instances are run in parallel
> # sometimes the commit will fail with a merge conflict. This is
> # expected, and is automatically resolved by updating.
> 
> # The files start off containing:
> #    A0
> #    0
> #    A1
> #    1
> #    A2
> #    .
> #    .
> #    A9
> #    9
> 
> # Each script has an ID in the range 0-9, and when it modifies a file
> # it modifes the line that starts with its ID. Thus scripts with
> # different IDs will make changes that can be merged automatically.
> 
> # The main loop is then:
> #
> #   step 1: modify a random selection of files
> #
> #   step 2: optional sleep or wait for RETURN keypress
> #
> #   step 3: update the working copy automatically merging out-of-date files
> #
> #   step 4: try to commit, if not successful go to step 3 otherwise go to step 1
> 
> # To allow break-out of potentially infinite loops, the script will
> # terminate if it detects the presence of a "stop file", the path to
> # which is specified with the -S option (default ./stop). This allows
> # the script to be stopped without any danger of interrupting an svn
> # sub-process, which experiment shows can cause problems with the
> # database locking.
> 
> use Getopt::Std;
> use File::Find;
> use File::Path;
> use Cwd;
> 
> # Repository check/create
> sub init_repo
>   {
>     my ( $repo, $create ) = @_;
>     if ( $create )
>       {
> 	rmtree([$repo]) if -e $repo;
> 	my $svnadmin_cmd = "svnadmin create $repo";
> 	system( $svnadmin_cmd) and die "$svnadmin_cmd: failed\n";
>       }
>     else
>       {
> 	my $svnadmin_cmd = "svnadmin youngest $repo";
> 	my $revision = readpipe $svnadmin_cmd;
> 	die "$svnadmin_cmd: failed\n" if not $revision =~ m{^[0-9]};
>       }
>     $repo = getcwd . "/$repo" if not $repo =~ m[^/];
>     return $repo;
>   }
> 
> # Check-out working copy
> sub check_out
>   {
>     my ( $url ) = @_;
>     my $wc_dir = "wcstress.$$";
>     mkdir "$wc_dir", 0755 or die "mkdir stress.$$: $!\n";
>     my $svn_cmd = "svn co $url -d $wc_dir";
>     system( $svn_cmd ) and die "$svn_cmd: failed\n";
>     return $wc_dir;
>   }
> 
> # Print status, update and commit. The update is to do any required merges.
> sub status_update_commit
>   {
>     my ( $wc_dir, $wait_for_key ) = @_;
>     my $svn_cmd = "svn st $wc_dir";
>     print "Status:\n";
>     system( $svn_cmd ) and die "$svn_cmd: failed\n";
>     print "Press return to update/commit\n" if $wait_for_key;
>     read STDIN, $wait_for_key, 1 if $wait_for_key;
>     print "Updating:\n";
>     $svn_cmd = "svn up $wc_dir";
>     system( $svn_cmd ) and die "$svn_cmd: failed\n";
>     print "Committing:\n";
>     my $now_time = localtime;
>     $svn_cmd = "svn ci $wc_dir -m '$now_time'";
>     return system( $svn_cmd );
>   }
> 
> # Get a list of all versioned files in the working copy
> {
>   my @get_list_of_files_helper_array;
>   sub GetListOfFilesHelper
>     {
>       $File::Find::prune = 1 if $File::Find::name =~ m[/.svn];
>       return if $File::Find::prune or -d;
>       push @get_list_of_files_helper_array, $File::Find::name;
>     }
>   sub GetListOfFiles
>     {
>       my ( $wc_dir ) = @_;
>       @get_list_of_files_helper_array = ();
>       find( \&GetListOfFilesHelper, $wc_dir);
>       return @get_list_of_files_helper_array;
>     }
> }
> 
> # Populate a working copy
> sub populate
>   {
>     my ( $dir, $dir_width, $file_width, $depth ) = @_;
>     return if not $depth--;
> 
>     for $nfile ( 1..$file_width )
>       {
> 	my $filename = "$dir/foo$nfile";
> 	open( FOO, ">$filename" ) or die "open $filename: $!\n";
> 
> 	for $line ( 0..9 )
> 	  {
> 	    print FOO "A$line\n$line\n" or die "write to $filename: $!\n";
> 	  }
> 	close FOO or die "close $filename:: $!\n";
> 
> 	my $svn_cmd = "svn add $filename";
> 	system( $svn_cmd ) and die "$svn_cmd: failed\n";
>       }
> 
>     if ( $depth )
>       {
> 	for $ndir ( 1..$dir_width )
> 	  {
> 	    my $dirname = "$dir/bar$ndir";
> 	    mkdir "$dirname", 0755 or die "mkdir $dirname: $!\n";
> 
> 	    my $svn_cmd = "svn add $dirname";
> 	    system( $svn_cmd ) and die "$svn_cmd: failed\n";
> 
> 	    populate( "$dirname", $dir_width, $file_width, $depth );
> 	  }
>       }
>   }
> 
> # Modify a versioned file in the working copy
> sub ModFile
>   {
>     my ( $filename, $mod_number, $id ) = @_;
> 
>     # Read file into memory replacing the line that starts with our ID
>     open( FOO, "<$filename" ) or die "open $filename: $!\n";
>     @lines = map { s[(^$id.*)][$1,$mod_number]; $_ } <FOO>;
>     close FOO or die "close $filename: $!\n";
> 
>     # Write the memory back to the file
>     open( FOO, ">$filename" ) or die "open $filename: $!\n";
>     print FOO or die "print $filename: $!\n" foreach @lines;
>     close FOO or die "close $filename: $!\n";
>   }
> 
> sub ParseCommandLine
>   {
>     my %cmd_opts;
> 
>     # defaults
>     $cmd_opts{'D'} = 2;            # number of subdirs per dir
>     $cmd_opts{'F'} = 2;            # number of files per dir
>     $cmd_opts{'N'} = 2;            # depth
>     $cmd_opts{'R'} = "repostress"; # repository name
>     $cmd_opts{'S'} = "stop";       # path of file to stop the script
>     $cmd_opts{'U'} = "none";       # URL
>     $cmd_opts{'c'} = 0;            # create repository
>     $cmd_opts{'i'} = 0;            # ID
>     $cmd_opts{'s'} = -1;           # sleep interval
>     $cmd_opts{'n'} = 200;          # sets of changes
>     $cmd_opts{'x'} = 4;            # files to modify
> 
>     getopts( 'ci:n:s:x:D:F:N:R:U:', \%cmd_opts )
>       or die "
> usage: stress [-c] [-i num] [-n num] [-s secs] [-x num]
>               [-D num] [-F num] [-N num] [-R path] [-S path] [-U url]
> where
>   -c cause repository creation
>   -i the ID
>   -n the number of sets of changes to commit
>   -s the sleep delay (-1 wait for key, 0 none)
>   -x the number of files to modify
>   -D the number of sub-directories per directory in the tree
>   -F the number of files per directory in the tree
>   -N the depth of the tree
>   -R the path to the repository
>   -S the path to the file whose presence stops this script
>   -U the URL to the repository (file:///<-R path> by default)
> ";
> 
>     # default ID if not set
>     $cmd_opts{'i'} = 1 + 5 * $cmd_opts{'c'} if not $cmd_opts{'i'};
> 
>     return %cmd_opts;
>   }
> 
> ############################################################################
> # Main
> 
> srand 123456789;
> 
> my %cmd_opts = ParseCommandLine();
> 
> my $repo = init_repo $cmd_opts{'R'}, $cmd_opts{'c'};
> 
> # Make URL from path if URL not explicitly specified
> $cmd_opts{'U'} = "file://$repo" if $cmd_opts{'U'} eq "none";
> 
> my $wc_dir = check_out $cmd_opts{'U'};
> 
> if ( $cmd_opts{'c'} )
>   {
>     populate $wc_dir, $cmd_opts{'D'}, $cmd_opts{'F'}, $cmd_opts{'N'};
>     status_update_commit $wc_dir, 0 and die "populate checkin failed\n";
>   }
> 
> my @wc_files = GetListOfFiles $wc_dir;
> die "not enough files in repository\n" if $#wc_files < $cmd_opts{'x'};
> 
> my $wait_for_key = $cmd_opts{'s'} < 0;
> 
> my $stop_file = $cmd_opts{'S'};
> 
> for $mod_number ( 1..$cmd_opts{'n'} )
>   {
>     my @chosen;
>     for ( 1..$cmd_opts{'x'} )
>       {
> 	# Extract random file from list and modify it
> 	my $mod_file = splice @wc_files, int rand $#wc_files, 1;
> 	ModFile $mod_file, $mod_number, $cmd_opts{'i'};
> 	push @chosen, $mod_file;
>       }
>     # Reinstate list of files, the order doesn't matter
>     push @wc_files, @chosen;
> 
>     # Loop committing until successful or the stop file is created
>     1 while not -e $stop_file and status_update_commit $wc_dir, $wait_for_key;
>     sleep $cmd_opts{'s'} if $cmd_opts{'s'} > 0;
> 
>     last if -e $stop_file;
>   }
> 
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscribe@subversion.tigris.org
> For additional commands, e-mail: dev-help@subversion.tigris.org

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@subversion.tigris.org
For additional commands, e-mail: dev-help@subversion.tigris.org

Re: Concurrent access and "Delta source ended unexpectedly"

Posted by Philip Martin <pm...@uklinux.net>.
Philip Martin <pm...@uklinux.net> writes:

> I have a couple of problems, one is that I can provoke deadlock
> between the clients, the other is that I can provoke the following
> error when trying to commit some changes:
> 
>   Sending         wcstress.1617/bar2/foo1
> 
>   svn_error: #21006 : <Incomplete data>
>     commit failed: wc locks have been removed.
> 
>   svn_error: #21006 : <Incomplete data>
>     commit failed:  while sending postfix text-deltas.
> 
>   svn_error: #21006 : <Incomplete data>
>     Delta source ended unexpectedly
> 
> at which point the working copy seems to be broken.
> 

This is still very easy to provoke on my system. The script I use to
do it as appended to the end of this message. I have had a look in the
debugger and text_delta.c:apply_delta() is generating this error. The
amount of incomplete data always matches the number of characters that
comprise the modification that is failing to commit, but there is just
so much code behind svn_stream)read() that I haven't managed to track
down the problem.

Here's what I do:

[NOTE: the script will destroy a directory/file called repostress in
the current directory.]

I use three X-terms open on the same directory containing the script,
which is called stress.pl (yes it's Perl, I don't do Python, and it
was originally simply to automate some things I was doing).  In the
first X-term I start the script using:

   $ ./stress.pl -c -s1

When the first commit messsages scroll through I start the script in
the second X-term using:

   $ ./stress.pl -s1

Then watch as they both update and commit. Occasionally a merge
conflict occurs:

   svn_error: #21061 : <Merge conflict during commit>
     commit failed: wc locks have been removed.

   svn_error: #21061 : <Merge conflict during commit>
     commit failed: while calling close_edit()

   svn_error: #21061 : <Merge conflict during commit>
     conflict at "/bar1/foo1"

this is expected and gets automatically resolved (it's the sort of
race I was looking to test). After typically 10-20 commits the error
in my previous message occurs. If this affects only one instance of
the script then the next commit by the other instance clears the
problem! If it affects both instances then neither can commit and the
scripts loop forever.

Once the scripts are looping I use the third X-term to touch a file
called "stop" which causes the scripts to exit (this mechanism avoids
any potential interrupt of an svn subprocess). At this stage I can
attempt the commit by hand and it still fails.

About one run in twenty the scripts will deadlock, with each running
an svn commit process.

If the script is executed without the -s option it waits for return to
be pressed before each commit. In this mode I have committed over a
hundred revisions, by ensuring that the instances don't race too
much. One can provoke the race by pressing return quickly a few times
in each X-term.

My machine is a dual processor x86 running linux kernel-2.4.14-pre6
and libc-2.1.3. Subversion is built with apr, db, neon in the source
tree, and it's a shared library build.

I would be interested to know if this problem reproducible elsewhere.
After all, even if there is a bug in my script, since it is merely
invoking svn it shouldn't be able to cause these problems.

Philip






#!/usr/bin/perl -w

# This script constructs a repository, and populates it with
# files. Then it loops making changes to a subset of the files and
# committing the tree. When two instances are run in parallel
# sometimes the commit will fail with a merge conflict. This is
# expected, and is automatically resolved by updating.

# The files start off containing:
#    A0
#    0
#    A1
#    1
#    A2
#    .
#    .
#    A9
#    9

# Each script has an ID in the range 0-9, and when it modifies a file
# it modifes the line that starts with its ID. Thus scripts with
# different IDs will make changes that can be merged automatically.

# The main loop is then:
#
#   step 1: modify a random selection of files
#
#   step 2: optional sleep or wait for RETURN keypress
#
#   step 3: update the working copy automatically merging out-of-date files
#
#   step 4: try to commit, if not successful go to step 3 otherwise go to step 1

# To allow break-out of potentially infinite loops, the script will
# terminate if it detects the presence of a "stop file", the path to
# which is specified with the -S option (default ./stop). This allows
# the script to be stopped without any danger of interrupting an svn
# sub-process, which experiment shows can cause problems with the
# database locking.

use Getopt::Std;
use File::Find;
use File::Path;
use Cwd;

# Repository check/create
sub init_repo
  {
    my ( $repo, $create ) = @_;
    if ( $create )
      {
	rmtree([$repo]) if -e $repo;
	my $svnadmin_cmd = "svnadmin create $repo";
	system( $svnadmin_cmd) and die "$svnadmin_cmd: failed\n";
      }
    else
      {
	my $svnadmin_cmd = "svnadmin youngest $repo";
	my $revision = readpipe $svnadmin_cmd;
	die "$svnadmin_cmd: failed\n" if not $revision =~ m{^[0-9]};
      }
    $repo = getcwd . "/$repo" if not $repo =~ m[^/];
    return $repo;
  }

# Check-out working copy
sub check_out
  {
    my ( $url ) = @_;
    my $wc_dir = "wcstress.$$";
    mkdir "$wc_dir", 0755 or die "mkdir stress.$$: $!\n";
    my $svn_cmd = "svn co $url -d $wc_dir";
    system( $svn_cmd ) and die "$svn_cmd: failed\n";
    return $wc_dir;
  }

# Print status, update and commit. The update is to do any required merges.
sub status_update_commit
  {
    my ( $wc_dir, $wait_for_key ) = @_;
    my $svn_cmd = "svn st $wc_dir";
    print "Status:\n";
    system( $svn_cmd ) and die "$svn_cmd: failed\n";
    print "Press return to update/commit\n" if $wait_for_key;
    read STDIN, $wait_for_key, 1 if $wait_for_key;
    print "Updating:\n";
    $svn_cmd = "svn up $wc_dir";
    system( $svn_cmd ) and die "$svn_cmd: failed\n";
    print "Committing:\n";
    my $now_time = localtime;
    $svn_cmd = "svn ci $wc_dir -m '$now_time'";
    return system( $svn_cmd );
  }

# Get a list of all versioned files in the working copy
{
  my @get_list_of_files_helper_array;
  sub GetListOfFilesHelper
    {
      $File::Find::prune = 1 if $File::Find::name =~ m[/.svn];
      return if $File::Find::prune or -d;
      push @get_list_of_files_helper_array, $File::Find::name;
    }
  sub GetListOfFiles
    {
      my ( $wc_dir ) = @_;
      @get_list_of_files_helper_array = ();
      find( \&GetListOfFilesHelper, $wc_dir);
      return @get_list_of_files_helper_array;
    }
}

# Populate a working copy
sub populate
  {
    my ( $dir, $dir_width, $file_width, $depth ) = @_;
    return if not $depth--;

    for $nfile ( 1..$file_width )
      {
	my $filename = "$dir/foo$nfile";
	open( FOO, ">$filename" ) or die "open $filename: $!\n";

	for $line ( 0..9 )
	  {
	    print FOO "A$line\n$line\n" or die "write to $filename: $!\n";
	  }
	close FOO or die "close $filename:: $!\n";

	my $svn_cmd = "svn add $filename";
	system( $svn_cmd ) and die "$svn_cmd: failed\n";
      }

    if ( $depth )
      {
	for $ndir ( 1..$dir_width )
	  {
	    my $dirname = "$dir/bar$ndir";
	    mkdir "$dirname", 0755 or die "mkdir $dirname: $!\n";

	    my $svn_cmd = "svn add $dirname";
	    system( $svn_cmd ) and die "$svn_cmd: failed\n";

	    populate( "$dirname", $dir_width, $file_width, $depth );
	  }
      }
  }

# Modify a versioned file in the working copy
sub ModFile
  {
    my ( $filename, $mod_number, $id ) = @_;

    # Read file into memory replacing the line that starts with our ID
    open( FOO, "<$filename" ) or die "open $filename: $!\n";
    @lines = map { s[(^$id.*)][$1,$mod_number]; $_ } <FOO>;
    close FOO or die "close $filename: $!\n";

    # Write the memory back to the file
    open( FOO, ">$filename" ) or die "open $filename: $!\n";
    print FOO or die "print $filename: $!\n" foreach @lines;
    close FOO or die "close $filename: $!\n";
  }

sub ParseCommandLine
  {
    my %cmd_opts;

    # defaults
    $cmd_opts{'D'} = 2;            # number of subdirs per dir
    $cmd_opts{'F'} = 2;            # number of files per dir
    $cmd_opts{'N'} = 2;            # depth
    $cmd_opts{'R'} = "repostress"; # repository name
    $cmd_opts{'S'} = "stop";       # path of file to stop the script
    $cmd_opts{'U'} = "none";       # URL
    $cmd_opts{'c'} = 0;            # create repository
    $cmd_opts{'i'} = 0;            # ID
    $cmd_opts{'s'} = -1;           # sleep interval
    $cmd_opts{'n'} = 200;          # sets of changes
    $cmd_opts{'x'} = 4;            # files to modify

    getopts( 'ci:n:s:x:D:F:N:R:U:', \%cmd_opts )
      or die "
usage: stress [-c] [-i num] [-n num] [-s secs] [-x num]
              [-D num] [-F num] [-N num] [-R path] [-S path] [-U url]
where
  -c cause repository creation
  -i the ID
  -n the number of sets of changes to commit
  -s the sleep delay (-1 wait for key, 0 none)
  -x the number of files to modify
  -D the number of sub-directories per directory in the tree
  -F the number of files per directory in the tree
  -N the depth of the tree
  -R the path to the repository
  -S the path to the file whose presence stops this script
  -U the URL to the repository (file:///<-R path> by default)
";

    # default ID if not set
    $cmd_opts{'i'} = 1 + 5 * $cmd_opts{'c'} if not $cmd_opts{'i'};

    return %cmd_opts;
  }

############################################################################
# Main

srand 123456789;

my %cmd_opts = ParseCommandLine();

my $repo = init_repo $cmd_opts{'R'}, $cmd_opts{'c'};

# Make URL from path if URL not explicitly specified
$cmd_opts{'U'} = "file://$repo" if $cmd_opts{'U'} eq "none";

my $wc_dir = check_out $cmd_opts{'U'};

if ( $cmd_opts{'c'} )
  {
    populate $wc_dir, $cmd_opts{'D'}, $cmd_opts{'F'}, $cmd_opts{'N'};
    status_update_commit $wc_dir, 0 and die "populate checkin failed\n";
  }

my @wc_files = GetListOfFiles $wc_dir;
die "not enough files in repository\n" if $#wc_files < $cmd_opts{'x'};

my $wait_for_key = $cmd_opts{'s'} < 0;

my $stop_file = $cmd_opts{'S'};

for $mod_number ( 1..$cmd_opts{'n'} )
  {
    my @chosen;
    for ( 1..$cmd_opts{'x'} )
      {
	# Extract random file from list and modify it
	my $mod_file = splice @wc_files, int rand $#wc_files, 1;
	ModFile $mod_file, $mod_number, $cmd_opts{'i'};
	push @chosen, $mod_file;
      }
    # Reinstate list of files, the order doesn't matter
    push @wc_files, @chosen;

    # Loop committing until successful or the stop file is created
    1 while not -e $stop_file and status_update_commit $wc_dir, $wait_for_key;
    sleep $cmd_opts{'s'} if $cmd_opts{'s'} > 0;

    last if -e $stop_file;
  }

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@subversion.tigris.org
For additional commands, e-mail: dev-help@subversion.tigris.org