You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@subversion.apache.org by Mohsin Abbas <mo...@gmail.com> on 2014/09/26 00:34:54 UTC

SVN Commit Failed For Data larger Than 2 GB [How To Resolve]

Good Day Team,

I am using Subversion 1.8.9 server on linux OS and tortoise SVN client
at windows. When I try to commit data larger then 2+ GB my commit
failed. I tried to google on different websites they provided below
solution :

1 : Set LimitRequestBody to 0 in the server side httpd.conf file.

But Apache Web server ( Apache 2.2 ) allows 2 GB only max limit for
data in HTTP request. Now tell me how can I commit files which having
size more than 2 GB ? How can I increase data limit in apache
webserver more than 2 GB ? OR I assume we can not commit data more
than 2 GB because svn not allows us for this ?
Please shed some light on this issue this is urgent.


Regards
Mohsin Abbas

Re: SVN Commit Failed For Data larger Than 2 GB [How To Resolve]

Posted by Nico Kadel-Garcia <nk...@gmail.com>.
On Thu, Sep 25, 2014 at 6:34 PM, Mohsin Abbas <mo...@gmail.com> wrote:
> Good Day Team,
>
> I am using Subversion 1.8.9 server on linux OS and tortoise SVN client
> at windows. When I try to commit data larger then 2+ GB my commit
> failed. I tried to google on different websites they provided below
> solution :
>
> 1 : Set LimitRequestBody to 0 in the server side httpd.conf file.
>
> But Apache Web server ( Apache 2.2 ) allows 2 GB only max limit for
> data in HTTP request. Now tell me how can I commit files which having
> size more than 2 GB ? How can I increase data limit in apache
> webserver more than 2 GB ? OR I assume we can not commit data more
> than 2 GB because svn not allows us for this ?
> Please shed some light on this issue this is urgent.

For pete's sake, it's like asking how to put a pool ball in your
mouth. Just Don't Do That.......

If you have to use files that large, switch to svn+ssh:// or svn://.

Re: SVN Commit Failed For Data larger Than 2 GB [How To Resolve]

Posted by Mohsin Abbas <mo...@gmail.com>.
Thanks Philip Martin for your reply.

I'll share error.log portion sortly meanwhile I have question I am
using Apache 2.2.24 (DAV module via http) with svn V1.8.9 and apache
web server have limit of request body 2 GB then how can you say more
than 2 GB data can be commited in svn ? This thing is confusing for me
please explain.

Regards
Mohsin

On 9/26/14, Philip Martin <ph...@wandisco.com> wrote:
> Mohsin Abbas <mo...@gmail.com> writes:
>
>> I am using Subversion 1.8.9 server on linux OS and tortoise SVN client
>> at windows. When I try to commit data larger then 2+ GB my commit
>> failed. I tried to google on different websites they provided below
>> solution :
>>
>> 1 : Set LimitRequestBody to 0 in the server side httpd.conf file.
>>
>> But Apache Web server ( Apache 2.2 ) allows 2 GB only max limit for
>> data in HTTP request. Now tell me how can I commit files which having
>> size more than 2 GB ? How can I increase data limit in apache
>> webserver more than 2 GB ? OR I assume we can not commit data more
>> than 2 GB because svn not allows us for this ?
>> Please shed some light on this issue this is urgent.
>
> Subversion supports commits greater than 2GB over HTTP.  There are many
> things that could cause it to fail so you need to provide more
> information.
>
> Which HTTP request is failing?
>
> The commit will use temporary disk space, do you have enough on the
> client and the server?
>
> What error message does the client give?
>
> Look at the server logs: what error message does the server give?
>
> If you are using a proxy then look at the the proxy logs: what error
> does the proxy give?
>
> --
> Philip Martin | Subversion Committer
> WANdisco // *Non-Stop Data*
>

Re: SVN Commit Failed For Data larger Than 2 GB [How To Resolve]

Posted by Mohsin Abbas <mo...@gmail.com>.
Thanks Branko Cibej for your feedback.

I have checked we are not using any proxy or load balancer for our svn but
firewall in involved in our scenario. In case of firewall there is not
limit for data transfer for specific your over the network.You said there
is in between client and server which is Apache may be due to Apache
request time out this is happening ? I have searched on internet regarding
Apache request body parameter whose max limit is 2 GB means Apache can
carry max 2 GB request please confirm me if I am wrong may be due to this
limit svn commit is failing before 2 GB ? Should i consider Apache request
timeout parameter in httpd.conf ? Please suggest


Regards
Mohsin


On Sun, Sep 28, 2014 at 8:59 AM, Branko Čibej <br...@wandisco.com> wrote:

>  On 27.09.2014 09:24, Mohsin Abbas wrote:
>
>    HI Philip Martin,
>
>  Here is Tortoise SVN client and Apache error log portion :
>
> *Tortoise SVN Client Error :*
>
> Error: Commit failed (details follow):
> Error: PUT of
> Error:
> '/svn/test/!svn/wrk/6f036c59-c3a8-5947-91dd-bd0f1eeccd84/SomeFolder/brands.zip':
>
> Error: Could not send request body: An existing connection was forcibly
> closed by the
> Error: remote host.
> Error: (http://x.x.com)
>
> *Apache error.log portion :*
>
> [Sat Sep 27 11:21:44 2014] [error] [client x.x.x.x] mod_dav_svn
> close_stream: error closing write stream  [500, #185004]
> [Sat Sep 27 11:21:44 2014] [error] [client x.x.x.x] Unexpected end of
> svndiff input  [500, #185004]
>
>
> From the above, I'm almost certain that this is not Subversion's fault.
> Look at the error messages: the client thinks that the server closed the
> connection, but the server thinks that the client closed the connection; so
> the only reasonable explanation is that something *between* the client
> and server is in fact closing the connection.
>
> This could be a proxy, or a load balancer, or even a packet
> inspector/firewall (see http://markmail.org/message/tazxq3jexaopcdum for
> example) that's forcibly closing the connection.
>
> You'll have to talk to the server/site administrators to determine what's
> happening. There's not much anyone else can do without knowing how not just
> the server, but the whole data-centre are configured.
>
> -- Brane
>
>

Re: AW: SVN Commit Failed For Data larger Than 2 GB [How To Resolve]

Posted by Mohsin <mo...@gmail.com>.
Hi ,

After upgrading my Tortoise SVN client version (1.8.7) on my windows machine
I was able to commit folder of size 4.7 GB : ) There is no issue in tortoise
svn client nor in SVN server.  Thanks all of you for your support .


Regards
Mohsin Abbas



--
View this message in context: http://subversion.1072662.n5.nabble.com/SVN-Commit-Failed-For-Data-larger-Than-2-GB-How-To-Resolve-tp190417p190510.html
Sent from the Subversion Users mailing list archive at Nabble.com.

Re: AW: SVN Commit Failed For Data larger Than 2 GB [How To Resolve]

Posted by Les Mikesell <le...@gmail.com>.
On Wed, Oct 1, 2014 at 11:27 AM, Mohsin <mo...@gmail.com> wrote:
> Thanks Dear,
>
>>Thanks - it would have been more clear if you had shown that svn
>>command line instead of the one that used the file:// protocol.
>
> Ignore my file:// protocol post that was in some other context . I am using
> HTTP protocol for repository access.
>
>>First note that the svn command line client is available for windows
>>too, and it might be worth verifying that it can succeed in exactly
>>the same circumstances where tortoise fails.  As others have noted,
>>this mail list doesn't have much to do with the tortoise client, so
>>you probably won't get the best advice about this problem here.   But,
>>make sure you are using the latest tortoise -  if the issue is really
>>in the neon libraries, it looks like neon has been dropped in the 1.8
> release:
>
> How can I use svn command line for windows ? Can you tell me in this regard
> ?

The tortoise installer should offer to include a command line client
too, but it will probably be built with the same library as the GUI.
There are several other builds linked from:
https://subversion.apache.org/packages.html#windows
plus one from the Cygwin environment (which, being a more linux-like
environment may confuse your line endings if you aren't careful).

> I am using Tortoise svn client older version (1.6 or 1.7) may be due to this
> older version Tortoise svn client was not able to commit larger data. I'll
> upgrade my Tortoise svn client version to latest and try to commit data may
> be with latest version I will be able to commit data on windows machine too.
> I'll update you regarding this shortly.

As a general recommendation: I usually try to update free software
before fighting bugs that might already be fixed.

-- 
   Les Mikesell
     lesmikesell@gmail.com

Re: AW: SVN Commit Failed For Data larger Than 2 GB [How To Resolve]

Posted by Mohsin <mo...@gmail.com>.
Thanks Dear,

>Thanks - it would have been more clear if you had shown that svn 
>command line instead of the one that used the file:// protocol. 

Ignore my file:// protocol post that was in some other context . I am using
HTTP protocol for repository access. 

>First note that the svn command line client is available for windows 
>too, and it might be worth verifying that it can succeed in exactly 
>the same circumstances where tortoise fails.  As others have noted, 
>this mail list doesn't have much to do with the tortoise client, so 
>you probably won't get the best advice about this problem here.   But, 
>make sure you are using the latest tortoise -  if the issue is really 
>in the neon libraries, it looks like neon has been dropped in the 1.8
release: 

How can I use svn command line for windows ? Can you tell me in this regard
? 

I am using Tortoise svn client older version (1.6 or 1.7) may be due to this
older version Tortoise svn client was not able to commit larger data. I'll
upgrade my Tortoise svn client version to latest and try to commit data may
be with latest version I will be able to commit data on windows machine too.
I'll update you regarding this shortly. 


Mohsin



--
View this message in context: http://subversion.1072662.n5.nabble.com/SVN-Commit-Failed-For-Data-larger-Than-2-GB-How-To-Resolve-tp190417p190504.html
Sent from the Subversion Users mailing list archive at Nabble.com.

Re: AW: SVN Commit Failed For Data larger Than 2 GB [How To Resolve]

Posted by Les Mikesell <le...@gmail.com>.
On Wed, Oct 1, 2014 at 10:23 AM, Mohsin <mo...@gmail.com> wrote:
> Dear,
>
>
> That's what i am saying from last 2 days I was successful in committing data
> by using svn command line on Linux server but I faced issue with tortoise
> svn client on my window machine .

Thanks - it would have been more clear if you had shown that svn
command line instead of the one that used the file:// protocol.

> That clearly depicts issue is with
> tortoise svn i know that but most important thing which i am eagerly to
> listen from you people how to resolve this issue ? What should I do to
> figure out this issue with tortoise svn client ?

First note that the svn command line client is available for windows
too, and it might be worth verifying that it can succeed in exactly
the same circumstances where tortoise fails.  As others have noted,
this mail list doesn't have much to do with the tortoise client, so
you probably won't get the best advice about this problem here.   But,
make sure you are using the latest tortoise -  if the issue is really
in the neon libraries, it looks like neon has been dropped in the 1.8
release:

http://subversion.apache.org/docs/release-notes/1.8.html#neon-deleted

-- 
   Les Mikesell
     lesmikesell@gmail.com

Re: AW: SVN Commit Failed For Data larger Than 2 GB [How To Resolve]

Posted by Mohsin <mo...@gmail.com>.
Dear,


That's what i am saying from last 2 days I was successful in committing data
by using svn command line on Linux server but I faced issue with tortoise
svn client on my window machine . That clearly depicts issue is with
tortoise svn i know that but most important thing which i am eagerly to
listen from you people how to resolve this issue ? What should I do to
figure out this issue with tortoise svn client ?


~Mohsin



--
View this message in context: http://subversion.1072662.n5.nabble.com/SVN-Commit-Failed-For-Data-larger-Than-2-GB-How-To-Resolve-tp190417p190498.html
Sent from the Subversion Users mailing list archive at Nabble.com.

Re: AW: SVN Commit Failed For Data larger Than 2 GB [How To Resolve]

Posted by Les Mikesell <le...@gmail.com>.
On Wed, Oct 1, 2014 at 7:45 AM, Mohsin <mo...@gmail.com> wrote:
> Hi,
>
> we are using HTTP protocol for repository access in browser e.g
>
> http://server/some/path
>

I'm not sure it is clear from this thread whether you have succeeded
in committing >2Gb with a command line svn client using http protocol
(your import example showed file://).  If the command line client
still shows the problem over http, then the issue may be with apache
on the server side.  However if the command line svn works with an
http url but tortoise fails, then the issue is obviously with the
tortoise libraries on the client side.

-- 
   Les Mikesell
      lesmikesell@gmail.com

Re: AW: SVN Commit Failed For Data larger Than 2 GB [How To Resolve]

Posted by Mohsin <mo...@gmail.com>.
Hi,

we are using HTTP protocol for repository access in browser e.g 

http://server/some/path


Mohsin   



--
View this message in context: http://subversion.1072662.n5.nabble.com/SVN-Commit-Failed-For-Data-larger-Than-2-GB-How-To-Resolve-tp190417p190491.html
Sent from the Subversion Users mailing list archive at Nabble.com.

AW: SVN Commit Failed For Data larger Than 2 GB [How To Resolve]

Posted by Markus Schaber <m....@codesys.com>.
Hi, Mohsin,

Von: Mohsin [mailto:mohsinchandia@gmail.com] 

> How can i check which scheme we are using on linux server ? can you share command so that i can tell you ?

Check the start of the URLs you're using to access the repository:

http://server/some/path    -> http access, apache configuration may be the problem.
https://server/some/path   -> http access with ssl protection, apache configuration may be the problem.

svn://server/some/path...  -> svn protocol, not affected.
file:///some/path/...      -> direct file access, not affected.
svn+ssh://server/some/path -> svn protocol tunneled through ssh, not affected.

svn+XXX://server/some/path -> svn protocol tunneled through some custom proxy, unlikely to be affected unless your tunnel configuration imposes a size limit on the total amount of tunneled data.


Best regards

Markus Schaber

CODESYS(r) a trademark of 3S-Smart Software Solutions GmbH

Inspiring Automation Solutions

3S-Smart Software Solutions GmbH
Dipl.-Inf. Markus Schaber | Product Development Core Technology
Memminger Str. 151 | 87439 Kempten | Germany
Tel. +49-831-54031-979 | Fax +49-831-54031-50

E-Mail: m.schaber@codesys.com | Web: http://www.codesys.com | CODESYS store: http://store.codesys.com
CODESYS forum: http://forum.codesys.com

Managing Directors: Dipl.Inf. Dieter Hess, Dipl.Inf. Manfred Werner | Trade register: Kempten HRB 6186 | Tax ID No.: DE 167014915

This e-mail may contain confidential and/or privileged information. If you are not the intended recipient (or have received
this e-mail in error) please notify the sender immediately and destroy this e-mail. Any unauthorised copying, disclosure
or distribution of the material in this e-mail is strictly forbidden.


Re: SVN Commit Failed For Data larger Than 2 GB [How To Resolve]

Posted by Mohsin <mo...@gmail.com>.
Hi,

How can i check which scheme we are using on linux server ? can you share
command so that i can tell you ?


Mohsin



--
View this message in context: http://subversion.1072662.n5.nabble.com/SVN-Commit-Failed-For-Data-larger-Than-2-GB-How-To-Resolve-tp190417p190486.html
Sent from the Subversion Users mailing list archive at Nabble.com.

Re: SVN Commit Failed For Data larger Than 2 GB [How To Resolve]

Posted by Andreas Stieger <an...@gmx.de>.
> On 1 Oct 2014, at 10:15, Mohsin <mo...@gmail.com> wrote:
> 
> For accessing repo in browser we are using HTTP protocol while on linux
> server i think (not sure) we are using file scheme .

If you are not sure then go check. I gave interpretations on what the difference means. If if fails on http but works on file, you will know where to look.

Check your httpd for timeout related settings and bugs, as previously indicated by others.
Note that the 2G serf bug fix linked to twice is not available in a public TortioseSVN release.

Andreas

Re: SVN Commit Failed For Data larger Than 2 GB [How To Resolve]

Posted by Mohsin <mo...@gmail.com>.
Hi,

For accessing repo in browser we are using HTTP protocol while on linux
server i think (not sure) we are using file scheme . Here is import command
which we had used on linux server for importing data in newly created
repository.

FYI

svn import -m "First Import" /tmp/Test/ file:///X/Y/
Adding         /tmp/Test/Trunk
Adding         /tmp/Test/Branches
Adding         /tmp/Test/Tags

Is this file scheme which we are using ? Please suggest


Regards
Mohsin



--
View this message in context: http://subversion.1072662.n5.nabble.com/SVN-Commit-Failed-For-Data-larger-Than-2-GB-How-To-Resolve-tp190417p190483.html
Sent from the Subversion Users mailing list archive at Nabble.com.

Re: [serf-dev] serf errors on responses bigger than 4GB

Posted by Mark Phippard <ma...@gmail.com>.
On Wed, Oct 1, 2014 at 11:36 AM, Philip Martin <ph...@wandisco.com>
wrote:

> Mark Phippard <ma...@gmail.com> writes:
>
> > On Wed, Oct 1, 2014 at 10:03 AM, Philip Martin <
> philip.martin@wandisco.com>
> > wrote:
> >>
> >> I can trigger the decompression error on a 5GB REPORT by setting
> >> http-bulk-updates=yes on the client side.
> >>
> >>
> > This does not really answer the question.
> >
> > Was your REPORT 5GB because it had a single file > 4GB or because it had
> > tens of thousands of small files?  Mike's question is about the latter.
> >
> > Does Serf only fail when decompressing a single large file, or also if
> the
> > entire REPORT response happens to be > 4 GB?  The latter probably would
> be
> > a much more common problem to run into if it can happen.
>
> I don't think it makes a difference, serf will generate the error in
> both cases.
>
> Serf is decompressing the HTTP body of the REPORT.  At the Subversion
> level the body is an XML <S:update-report> but as far as serf is
> concerned it is just a block of data that has to be decompressed.  Serf
> doesn't look to see whether the uncompressed data really is XML so it
> certainly doesn't care whether there is one <S:txdelta> or many.
>


 Understood.


-- 
Thanks

Mark Phippard
http://markphip.blogspot.com/

Re: [serf-dev] serf errors on responses bigger than 4GB

Posted by Philip Martin <ph...@wandisco.com>.
Mark Phippard <ma...@gmail.com> writes:

> On Wed, Oct 1, 2014 at 10:03 AM, Philip Martin <ph...@wandisco.com>
> wrote:
>>
>> I can trigger the decompression error on a 5GB REPORT by setting
>> http-bulk-updates=yes on the client side.
>>
>>
> This does not really answer the question.
>
> Was your REPORT 5GB because it had a single file > 4GB or because it had
> tens of thousands of small files?  Mike's question is about the latter.
>
> Does Serf only fail when decompressing a single large file, or also if the
> entire REPORT response happens to be > 4 GB?  The latter probably would be
> a much more common problem to run into if it can happen.

I don't think it makes a difference, serf will generate the error in
both cases.

Serf is decompressing the HTTP body of the REPORT.  At the Subversion
level the body is an XML <S:update-report> but as far as serf is
concerned it is just a block of data that has to be decompressed.  Serf
doesn't look to see whether the uncompressed data really is XML so it
certainly doesn't care whether there is one <S:txdelta> or many.

-- 
Philip Martin | Subversion Committer
WANdisco // *Non-Stop Data*

Re: [serf-dev] serf errors on responses bigger than 4GB

Posted by Mark Phippard <ma...@gmail.com>.
On Wed, Oct 1, 2014 at 10:03 AM, Philip Martin <ph...@wandisco.com>
wrote:

> "C. Michael Pilato" <cm...@gmail.com> writes:
>
> > The log message for r2419 mentions "files" larger than 4Gb, and leads me
> > to believe that this problem only affects GETs.  But here, Philip avoids
> > the term "files" and talks about the "compressed size".  Does the bug
> > fixed in r2419 manifest on any response > 4GB, such as a bulk-mode
> > REPORT carrying a whole Subversion tree that's larger than 4GB?
>
> I can trigger the decompression error on a 5GB REPORT by setting
> http-bulk-updates=yes on the client side.
>
>
This does not really answer the question.

Was your REPORT 5GB because it had a single file > 4GB or because it had
tens of thousands of small files?  Mike's question is about the latter.

Does Serf only fail when decompressing a single large file, or also if the
entire REPORT response happens to be > 4 GB?  The latter probably would be
a much more common problem to run into if it can happen.

-- 
Thanks

Mark Phippard
http://markphip.blogspot.com/

Re: [serf-dev] serf errors on responses bigger than 4GB

Posted by "C. Michael Pilato" <cm...@gmail.com>.
On 10/01/2014 10:16 AM, Philip Martin wrote:
> Philip Martin <ph...@wandisco.com> writes:
>
>> "C. Michael Pilato" <cm...@gmail.com> writes:
>>
>>> The log message for r2419 mentions "files" larger than 4Gb, and leads me
>>> to believe that this problem only affects GETs.  But here, Philip avoids
>>> the term "files" and talks about the "compressed size".  Does the bug
>>> fixed in r2419 manifest on any response > 4GB, such as a bulk-mode
>>> REPORT carrying a whole Subversion tree that's larger than 4GB?
>> I can trigger the decompression error on a 5GB REPORT by setting
>> http-bulk-updates=yes on the client side.
> When error is produced for a REPORT the client has successfully produced
> a working copy containing the large file in the correct place.  I
> suppose this means that the error triggers after reading/parsing the XML
> that makes up the report.
>
> When the error occurs on a GET the large file has also been successfully
> downloaded but is left as a temporary file in .svn/tmp and not installed
> in the pristine store or the working copy.
>

Thanks, Philip!

Re: [serf-dev] serf errors on responses bigger than 4GB

Posted by Philip Martin <ph...@codematters.co.uk>.
Philip Martin <ph...@wandisco.com> writes:

> "C. Michael Pilato" <cm...@gmail.com> writes:
>
>> The log message for r2419 mentions "files" larger than 4Gb, and leads me
>> to believe that this problem only affects GETs.  But here, Philip avoids
>> the term "files" and talks about the "compressed size".  Does the bug
>> fixed in r2419 manifest on any response > 4GB, such as a bulk-mode
>> REPORT carrying a whole Subversion tree that's larger than 4GB?
>
> I can trigger the decompression error on a 5GB REPORT by setting
> http-bulk-updates=yes on the client side.

When error is produced for a REPORT the client has successfully produced
a working copy containing the large file in the correct place.  I
suppose this means that the error triggers after reading/parsing the XML
that makes up the report.

When the error occurs on a GET the large file has also been successfully
downloaded but is left as a temporary file in .svn/tmp and not installed
in the pristine store or the working copy.

-- 
Philip

Re: [serf-dev] serf errors on responses bigger than 4GB

Posted by Philip Martin <ph...@wandisco.com>.
"C. Michael Pilato" <cm...@gmail.com> writes:

> The log message for r2419 mentions "files" larger than 4Gb, and leads me
> to believe that this problem only affects GETs.  But here, Philip avoids
> the term "files" and talks about the "compressed size".  Does the bug
> fixed in r2419 manifest on any response > 4GB, such as a bulk-mode
> REPORT carrying a whole Subversion tree that's larger than 4GB?

I can trigger the decompression error on a 5GB REPORT by setting
http-bulk-updates=yes on the client side.

-- 
Philip Martin | Subversion Committer
WANdisco // *Non-Stop Data*

Re: [serf-dev] serf errors on responses bigger than 4GB

Posted by "C. Michael Pilato" <cm...@gmail.com>.
On 10/01/2014 06:48 AM, Philip Martin wrote:
> Andreas Stieger <an...@gmx.de> writes:
>
>> I
>> will once again point to the serf issues below and httpd/network config.
>> https://code.google.com/p/serf/issues/detail?id=152
>> https://code.google.com/p/serf/source/detail?r=2419
> Andreas identified a bug in serf that causes decompression to fail when
> the compressed size is bigger than 4GB. This bug has been fixed on trunk
> but not in any release.  This bug does not affect commit but does affect
> checkout/update.

The log message for r2419 mentions "files" larger than 4Gb, and leads me
to believe that this problem only affects GETs.  But here, Philip avoids
the term "files" and talks about the "compressed size".  Does the bug
fixed in r2419 manifest on any response > 4GB, such as a bulk-mode
REPORT carrying a whole Subversion tree that's larger than 4GB?

Re: [serf-dev] serf errors on responses bigger than 4GB

Posted by Lieven Govaerts <lg...@mobsol.be>.
Hi,

On Wed, Oct 1, 2014 at 12:48 PM, Philip Martin
<ph...@wandisco.com> wrote:
> Andreas Stieger <an...@gmx.de> writes:
>
>> I
>> will once again point to the serf issues below and httpd/network config.
>> https://code.google.com/p/serf/issues/detail?id=152
>> https://code.google.com/p/serf/source/detail?r=2419
>
> Andreas identified a bug in serf that causes decompression to fail when
> the compressed size is bigger than 4GB. This bug has been fixed on trunk
> but not in any release.  This bug does not affect commit but does affect
> checkout/update.
>
> In my testing a commit of a 5GB /dev/urandom file over HTTP using serf
> 1.3.x works with compression both disabled and enabled.  A checkout over
> HTTP using serf 1.3.x fails:
>
>   svn: E120104: ra_serf: An error occurred during decompression
>
> I also tried the checkout with compression disabled by the client and
> saw the error:
>
>   svn: E120106: ra_serf: The server sent a truncated HTTP response body.
>
> but this turned out to be the known mod_deflate memory leak causing the
> server to abort.  With compression disabled on the server the
> uncompressed checkout works.
>
> Doing a search I see users reporting both the above serf errors.  The
> way to fix the decompression error is to disable compression.  This can
> be done on the client if the server is a recent 2.4 as it is not
> affected by the mod_deflate bug.  If the server is older then a client
> disabling compression will probably cause the truncated error and the
> fix is to disable mod_deflate on the server or to revert to a 1.7/neon
> client.
>
> I merged r2419 to my 1.3.x build and it fixes the compressed checkout.
> Are there any plans for a serf release that includes this fix?

I've learned from earlier releases that (most) packagers won't upgrade
serf unless there's a svn release,

As a result, I plan a serf (patch) release right before a svn (patch)
release, but not earlier.

regards,

Lieven

> -
> Philip Martin | Subversion Committer
> WANdisco // *Non-Stop Data*
>

serf errors on responses bigger than 4GB

Posted by Philip Martin <ph...@wandisco.com>.
Andreas Stieger <an...@gmx.de> writes:

> I
> will once again point to the serf issues below and httpd/network config.
> https://code.google.com/p/serf/issues/detail?id=152
> https://code.google.com/p/serf/source/detail?r=2419

Andreas identified a bug in serf that causes decompression to fail when
the compressed size is bigger than 4GB. This bug has been fixed on trunk
but not in any release.  This bug does not affect commit but does affect
checkout/update.

In my testing a commit of a 5GB /dev/urandom file over HTTP using serf
1.3.x works with compression both disabled and enabled.  A checkout over
HTTP using serf 1.3.x fails:

  svn: E120104: ra_serf: An error occurred during decompression

I also tried the checkout with compression disabled by the client and
saw the error:

  svn: E120106: ra_serf: The server sent a truncated HTTP response body.

but this turned out to be the known mod_deflate memory leak causing the
server to abort.  With compression disabled on the server the
uncompressed checkout works.

Doing a search I see users reporting both the above serf errors.  The
way to fix the decompression error is to disable compression.  This can
be done on the client if the server is a recent 2.4 as it is not
affected by the mod_deflate bug.  If the server is older then a client
disabling compression will probably cause the truncated error and the
fix is to disable mod_deflate on the server or to revert to a 1.7/neon
client.

I merged r2419 to my 1.3.x build and it fixes the compressed checkout.
Are there any plans for a serf release that includes this fix?

- 
Philip Martin | Subversion Committer
WANdisco // *Non-Stop Data*

Re: SVN Commit Failed For Data larger Than 2 GB [How To Resolve]

Posted by Andreas Stieger <an...@gmx.de>.
Hi,

On 30/09/14 19:24, Mohsin Abbas wrote:
> Just for your information we have SVN installed on Linux machine and I
> committed folder of size 4.7 GB on Linux server (I am administrator of
> SVN). I have checked out that particular repository on Linux server and
> committed that folder which was successful.

Which URL scheme were you using locally on GNU/Linux? If it was file I
will once again point to the serf issues below and httpd/network config.
https://code.google.com/p/serf/issues/detail?id=152
https://code.google.com/p/serf/source/detail?r=2419

Andreas


Re: SVN Commit Failed For Data larger Than 2 GB [How To Resolve]

Posted by Mohsin Abbas <mo...@gmail.com>.
Brane,

Just for your information we have SVN installed on Linux machine and I
committed folder of size 4.7 GB on Linux server (I am administrator of
SVN). I have checked out that particular repository on Linux server and
committed that folder which was successful. But all our users have windows
machine so they use Tortoise SVN (client) for svn check out or svn commit
or svn update. I am using windows machine and Tortoise SVN too for svn
operations like commit , update n etc. Point is I can do commit on Linux
server as a administrator. But for windows users how they commit on their
machines with Tortoise SVN client which is not possible for normal user to
perform heavy data commit so we should consider some other tool which
provide this facility . Can we have some others svn clients for windows
which support your SVN?

Surely, I will report bug for this issue.


Regards
Mohsin

On Mon, Sep 29, 2014 at 6:13 PM, Branko Čibej <br...@wandisco.com> wrote:

>  On 29.09.2014 19:45, Mohsin Abbas wrote:
>
>  HI,
>
> I have committed folder with size 4.7 GB successfully in SVN repository by
> using svn CLI (svn commit command). But when I try to commit folder by
> using Tortoise SVN client my commit was failed. It means we have limitation
> in tortoise svn client for commit. Can any one confirm this ? There is no
> issue in svn at all but may be in svn client.
>
>
> The last sencence is both confusing and important: "the svn client" is the
> command-line tool you get when installing a stock Subversion package, and
> that's what this project supports. TortoiseSVN is *not* supported by us,
> it's a separate project; I don't think more than a couple Subversion
> committers ever saw the TSVN source code.
>
> So it turns out we've been going back and forth about this issue for 4
> days, because we did not know a very important piece of information: you
> did not use Subversion to perform the commit, you used TortoiseSVN ...
>
> Given that we do know now that the problem is most likely in TSVN, I
> suggest you report it there; I suggest you start by reading
>
>     http://tortoisesvn.net/reportbug.html
>
> which describes the steps you need to take to create a bug report that's
> most likely to help resolve the issue.
>
> -- Brane
>
>

Re: SVN Commit Failed For Data larger Than 2 GB [How To Resolve]

Posted by Branko Čibej <br...@wandisco.com>.
On 29.09.2014 19:45, Mohsin Abbas wrote:
> HI,
>
> I have committed folder with size 4.7 GB successfully in SVN
> repository by using svn CLI (svn commit command). But when I try to
> commit folder by using Tortoise SVN client my commit was failed. It
> means we have limitation in tortoise svn client for commit. Can any
> one confirm this ? There is no issue in svn at all but may be in svn
> client.

The last sencence is both confusing and important: "the svn client" is
the command-line tool you get when installing a stock Subversion
package, and that's what this project supports. TortoiseSVN is *not*
supported by us, it's a separate project; I don't think more than a
couple Subversion committers ever saw the TSVN source code.

So it turns out we've been going back and forth about this issue for 4
days, because we did not know a very important piece of information: you
did not use Subversion to perform the commit, you used TortoiseSVN ...

Given that we do know now that the problem is most likely in TSVN, I
suggest you report it there; I suggest you start by reading

    http://tortoisesvn.net/reportbug.html

which describes the steps you need to take to create a bug report that's
most likely to help resolve the issue.

-- Brane


Re: SVN Commit Failed For Data larger Than 2 GB [How To Resolve]

Posted by Mohsin Abbas <mo...@gmail.com>.
HI,

I have committed folder with size 4.7 GB successfully in SVN repository by
using svn CLI (svn commit command). But when I try to commit folder by
using Tortoise SVN client my commit was failed. It means we have limitation
in tortoise svn client for commit. Can any one confirm this ? There is no
issue in svn at all but may be in svn client.

Regards
Mohsin

On Mon, Sep 29, 2014 at 10:01 AM, Philip Martin <ph...@wandisco.com>
wrote:

> Branko Čibej <br...@wandisco.com> writes:
>
> >> [Sat Sep 27 11:21:44 2014] [error] [client x.x.x.x] mod_dav_svn
> >> close_stream: error closing write stream  [500, #185004]
> >> [Sat Sep 27 11:21:44 2014] [error] [client x.x.x.x] Unexpected end of
> >> svndiff input  [500, #185004]
> >
> > From the above, I'm almost certain that this is not Subversion's fault.
> > Look at the error messages: the client thinks that the server closed the
> > connection, but the server thinks that the client closed the connection;
> > so the only reasonable explanation is that something *between* the
> > client and server is in fact closing the connection.
>
> Error 185004 is SVN_ERR_SVNDIFF_UNEXPECTED_END and that could be caused
> by some proxy/router closing the connection.  But it could also be
> caused by a Subversion bug that causes the client to write invalid
> svndiff data.  It's not trivial to distinguish one from the other, we
> would need access to all the commit data to decode the svndiff.  Another
> way would be to attempt the commit from the server itself over HTTP to
> localhost, if this works the problem is most likely the network.
>
> --
> Philip Martin | Subversion Committer
> WANdisco // *Non-Stop Data*
>

Re: SVN Commit Failed For Data larger Than 2 GB [How To Resolve]

Posted by Philip Martin <ph...@wandisco.com>.
Branko Čibej <br...@wandisco.com> writes:

>> [Sat Sep 27 11:21:44 2014] [error] [client x.x.x.x] mod_dav_svn
>> close_stream: error closing write stream  [500, #185004]
>> [Sat Sep 27 11:21:44 2014] [error] [client x.x.x.x] Unexpected end of
>> svndiff input  [500, #185004]
>
> From the above, I'm almost certain that this is not Subversion's fault.
> Look at the error messages: the client thinks that the server closed the
> connection, but the server thinks that the client closed the connection;
> so the only reasonable explanation is that something *between* the
> client and server is in fact closing the connection.

Error 185004 is SVN_ERR_SVNDIFF_UNEXPECTED_END and that could be caused
by some proxy/router closing the connection.  But it could also be
caused by a Subversion bug that causes the client to write invalid
svndiff data.  It's not trivial to distinguish one from the other, we
would need access to all the commit data to decode the svndiff.  Another
way would be to attempt the commit from the server itself over HTTP to
localhost, if this works the problem is most likely the network.

-- 
Philip Martin | Subversion Committer
WANdisco // *Non-Stop Data*

Re: SVN Commit Failed For Data larger Than 2 GB [How To Resolve]

Posted by Branko Čibej <br...@wandisco.com>.
On 27.09.2014 09:24, Mohsin Abbas wrote:
> HI Philip Martin,
>
> Here is Tortoise SVN client and Apache error log portion :
>
> *Tortoise SVN Client Error :*
>
> Error: Commit failed (details follow):
> Error: PUT of
> Error:
> '/svn/test/!svn/wrk/6f036c59-c3a8-5947-91dd-bd0f1eeccd84/SomeFolder/brands.zip':
>
> Error: Could not send request body: An existing connection was
> forcibly closed by the
> Error: remote host.
> Error: (http://x.x.com)
>
> *Apache error.log portion :*
>
> [Sat Sep 27 11:21:44 2014] [error] [client x.x.x.x] mod_dav_svn
> close_stream: error closing write stream  [500, #185004]
> [Sat Sep 27 11:21:44 2014] [error] [client x.x.x.x] Unexpected end of
> svndiff input  [500, #185004]

>From the above, I'm almost certain that this is not Subversion's fault.
Look at the error messages: the client thinks that the server closed the
connection, but the server thinks that the client closed the connection;
so the only reasonable explanation is that something *between* the
client and server is in fact closing the connection.

This could be a proxy, or a load balancer, or even a packet
inspector/firewall (see http://markmail.org/message/tazxq3jexaopcdum for
example) that's forcibly closing the connection.

You'll have to talk to the server/site administrators to determine
what's happening. There's not much anyone else can do without knowing
how not just the server, but the whole data-centre are configured.

-- Brane


Re: SVN Commit Failed For Data larger Than 2 GB [How To Resolve]

Posted by Mohsin Abbas <mo...@gmail.com>.
HI Philip Martin,

Here is Tortoise SVN client and Apache error log portion :

*Tortoise SVN Client Error :*

Error: Commit failed (details follow):
Error: PUT of
Error:
'/svn/test/!svn/wrk/6f036c59-c3a8-5947-91dd-bd0f1eeccd84/SomeFolder/brands.zip':

Error: Could not send request body: An existing connection was forcibly
closed by the
Error: remote host.
Error: (http://x.x.com)

*Apache error.log portion :*

[Sat Sep 27 11:21:44 2014] [error] [client x.x.x.x] mod_dav_svn
close_stream: error closing write stream  [500, #185004]
[Sat Sep 27 11:21:44 2014] [error] [client x.x.x.x] Unexpected end of
svndiff input  [500, #185004]

Please let me know how can i figure out this issue ?

FYI
My folder size is 4.73 GB which i am committing .


Regards
Mohsin



On Fri, Sep 26, 2014 at 9:04 AM, Philip Martin <ph...@wandisco.com>
wrote:

> Mohsin Abbas <mo...@gmail.com> writes:
>
> > I am using Subversion 1.8.9 server on linux OS and tortoise SVN client
> > at windows. When I try to commit data larger then 2+ GB my commit
> > failed. I tried to google on different websites they provided below
> > solution :
> >
> > 1 : Set LimitRequestBody to 0 in the server side httpd.conf file.
> >
> > But Apache Web server ( Apache 2.2 ) allows 2 GB only max limit for
> > data in HTTP request. Now tell me how can I commit files which having
> > size more than 2 GB ? How can I increase data limit in apache
> > webserver more than 2 GB ? OR I assume we can not commit data more
> > than 2 GB because svn not allows us for this ?
> > Please shed some light on this issue this is urgent.
>
> Subversion supports commits greater than 2GB over HTTP.  There are many
> things that could cause it to fail so you need to provide more
> information.
>
> Which HTTP request is failing?
>
> The commit will use temporary disk space, do you have enough on the
> client and the server?
>
> What error message does the client give?
>
> Look at the server logs: what error message does the server give?
>
> If you are using a proxy then look at the the proxy logs: what error
> does the proxy give?
>
> --
> Philip Martin | Subversion Committer
> WANdisco // *Non-Stop Data*
>

Re: SVN Commit Failed For Data larger Than 2 GB [How To Resolve]

Posted by Philip Martin <ph...@wandisco.com>.
Mohsin Abbas <mo...@gmail.com> writes:

> I am using Subversion 1.8.9 server on linux OS and tortoise SVN client
> at windows. When I try to commit data larger then 2+ GB my commit
> failed. I tried to google on different websites they provided below
> solution :
>
> 1 : Set LimitRequestBody to 0 in the server side httpd.conf file.
>
> But Apache Web server ( Apache 2.2 ) allows 2 GB only max limit for
> data in HTTP request. Now tell me how can I commit files which having
> size more than 2 GB ? How can I increase data limit in apache
> webserver more than 2 GB ? OR I assume we can not commit data more
> than 2 GB because svn not allows us for this ?
> Please shed some light on this issue this is urgent.

Subversion supports commits greater than 2GB over HTTP.  There are many
things that could cause it to fail so you need to provide more
information.

Which HTTP request is failing?

The commit will use temporary disk space, do you have enough on the
client and the server?

What error message does the client give?

Look at the server logs: what error message does the server give?

If you are using a proxy then look at the the proxy logs: what error
does the proxy give?

-- 
Philip Martin | Subversion Committer
WANdisco // *Non-Stop Data*