You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@httpd.apache.org by Randy Grafton <rg...@indatacorp.com> on 2004/01/29 22:22:50 UTC

[users@httpd] Facilitating File Downloads...

We want to make some files available for download through http. We reference
the files with full paths for the href value in an anchor tag. The files
range in size up to 150MB. We seem to be having a problem with the larger
files becoming corrupted. The large files are self extracting installers for
demos of our products. The corruption of these files results in the self
extractor/installer files crashing on the clients computer after download
from our Apache 2.0.48 server, (running on FreeBSD 4.8). Is there something
I should configure in my httpd.config file to accommodate http downloads?
Should I look to an alternative such as storing the file in a MySQL database
and handing it out from there with PHP scripts?

Thanks For the suggestions,
-Randy


---------------------------------------------------------------------
The official User-To-User support forum of the Apache HTTP Server Project.
See <URL:http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: users-unsubscribe@httpd.apache.org
   "   from the digest: users-digest-unsubscribe@httpd.apache.org
For additional commands, e-mail: users-help@httpd.apache.org


RE: [users@httpd] Facilitating File Downloads...

Posted by Joshua Slive <jo...@slive.ca>.
On Thu, 29 Jan 2004, Randy Grafton wrote:
> The file is usually good for several weeks and then we'll get a call from a
> customer complaining that their install didn't work after repeated
> downloads. We'll go grab the file off of the server through ssh/sftp and
> sure enough the file is bad. At this point we'll make a new copy on the
> server and wait for the next customer to call.
>

Woh... You're saying the file is corrupted on the server?  There is no way
that apache is doing that.  Apache will *never* write to a file that it is
serving, unless you have some cgi/php/perl code that is doing that.

Your problem very likely has absolutely nothing to do with the web server
itself.  Check your hardware/os config/etc.

Joshua.

---------------------------------------------------------------------
The official User-To-User support forum of the Apache HTTP Server Project.
See <URL:http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: users-unsubscribe@httpd.apache.org
   "   from the digest: users-digest-unsubscribe@httpd.apache.org
For additional commands, e-mail: users-help@httpd.apache.org


Re: [users@httpd] Facilitating File Downloads...

Posted by Cameron Schaus <ca...@schaus.ca>.
On Fri, Jan 30, 2004 at 11:10:41AM -0700, Randy Grafton wrote:
> Thanks for the tips. I actually stopped listening to my IIS only guys and
> grabbed the 'corrupted' file myself. I got 2 copies of the file off of the
> FreeBSD/Apache server. One by sftp and the other by http. I ran diff
> --minimal file.good file.bad. The files are the same. In fact the sftp file
> worked without issue. The http download of the same file did have a problem
> but when I renamed it and ran it again it worked. Any ideas?

Is the file you are making available for download a .tar.gz file, by
chance?  Some browsers will save the .tar.gz files with a .tar
extension, but without having unzipped it.  This creates confusion.

But without knowing what you did (how you downloaded, renamed from
what to what, ran it), I can only just guess.

Cam


---------------------------------------------------------------------
The official User-To-User support forum of the Apache HTTP Server Project.
See <URL:http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: users-unsubscribe@httpd.apache.org
   "   from the digest: users-digest-unsubscribe@httpd.apache.org
For additional commands, e-mail: users-help@httpd.apache.org


Re: [users@httpd] Facilitating File Downloads...

Posted by Brian Dessent <br...@dessent.net>.
Randy Grafton wrote:
> 
> Thanks for the tips. I actually stopped listening to my IIS only guys and
> grabbed the 'corrupted' file myself. I got 2 copies of the file off of the
> FreeBSD/Apache server. One by sftp and the other by http. I ran diff
> --minimal file.good file.bad. The files are the same. In fact the sftp file
> worked without issue. The http download of the same file did have a problem
> but when I renamed it and ran it again it worked. Any ideas?

Check the Content-Type that the server is setting for the download. 
Hopefully, it's "application/octet-stream" or possibly "text/plain". 
You can use a number of techniques to check this... My preference is
"curl -I http://your.url.here" which will show you the server's response
headers.  The most definitive way is to run Ethereal and capture the
packets when you do the actual download from your regular browser. 
(Hint: right click on one of the relevant packets and choose 'Follow TCP
Stream' and you'll get a good look at the client-server conversation,
including the 'Content-Type:' header.)

If the server is sending the file with an odd content-type then the
browser may do weird things.  The first thing you need to determine is
if the HTTP-downloaded file is exactly the same, byte-wise.  I'd advise
you use the md5sum tool to generate an MD5 hash of the file as
downloaded by your browser, and compare that with a known good copy.

If the file contents are exactly the same but the browser is naming it
something odd, then that's another issue.  But, usually those types of
problems only occur when you're doing a file download from a script
(such as an attachment on a PHP message board.)  If this is a plain old
HTTP url to the file itself and there's no scripting or anything, then
the browser should be saving it with the same name as the filename in
the URL.  Assuming, of course, that the content-type is sane.

Brian

---------------------------------------------------------------------
The official User-To-User support forum of the Apache HTTP Server Project.
See <URL:http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: users-unsubscribe@httpd.apache.org
   "   from the digest: users-digest-unsubscribe@httpd.apache.org
For additional commands, e-mail: users-help@httpd.apache.org


RE: [users@httpd] Facilitating File Downloads...

Posted by Randy Grafton <rg...@indatacorp.com>.
Thanks for the tips. I actually stopped listening to my IIS only guys and
grabbed the 'corrupted' file myself. I got 2 copies of the file off of the
FreeBSD/Apache server. One by sftp and the other by http. I ran diff
--minimal file.good file.bad. The files are the same. In fact the sftp file
worked without issue. The http download of the same file did have a problem
but when I renamed it and ran it again it worked. Any ideas?

Thank You,
-Randy

-----Original Message-----
From: Don [mailto:dnrlinux@san.rr.com] 
Sent: Thursday, January 29, 2004 4:38 PM
To: users@httpd.apache.org
Subject: RE: [users@httpd] Facilitating File Downloads...


I'm not an ASP guy myself, so I can't answer that part of your question...

That's awfully strange that after a few weeks the file goes bad. From what
you describe, it doesn't sound like an http/ftp problem at all. It sounds
like you have some problem with the original file, and the server it lives
on. Could somebody be modifying it?

You say you "make a new copy and put it on the server" and then it's
fine.... That means the original file changed someway. I think you need to
track down where/when does the original file "go bad"? Sounds like something
has write access to that file and it shouldn't.

Or, perhaps you have some "disk maintenance program" that's corrupting the
file? When you discover the file is bad, do you run a "diff" on it to see
what's changed from your "new copy"?

Maybe set up a copy on another (local) server and use rsync or something to
notify when the file changes....

Don



-----Original Message-----
From: Randy Grafton [mailto:rgrafton@indatacorp.com]
Sent: Thursday, January 29, 2004 1:49 PM
To: users@httpd.apache.org
Subject: RE: [users@httpd] Facilitating File Downloads...


We actually do have ftp access to these files. The http effort was to reach
those customers behind firewalls with ftp blocked. I guess I should also
give a little more info of the setup here; The website is on win2k with IIS,
it is an ASP generated page with links to files stored on our FreeBSD/Apache
server. The links are performed with: Response.Redirect
"http://apache.server/dir/file.exe"

I'm not familiar with ASP, should we be doing the <a
href='http://apache.server/dir/file.exe'>Get File</a> instead?

The file is usually good for several weeks and then we'll get a call from a
customer complaining that their install didn't work after repeated
downloads. We'll go grab the file off of the server through ssh/sftp and
sure enough the file is bad. At this point we'll make a new copy on the
server and wait for the next customer to call.

150MB isn't that big, relatively speaking, I saw some posts of similar
problems(more speed related) and the admins downgraded to the current 1.3.x
release. I've not tried that yet. I assume there are major differences in
file handling between the two? I also don't want to overlook the httpd.conf
file. It is pretty plain vanilla, (no optimizations that I know of for this
specific use). This server is dedicated to ftp and http downloads of files
only.

Thank You,
-Randy

-----Original Message-----
From: Don [mailto:dnrlinux@san.rr.com]
Sent: Thursday, January 29, 2004 2:25 PM
To: users@httpd.apache.org
Subject: RE: [users@httpd] Facilitating File Downloads...


Does the file have to be available via http? Can you make the link on the
web page an ftp url?

<a href="ftp://blah.blah...">Click here to download by ftp</a>

Of course your ftp has to support PASV or else Mozilla browsers will choke
on it. (Mozilla 1.6 bug 232088)



-----Original Message-----
From: Randy Grafton [mailto:rgrafton@indatacorp.com]
Sent: Thursday, January 29, 2004 1:23 PM
To: users@httpd.apache.org
Subject: [users@httpd] Facilitating File Downloads...


We want to make some files available for download through http. We reference
the files with full paths for the href value in an anchor tag. The files
range in size up to 150MB. We seem to be having a problem with the larger
files becoming corrupted. The large files are self extracting installers for
demos of our products. The corruption of these files results in the self
extractor/installer files crashing on the clients computer after download
from our Apache 2.0.48 server, (running on FreeBSD 4.8). Is there something
I should configure in my httpd.config file to accommodate http downloads?
Should I look to an alternative such as storing the file in a MySQL database
and handing it out from there with PHP scripts?

Thanks For the suggestions,
-Randy


---------------------------------------------------------------------
The official User-To-User support forum of the Apache HTTP Server Project.
See <URL:http://httpd.apache.org/userslist.html> for more info. To
unsubscribe, e-mail: users-unsubscribe@httpd.apache.org
   "   from the digest: users-digest-unsubscribe@httpd.apache.org
For additional commands, e-mail: users-help@httpd.apache.org


---------------------------------------------------------------------
The official User-To-User support forum of the Apache HTTP Server Project.
See <URL:http://httpd.apache.org/userslist.html> for more info. To
unsubscribe, e-mail: users-unsubscribe@httpd.apache.org
   "   from the digest: users-digest-unsubscribe@httpd.apache.org
For additional commands, e-mail: users-help@httpd.apache.org



---------------------------------------------------------------------
The official User-To-User support forum of the Apache HTTP Server Project.
See <URL:http://httpd.apache.org/userslist.html> for more info. To
unsubscribe, e-mail: users-unsubscribe@httpd.apache.org
   "   from the digest: users-digest-unsubscribe@httpd.apache.org
For additional commands, e-mail: users-help@httpd.apache.org


---------------------------------------------------------------------
The official User-To-User support forum of the Apache HTTP Server Project.
See <URL:http://httpd.apache.org/userslist.html> for more info. To
unsubscribe, e-mail: users-unsubscribe@httpd.apache.org
   "   from the digest: users-digest-unsubscribe@httpd.apache.org
For additional commands, e-mail: users-help@httpd.apache.org



---------------------------------------------------------------------
The official User-To-User support forum of the Apache HTTP Server Project.
See <URL:http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: users-unsubscribe@httpd.apache.org
   "   from the digest: users-digest-unsubscribe@httpd.apache.org
For additional commands, e-mail: users-help@httpd.apache.org


RE: [users@httpd] Facilitating File Downloads...

Posted by Don <dn...@san.rr.com>.
I'm not an ASP guy myself, so I can't answer that part of your question...

That's awfully strange that after a few weeks the file goes bad. From what
you describe, it doesn't sound like an http/ftp problem at all. It sounds
like you have some problem with the original file, and the server it lives
on. Could somebody be modifying it?

You say you "make a new copy and put it on the server" and then it's
fine.... That means the original file changed someway. I think you need to
track down where/when does the original file "go bad"? Sounds like something
has write access to that file and it shouldn't.

Or, perhaps you have some "disk maintenance program" that's corrupting the
file? When you discover the file is bad, do you run a "diff" on it to see
what's changed from your "new copy"?

Maybe set up a copy on another (local) server and use rsync or something to
notify when the file changes....

Don



-----Original Message-----
From: Randy Grafton [mailto:rgrafton@indatacorp.com]
Sent: Thursday, January 29, 2004 1:49 PM
To: users@httpd.apache.org
Subject: RE: [users@httpd] Facilitating File Downloads...


We actually do have ftp access to these files. The http effort was to reach
those customers behind firewalls with ftp blocked. I guess I should also
give a little more info of the setup here; The website is on win2k with IIS,
it is an ASP generated page with links to files stored on our FreeBSD/Apache
server.
The links are performed with: Response.Redirect
"http://apache.server/dir/file.exe"

I'm not familiar with ASP, should we be doing the <a
href='http://apache.server/dir/file.exe'>Get File</a> instead?

The file is usually good for several weeks and then we'll get a call from a
customer complaining that their install didn't work after repeated
downloads. We'll go grab the file off of the server through ssh/sftp and
sure enough the file is bad. At this point we'll make a new copy on the
server and wait for the next customer to call.

150MB isn't that big, relatively speaking, I saw some posts of similar
problems(more speed related) and the admins downgraded to the current 1.3.x
release. I've not tried that yet. I assume there are major differences in
file handling between the two? I also don't want to overlook the httpd.conf
file. It is pretty plain vanilla, (no optimizations that I know of for this
specific use). This server is dedicated to ftp and http downloads of files
only.

Thank You,
-Randy

-----Original Message-----
From: Don [mailto:dnrlinux@san.rr.com]
Sent: Thursday, January 29, 2004 2:25 PM
To: users@httpd.apache.org
Subject: RE: [users@httpd] Facilitating File Downloads...


Does the file have to be available via http? Can you make the link on the
web page an ftp url?

<a href="ftp://blah.blah...">Click here to download by ftp</a>

Of course your ftp has to support PASV or else Mozilla browsers will choke
on it. (Mozilla 1.6 bug 232088)



-----Original Message-----
From: Randy Grafton [mailto:rgrafton@indatacorp.com]
Sent: Thursday, January 29, 2004 1:23 PM
To: users@httpd.apache.org
Subject: [users@httpd] Facilitating File Downloads...


We want to make some files available for download through http. We reference
the files with full paths for the href value in an anchor tag. The files
range in size up to 150MB. We seem to be having a problem with the larger
files becoming corrupted. The large files are self extracting installers for
demos of our products. The corruption of these files results in the self
extractor/installer files crashing on the clients computer after download
from our Apache 2.0.48 server, (running on FreeBSD 4.8). Is there something
I should configure in my httpd.config file to accommodate http downloads?
Should I look to an alternative such as storing the file in a MySQL database
and handing it out from there with PHP scripts?

Thanks For the suggestions,
-Randy


---------------------------------------------------------------------
The official User-To-User support forum of the Apache HTTP Server Project.
See <URL:http://httpd.apache.org/userslist.html> for more info. To
unsubscribe, e-mail: users-unsubscribe@httpd.apache.org
   "   from the digest: users-digest-unsubscribe@httpd.apache.org
For additional commands, e-mail: users-help@httpd.apache.org


---------------------------------------------------------------------
The official User-To-User support forum of the Apache HTTP Server Project.
See <URL:http://httpd.apache.org/userslist.html> for more info. To
unsubscribe, e-mail: users-unsubscribe@httpd.apache.org
   "   from the digest: users-digest-unsubscribe@httpd.apache.org
For additional commands, e-mail: users-help@httpd.apache.org



---------------------------------------------------------------------
The official User-To-User support forum of the Apache HTTP Server Project.
See <URL:http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: users-unsubscribe@httpd.apache.org
   "   from the digest: users-digest-unsubscribe@httpd.apache.org
For additional commands, e-mail: users-help@httpd.apache.org


---------------------------------------------------------------------
The official User-To-User support forum of the Apache HTTP Server Project.
See <URL:http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: users-unsubscribe@httpd.apache.org
   "   from the digest: users-digest-unsubscribe@httpd.apache.org
For additional commands, e-mail: users-help@httpd.apache.org


RE: [users@httpd] Facilitating File Downloads...

Posted by Don <dn...@san.rr.com>.
I'm not an ASP guy myself, so I can't answer that part of your question...

That's awfully strange that after a few weeks the file goes bad. From what
you describe, it doesn't sound like an http/ftp problem at all. It sounds
like you have some problem with the original file, and the server it lives
on. Could somebody be modifying it?

You say you "make a new copy and put it on the server" and then it's
fine.... That means the original file changed someway. I think you need to
track down where/when does the original file "go bad"? Sounds like something
has write access to that file and it shouldn't.

Or, perhaps you have some "disk maintenance program" that's corrupting the
file? When you discover the file is bad, do you run a "diff" on it to see
what's changed from your "new copy"?

Maybe set up a copy on another (local) server and use rsync or something to
notify when the file changes....

Don



-----Original Message-----
From: Randy Grafton [mailto:rgrafton@indatacorp.com]
Sent: Thursday, January 29, 2004 1:49 PM
To: users@httpd.apache.org
Subject: RE: [users@httpd] Facilitating File Downloads...


We actually do have ftp access to these files. The http effort was to reach
those customers behind firewalls with ftp blocked. I guess I should also
give a little more info of the setup here; The website is on win2k with IIS,
it is an ASP generated page with links to files stored on our FreeBSD/Apache
server.
The links are performed with: Response.Redirect
"http://apache.server/dir/file.exe"

I'm not familiar with ASP, should we be doing the <a
href='http://apache.server/dir/file.exe'>Get File</a> instead?

The file is usually good for several weeks and then we'll get a call from a
customer complaining that their install didn't work after repeated
downloads. We'll go grab the file off of the server through ssh/sftp and
sure enough the file is bad. At this point we'll make a new copy on the
server and wait for the next customer to call.

150MB isn't that big, relatively speaking, I saw some posts of similar
problems(more speed related) and the admins downgraded to the current 1.3.x
release. I've not tried that yet. I assume there are major differences in
file handling between the two? I also don't want to overlook the httpd.conf
file. It is pretty plain vanilla, (no optimizations that I know of for this
specific use). This server is dedicated to ftp and http downloads of files
only.

Thank You,
-Randy

-----Original Message-----
From: Don [mailto:dnrlinux@san.rr.com]
Sent: Thursday, January 29, 2004 2:25 PM
To: users@httpd.apache.org
Subject: RE: [users@httpd] Facilitating File Downloads...


Does the file have to be available via http? Can you make the link on the
web page an ftp url?

<a href="ftp://blah.blah...">Click here to download by ftp</a>

Of course your ftp has to support PASV or else Mozilla browsers will choke
on it. (Mozilla 1.6 bug 232088)



-----Original Message-----
From: Randy Grafton [mailto:rgrafton@indatacorp.com]
Sent: Thursday, January 29, 2004 1:23 PM
To: users@httpd.apache.org
Subject: [users@httpd] Facilitating File Downloads...


We want to make some files available for download through http. We reference
the files with full paths for the href value in an anchor tag. The files
range in size up to 150MB. We seem to be having a problem with the larger
files becoming corrupted. The large files are self extracting installers for
demos of our products. The corruption of these files results in the self
extractor/installer files crashing on the clients computer after download
from our Apache 2.0.48 server, (running on FreeBSD 4.8). Is there something
I should configure in my httpd.config file to accommodate http downloads?
Should I look to an alternative such as storing the file in a MySQL database
and handing it out from there with PHP scripts?

Thanks For the suggestions,
-Randy


---------------------------------------------------------------------
The official User-To-User support forum of the Apache HTTP Server Project.
See <URL:http://httpd.apache.org/userslist.html> for more info. To
unsubscribe, e-mail: users-unsubscribe@httpd.apache.org
   "   from the digest: users-digest-unsubscribe@httpd.apache.org
For additional commands, e-mail: users-help@httpd.apache.org


---------------------------------------------------------------------
The official User-To-User support forum of the Apache HTTP Server Project.
See <URL:http://httpd.apache.org/userslist.html> for more info. To
unsubscribe, e-mail: users-unsubscribe@httpd.apache.org
   "   from the digest: users-digest-unsubscribe@httpd.apache.org
For additional commands, e-mail: users-help@httpd.apache.org



---------------------------------------------------------------------
The official User-To-User support forum of the Apache HTTP Server Project.
See <URL:http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: users-unsubscribe@httpd.apache.org
   "   from the digest: users-digest-unsubscribe@httpd.apache.org
For additional commands, e-mail: users-help@httpd.apache.org


---------------------------------------------------------------------
The official User-To-User support forum of the Apache HTTP Server Project.
See <URL:http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: users-unsubscribe@httpd.apache.org
   "   from the digest: users-digest-unsubscribe@httpd.apache.org
For additional commands, e-mail: users-help@httpd.apache.org


RE: [users@httpd] Facilitating File Downloads...

Posted by Randy Grafton <rg...@indatacorp.com>.
We actually do have ftp access to these files. The http effort was to reach
those customers behind firewalls with ftp blocked. I guess I should also
give a little more info of the setup here; The website is on win2k with IIS,
it is an ASP generated page with links to files stored on our FreeBSD/Apache
server.
The links are performed with: Response.Redirect
"http://apache.server/dir/file.exe"

I'm not familiar with ASP, should we be doing the <a
href='http://apache.server/dir/file.exe'>Get File</a> instead?

The file is usually good for several weeks and then we'll get a call from a
customer complaining that their install didn't work after repeated
downloads. We'll go grab the file off of the server through ssh/sftp and
sure enough the file is bad. At this point we'll make a new copy on the
server and wait for the next customer to call.

150MB isn't that big, relatively speaking, I saw some posts of similar
problems(more speed related) and the admins downgraded to the current 1.3.x
release. I've not tried that yet. I assume there are major differences in
file handling between the two? I also don't want to overlook the httpd.conf
file. It is pretty plain vanilla, (no optimizations that I know of for this
specific use). This server is dedicated to ftp and http downloads of files
only.

Thank You,
-Randy

-----Original Message-----
From: Don [mailto:dnrlinux@san.rr.com] 
Sent: Thursday, January 29, 2004 2:25 PM
To: users@httpd.apache.org
Subject: RE: [users@httpd] Facilitating File Downloads...


Does the file have to be available via http? Can you make the link on the
web page an ftp url?

<a href="ftp://blah.blah...">Click here to download by ftp</a>

Of course your ftp has to support PASV or else Mozilla browsers will choke
on it. (Mozilla 1.6 bug 232088)



-----Original Message-----
From: Randy Grafton [mailto:rgrafton@indatacorp.com]
Sent: Thursday, January 29, 2004 1:23 PM
To: users@httpd.apache.org
Subject: [users@httpd] Facilitating File Downloads...


We want to make some files available for download through http. We reference
the files with full paths for the href value in an anchor tag. The files
range in size up to 150MB. We seem to be having a problem with the larger
files becoming corrupted. The large files are self extracting installers for
demos of our products. The corruption of these files results in the self
extractor/installer files crashing on the clients computer after download
from our Apache 2.0.48 server, (running on FreeBSD 4.8). Is there something
I should configure in my httpd.config file to accommodate http downloads?
Should I look to an alternative such as storing the file in a MySQL database
and handing it out from there with PHP scripts?

Thanks For the suggestions,
-Randy


---------------------------------------------------------------------
The official User-To-User support forum of the Apache HTTP Server Project.
See <URL:http://httpd.apache.org/userslist.html> for more info. To
unsubscribe, e-mail: users-unsubscribe@httpd.apache.org
   "   from the digest: users-digest-unsubscribe@httpd.apache.org
For additional commands, e-mail: users-help@httpd.apache.org


---------------------------------------------------------------------
The official User-To-User support forum of the Apache HTTP Server Project.
See <URL:http://httpd.apache.org/userslist.html> for more info. To
unsubscribe, e-mail: users-unsubscribe@httpd.apache.org
   "   from the digest: users-digest-unsubscribe@httpd.apache.org
For additional commands, e-mail: users-help@httpd.apache.org



---------------------------------------------------------------------
The official User-To-User support forum of the Apache HTTP Server Project.
See <URL:http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: users-unsubscribe@httpd.apache.org
   "   from the digest: users-digest-unsubscribe@httpd.apache.org
For additional commands, e-mail: users-help@httpd.apache.org


RE: [users@httpd] Facilitating File Downloads...

Posted by Don <dn...@san.rr.com>.
Does the file have to be available via http? Can you make the link on the
web page an ftp url?

<a href="ftp://blah.blah...">Click here to download by ftp</a>

Of course your ftp has to support PASV or else Mozilla browsers will choke
on it. (Mozilla 1.6 bug 232088)



-----Original Message-----
From: Randy Grafton [mailto:rgrafton@indatacorp.com]
Sent: Thursday, January 29, 2004 1:23 PM
To: users@httpd.apache.org
Subject: [users@httpd] Facilitating File Downloads...


We want to make some files available for download through http. We reference
the files with full paths for the href value in an anchor tag. The files
range in size up to 150MB. We seem to be having a problem with the larger
files becoming corrupted. The large files are self extracting installers for
demos of our products. The corruption of these files results in the self
extractor/installer files crashing on the clients computer after download
from our Apache 2.0.48 server, (running on FreeBSD 4.8). Is there something
I should configure in my httpd.config file to accommodate http downloads?
Should I look to an alternative such as storing the file in a MySQL database
and handing it out from there with PHP scripts?

Thanks For the suggestions,
-Randy


---------------------------------------------------------------------
The official User-To-User support forum of the Apache HTTP Server Project.
See <URL:http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: users-unsubscribe@httpd.apache.org
   "   from the digest: users-digest-unsubscribe@httpd.apache.org
For additional commands, e-mail: users-help@httpd.apache.org


---------------------------------------------------------------------
The official User-To-User support forum of the Apache HTTP Server Project.
See <URL:http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: users-unsubscribe@httpd.apache.org
   "   from the digest: users-digest-unsubscribe@httpd.apache.org
For additional commands, e-mail: users-help@httpd.apache.org