[H-GEN] Apache httpd.conf

David Jericho david.jericho at aarnet.edu.au
Thu Jul 7 01:25:43 EDT 2005


David Harrison wrote:

> You can recompile Apache v1.3.x with -D_LARGEFILE_SOURCE
> -D_FILE_OFFSET_BITS=64 and it will let you serve the files, but the
> Content-Length will be unknown (at least it was the last version of
> 1.3.x I tried - not sure if this has been fixed in recent versions).

It is possible, but you can break the occasional module.

> If you're using Apache v2.x I'm pretty sure as of the latest version is
> properly supports files of > ~2.1GB, as long as you compile with the
> above directives (for some reason my notes say with Apache v2 you need
> to export them as environment variables before starting the compile - eg,
> export CPPFLAGS="-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64"

David is right, but you still stand to break the odd module, or deviate
away from your package managed system.

There are a few solutions to this. One would be a redirect to a
customised HTTP process built for large files (be it Apache or something
else).

You could always do a recompile, and manage Apache and all your modules
outside of your current distribution.

Other options include splitting the files into sizes smaller than 2^31-1
bytes, possibly compressing the files, or using ftp/rsync/bittorrent[1].

Now that you have solved that problem, you still haven't solved the
client issue. Apache tends to take the approach of "protect stupid
clients from themselves".

Unfortunately, wget, FireFox, and many others can fall into this category.

[1] Nothing dictates you having to open your torrent to the world, and
nothing says bittorrent can't be used for internal file distribution.
Infact, it can be nicely dynamic.

-- 
David Jericho
Systems Administrator, AARNet
Phone:     +61 7 3317 9576
Mobile:    +61 4 2302 7185





More information about the General mailing list