Skip to content
Advertisement

Serving large files with PHP

So I am trying to serve large files via a PHP script, they are not in a web accessible directory, so this is the best way I can figure to provide access to them.

The only way I could think of off the bat to serve this file is by loading it into memory (fopen, fread, ect.), setting the header data to the proper MIME type, and then just echoing the entire contents of the file.

The problem with this is, I have to load these ~700MB files into memory all at once, and keep the entire thing there till the download is finished. It would be nice if I could stream in the parts that I need as they are downloading.

Any ideas?

Advertisement

Answer

You don’t need to read the whole thing – just enter a loop reading it in, say, 32Kb chunks and sending it as output. Better yet, use fpassthru which does much the same thing for you….

$name = 'mybigfile.zip';
$fp = fopen($name, 'rb');

// send the right headers
header("Content-Type: application/zip");
header("Content-Length: " . filesize($name));

// dump the file and stop the script
fpassthru($fp);
exit;

even less lines if you use readfile, which doesn’t need the fopen call…

$name = 'mybigfile.zip';

// send the right headers
header("Content-Type: application/zip");
header("Content-Length: " . filesize($name));

// dump the file and stop the script
readfile($name);
exit;

If you want to get even cuter, you can support the Content-Range header which lets clients request a particular byte range of your file. This is particularly useful for serving PDF files to Adobe Acrobat, which just requests the chunks of the file it needs to render the current page. It’s a bit involved, but see this for an example.

User contributions licensed under: CC BY-SA
7 People found this is helpful
Advertisement