Skip to content
Advertisement

How to copy very large files from URL to server via PHP?

I use the following code to copy/download files from an external server (any server via a URL) to my hosted web server(Dreamhost shared hosting at default settings).

<!DOCTYPE html>
<html>
<head>
    <title></title>
</head>
<body>
<form method="post" action="copy.php">
    <input type="submit" value="click" name="submit">
</form>
</body>
</html>
<!-- copy.php file contents -->
<?php
function chunked_copy() {
    # 1 meg at a time, adjustable.
    $buffer_size = 1048576; 
    $ret = 0;
    $fin = fopen("http://www.example.com/file.zip", "rb");
    $fout = fopen("file.zip", "w");
    while(!feof($fin)) {
        $ret += fwrite($fout, fread($fin, $buffer_size));
    }
    fclose($fin);
    fclose($fout);
    return $ret; # return number of bytes written
}
if(isset($_POST['submit']))
{
   chunked_copy();
} 
?>

However the function stops running at about once 2.5GB (sometimes 2.3GB and sometimes 2.7GB, etc) of the file has downloaded. This happens every time I execute this function. Smaller files (<2GB) rarely exhibit this problem. I believe nothing is wrong with the source as I separately downloaded the file flawlessly onto my home PC.

Can someone please remedy and explain this problem to me? I am very new to programming.

Also,

file_put_contents("Tmpfile.zip", fopen("http://example.com/file.zip", 'r')); 

exhibits similar symptoms as well.

Advertisement

Answer

I think the problem might be the 30 second time-out on many servers running PHP scripts.

PHP scripts running via cron or shell wont have that problem so perhaps you could find a way to do it that way.

Alternatively you could add set_time_limit([desired time]) to the start of your code.

User contributions licensed under: CC BY-SA
4 People found this is helpful
Advertisement