Skip to content
Advertisement

How to copy very large files from URL to server via PHP?

I use the following code to copy/download files from an external server (any server via a URL) to my hosted web server(Dreamhost shared hosting at default settings).

JavaScript

However the function stops running at about once 2.5GB (sometimes 2.3GB and sometimes 2.7GB, etc) of the file has downloaded. This happens every time I execute this function. Smaller files (<2GB) rarely exhibit this problem. I believe nothing is wrong with the source as I separately downloaded the file flawlessly onto my home PC.

Can someone please remedy and explain this problem to me? I am very new to programming.

Also,

JavaScript

exhibits similar symptoms as well.

Advertisement

Answer

I think the problem might be the 30 second time-out on many servers running PHP scripts.

PHP scripts running via cron or shell wont have that problem so perhaps you could find a way to do it that way.

Alternatively you could add set_time_limit([desired time]) to the start of your code.

User contributions licensed under: CC BY-SA
4 People found this is helpful
Advertisement