I am downloading a CSV file from another server as a data feed from a vendor.
I am using curl to get the contents of the file and saving that into a variable called $contents
.
I can get to that part just fine, but I tried exploding by r
and n
to get an array of lines but it fails with an ‘out of memory’ error.
I echo strlen($contents)
and it’s about 30.5 million chars.
I need to manipulate the values and insert them into a database. What do I need to do to avoid memory allocation errors?
Advertisement
Answer
PHP is choking because it’s running out memory. Instead of having curl populate a PHP variable with the contents of the file, use the
CURLOPT_FILE
option to save the file to disk instead.
//pseudo, untested code to give you the idea $fp = fopen('path/to/save/file', 'w'); curl_setopt($ch, CURLOPT_FILE, $fp); curl_exec ($ch); curl_close ($ch); fclose($fp);
Then, once the file is saved, instead of using the file
or file_get_contents
functions (which would load the entire file into memory, killing PHP again), use fopen
and fgets to read the file one line at a time.