Do you know any solucion to unpack large .gz in php (>200 MB .gz, >4GB original… maybe in packs of >1GB or >2GB inside) ? Solucion of decode .gz part by part is needed. Code gzdecode(@file_get_contents($file)) gets PHP error: PHP Fatal error: Allowed memory size of 268435456 bytes exhausted (tried to allocate … of course. I cannot increase the PHP
Tag: out-of-memory
Large amount of data. Best way to iterate over, without getting memory exhaustion?
Using Laravel 6, and so Eloquent Collection classes. So I have a “lot” of data to process. Roughly 5000 rows, and when fetching it, this generates a Collection of 5000 models. Now each of those models has maybe 20 attributes that need to be read. Is there a fast way to do this? I currently have an array of the
Have PHP dump heap on OutOfMemory exception
I am currently debugging a script that constantly runs into OutOfMemory exceptions. It is run as a cronjob and usually runs fine, but when the cronjob wasn’t run for a while (for whatever reason) the script has to handle to many elements that queued up and will run into a OutOfMemory exception. From examining the code I was not able