Using Laravel 6, and so Eloquent Collection classes.
So I have a “lot” of data to process. Roughly 5000 rows, and when fetching it, this generates a Collection of 5000 models. Now each of those models has maybe 20 attributes that need to be read.
Is there a fast way to do this? I currently have an array of the attributes I want to read, and then loops set up like this:
fopen()... foreach ($models as $model) { $row = []; foreach ($this->attributes as $attr) { $row[] = data_get($model, $attr); } fputcsv($fh, $row); } fclose()...
$models
is a Laravel Collection, created by EloquentModel::find($ids);
where $ids
is an array of integers. (5000 ids from the db)
$this
refers to the class which contains the foreach loops. There is nothing else in this class besides the function that contains the code above, and the attributes
property which is just an array of strings.
For 5000 rows, each row loops 20 attributes, this can take a very long time to process, and in every case this actually throws a FatalErrorException: Allowed memory size of 134217728 bytes exhausted
So what is the fastest way to retrieve the set of attributes for each row? I can’t think of a faster one than this nested loop personally.
Additionally, seeing as fputcsv()
is writing each line to file, and the $row
variable is being overwritten each loop, why am I still getting Allowed memory size exhausted
?
Would a LazyCollection be the solution here?
Thanks!
Advertisement
Answer
This would process the models in the chunks, 200 at a time, thereby saving a lot of memory.
Model::whereIn('id', $ids)->chunk(200, function($models){ foreach ($models as $model) { $row = []; foreach ($this->attributes as $attr) { $row[] = data_get($model, $attr); } fputcsv($fh, $row); } });