While this may seem to be the right thing to do, I feel my code is very bad.
I’m running my Laravel 6.x app in a Docker container. When running the code below, I get
Allowed memory size of ** bytes exhausted (tried to allocate 8192 bytes)
No matter how high I set the memory_limit
, it’s the same error (with the new limit). So I want to review my code:
// I'm running a seeder. $arr = [1,2,3,4,5,.....]; // Get all users and update a column: $users = Users:all(); // Loop and update (we have thousands ????) foreach ($users as $user) { $index = array_rand($arr); $user->someColumn = $arr[$index]; $user->save(); } // Another loop for another model with same as above.....
This is causing the “allowed memory” issue. Is there any better way to achieve this?
Advertisement
Answer
Instead of fetching all users at once, fetch chunk
of users
AppUser::chunk(10, function ($users) use ($arr) { foreach ($users as $user) { $index = array_rand($arr); $user->someColumn = $arr[$index]; $user->save(); } });