Hey guys,
I am developing a system using Laravel Excel/Maatwebsite. What I’m trying to achieve is when a user inserts an excel file into the system, the system will check for a few things and then insert the data into the database.
Here is an instance of the database schemas:
Hadees: h_id | h_english | r_id | h_last_raavi_id | b_id | s_id | k_id | h_status Raavi: r_id | r_english | r_status Baab: b_id | b_english | s_id | b_status Section: s_id | s_english | k_id | s_status Kitaab: k_id | k_english | k_status
My controller:
use IlluminateHttpRequest; use AppHttpControllersController; use AppModelsSection; use AppModelsRaavi; use AppImportsHadeesImport; use Excel; class ImportHadeesController extends Controller{ /** * Show the application import-hadees page. * * @return IlluminateHttpResponse */ public function index(){ $section = Section::where(['s_status' => 1])->get(); return view('admin/import-hadees', compact('section')); } /** * This method uses the Excel facade to prep the excel file * and extract data from it and uses AppImportsHadeesImport * class to insert each row in the database schema accordingly. * * @param IlluminateHttpRequest $request * @return IlluminateHttpResponse */ public function importHadees(Request $request){ $raavi = Raavi::where(['r_status' => 1])->get(); if($request->file('hadees_sheet')) { } else { return response()->json(['status'=>'error', 'msg'=> 'No file present!']); } $temp = $request->file('hadees_sheet')->store('temp'); $path=storage_path('app').'/'.$temp; $hadees = Excel::import(new HadeesImport($request->s_id, compact('raavi')), $path); if($hadees){ return response()->json(['status'=>'success', 'msg'=> 'Successfully imported all the data from the file to the database!']); } else{ return response()->json(['status'=>'error', 'msg'=> 'Unable to import data from the file to the database!']); } } }
HadeesImport Class:
use IlluminateSupportCollection; use MaatwebsiteExcelConcernsToCollection; use MaatwebsiteExcelConcernsWithHeadingRow; use AppModelsSection; use AppModelsBaab; use AppModelsHadees; use AppModelsRaavi; class HadeesImport implements ToCollection, WithHeadingRow{ /** * Global variable for section_id. */ public $s_id; /** * Global variable for raavi's data. */ public $raavi; /** * Construction function. * * @param int $id */ function __construct($id, $arr) { $this->s_id = $id; $this->raavi = $arr; } /** * This method is responsible for inserting the excel * sheet's rows data to the database schema. * * @param Collection $row */ public function collection(Collection $rows){ $baab = Baab::where(['s_id' => $this->s_id])->get(); $hissa = Section::where(['s_id' => $this->s_id])->first(); $kitaab = $hissa->k_id; $first_raavi = 0; $last_raavi = 0; $baab_id = 0; $data = array(); foreach ($rows as $row){ if($row['hadees_arabic'] != "" && $row['hadees_urdu'] != ""){ $baab_id = $this->baabCheck($baab, $row); foreach($this->raavi['raavi'] as $rav){ if(trim($rav->r_english) == trim($row['first_raavi_english'])){ $first_raavi = $rav->r_id; } else{ $first_raavi = 0; } $last_raavi = (trim($rav->r_english) == trim($row['last_raavi_english']))? $rav->r_id : 0; } if($first_raavi == 0){ $raavi = Raavi::create([ 'r_arabic' => trim($row['first_raavi_urdu']), 'r_urdu' => trim($row['first_raavi_urdu']), 'r_english' => trim($row['first_raavi_english']), 'r_nickname' => trim($row['raavi_other_names']), 'r_status' => 1, ]); $first_raavi = $raavi->r_id; } if($last_raavi == 0){ $raavi = Raavi::create([ 'r_arabic' => trim($row['last_raavi_urdu']), 'r_urdu' => trim($row['last_raavi_urdu']), 'r_english' => trim($row['last_raavi_english']), 'r_nickname' => 'n/a', 'r_status' => 1, ]); $last_raavi = $raavi->r_id; } $data = array([ 'h_arabic' => trim($row['hadees_arabic']), 'h_urdu' => trim($row['hadees_urdu']), 'h_english' => trim($row['hadees_english']), 'h_roman_keywords' => trim($row['roman_keywords']), 'h_number' => trim($row['hadees_number']), 'r_id' => $first_raavi, 'h_last_raavi_id' => $last_raavi, 'b_id' => $baab_id, 's_id' => $this->s_id, 'k_id' => $kitaab, 'h_status' => 1 ]); } } $hadees = Hadees::create($data); } /** * Checks if the baab exists in the database or not. * * @param Collection $baab * @param Object $row * @return int - baad_id or 0 */ public function baabCheck($baab, $row){ foreach($baab as $b){ if(trim($b->b_arabic) == trim($row['baab_arabic']) || trim($b->b_urdu) == trim($row['baab_urdu']) || trim($b->b_english) == trim($row['baab_english'])){ return $b->b_id; } else{ return 0; } } } }
It all was working fine when the data was less. Now I have 1400+ rows in Raavi table and 10,000+ rows in Baab table. Now whenever I try to import a sheet to the system it gives me this error:
Allowed memory size of 268435456 bytes exhausted (tried to allocate 37748736 bytes).
I think it’s because of so long foreach() loop. Any kind of help would be highly appreciated. If you guys have any suggestions about bad coding or bad logic building, please do let me know. I have been stuck on that issue for almost two days. Thanks in advance.
P.s: The error is same on the localhost and on hosting, just a difference in bytes. Which is due to different memory_limit setting, I believe.
Advertisement
Answer
All posted solutions mention raising the memory limit for PHP.
It does not work like that. You can’t just throw more RAM at a problem. What if your server has 2GB of RAM and you upload a file that, with all the arrays created there, can use more than 2GB of RAM? What’s next? Not to mention the risk of the server running out of memory and killing other processes. For example, if the server has shared PHP and MySQL running and PHP is causing the server to run out of memory, the OOM Killer will kick in and might kill the MySQL process.
The solution to your problem is to process that file in chunks. Laravel 6, for example, has a new type of collection, Lazy Collections. They can help you speed things up. Your code might have to change in order to use chunk processing, but imho, that is the only way you can fix this problem.
I would also run this in a queue, not from a user request.
I also know that the package you are using supports both chunking for reading and batching for inserting.