I have a snippet that resembles the following: This snippet should run as a daemon service, but I’m having a lot of trouble making this work. The issue: each iteration increases the process memory usage. As if at each new iteration a new $myObject is being instantiated, but the previous one remains allocated in memory, and such. I have tried:
Tag: memory
Fatal error: Allowed Memory size of 67108864 in opencart
I’m using opencart. In my admin page when I access CATALOG>PRODUCTS (I have 73 products – Totally I have four pages). When I access my second page it shows this following error But, I can access first, third and fourth page. I have tried this solution (Allowed memory size of 67108864 bytes exhausted (tried to allocate 4459414 bytes) in writing
Fastest PHP memory cache/hashtable [closed]
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance. Closed 11 years ago.
In PHP, what happens in memory when we use mysql_query
I used to fetch large amount of data using mysql_query then iterating through the result one by one to process the data. Ex: Recently I looked at a few framework and realized that they fetched all data to an array in memory and returning the array. I would like to know the pros/cons of each method. It appears to me
How to base64-decode large files in PHP
My PHP web application has an API that can recieve reasonably large files (up to 32 MB) which are base64 encoded. The goal is to write these files somewhere on my filesystem. Decoded of course. What would be the least resource intensive way of doing this? Edit: Recieving the files through an API means that I have a 32MB string