Skip to content
Advertisement

In PHP, what happens in memory when we use mysql_query

I used to fetch large amount of data using mysql_query then iterating through the result one by one to process the data. Ex:

$mysql_result = mysql_query("select * from user");
while($row = mysql_fetch_array($mysql_result)){
    echo $row['email'] . "n";
}

Recently I looked at a few framework and realized that they fetched all data to an array in memory and returning the array.

$large_array = $db->fetchAll("select * from user");
foreach($large_array as $user){
    echo $user['email'] . "n";
}

I would like to know the pros/cons of each method. It appears to me that loading everything in memory is a recipe for disaster if you have a very long list of items. But then again, a coworker told me that the mysql driver would have to put the result set in memory anyway. I’d like to get the opinion of someone who understand that the question is about performance. Please don’t comment on the code, I just made it up as an example for the post.

Thanks

Advertisement

Answer

It is true that the MySQL library “normally” fetches all data in the client memory. It usually does so by using mysql_store_result(). You can split up too large queries as shown above with the LIMIT keyword, but there is the risk that data get inconsistent because they might change in-between. You can care for this using locks.

Another approach could be to use mysql_use_result(), which uses more ressources on the server-side and requires to complete the fetching job ASAP.

User contributions licensed under: CC BY-SA
3 People found this is helpful
Advertisement