I’m working in a ‘point of sale’ application (PHP + MySQL), with products, sales, product quantity that have to be updated (+ or -) depending on the operation (sale/buy), etc.
On every sale/buy, I save the operation data (date, customer, totals…) in an ‘invoices’ table, and the details (all products in the invoice) in a ‘invoicecontents’ table.
Ok, you got it.
Now (and it works fine), I do the following to save the invoice and update products quantity:
- Save the invoice data
- Iterate the products in the invoice (from a JSON) and create a single INSERT query to save the invoice contents in second table.
- AFTER THAT, once the invoice and details are saved, then I update products quantities WITH A SINGLE QUERY.
Something like this (for a sale operation):
UPDATE products a SET QUANTITY = QUANTITY - (SELECT sum(QUANTITY) FROM details b WHERE IDPRODUCT = a.ID and IDCUSTOMER = 4 and IDMOV = 615 and IDPRODUCT <> -1) where ID in (SELECT IDPRODUCT FROM details b WHERE b.IDPRODUCT = a.ID and IDCUSTOMER = 4 and IDMOV = 615 and IDARTICULO <> -1) and IDCUSTOMER = 4 and a.ACTSTOCK = 1;
This works fine. Of course all is in a “begin_transaction…commit”.
HOWEVER, I would like to know if this second method is better, faster or even more secure:
- Save the invoice data
- Iterate the products in the invoice (from a JSON) and…
- Inside the iteration, for each product (each json item): 3.1. save that line into the invoice contents table 3.2. update that product quantity in the products table for THAT product.
Something like this, for each product/item:
INSERT INTO detailtable (IDCUSTOMER,IDPRODUCT,QUANTITY......) VALUES (4,615,5); UPDATE products SET QUANTITY = QUANTITY - 5 WHERE IDPRODUCT = 615 and IDCUSTOMER = 4 and ACTSTOCK = 1;
Maybe this second approach is simpler and more undertandable, but I really don’t know if this will cause more (or less) CPU, memory consumption, taking into account that it is a multi-tenant database/application.
The problem I have with the first method is that, in the future, I will need to update more fields and tables with each product sold, so the big update query will be even bigger and even not possible.
Thanks!
Advertisement
Answer
Of course, single queries are generally more efficient than multiple queries, all other things being equal. But either approach should work. Clarity and readability of code are vital to long-lived SQL-using applications.
If you do choose multiple queries, always process your products in the same order: always sort the list of items in your JSON objects in ID
order before working through them one-by-one. This will help avoid deadlocks in your transactions.
In either case, but especially the multi-query case, take care to create efficient indexes for the WHERE clauses in all the queries within your transactions. The less time a transaction takes, the better.
Edit One more thing to do for best performance and fewest deadlocks.
Right after you begin your transaction, lock the rows in products
you want to update in the transaction. Run this SELECT query — it is very similar to the UPDATE query you showed us. You don’t need its resultset, just the FOR UPDATE
clause.
SELECT COUNT(ID) FROM ( SELECT ID FROM products WHERE ID in (SELECT IDPRODUCT FROM details b WHERE b.IDPRODUCT = a.ID and IDCUSTOMER = 4 and IDMOV = 615 and IDARTICULO <> -1) and IDCUSTOMER = 4 and a.ACTSTOCK = 1 ORDER BY ID FOR UPDATE ) subq;
The FOR UPDATE locks are released when you COMMIT (or ROLLBACK) the transaction.
The ORDER BY ID
clause avoids deadlocks.