How do I use get_compiled_select or count_all_results before running the query without getting the table name added twice? When I use $this->db->get(‘tblName’) after either of those, I get the error:
Not unique table/alias: 'tblProgram' SELECT * FROM (`tblProgram`, `tblProgram`) JOIN `tblPlots` ON `tblPlots`.`programID`=`tblProgram`.`pkProgramID` JOIN `tblTrees` ON `tblTrees`.`treePlotID`=`tblPlots`.`id` ORDER BY `tblTrees`.`id` ASC LIMIT 2000
If I don’t use a table name in count_all_results or $this->db->get(), then I get an error that no table is used. How can I get it to set the table name just once?
public function get_download_tree_data($options=array(), $rand=""){ //join tables and order by tree id $this->db->reset_query(); $this->db->join('tblPlots','tblPlots.programID=tblProgram.pkProgramID'); $this->db->join('tblTrees','tblTrees.treePlotID=tblPlots.id'); $this->db->order_by('tblTrees.id', 'ASC'); //get number of results to return $allResults=$this->db->count_all_results('tblProgram', false); //chunk data and write to CSV to avoid reaching memory limit $offset=0; $chunk=2000; $treePath=$this->config->item('temp_path')."$rand/trees.csv"; $tree_handle=fopen($treePath,'a'); while (($offset<$allResults)) { $this->db->limit($chunk, $offset); $result=$this->db->get('tblProgram')->result_array(); foreach ($result as $row) { fputcsv($tree_handle, $row); } $offset=$offset+$chunk; } fclose($tree_handle); return array('resultCount'=>$allResults); }
Advertisement
Answer
To count how many rows would be returned by a query, essentially all the work must be performed. That is, it is impractical to get the count, then perform the query; you may as well just do the query.
If your goal is to “paginate” by getting some of the rows, plus the total count, that is essentially two separate actions (that may be combined to look like one.)
If the goal is to estimate the number of rows, then SHOW TABLE STATUS
or SELECT
Rows FROM information_schema.TABLES WHERE ...
gives you an estimate.
If you want to see if there are, say “at least 100 rows”, then this may be practical:
SELECT 1 FROM ... WHERE ... ORDER BY ... LIMIT 99,1
and see if you get a row back. However, this may or may not be efficient, depending on the indexes and the WHERE
and the ORDER BY
. (Show us the query and I can elaborate.)
Using OFFSET
for chunking is grossly inefficient. If there is not a usable index, then it is performing essentially the entire query for each chunk. If there is a usable index, the chunks are slower and slower. Here is a discussion of why OFFSET
is not good for “pagination”, plus an efficient workaround: Pagination . It talks about how to “remember where you left off ” as an efficient technique for chunking. Fetch between 100 and 1000 rows per chunk.