So I have this huge file that has 45K+ arrays and I can’t just open a huge file on live server with high traffic from every request so I used array_chunk($array, 1000)
and saved them in 46 files.
Now I want to read those files when specific page is accessed.
Problem?
$offset
seems to be working fine with some pages but mostly offset changes to - number
(minus number). Checked page 25, 50, 75
and more…
My Math is kinda (Very) weak, so any help will be appreciated. Thanks!
<?php $page = ! empty( $_GET['page'] ) ? (int) $_GET['page'] : 1; $limt = 40; //Page array/item limit $fcnt = 46; //Total number of files $tarr = 45187; //Total number of arrays $fmu0 = (int)($limt*$fcnt*$page/$tarr); //Getting the file number according to page number if(file_exists("$fmu0.json")){ $id = json_decode(file_get_contents("$fmu0.json"),true); //Each file has 1000 arrays except the last file which has 187 arrays $tpgs = ceil($tarr/$limt); //Total number of pages $mult = $fmu0*count($id); $offset = ($page - 1) * $limt-$mult; if( $offset < 0 ){$offset = 0;} $id = array_slice( $id, $offset, $limt ); var_dump($id); } ?>
Advertisement
Answer
1000 objects per file and 40 objects per page makes 25 pages per file.
Here’s how to find the file containing objects for $page
number:
$fmu0 = floor($page / 25);
And here’s how to find the starting index of the group of 40 ($limt
) objects within that file corresponding to $page
number, when the first page is 1:
$offset = (($page - 1) * $limt) - ($fmu0 * 1000);
<?php $page = (!empty($_GET['page']) && 0 < $_GET['page']) ? (int)$_GET['page']: 1; $limt = 40; //Page array/item limit // FIND FILE, 25 pages per file $fmu0 = floor($page / 25); if(file_exists("$fmu0.json")){ $id = json_decode(file_get_contents("$fmu0.json"),true); // FIND GROUP of 40 ($limt) page objects, 1000 objects per file $offset = (($page - 1) * $limt) - ($fmu0 * 1000); $id = array_slice( $id, $offset, $limt ); var_dump($id); } ?>