Skip to content
Advertisement

Why using DOMDocument makes site load slower?

I’m using DOMDocument with xpath to load some data to my site from external (fast) website.

Right now I use 4 urls (please see below). I need to increase to 8 urls.

What I have noticed, that the more of those you add, the more slower the site loads.

Is there any way to use xpath for more faster load?

Or maybe there’s at least some kind of a way to load the data on website1 (child website) and when it loads, include the data to my main website.

Any tips would be appeciated.

<?php
$parent_title = get_the_title( $post->post_parent );
$html_string = file_get_contents('weburladresshere');
$dom = new DOMDocument();
libxml_use_internal_errors(true);
$dom->loadHTML($html_string);
libxml_clear_errors();
$xpath = new DOMXpath($dom);
$values = array();
$row = $xpath->query('myquery');
foreach($row as $value) {
    print($value->nodeValue);
}

?>

Advertisement

Answer

It’s slow because you load external sites. Instead of loading them just in time try to load them “in the background” via another php job and save them to a temporary file. Then you can load the html from your local temp file which is faster than the loading the remote $html_string via file_get_contents.

Extended answer

Here you can see a very leightweight example of how you could handle it.

function getPageContent($url) {
    $filename = md5($url).'.tmp';

    // implement your extended cache logic here
    // for example: store it just for 60 seconds...
    if(!file_exists($filename)) {  
        file_put_contents($filename, $url);
    }    
    return file_get_contents($filename);
}

function businessLogic($url) {
    $htmlContent = getPageContent($url);
    // your business logic here
}

businessLogic($url);
User contributions licensed under: CC BY-SA
2 People found this is helpful
Advertisement