Skip to content
Advertisement

Laravel – Too slow sitemap creation (Apache and Node.js installed on the same server)

I am running two Angular2+ projects and a Laravel API on my server together. The first angular project is a public website for users (SSR enabled), the second angular project is Admin Panel (SSR not enabled), these two projects run on Laravel API. I have a strange problem with sitemap creation.

I am using this library for sitemap creation and also Laravel API is running on Apache Server.

The problem is The sitemap creation process is taking too much time (Apr. 70 seconds) This problem occurred after enabling server-side rendering on my angular project. If I disable SSR, the problem not occurs. While the angular SSR project is running based on Node.js, the sitemap creation process increases Node.js resource usage (CPU up to 85%) and therefore this process takes 1 minute to complete. This is strange because Laravel is running on the Apache server and the admin can create a sitemap, because SSR is not enabled for the admin panel, it shouldn’t be related to the Node.js server.

Here is my SSR enabled angular project (when idle): ssr enabled angular

Here is my process list: enter image description here

This ss taken when executing following code: $sitemap = SitemapGenerator::create($website->url)->getSitemap();

As you can see Node process is consuming all CPU resources.

How did I understand sitemap generation is getting slow the performance?

Sitemap generation code here:

    if($posts == null || count($posts) == 0) {
        return 'Bu siteye ait link verisi bulunamadı.!';
    } else {
        $sitemap = SitemapGenerator::create($website->url)->getSitemap(); //<-- This line
            
        return;
        //code never reach here for testing purpose
        foreach ($posts as $post) {
            //do something
        }
    }

If I comment the $sitemap = SitemapGenerator::create($website->url)->getSitemap(); line the request is finished in 2 seconds otherwise it is finished in 70 seconds. While this code is executing, Node.js cpu usage is hitting up.

And also if I stop the SSR with terminal sudo pm2 stop ssr.website, I can generate my sitemap in 2 seconds.


So what kind of relationship exists between this request and the Node.js process?

Is this an Apache bug?

Is this a Node.js bug? or something related to this library?

How can I fix it?

My Apache configuration file for SSR enabled website:

    <VirtualHost *:80>
        ServerAdmin info@example.com
        DocumentRoot /var/www/html/example.com/dist/browser

        ErrorLog ${APACHE_LOG_DIR}/error.log
        CustomLog ${APACHE_LOG_DIR}/access.log combined

        ServerName example.com
        ServerAlias api.example.com

        <Directory /var/www/html/example.com>
            Options -Indexes +FollowSymLinks

            RewriteEngine on

            # Don't rewrite files or directories
            RewriteCond %{REQUEST_FILENAME} -f [OR]
            RewriteCond %{REQUEST_FILENAME} -d
            RewriteRule ^ - [L]

            # Rewrite everything else to index.html to allow HTML5 state links
            RewriteRule ^ index.html [L]
        </Directory>
        ProxyPreserveHost On
        ProxyPass / http://localhost:4000/
        ProxyPassReverse / http://localhost:4000/
        ErrorDocument 403 https://example.com/403/
    </VirtualHost>

My Apache configuration file for Laravel API:

    <VirtualHost *:80>
        ServerAdmin info@example.com
        DocumentRoot /var/www/html/api.example.com/public
        ErrorLog ${APACHE_LOG_DIR}/error.log
        CustomLog ${APACHE_LOG_DIR}/access.log combined

        ServerName api.example.com
        ServerAlias www.api.example.com

        <Directory /var/www/html/api.example.com/public>
            AllowOverride All
        </Directory>
    </VirtualHost>

My environment;

  • Node v10.22.0
  • Apache v2.4.29
  • PHP v7.4.14
  • Laravel v6.0
  • Laravel Sitemap v5.5

Server;

  • Ubuntu Server – 18.04.4
  • 8 GB RAM

Thank you

Advertisement

Answer

From what I understand, that lib works by dynamically crawling the website. I don’t know how many pages you’ve got, but it could take a while since the crawler will cause angular universal to render all the pages to retrieve the links.

If you disable SSR, I doubt that the sitemap will work as expected, since the crawler will not be able to retrieve page content (as without SSR, the rendered page only contains js/css links).

A solution could be to generate the sitemap yourself: add static links first, then use your api generate the lsit of dynamic pages (product/xxx, product/yyy)

User contributions licensed under: CC BY-SA
3 People found this is helpful
Advertisement