Skip to content
Advertisement

PHP Timeout and TOO_MANY_REDIRECTS

here is the situation;

I have an import running on php (basically, you can consider it as a big while loop). But, as there is a lot of data (hours of data to import); I can’t do that in one request, otherwise i’m taking the php timeout error after 10 min.

in order to avoid that timeout issue, I’ve decided to cut my import into many part..basically…i’m calling the same url again but increasing the parameters offset by a thousand every 5 min.

enter image description here

This is also working…but after some redirects..i’m taking the too many redirects error.

This issue is tagged chrome but if you have a solution for an other broweser I take it.

My question is : Do I have a way on chrome to increase the number of redirects which can be allowed ?

Or may be the fix could be to temporary remove the timeout from php ? I’m struggling to know what the best solution could be. How to do that ?

Advertisement

Answer

First of all I would not recommend going into those redirects. It would be way better to just set:

max_execution_time = 0

You don’t have to change this setting for all PHP, you can set it in your import script.

Do you have any possibility to change source file of your import?

It would be better to break this file to smaller ones and than you could use any message broker (eg. RabbitMQ) to queue your files one by one to import script.

If you can’t change source file because it’s from external source than you can chunk it by your own in yours script. Thant try to queue those chunks and import one after one using CRON job or something similiar.

What is happening during this import? Maybe you are trying to do just too much during import?

EDIT 2022-06

I am just curious if people are using yield instead of returning whole data read from a file during such imports. To save server’s memory it would be highly recommended to do so. It could be used like:

public function readFile(string $filePath): iterable
{
    $file = new SplFileObject($filePath);
    ...
    while (!$file->eof()) {
        $row = ...
        ...
        yield $row;
    }
}

Using yield statement here gives us huge memory savings (especially while loading big files) and make it possible to work on huge data amount smoothly.

User contributions licensed under: CC BY-SA
5 People found this is helpful
Advertisement