Skip to content
Advertisement

Symfony2 Process component – unable to create pipe and launch a new process

I’m using the Symfony2 Process component to manually manage a pool of processes.

In the example below I restart 2 simple processes every 2 seconds and monitor what happens. The application breaks after restarting these processes a few hundred times.

Execution is stopped and I get the following PHP warning:

proc_open(): unable to create pipe Too many open files

and then the following exception is thrown by the Symfony Process component:

[SymfonyComponentProcessExceptionRuntimeException]  
Unable to launch a new process.   

I’ve manually monitored the total number of open processes and it never goes up the expected limit.

The below simplified snippet is part of a Symfony2 command and is being runned from the CLI (e.g. app/console hamster:run):

    $processes[] = new Process("ls > /dev/null", null, null, null, 2);
    $processes[] = new Process("date > /dev/null", null, null, null, 2);

    while (count($processes) > 0) {
        foreach ($processes as $i => $process) {
            if (!$process->isStarted()) {
                $process->start();

                continue;
            }

            try {
                $process->checkTimeout();
            } catch (Exception $e) {
                // Don't stop main thread execution
            }

            if (!$process->isRunning()) {
                // All processes are timed out after 2 seconds and restarted afterwards
                $process->restart();
            }
        }

        usleep($sleep * 1000000);
    }

This application is being run on a MAC server running OS X 10.8.4.

I would appreciate any hints on how to pursue the root of this issue.

Update #1: I’ve simplified my function to work with basic commands like ls and date for faster testing. It still looks like the Process command fails after starting and stopping about 1000-1500 processes.

I suspected that proc_close() was not being called correctly for each process, but further investigation revealed that’s not the case here.

Advertisement

Answer

The file handles are not being garbage collected, so they eventually fill up (os limit or php limit, not sure), but you can fix it by adding an explicit garbage collection call:

gc_collect_cycles();
usleep($sleep * 1000000);

Also, be forwarned that garbage collection doesn’t work very well inside a foreach loop because of the way that php maps the temporary $foo as $bar => $var array variables into memory. If you need it in that part of the code you could switch it to something like this instead, which I think should allow for garbage collection inside the for loop:

$processes[] = new Process("ls > /dev/null", null, null, null, 2);
$processes[] = new Process("date > /dev/null", null, null, null, 2);

$sleep = 0;

do {
    $count = count($processes);
    for($i = 0; $i < $count; $i++) {
        if (!$processes[$i]->isStarted()) {
            $processes[$i]->start();

            continue;
        }

        try {
            $processes[$i]->checkTimeout();
        } catch (Exception $e) {
            // Don't stop main thread execution
        }

        if (!$processes[$i]->isRunning()) {
            // All processes are timed out after 2 seconds and restarted afterwards
            $processes[$i]->restart();
        }

        gc_collect_cycles();
    }

    usleep($sleep * 1000000);
} while ($count > 0);
User contributions licensed under: CC BY-SA
9 People found this is helpful
Advertisement