Speeding up PHP - Using process forking to accelerate image resizing

Most of my projects at work involve resizing images in PHP applications, usually the result is loaded via AJAX so it needs to be generated as fast as possible for optimum user experience. Recently I discovered the ability to fork processes in PHP, this gives the ability to run multiple functions at the same time. In the example below I'll demonstrate how parallel processing can be used in PHP to speed the generation of several different sizes of preview images from a large uploaded image.

This code uses the PCNTL extension which was installed as standard on my Ubuntu 11.04 PHP 5.3 install.

Conventionally to generate a couple of different sizes of images from a large JPEG image one might use something like this:

<?php

/**
 * Create a thumbnail image from the specified file
 * 
 * @param int $height output image height
 * @param int $width  output image width
 */

function create_preview($height, $width)
{
    //Input image
    $img = 'test.jpg';
    
    //Output image
    $new_img = 'test -'.$width.'x'.$height.'.jpg';
    
    //get the current image size
    list($full_width, $full_height) = getimagesize($img) ;

    //Create a blank image to load in the new resized image
    $tn = imagecreatetruecolor($width, $height) ;

    //Load the original file
    $image = imagecreatefromjpeg($img) ;

    //Do the crop
    imagecopyresampled($tn, $image, 0, 0, 0, 0, $width, $height, $full_width, $full_height) ;

    //Save the file
    imagejpeg($tn, $new_img, 100);
}        

// The thumbnail sizes to be generated
$image_sizes = array(
    array(100,100),
    array(200, 200),
    array(300, 300),
    array(400, 400),
    array(500, 500),
    array(1000, 1000)
);

// Loop through the images sequentially
foreach($image_sizes as $image_size)
{
    create_preview($image_size[0], $image_size[1]);
}

?>

To get an idea of the speed of this script we can run it from the command line:

time php single_process.php

This takes couple of seconds, as you might expect, in my case:

real   0m2.325s
user    0m2.210s
sys 0m0.100s

However rather than performing the resizing sequentially it can be performed in parallel using the same create_preview() function as before:

// The thumbnail sizes to be generated
$image_sizes = array(
    array(100,100),
    array(200, 200),
    array(300, 300),
    array(400, 400),
    array(500, 500),
    array(1000, 1000)
);

//Counter for number of processes
$i = 1;

//Loop through the image sizes but this time fork a process for each size.
foreach($image_sizes as $image_size)
{
    //Fork a process
    $pid = pcntl_fork();

    //if we're in a child thread then grab an image size and process it.
    if (!$pid)
    {
        echo 'starting child ', $i, PHP_EOL;
        create_preview($image_size[0], $image_size[1]);

        //Die otherwise the process will continue to loop and each process will create all the thumbnails
        die();
    }

    $i++;
}

//Wait for all the subprocesses to complete to avoid zombie processes
foreach($image_sizes as $image_size)
{
    pcntl_wait($status);
} 
?>

Now running the script using the same command line again the advantage is instantly noticeable:

time php multi_process.php

starting child 1
starting child 2
starting child 3
starting child 4
starting child 5
starting child 6

real    0m0.693s
user    0m3.030s
sys 0m0.140s

As you can see from this very unscientific test running the processes in parallel increases the speed by over 3.5x. As long as you're careful to watch for threads terminating and not just leaving them unattended this seems like a great way to speed up simple repetitive tasks, I'd certainly hope to find a place for it in some of my applications in the future I'll probably not use this in a production web application due to the warning that Paul kindly posted below, I guess should read the manual before getting excited...