Blender 2.64 has a new compositor node called the “Bokeh Image”. The node generates reference images. These images can be used with blurs and give the user more flexibility matching the blur with the camera-footage. But how does this node works? We want to know if a pixel is inside the bokeh. We do this by comparing distances. The following steps will give an overview of how the node works.

Step 1. First we calculate the distance between the center of the image and the pixel.

Step 2. We calculate the intersection point between the line from the center towards the pixel to evaluate and the line between the two corners of the flap. From this intersection point we calculate the distance to the center of the image.

Step 3. We interpolate between the distance of the intersection point and the radius of the circle. This interpolation will simulate the rounding of the camera flaps.

Step 4. By interpolating the outer edge distance with the catadioptric setting we find the distance of the inner edge. We can check if the distance of the pixel (step 1) is between the inner and the outer edge. If this is the case the pixel will be inside the bokeh image.

Doing this for every pixel of the image will generate the final bokeh image. Lens-shift has been implementing by varying the radius of the circle for R, G and B.

There can still be done some improvements on the lens-shift and catadioptric algorithm. These are far from how a real camera works. But the idea of the node was to mimic the camera internals.

Source code can be found in the SVN