I’m afraid I don’t know of a model that can “predict” correction values based solely on the lens, but there have been attempts to create open-source lens databases that store this information.

The Hugin guys have a nice article on that here: http://hugin.sourceforge.net/docs/manual/Lens_correction_model.html

]]>This algorithm wasn’t designed for barrel-distortion, specifically (and in fact, there are other, better algorithms for that). But as a general rule, any algorithm can be modified to operate on a specific location.

In the case of the algorithm above, you’ll notice that x and y are remapped around the center of the image (using the “halfWidth” and “halfHeight” values). Simply point those at some other “center” point, and the algorithm will automatically base its calculations off that, instead.

You can also limit the radius used to prevent correction beyond a certain distance. (This is nice for dealing with certain types of lens aberrations, where only a portion of the lens is affected, for example.)

Anyway, I don’t know if any of this is useful, but maybe it’s a start. This article has produced a lot of questions about correcting barrel distortion, specifically, so maybe it’s time for me to write a “Part 2″… :)

]]>However, if you only consider the issue of computing the average of two hue components, a number of solutions can be found. I came up with three different algorithms I implemented using Python.

The first idea is to adopt a very simple -yet efficient- phase unwrapping approach. We first unwrap the data if needed (that is if the derivative -or the difference here- exceeds a threshold, 180 in our case), perform our computation and we finally make sure we remain in the correct domain.

The second idea starts to reintroduce the idea of cyclicity. The integer B is switched to a domain centered around A+-180, the average is performed, and we switch back to the original domain.

The third idea completely reintroduces the idea of cyclicity, using the complex representation (which seems only natural since the hue is represented by a phase). We thus create our associated complex numbers and perform the geometric average (which leads to a arithmetic average for the phase).

I have not thorougly tested these algorithms, and some border effects might remain (because of all the different domains used), but the general ideas seem to work!

You can find my code here: https://github.com/bporteboeuf/master-python/blob/master/sumUnwrap.py

]]>Here’s another thing to consider re: blending in HSL space (which is a non-trivial exercise). If we use the convention where we describe Hue as an angle on the range [0, 359], and we have two colors with Hue [0] and Hue [340], how do we blend these? In RGB space, we can simply calculate an average, but because Hue is circular, the correct blend would resolve to [350]. The circular nature of Hue makes blending complicated because of this.

And yes, you are of course welcome to use this algorithm in your own software.

]]>Let us consider a white background. RGB would be (255,255,255), and HSV would be something like (H,0,1) if normalized, where H is whatever hue you desire. If you were to change the color temperature, the background would either get more blueish or yellowish, which would of course affect the hue component, but also the saturation by actually increasing it in this example (otherwise, the hue would not be visible and the background would remain white). This example is enough to show that saturation should be modified in the general case, and that your algorithm is indeed correct! ]]>