This chapter does not appear in the book.
The colored blob detection code of chapter 5 can be used as the basis of other shape analyzers, which I'll illustrate here by extending it to detect a hand and fingers. In the screenshot on the right, I'm wearing a black glove on my left hand. My Handy application attempts to find and label the thumb, index, middle, ring, and little finger. Yellow lines are drawn between the fingertips and the center-of-gravity (COG) of the hand.
I utilized the HSVSelector application of chapter 5 to determine suitable HSV ranges for the black glove. These ranges are loaded by Handy prior to executing the steps shown below to obtain the hand's contour, its COG, and orientation relative to the horizontal.
The various stages shown above are almost identical to those carried out by the ColorRectDetector.findRect() method in section 4.1 of chapter 5. However, Handy continues processing, employing a convex hull and convexity defects to locate and label the fingertips within the hand contour. These additional steps are shown below.
The hull and defects are obtained from the contour with standard OpenCV operations, which I'll explain below. However, the final step of naming the fingers utilizes a rather hacky strategy that assumes the contour's defects are for an out-stretched left hand. The thumb and index finger are located based on their angular position relative to the COG, and the other fingers are identified based on their position relative to those fingers. This process is rather fragile, and can easily become confused, as shown on the right.
Nevertheless, the technique is fairly reliable, usually identifying at least the thumb and index finger irrespective of the hand's orientation, which should be enough for basic gesture processing. However, the application doesn't identify gestures, which will hopefully be the subject of a later chapter.