Glad to hear that the AVRcam is up and working as expected. I haven't had the chance to compare its responsiveness to the CMUcam2, but compared to the original CMUcam, I also found it to be more responsive. So your tests with the CMUcam2 is a good data point.
Regarding a method to be informed of when a color in the color map has been found...it sounds like you have an application set up to receive the packets from the AVRcam already, and that it is expecting to get packets on a regularly timed basis. So there are a couple of options here:
1) The AVRcam firmware could easily be modified to generate a tracking packet with a number of tracked objects set to 0 (the ICD states that this is an acceptable condition). This would only be a few lines to change in the firmware, and I could even send you the file with the mods if you'd like. Another option would be that I could send you a pre-programmed mega8 with this update. But to be honest, I don't think this is the correct approach...
2) I think that updating the receiving application so that it isn't dependent on the tracking packets always arriving makes more sense. In the past, I have set up simple frameworks on a host platform that receives serial bytes in an ISR, dumps them into a buffer, and posts a message that data has been received. Then, a secondary task comes through and processes the buffer in an attempt to build a well formatted tracking packet. When a properly formatted tracking packet is complete, a final handler is called to parse the data in the packet and respond accordingly (drive a servo, etc.). Now, I'm sure you have already played through this solution in your head and there is probably another reason why you may not want to procede in this direction. But for the benefit of others, I thought I would call this solution out.
3) The solution you mentioned regarding setting up one of the tracked colors to cover the full spectrum, and thus always having a trackable color within view, won't work as far as I can tell. The reason here is that if you have a color range set up for all possible colors, you'll end up getting a single tracked object that encompasses the entire field of view. When a "real" object comes into view, it won't be separated out since it obviously falls within the entire range of colors previously set up, and thus will be considered part of it. Another reason this would be a problem is because you would end up having non-unique mappings of RGB triplets to colors (more than a single bit would be set in the final "color" check after the lookup tables had been processed). This is an error condition essentially, and will lead to invalid results.
All this being said, I'd still say to play around and try out some of these ideas to see how the system works. There may be another solution lurking that will be farily elegant.
Keep us posted
