Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

GorillaPaws

macrumors 6502a
Original poster
Oct 26, 2003
932
8
Richmond, VA
So I was wondering if it's even remotely possible to try to track where the user is looking on-screen based on the video from their built-in iSight. I was thinking you could have a calibration mode where the user would look at flashing dots in the corners, mid-points of the sides, center and various other locations. The images snapped could then serve as the benchmarks.

During the tracking phase, the image from the camera would be constantly compared (using color vectors maybe? any other suggestions?) to determine which benchmark images it most closely matches. Obviously this strategy isn't going to be nearly as accurate as the more sophisticated methods currently in use, but would it be so frustratingly inaccurate that it's not even worth the time to think about?

I realize "accuracy" can mean a lot of different things, but I was thinking of a 200-300pt radius circle of tracking area with around 60-75% accuracy.
 
there's actually a really good one called iTracker

Here's a link to iTracker for those who are curious. iTracker is doing something different that what I was talking about however. It translates movement of the head into actual cursor movements; I was trying to see if it would be possible to translate the image of the user's face into an approximation of where on-screen they are looking. Thank you for the reference though, it looks like a pretty impressive piece of software.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.