Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Chirone

macrumors 6502
Original poster
Mar 2, 2009
279
0
NZ
using openGLES and detecting touches in 2D space is pretty simple because the coordinates map onto the coordinates of quartz easy

but, when you want to know when someone has touched an object in 3D space, how would you do it?
the coordinates aren't the same as they would be in 2D space
 
well your current viewpoint when you the screen is touched will be showing technically a 2D picture, so from using your viewpoint and where the user pressed on the screen, you can work out what he may have touched (or atleast where in a 3D world)
 
due to a nice and welcome plot twist someone changed my code to make this easier by having the camera move to a position where a unit in 3D equals a pixel on the phone
 
yeah i thought of a concept similar to ray tracing, i just didn't know how to do that

well... ray casting sounds slightly different so i'll give that a search too
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.