Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
a) Is touch event only applicable for imageview? or is it possible for its properties?

b)Is it possible to resize the touch area of a imageview?
 
I still don't get it.

a) a touch event can occur on any UIView. What do you mean by "it's properties". If you have a custom ImageView that has a property which is a UIView than of course touch events can occur on that either.

b) not that I know of. the "touch area" of a UIView is always as big as the view itself.
 
Have a look at UIResponder's methods, which all UIViews inherit.

Then have a look at UIEvent and UITouch reference documentation. For each touch (using UIResponder's touchesBegan, touchesMoved and touchesEnded methods) you can track whereabouts the x and y co-ords of the touch were and confine it to part of a view if you like. Look at the methods touchInView, etc to get the x and y.

Or... if you're looking for something really simple and want button-like functionality, check out UIButton. You can make the button as big as you like and have any image you like.
 
We check the touch event like below,

Code:
if ([touch view] == Imageview1)  

{

}

I am asking ,Can i differentiate the image and the white space of the imageview?

Because the image should not fill the imageview.So there is some white space surrouded the image.

We can easily track the (x,y) coordinates by touch event.But we cant check each pixel(x,y).Its tedious

So other than this any possible way to find touch?

Good solution will be appreciated.
Thanks
 
I suspect you'll need to have a look at Core Graphics. Once you have the position of the touch, there must be a simple function that gets the colour or alpha for a particular pixel - then it's a case of having an if condition that only handles the touch event when it is not white or clear.

Sorry, I don't know that much about Core Graphics but Apple provides a good documentation listing all the functions on the developer website. Good luck!
 
We can easily track the (x,y) coordinates by touch event.But we cant check each pixel(x,y)

You asked that same question and I already gave you the answer. It's basic math, if you can't do it than you probably shouldn't be programming.
I said it once, I say it again: yes you can. It's very simple. you don't check for each pixel but rather for a range of pixels. if your border is 5 pixels than if x < 5 or y < 5 you are inside the border at the top or the left. now you have to add the right and bottom border and there you go.

simple stuff.
 
We cant do like that.Because image is not a straight line.so many curves.
 
The image is rectangular but I'm assuming you have a non-rectangular image inside a rectangular frame, with transparency filling the gaps around the image.

As I mentioned, try getting the x and y first and then testing using Core Graphics APIs the colour of the pixel on the image at those co-ordinates. Then make a decision as to what to do with the touch event based on that outcome. If you can differenciate between what is 'colour' and what is transparent, then you should be able to make the touch event react only to touches inside the 'shape' of the image.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.