touchesbegan with multiple UIImageView detects incorrect UIImageView
I'm having a problem with touch detection on iPad.
I subclassed the UIImageView like this:
@interface MyUIImageView : UIImageView
{
BOOL _dragActive;
CGPoint originalLocation;
开发者_Python百科}
@end
@implementation MyUIImageView
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
_dragActive = YES;
originalLocation = self.center;
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
if (_dragActive)
{
UITouch* touch = [touches anyObject];
self.center = [touch locationInView:self.superview];
}
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
if (_dragActive)
{
self.center = originalLocation;
}
_dragActive = NO;
}
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event
{
if (_dragActive)
{
self.center = originalLocation;
}
_dragActive = NO;
}
@end
I have multiple ImageView(MyUIImageView) controls on the controller view side by side but my touchesBegan is only being called for the front most although they are separated on the controller view.
Looks like there is an invisible "detection layer" that spawns beyond the ImageViews.
When I click and drag an ImageView the one that get dragged is the one that is to the left or to the right depending on which one is in front. If I change the z axis then the behavior repeats but on the images that are on front.
I m not getting clear idea of your exact requirement.If you just want that touches to be detected for a particular imageView (i.e imageview at the leftmost or rightmost) than try assing that object to UITouch instead of using anyobject ->
UITouch* touch = [touches anyObject];
Use this way if it works UITouch* touch = [touches imgViewObject];
For the situation you've described, you should consider having the controller (whose view contains all your imageviews) handle the touch events. I think what you are describing as a "detection layer" is in fact the region of the screen which your imageview occupies at the beginning of the stream of touch events. Repositioning the imageview on-the-fly (dragging) really is better handled by the parent view, I believe.
A bonus to this approach is that you can more easily deal with rearranging the sibling views, defining behavior at the parent views boundaries, doing whatever you plan to do with sibling collisions/overlaps, etc. from the perspective of the superview.
Finally I've nailed the problem. The UIImageViews I was trying to move had a bigger background than the Image. This was interfering with touch detection with sibling images.
In Interface builder if I set the background color it worked as expected, but at run time it got bigger.I noticed this by programmatically setting the background color to gray.
Changing the auto sizing for the UIImageViews in Interface Builder solved the problem.
精彩评论