[iPhone] detecting a hit in a transparent area

Problem : let’s say you want to have a zone tht’s partially transparent and you want to know if a hit is on the non-transparent zone or not.

Under Mac OS X, you can use several methods to do so, but on the iPhone, you’re on your own.

Believe it or not, but the solution came from the past : the QuickDraw migration guide to Carbon actually contained a way to detect transparent pixels in a bitmap image. After some tweaking, the code works.

Here is the setup :
– A view containing a score of NZTouchableImageView subviews (each being able to detect if you are in a transparent zone or not)
– on top of it all, not necessary for every purpose, but needed in my case, a transparent NZSensitiveView that intercepts hits and finds out which subview of the “floorView” (the view with all the partially transparent subviews) was hit
– a delegate conforming to the NZSensitiveDelegate protocol, which reacts to hits and swipes.

The code follows. If you have any use for it, feel free to do so. The only thing I ask in return is a thanks, and if you find any bugs or any way to improve on it, to forward it my way.

Merry Christmas!

[UPDATE] It took me some time to figure out what was wrong and even more to decide to update this post, but thanks to Peng’s questions, I modified the code to work in a more modern way, even with the Gesture Recognizer and the scaling active. Enjoy again!

[UPDATE] Last trouble was linked to the contentsGravity of the images: when scaled to fit/fill, the transformation matrix is not updated, and there’s no real way to guess what it might be. Changing approach, you can trust the CALayer’s inner workings. Enjoy again again!

NZSensitiveDelegate:

@protocol NZSensitiveDelegate
 
- (void) userSlidedLeft:(CGFloat) s;
- (void) userSlidedRight:(CGFloat) s;
- (void) userSlidedTop:(CGFloat) s;
- (void) userSlidedBottom:(CGFloat) s;
 
- (void) userTappedView:(UIView*) v;
 
@end

NZSensitiveView:

@interface NZSensitiveView : UIView {
  id _sdelegate;
  UIView *_floorView;
}
 
@property(retain,nonatomic) IBOutlet id  _sdelegate;
@property(retain,nonatomic) UIView *_floorView;
 
@end
#define kSwipeMinimum 12
#define kSwipeMaximum 4
 
static UIView *currentlyTouchedView;
static CGPoint lastPosition;
static BOOL moving;
 
@implementation NZSensitiveView
@synthesize _sdelegate;
@synthesize _floorView;
 
- (id)initWithFrame:(CGRect)frame {
  if (self = [super initWithFrame:frame]) {
  // Initialization code
  }
  return self;
}
 
- (void)drawRect:(CGRect)rect {
  // Drawing code
}
 
- (void)dealloc {
  [super dealloc];
}
 
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
  UITouch *cTouch = [touches anyObject];
  CGPoint position = [cTouch locationInView:self];
  UIView *roomView = [self._floorView hitTest:position
    withEvent:nil];
 
  if([roomView isKindOfClass:[NZTouchableImageView class]]) {
    currentlyTouchedView = roomView;
  }
 
  moving = YES;
  lastPosition = position;
}
 
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
  UITouch *cTouch = [touches anyObject];
  CGPoint position = [cTouch locationInView:self];
 
  if(moving) { // as should be
    if( (position.x - lastPosition.x > kSwipeMaximum) && fabs(position.y - lastPosition.y) < kSwipeMinimum ) {
      // swipe towards the left (moving right)
      [self._sdelegate userSlidedLeft:position.x - lastPosition.x];
      [self touchesEnded:touches withEvent:event];
    } else if( (lastPosition.x - position.x > kSwipeMaximum) && fabs(position.y - lastPosition.y) < kSwipeMinimum ) {
      // swipe towards the right
      [self._sdelegate userSlidedRight:lastPosition.x - position.x];
      [self touchesEnded:touches withEvent:event];
    } else if( (position.y - lastPosition.y > kSwipeMaximum) && fabs(position.x - lastPosition.x) < kSwipeMinimum ) {
      // swipe towards the top
      [self._sdelegate userSlidedTop:position.y - lastPosition.y];
      [self touchesEnded:touches withEvent:event];
    } else if( (lastPosition.y - position.y > kSwipeMaximum) && fabs(position.x - lastPosition.x) < kSwipeMinimum ) {
      // swipe towards the bottom
      [self._sdelegate userSlidedBottom:lastPosition.y - position.y];
      [self touchesEnded:touches withEvent:event];
    }
  }
}
 
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
  UITouch *cTouch = [touches anyObject];
  CGPoint position = [cTouch locationInView:self];
  UIView *roomView = [self._floorView        hitTest:position
    withEvent:nil];
  if(roomView == currentlyTouchedView) {
    [self._sdelegate userTappedView:currentlyTouchedView];
  }
 
  currentlyTouchedView = nil;
  moving = NO;
}
 
@end

NZTouchableImageView:

@interface NZTouchableImageView : UIImageView {
}
@end
@implementation NZTouchableImageView
 
- (BOOL) doHitTestForPoint:(CGPoint)point {
    CGColorSpaceRef colorspace = CGColorSpaceCreateDeviceRGB();
    CGBitmapInfo info = kCGImageAlphaPremultipliedLast;
 
    UInt32 bitmapData[1];
    bitmapData[0] = 0;
 
    CGContextRef context =
    CGBitmapContextCreate(bitmapData,
                          1,
                          1,
                          8,
                          4,
                          colorspace,
                          info);
 
    // draw the image into our modified context
    // CGRect rect = CGRectMake(-point.x, 
        //                             point.y - CGImageGetHeight(self.image.CGImage),
        //                             CGImageGetWidth(self.image.CGImage),
        //                             CGImageGetHeight(self.image.CGImage));
        // CGContextDrawImage(context, rect, self.image.CGImage);
    CGContextTranslateCTM(context, -point.x, -point.y);
    [self.layer renderInContext:context];
 
    CGContextFlush(context);
 
    BOOL res = (bitmapData[0] != 0);
 
    CGContextRelease(context);
    CGColorSpaceRelease(colorspace);
 
    return res;
}
 
#pragma mark -
 
- (BOOL) isUserInteractionEnabled {
  return YES;
}
 
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event {
  return [self doHitTestForPoint:point];
}
 
@end
  

Leave a Reply