[iPhone] detecting a hit in a transparent area

Problem : let’s say you want to have a zone tht’s partially transparent and you want to know if a hit is on the non-transparent zone or not.

Under Mac OS X, you can use several methods to do so, but on the iPhone, you’re on your own.

Believe it or not, but the solution came from the past : the QuickDraw migration guide to Carbon actually contained a way to detect transparent pixels in a bitmap image. After some tweaking, the code works.

Here is the setup :
– A view containing a score of NZTouchableImageView subviews (each being able to detect if you are in a transparent zone or not)
– on top of it all, not necessary for every purpose, but needed in my case, a transparent NZSensitiveView that intercepts hits and finds out which subview of the “floorView” (the view with all the partially transparent subviews) was hit
– a delegate conforming to the NZSensitiveDelegate protocol, which reacts to hits and swipes.

The code follows. If you have any use for it, feel free to do so. The only thing I ask in return is a thanks, and if you find any bugs or any way to improve on it, to forward it my way.

Merry Christmas!

[UPDATE] It took me some time to figure out what was wrong and even more to decide to update this post, but thanks to Peng’s questions, I modified the code to work in a more modern way, even with the Gesture Recognizer and the scaling active. Enjoy again!

[UPDATE] Last trouble was linked to the contentsGravity of the images: when scaled to fit/fill, the transformation matrix is not updated, and there’s no real way to guess what it might be. Changing approach, you can trust the CALayer’s inner workings. Enjoy again again!


@protocol NZSensitiveDelegate
- (void) userSlidedLeft:(CGFloat) s;
- (void) userSlidedRight:(CGFloat) s;
- (void) userSlidedTop:(CGFloat) s;
- (void) userSlidedBottom:(CGFloat) s;
- (void) userTappedView:(UIView*) v;


@interface NZSensitiveView : UIView {
  id _sdelegate;
  UIView *_floorView;
@property(retain,nonatomic) IBOutlet id  _sdelegate;
@property(retain,nonatomic) UIView *_floorView;
#define kSwipeMinimum 12
#define kSwipeMaximum 4
static UIView *currentlyTouchedView;
static CGPoint lastPosition;
static BOOL moving;
@implementation NZSensitiveView
@synthesize _sdelegate;
@synthesize _floorView;
- (id)initWithFrame:(CGRect)frame {
  if (self = [super initWithFrame:frame]) {
  // Initialization code
  return self;
- (void)drawRect:(CGRect)rect {
  // Drawing code
- (void)dealloc {
  [super dealloc];
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
  UITouch *cTouch = [touches anyObject];
  CGPoint position = [cTouch locationInView:self];
  UIView *roomView = [self._floorView hitTest:position
  if([roomView isKindOfClass:[NZTouchableImageView class]]) {
    currentlyTouchedView = roomView;
  moving = YES;
  lastPosition = position;
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
  UITouch *cTouch = [touches anyObject];
  CGPoint position = [cTouch locationInView:self];
  if(moving) { // as should be
    if( (position.x - lastPosition.x > kSwipeMaximum) && fabs(position.y - lastPosition.y) < kSwipeMinimum ) {
      // swipe towards the left (moving right)
      [self._sdelegate userSlidedLeft:position.x - lastPosition.x];
      [self touchesEnded:touches withEvent:event];
    } else if( (lastPosition.x - position.x > kSwipeMaximum) && fabs(position.y - lastPosition.y) < kSwipeMinimum ) {
      // swipe towards the right
      [self._sdelegate userSlidedRight:lastPosition.x - position.x];
      [self touchesEnded:touches withEvent:event];
    } else if( (position.y - lastPosition.y > kSwipeMaximum) && fabs(position.x - lastPosition.x) < kSwipeMinimum ) {
      // swipe towards the top
      [self._sdelegate userSlidedTop:position.y - lastPosition.y];
      [self touchesEnded:touches withEvent:event];
    } else if( (lastPosition.y - position.y > kSwipeMaximum) && fabs(position.x - lastPosition.x) < kSwipeMinimum ) {
      // swipe towards the bottom
      [self._sdelegate userSlidedBottom:lastPosition.y - position.y];
      [self touchesEnded:touches withEvent:event];
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
  UITouch *cTouch = [touches anyObject];
  CGPoint position = [cTouch locationInView:self];
  UIView *roomView = [self._floorView        hitTest:position
  if(roomView == currentlyTouchedView) {
    [self._sdelegate userTappedView:currentlyTouchedView];
  currentlyTouchedView = nil;
  moving = NO;


@interface NZTouchableImageView : UIImageView {
@implementation NZTouchableImageView
- (BOOL) doHitTestForPoint:(CGPoint)point {
    CGColorSpaceRef colorspace = CGColorSpaceCreateDeviceRGB();
    CGBitmapInfo info = kCGImageAlphaPremultipliedLast;
    UInt32 bitmapData[1];
    bitmapData[0] = 0;
    CGContextRef context =
    // draw the image into our modified context
    // CGRect rect = CGRectMake(-point.x, 
        //                             point.y - CGImageGetHeight(self.image.CGImage),
        //                             CGImageGetWidth(self.image.CGImage),
        //                             CGImageGetHeight(self.image.CGImage));
        // CGContextDrawImage(context, rect, self.image.CGImage);
    CGContextTranslateCTM(context, -point.x, -point.y);
    [self.layer renderInContext:context];
    BOOL res = (bitmapData[0] != 0);
    return res;
#pragma mark -
- (BOOL) isUserInteractionEnabled {
  return YES;
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event {
  return [self doHitTestForPoint:point];


  1. liens 8-9 read:
    CGContextRef context = CGBitmapContextCreate

    looks like CGBitmapContextCreate was typed twice…

    thanks for this code! I thought I was SOL.


  2. Hi,

    Great post! Just one question for below code snippet.

    if(scaleX < scaleY) { // shift vertically
    CGFloat delta = self.frame.size.height – (CGImageGetHeight(self.image.CGImage) * scaleX);
    delta *= .5;
    rect.origin.x = -point.x – delta;;
    rect.origin.y = point.y – self.frame.size.height;

    I think the idea is that the rect is shifted so that the point is the last pixel to be drawed in this rect. Then why do not set the origin as below? I tried to draw images on paper to verify your calculation and found the point is out of rect.

    rect.origin.x = point.x – self.frame.size.width;
    rect.origin.y = point.y – self.frame.size.height;

    Have fun!

  3. Hey, very interesting approach,

    I am struggling to understand how the new origins are calculated, what is the algorithm to determine the new origin based on ScaleX and scaleY (in function – (BOOL) doHitTestForPoint:(CGPoint)point).

    In addition, it seems that delta is not used in the else clause when scaleY is less than scaleX, is that correct?

    Thanks for you help.

  4. Hi Nicolas,
    Thanks for the code. I have several NZTouchableImageView objects in my main view. Each of these has a UIPinchGestureRecognizer attached to it. Before the image views are scaled in response to the pinch gesture, the transparency detection works fine. However, after an image view is scaled larger than before, the transparency detection returns true even for a touch on what is clearly a transparent part of the image. Any ideas on how we can fix this? Thanks.


  5. Hey,

    I import Your code, change class UIView to NZSensitiveView and class UIIImageView to NZTouchableImageView on Interface Builder and add some irregular png file with transparent background. And detecting doesn’t work for me :( Can You add some full project for download?

  6. YES ! Finally a solution which works with auto-fitting graphics and multiple layers!! Thank you a million times
    (for those with as little experience as myself: don’t forget to : don’t forget to add Quartz framework and add #import <QuartzCore/QuartzCore.h> to your .h file).

Leave a Reply