开发者

setNeedsDisplayInRect: paints a white rectangle only

I'm still a little fresh to CoreGraphics programming, so pleas开发者_开发知识库e bear with me. I'm trying to write an application, which allows the user to rub stuff off an image with the finger. I have the basic functionality nailed down, but the result is sluggish since the screen is redrawn completely every time a touch is rendered. I did some research and found out that I can refresh only a portion of the screen using UIView's setNeedsDisplayInRect: method.

This does call drawRect: as expected, however everything I draw in the drawRect: following the setNeedsDisplayInRect: is ignored. Instead, the area in the rect parameter is simply filled with white. No matter what I draw inside, all I end up with is a white rectangle.

In essence, this is what I do:

1) when user touches screen, this touch is rendered into a mask 2) when the drawRect: is called, the image is masked with that mask

There must be something simple I'm overlooking, surely?


I found a solution, however it still escapes me how exactly this works. Here's the code:

This method flips the given rectangle in the same manner, in which the coordinate transformation in the context flips the context coordinate system:

- (CGRect) flippedRect:(CGRect)rect
{
    CGRect flippedRect = rect;

    flippedRect.origin.y = self.bounds.size.height - rect.origin.y - rect.size.height;

    return CGRectIntersection( self.bounds, flippedRect );
}

This calculates the rectangle to be updated from the touch location. Note that the rectangle gets flipped:

- (CGRect) updateRectFromTouch:(UITouch *)touch
{
    CGPoint location = [touch locationInView:self];

    int d = RubbingSize;

    CGRect touchRect = [self flippedRect:CGRectMake( location.x - d, location.y - d, 2*d, 2*d )];

    return CGRectIntersection( self.frame, touchRect );
}

In render touch the 'flipped' update rectangles are coerced:

- (void) renderTouch:(UITouch *)touch
{
    //
    // Code to render into the mask here
    //

    if ( m_updateRect.size.width == 0 )
    {
        m_updateRect = [self updateRectFromTouch:touch];
    }
    else 
    {
        m_updateRect = CGRectUnion( m_updateRect, [self updateRectFromTouch:touch] );
    }
}   

The whole view is refreshed with about 20Hz during the fingerpainting process. The following method is called every 1/20th second and submits the rectangle for rendering:

- (void) refreshScreen
{
    if ( m_updateRect.size.width > 0 )
    {
        [self setNeedsDisplayInRect:[self flippedRect:m_updateRect]];
    }
}

Here's a helper method to compare to rectangles:

BOOL rectIsEqualTo(CGRect a, CGRect b)
{
    return a.origin.x == b.origin.x && a.origin.y == b.origin.y && a.size.width == b.size.width && a.size.height == b.size.height;
}

In the drawRect: method, the update rectangle is used to draw only the portion that needs updating.

- (void)drawRect:(CGRect)rect 
{
    BOOL drawFullScreen = rectIsEqualTo( rect, self.frame );

    // Drawing code

    CGContextRef context = UIGraphicsGetCurrentContext();

    // Turn coordinate system around

    CGContextTranslateCTM( context, 0.0, self.frame.size.height );

    CGContextScaleCTM( context, 1.0, -1.0 );

    if ( drawFullScreen )
    {
        // draw the full thing
        CGContextDrawImage( context, self.frame, self.image );
    }
    else 
    {       
        CGImageRef partialImage = CGImageCreateWithImageInRect( self.image, [self flippedRect:m_updateRect] );

        CGContextDrawImage( context, m_updateRect, partialPhoto );

        CGImageRelease( partialImage );
    }

        ...

    // Reset update box

    m_updateRect = CGRectZero;
}

If someone can explain to me why the flipping works, I'd appreciate it.


The flipping issue is most likely because UIKit has a 'flipped' coordinate compared to Core Graphics.

Apple documentation here

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜