开发者

What's a good approach to implement a smudge tool for a drawing program on the iPad?

At a high level (or low level if you'd like), what's a good way to implement a smudge affect for a d开发者_StackOverflow中文版rawing program on the iPad using Quartz2D (Core Graphics)? Has anyone tried this?

What's a good approach to implement a smudge tool for a drawing program on the iPad?

(source: pixlr.com)

Thanks so much in advance for your wisdom!

UPDATE I found this great article for those interested, check it!

Link now at: http://losingfight.com/blog/2007/09/05/how-to-implement-smudge-and-stamp-tools/


I would suggest implementing a similar algorithm to what is detailed in that article using OpenGL ES 2.0 to get the best performance.

  1. Get the starting image as a texture
  2. Set up a render-to-texture framebuffer
  3. Render initial image in a quad
  4. Render another quad the size of your brush with a slightly shifted view of the image, multiplied by an alpha mask stored in a texture or defined by, for example, a gaussian function. Use alpha-blending with the background quad.
  5. Render this texture into a framebuffer associated with your CAEAGLLayer-backed view
  6. Go to 1 on the next -touchesMoved event, with the result from your previous rendering as the input. Keep in mind you'll want to have 2 texture objects to "ping-pong" between as you can't read from and write to the same texture at once.

I think it's unlikely you're going to get great performance on the CPU, but it's definitely easier to set up that way. In this setup, though, you can have essentially unlimited brush size, etc and you're not looping over image drawing code.

Curious about what sort of performance you do get on the CPU, though. Take care :)

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜