Downsampling an image without "dancing" pixels
Say I wanted to downsample an image in realtime (1280x720) to a very small scale (16x16) and not suffer from "dancing" pixels when the image moves, which technique would I use?
This would be using XBox360/PS开发者_JAVA百科3 GPUs.
Note that the 16x16 image only needs to contain a very "generalised" representation of the original image - almost like a rough-estimate of colour.
Doing a direct downsample results in pixels coming in/out of the sampling so you get a dancing/disco effect when the image moves.
Thanks for any help.
You'll want to use averaging instead of downsampling. For instance, setting each pixel to the average value of an 80x45 pixel block will give you a 16x16 image from a 1280x720 image. Note that that would alter the aspect ratio though...
I should clarify that you don't necessarily have to average the entire block -- you could average every other pixel in it or whatever you want. But the more pixels you account for, the more accurate the value would be. You could also use other statistical measures -- the mode or median values across channels could also work.
The downsampling filter that will give the best quality is to do a 2D Fourier transform, select only the low frequencies and do an inverse Fourier transform into the low resolution image.
Obviously this won't be the fastest method and there may be minor issues with scaling and handling the color channels.
精彩评论