开发者

High Quality Image Magnification on GPU

I'm looking for interesting algorithms for image magnification that can be implemented on a gpu for real-time sc开发者_Go百科aling of video. Linear and bicubic interpolations algorithms are not good enough.

Suggestions?

Here are some papers I've found, unsure about their suitability for gpu implementation.

Adaptive Interpolation

Level Set

I've seen some demos on the cell processor used in TVs for scaling which had some impressive results, no link unfortunately.


lanczos3 is a very nice interpolation algorithm (you can test it in the GIMP or virtualDub). It generally performs better than cubic interpolation and can be parallelized.
A GPU based version is implemented in Chromium:
http://code.google.com/p/chromium/issues/detail?id=47447
Check out chromium source code.

It may be still too slow for realtime video processing but maybe worth trying if you don't use too high resolution.


You may also want to try out CUVI Lib which offers a good set of GPU acceleration Image Processing algorithms. Find about it on: http://www.cuvilib.com

Disclosure: I am part of the team that developed CUVI.


Still slightly 'work in progress' but gpuCV is a drop in replacement for the openCV image processing functions implemented in openCL on a GPU


Prefiltered cubic b-spline interpolation delivers good results (you can have a look here for some theoretical background). CUDA source code can be downloaded here. WebGL examples can be found here.

edit: The cubic interpolation code is now available on github: CUDA version and WebGL version.


You may want to have a look at Super Resolution Algorithms. Starting Point on CiteseerX

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜