开发者

Anti-Alias versus Hi-Density Screen? [closed]

Closed. This question is off-topic. It is not currently accepting answers.

Want to improve this question? Updat开发者_开发技巧e the question so it's on-topic for Stack Overflow.

Closed 12 years ago.

Improve this question

Conceptual question, just out of curiosity:

What is less taxing on the graphics processor: Anti-aliasing (2x? 4x? Higher?) on a typical desktop machine (around 120-150dpi) or to drive a hi-density (>300dpi) screen without anti-aliasing? This question could pertain to both desktop systems and embedded (smartphones). I'm interested to see the responses!


Neither usually, since font rendering and AA is done by the CPU (though you can use GPU features to blur). And then it depends on the font rasterizer and how good or bad it was implemented. It also depends on how AA was done, whether a matrix blur was applied, an FFT, or a simple render-bigger-and-bicubic-downsampling was used. Only runtime tests can show.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜