Anti-Alias versus Hi-Density Screen? [closed]
Want to improve this question? Updat开发者_开发技巧e the question so it's on-topic for Stack Overflow.
Closed 12 years ago.
Improve this questionConceptual question, just out of curiosity:
What is less taxing on the graphics processor: Anti-aliasing (2x? 4x? Higher?) on a typical desktop machine (around 120-150dpi) or to drive a hi-density (>300dpi) screen without anti-aliasing? This question could pertain to both desktop systems and embedded (smartphones). I'm interested to see the responses!
Neither usually, since font rendering and AA is done by the CPU (though you can use GPU features to blur). And then it depends on the font rasterizer and how good or bad it was implemented. It also depends on how AA was done, whether a matrix blur was applied, an FFT, or a simple render-bigger-and-bicubic-downsampling was used. Only runtime tests can show.
精彩评论