开发者

What is faster: decode H264 1920x1080 and display it on screen 1920x1080 or decode H264 1280x720 and display 1920x1080?

So we are creating some video server. We broadcast flv's to our flash clients. our providers can provide us with h254 videos of 1920x1080 and 1280x720 video sizes. But that videos vill be generated from same source of 1280x720 with same bitrate. so ve will have

1280x720 h264 with bitRateA
and
1920x1080 with bitRateA

we asume we can deliver videos with same speed.

What will be faster for clients: decode H264 1920x1080 and display it on screen 1920x1080 or decode H264 1280x720 and display 1920x1080? 开发者_运维技巧


I suspect that lower resolution upscaled will be faster than decoding higher resolution. Upscaling is a simple and fast operation, while h264 decoding is hella-complex (yep, that's an actual technical expression :-)).

Besides, if you're having the same bitrate on both streams you'll get no greater picture quality by increasing resolution, doubly so if you're using the same lower-res stream as a source (since there is no extra information that higher resolution stream can use). More likely, re-encoding will probably produce worse picture quality than the original lower-res stream by introducing additional dithering and quantization errors.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜