FFmpeg : Encode Rgb Frames (AvFrames) to H264
When I encode Rgb24 frame with H264 I get "input width is greater than than stride"...
By the way if I give raw image which is Yuv420p, ffmpeg successfully encodes it...
What I wanted to know is:
i) Do we have to give Yuv format for encoding? Can't g开发者_如何学Goive rgb frame for encoding h264?
ii) If we can give rgb frame, what is the trick?I know this is a bit late (no answers since 2010), but it sounds like you need (or needed) to adjust the wrapping of your image data.
From the following MSDN article (I know it's MSDN, but its explanation of the concepts involved is REALLY good):
When a video image is stored in memory, the memory buffer might contain extra padding bytes after each row of pixels. The padding bytes affect how the image is stored in memory, but do not affect how the image is displayed.
The stride is the number of bytes from one row of pixels in memory to the next row of pixels in memory. Stride is also called pitch. If padding bytes are present, the stride is wider than the width of the image, as shown in the following illustration.
Read more here
Look at what you've specified for both your image width and image stride. Whatever data you are supplying for the row has more bits than you're specified for the stride (and I'm guessing the width as well, if they are in agreement).
精彩评论