开发者

Play and render stream using audio queues

I'm currently playing a stream on my iOS App but one feature we'd like to add is the visualization of the output wave. I use an output audio queue in order to play t开发者_如何学Gohe stream, but have found no way to read the output buffer. Can this be achieved using audio queues or shall be done wit a lower level api?


To visualize, you presumably need PCM (uncompressed) data, so if you're pushing some compressed format into the queue like MP3 or AAC, then you never see the data you need. If you were working with PCM (maybe you're uncompressing it yourself with the Audio Conversion APIs), then you could visualize before putting samples into the queue. But then the problem would be latency - you want to visualize samples when they play, not when they go into the queue.

For latency reasons alone, you probably want to be using audio units.


It cannot actually be done. In order to do so, I need audio units to implement the streamer.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜