How to use Core Audio's Clock API?
I'm trying to synchronize a visualizer with an audio-track that's being played with the generator audio unit, subtype audioFilePlayer in an开发者_JAVA百科 AUGraph.
I would like to use Core Audio's Clock API, but I there's not much info out there. I found this, and this.
Does anyone know of a good example in english or any docs on this API?
The answer is that there is almost no documentation, and the only reference I found was an Apple list serv stating that it's not a fully developed API.
Instead, if you need audio clock data, register a render callback with your generator audio unit like this.
AudioUnitAddRenderNotify(m_generatorAudioUnit, auRenderCallback, this);
OSStatus auRenderCallback (
void *inRefCon,
AudioUnitRenderActionFlags *ioActionFlags,
const AudioTimeStamp *inTimeStamp,
UInt32 inBusNumber,
UInt32 inNumberFrames,
AudioBufferList *ioData
)
{
AudioEngineModel* pAudioEngineModel= (AudioEngineModel*)inRefCon;
pAudioEngineModel->m_f64SampleTime= inTimeStamp->mSampleTime;
return noErr;
}
You can get seconds by dividing the mSampleTime by the sampleRate.
Have you seen this site: http://www.cocoadev.com/index.pl?CoreAudioAndAudioUnitsTutorial
精彩评论