I am writing an audio player for OSX. One view is a custom view that displays a waveform. The waveform is stored as a instance variable of type NSImage with an NSBitmapImageRep. The view also displays
I\'m trying to write raw audio bytes to a file using AudioFileWriteBytes(). Here\'s what I\'m doing: void writeSingleChannelRingBufferDataToFileAsSInt16(AudioFileID audioFileID, AudioConverterRef aud
I am streaming a MP3 over network using custom feeding code, not AVAudioPlayer (which only works with URLs) using APIs like AudioFileStreamOpen and etc.
I\'m making an iPhone game application using Core Audio, Extended Audio File Services. It works OK, but when I first call AudioOutputUnitStart, it takes about 1-2 seconds.
I\'m recording a continuous stream of data using AudioQueueServices.It is my understanding that the callback will only be called when the buffer fills with data.In practice, the first callback has a f
I am trying to control the channel (left/right) and its volume from which the audio file is played. It will be great if someone can explain how this can be done or point me to some docu开发者_开发技巧
I am writing an iPhone application that needs to record audio from the built-in microphone and then send that audio data to a server for processing.
I am not sure whether audio units can work as codecs in a streaming audio scenario on the iPhone. I\'ve read in various places that it can be done, but I haven\'t seen any examples or proper documen
I\'d like to be able to play back a开发者_JS百科udio I\'ve recorded using AVAudioRecorder @ 1.5x or 2.0x speed. I don\'t see anything in AVAudioPlayer that will support that. I\'d appreciate some sugg
Can anyone point me in the right direction on how I would minimize ambient noise while recording someone speaking using the iPhone SDK Core Audio? I\'m guessing a band-pass filter that eliminates any