by ajt » February 18th, 2010, 11:23 pm
Sorry for the seemingly dumb question.
Since PCM is an audio format (not a storage format) and WAV files (for example) store data a certain way, I did assume that the data was stored as in the data chunk in a WAV file. My deinterleave code wasn't working so I began to look for specific definitions in the code and readme.htm so I could fix it. The Help doesn't really say the data is stored in "PCM Format" only referring to it as wave data which is a broad term and doesn't define how application buffers will be filled in any case.
The following definition of the DataReady event still leaves questions even if I assume the data is structured as a WAV file data chunk in the buffer.
In the event, the pointer to the buffer is typeless so is buffersize in bytes, words or samples? I assume it is bytes and the deinterleave code walks through the buffer in signed words. Also, in the TLiveAudioRecorder there is a property called BufferCount defined only as "Determines the number of data buffers". What exactly is this property (I have set it to 2 for stereo but thats only a guess)?
TWaveAudioDataReadyEvent = procedure(Sender: TObject; const Buffer: Pointer; BufferSize: DWORD; var FreeIt: Boolean) of object;
This event is used for wave audio events that informs the caller about the recorded wave audio data. The wave data stored in buffer specified by the Buffer parameter, and the size of the actual data in the buffer is specified by the BufferSize parameter. If the FreeIt parameter set to True, the component will release the memory allocated for the buffer, otherwise your application must release it.