Jump to content
lefjucabro

Record and process audio

Recommended Posts

Hello there,

 

I am trying to develop an iOS app with Rio. This app must record audio and process it in real time to display some graphs.

I am a newbie with FMX platform and iOS, there are lot of informations I don't get :classic_unsure:

 

There is iOSapi.AVFoundation unit which contains a lot of types (TAVAudioSession, TAVAudioRecorder) but I cannot understand how to use them.

 

For what I understand, I have to define an audio format (2 channels stereo, 16 bits), get the buffer size, create a recording task and then launch recording.

In a thread I read samples and fill them into a TMemoryStream for processing.

 

Do you think is this correct ? Do you have any idea how can I do this ?

 

Kind regards

 

Lefjucabro

Share this post


Link to post
Posted (edited)

Don't think to difficult, its all in FMX.

 

Uses
  , FMX.Media

var
    FMicrophone : TAudioCaptureDevice;


    // get the microphone device
    FMicrophone := TCaptureDeviceManager.Current.DefaultAudioCaptureDevice;

    FMicrophone.FileName :=  AFileName; // GetAudioFileName(AUDIO_FILENAME);


    
    try
        if Assigned( FMicrophone )                                      and
                   ( FMicrophone.State = TCaptureDeviceState.Stopped )  then
        begin
            FMicrophone.StartCapture;
        end;

    except
        on E : Exception do
            begin
                LTxt := E.Message;
            end;
    end;
    

And you have to provide the NSMicrophoneUsageDescription in the version info.

 

Edited by Rollo62

Share this post


Link to post

Hello Rollo62,

 

Thank you for your answer.

 

I was also thinking of using a TAudioCaptureDevice but it's impossible to access data before the end of the recording.

 

During recording, I have to get data each time my buffer is filled to process them.

 

Share this post


Link to post

We have developed a low-level audio input solution for Windows, MacOS and iOS. It allows us to process audio immediately when each buffer is filled and delivered to our code. I cannot share my code here, but here are some hints of the API functions we use on iOS:

AudioQueueNewInput

AudioQueueAllocateBuffer

AudioQueueEnqueueBuffer

AudioQueueStart

AudioQueueStop

  • Like 2

Share this post


Link to post
Posted (edited)
On 5/22/2020 at 11:06 AM, Hans♫ said:

We have developed a low-level audio input solution for Windows, MacOS and iOS. It allows us to process audio immediately when each buffer is filled and delivered to our code. I cannot share my code here, but here are some hints of the API functions we use on iOS:

AudioQueueNewInput

AudioQueueAllocateBuffer

AudioQueueEnqueueBuffer

AudioQueueStart

AudioQueueStop

Thank you It looks what I want to do. I will check these hints.

Edited by lefjucabro

Share this post


Link to post
Posted (edited)
On 5/22/2020 at 11:06 AM, Hans♫ said:

We have developed a low-level audio input solution for Windows, MacOS and iOS. It allows us to process audio immediately when each buffer is filled and delivered to our code. I cannot share my code here, but here are some hints of the API functions we use on iOS:

AudioQueueNewInput

AudioQueueAllocateBuffer

AudioQueueEnqueueBuffer

AudioQueueStart

AudioQueueStop

I didn't find these functions in iOSapi.AVFoundation or iOSapi.CoreAudio unit in Rio 10.3.3. Did you use these functions with Rio 10.3.3 ?

 

 

On 5/20/2020 at 7:40 PM, Rollo62 said:

Maybe you'looking for something like AVAudioEngine, as the blog said, this is much more tricky API, I wouldn't count that FMX can do that out-of-the-box.

https://developer.apple.com/documentation/avfoundation/avaudioengine

 

If not in the FMX library, then have a look at FmxExpress headers.

AVAudioEngine is also missing in iOSapi.AVFoundation.pas from Rio 10.3.3. Do I install an old file ?

Edited by lefjucabro

Share this post


Link to post

All the AudioQueue... functions are in iOSapi.AudioToolbox. Do you know how to convert the SDK headers your self? - or else I can send you my version of the file.

Share this post


Link to post

Hi Hans,

Indeed I don't know how to convert SDK (I really am a beginner). If you have a link to explain it or send me your file if it is ok for you.

Regards

Share this post


Link to post
2 hours ago, lefjucabro said:

... (I really am a beginner). If you have a link to explain it or send me your file if it is ok for you.

If have sent a PM with the file, but you are far from done having this file. The best advice is to find code in Objective-C that does what you want, and then convert it to Delphi - which might include a few headaches. 

Share this post


Link to post

Thank you Hans. I did some tests and I am not able to compile with imported files:

I got iOSapi.AudioToolbox.pas.

 

I had to get iOSapi.AudioUnit.pas, iOSapi.CoreAudioTypes.pas and iOSapi.CoreMidi.pas which are declared in iOSapi.AudioToolbox.pas.

 

I also copied frameworks from iMac into "Embarcadero\Studio\SDKs\iPhoneOS13.4.sdk\System\Library\Frameworks" and have this issue when compile

"[DCC Erreur] E2597 NYI lto::archName

ld: file was built for  which is not the architecture being linked (arm64): D:\Documents\Embarcadero\Studio\SDKs\iPhoneOS13.4.sdk/System/Library/Frameworks/AudioToolbox.framework/AudioToolbox.tbd for architecture arm64".

 

I am aware that files and frameworks are not in same versions but I don't know how to resolve it.:classic_sad:

 

 

Share this post


Link to post
5 hours ago, lefjucabro said:

I had to get iOSapi.AudioUnit.pas, iOSapi.CoreAudioTypes.pas and iOSapi.CoreMidi.pas which are declared in iOSapi.AudioToolbox.pas.

Sorry I didn't see that. I have sent a PM with those headers too.

However, there are a lot of details to figure out before you can achieve what you want. Having the headers is just a small part of it. With a lot of Googling, reading and persistence - eventually you will get through 😉 Keep up the spirit!

Share this post


Link to post
1 hour ago, Hans♫ said:

Sorry I didn't see that. I have sent a PM with those headers too.

Thanks Hans


Excuse me but I misspoke: I needed the xxx files and I could find them on the web.
 

The ones you sent me are more recent.
 

By adding your files to my project, I also added the corresponding FrameWorks from iMac in the "Embarcadero\Studio\SDKs\iPhoneOS13.4.sdk\System\Library\Frameworks" directory. I still have this issue when compiling  😭

"[DCC Erreur] E2597 NYI lto::archName

ld: file was built for  which is not the architecture being linked (arm64): D:\Documents\Embarcadero\Studio\SDKs\iPhoneOS13.4.sdk/System/Library/Frameworks/AudioToolbox.framework/AudioToolbox.tbd for architecture arm64".

 

So I tried to build uptodate files regarding to this page but I have errors when compiling files...

 

1 hour ago, Hans♫ said:

However, there are a lot of details to figure out before you can achieve what you want. Having the headers is just a small part of it. With a lot of Googling, reading and persistence - eventually you will get through 😉 Keep up the spirit!

Unfortunately I know that it is far from being won 😀

 

 

Share this post


Link to post
6 hours ago, lefjucabro said:

So I tried to build uptodate files regarding to this page but I have errors when compiling files...

It's because the tools are not as accurate as they could be, and the resulting source files often require adjustment. For iOS/macOS, at least some knowledge of translating Objective-C to Delphi is essential.

Share this post


Link to post

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×