Recording audio (ios)?

Hi there,
I’m interested by add a new features to my app (intended designed for iOS):
audio record.

My question is more about best practices & advices for that.

I’d like that the users could grab recorded files easily, so what way to take for that ?

any help would be appreciated :slight_smile:

+1, been trying some of the addons but mostly getting compile errors :frowning:

what exactly are you trying to do, are you using ofSoundStream. You have to use Audio file services with ofSoundStream.

I’m trying to record small sections of audio from the mic as wav/mp3 and loop/play them back.

I come from processing and have recently ventured into OFX, so i’m a bit rusty still! Could you point out any tutorials that teach how to integrate Audio File Services on an OFX for iOS Xcode project?

Thanks!

Here is a simple example that records audio ,saves to a wave file named “audio.wav” , and plays back audio.wav in a loop. You need to add ofxGui to your project for this example to work. It overwrites the audio everyTime you press record. I only tested this in the simulator.

Put this in ofApp.h

#pragma once

#include "ofMain.h"
#include "ofxiOS.h"
#include "ofxiOSExtras.h"
#include "ofxGui.h"
#import <AudioToolbox/AudioToolbox.h>
#import <AVFoundation/AVFoundation.h>

class ofApp : public ofxiOSApp {
    
    public:
        void setup();
        void update();
        void draw();
        void exit();
    
        void touchDown(ofTouchEventArgs & touch);
        void touchMoved(ofTouchEventArgs & touch);
        void touchUp(ofTouchEventArgs & touch);
        void touchDoubleTap(ofTouchEventArgs & touch);
        void touchCancelled(ofTouchEventArgs & touch);

        void lostFocus();
        void gotFocus();
        void gotMemoryWarning();
        void deviceOrientationChanged(int newOrientation);
    
    void recordPressed();
    void playPresed();
    void stopPressed();
    
    NSString* getAudioFilePath();
    void setupAudioFile();
    void audioIn( float * input, int bufferSize, int nChannels );
    
    
    AVAudioPlayer *player;
    ofBuffer recordBuffer;
    bool isRecording;
    bool isPaused;
    bool isPlaying;
    
    ofxPanel panel;
    ofxButton record;
    ofxButton play;
    ofxButton stop;
    
    NSURL *fileUrl;
    AudioFileTypeID fileType;
    AudioStreamBasicDescription fileFormat;
    AudioFileID fileID;
   
};

put this in ofApp.cpp

#include "ofApp.h"

static void checkError(OSStatus error,const char *operation){
    if (error == noErr) {
        return;
    }
    char errorString[20];
    *(UInt32*)(errorString + 1) = CFSwapInt32HostToBig(error);
    if (isprint(errorString[1] && isprint(errorString[2]) && isprint(errorString[3]) && isprint(errorString[4]))) {
        errorString[0] = errorString[5] = '\'';
        errorString[6] = '\0';
        
    }else{
        sprintf(errorString, "%d",int(error));
    }
    fprintf(stderr, "Error: %s (%s)\n",operation,errorString);
    exit(1);
    
}

//--------------------------------------------------------------
void ofApp::setup(){
    isRecording = false;
    isPlaying = false;
    isPaused = true;
    
    ofSoundStreamSetup(0, 1);
    panel.setup();
    panel.add(record.setup("record"));
    panel.add(play.setup("play"));
    panel.add(stop.setup("stop"));
    record.addListener(this, &ofApp::recordPressed);
    play.addListener(this, &ofApp::playPresed);
    stop.addListener(this, &ofApp::stopPressed);
    setupAudioFile();
    
}

//--------------------------------------------------------------
void ofApp::update(){
    if (isRecording) {
        ofBackground(ofColor::red);
    }else if (isPlaying){
        ofBackground(ofColor::green);
    }else{
        ofBackground(ofColor::grey);
    }
}

//--------------------------------------------------------------
void ofApp::draw(){
    panel.draw();
}

//--------------------------------------------------------------
void ofApp::exit(){
    [fileUrl release];
    [player release];
}

//--------------------------------------------------------------
void ofApp::touchDown(ofTouchEventArgs & touch){

}

//--------------------------------------------------------------
void ofApp::touchMoved(ofTouchEventArgs & touch){

}

//--------------------------------------------------------------
void ofApp::touchUp(ofTouchEventArgs & touch){

}

//--------------------------------------------------------------
void ofApp::touchDoubleTap(ofTouchEventArgs & touch){

}

//--------------------------------------------------------------
void ofApp::touchCancelled(ofTouchEventArgs & touch){
    
}

//--------------------------------------------------------------
void ofApp::lostFocus(){

}

//--------------------------------------------------------------
void ofApp::gotFocus(){

}

//--------------------------------------------------------------
void ofApp::gotMemoryWarning(){

}

//--------------------------------------------------------------
void ofApp::deviceOrientationChanged(int newOrientation){

}

//--------------------------------------------------------------
void ofApp::audioIn(float *input, int bufferSize, int nChannels){
    if (isRecording) {
        recordBuffer.append((const char*)input, bufferSize * sizeof(float));
    }
}

//--------------------------------------------------------------
void ofApp::recordPressed(){
    stopPressed();
    isRecording = true;
    isPaused = false;
    isPlaying = false;
    ofSoundStreamStart();
}

//--------------------------------------------------------------
void ofApp::playPresed(){
    stopPressed();
    [player release];
    player = [[AVAudioPlayer alloc] initWithContentsOfURL:fileUrl error:nil];
    [player play];
    player.numberOfLoops = -1;
    isRecording = false;
    isPlaying = true;
    isPaused = false;
}

//--------------------------------------------------------------
void ofApp::stopPressed(){
    if (isRecording) {
        ofSoundStreamStop();
        checkError(AudioFileCreateWithURL((__bridge CFURLRef)fileUrl, fileType, &fileFormat, kAudioFileFlags_EraseFile, &fileID), "Creating Audio File");
        checkError(AudioFileOpenURL((__bridge CFURLRef)fileUrl, kAudioFileReadWritePermission, kAudioFileWAVEType, &fileID), "Opening audio File");
        UInt32 bytesToWrite = recordBuffer.size();
        checkError(AudioFileWriteBytes(fileID, false, 0, &bytesToWrite, (void*)recordBuffer.getBinaryBuffer()), "writing to audio file");
        ofLog() << bytesToWrite;
        checkError(AudioFileClose(fileID), "closing audio file");
        recordBuffer.clear();
    }
    if (isPlaying) {
        [player stop];
    }
    isRecording = false;
    isPlaying = false;
    isPaused = true;
   
}

//--------------------------------------------------------------

NSString* ofApp::getAudioFilePath(){
    NSArray *searchPaths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
    NSString *documentsPath = [searchPaths objectAtIndex:0];
    NSString *fileName = [NSString stringWithFormat:@"%@/audio.wav",documentsPath];
    return fileName;
}
//--------------------------------------------------------------

void ofApp::setupAudioFile(){
    fileUrl = [[NSURL alloc] initFileURLWithPath:getAudioFilePath()];
    fileType = kAudioFileWAVEType;
    memset(&fileFormat, 0, sizeof(AudioStreamBasicDescription));
    fileFormat.mSampleRate = 44100.0;
    fileFormat.mFormatID = kAudioFormatLinearPCM;
    fileFormat.mFormatFlags = kAudioFormatFlagsNativeFloatPacked;
    fileFormat.mBitsPerChannel = sizeof(Float32) * 8;
    fileFormat.mBytesPerFrame = sizeof(Float32);
    fileFormat.mChannelsPerFrame = 1;
    fileFormat.mFramesPerPacket = 1;
    fileFormat.mBytesPerPacket = sizeof(Float32);
}

let me know if you have any questions

Thanks Ahbee, that’s a great help! With the help from your code I have managed to implementrecord and playback of one file, but have been hitting a brick wall when trying to play many sound simultaneously, which according to all the stuff i read is possible. Any tips on achieving that?

If you wanna play multiple sounds, you have to create multiple instances of AVAudioPlayer. One for each Sound.

there are only two things you need to know.

  1. How to record audio to a file. You do this by saving the incoming bytes to a buffer when record is on. When the user presses stop, you copy the buffer to a file.then clear the buffer(see the stopPressed Method)
  2. How to play back. you use a AVAudioPlayer to play each sound)

There is actually an AVAudioRecorder class. That is probably the easiest way to record data. So forget everything I said before. Just use AVAudioRecorder to record and AVAudioPlayer to play. So you dont even need to worry about soundStreams and buffers.

Hi Ahbee,

Thanks a lot for your help! With your help i have achieved multiple record and play using the traditional soundStreamBuffers recording approach + ofSoundPlayers which load the recorded WAV files. My code is a bit scattered at the moment but i’ll try to share what i have achieved so far in my next post.