Linux webcam audio output


#1

I’m using a Magewell XI100DUSB-HDMI to pull HDMI video into my oF app. I’d like to also pull in and play the audio from that feed. So far, I can’t seem to get audio output using ofSoundstream. When I run the audioOutputExample, I get this:

HOST_OS=Linux
checking pkg-config libraries:   cairo zlib gstreamer-app-1.0 gstreamer-1.0 gstreamer-video-1.0 gstreamer-base-1.0 libudev freetype2 fontconfig sndfile openal openssl gl glu glew gtk+-3.0 libmpg123
Cannot connect to server socket err = No such file or directory
Cannot connect to server request channel
jack server is not running or cannot be started
JackShmReadWritePtr::~JackShmReadWritePtr - Init not done for 4294967295, skipping unlock
JackShmReadWritePtr::~JackShmReadWritePtr - Init not done for 4294967295, skipping unlock
[notice ] ofBaseSoundStream::printDeviceList:
[0] hw:HDA Intel PCH,0 [in:2 out:4] (default in) (default out)
[1] hw:HDA Intel PCH,1 [in:0 out:2]
[2] hw:HDA Intel PCH,3 [in:0 out:8]
[3] hw:XI100DUSB-HDMI,0 [in:2 out:0]
[4] default [in:32 out:32]

Cannot connect to server socket err = No such file or directory
Cannot connect to server request channel
jack server is not running or cannot be started
JackShmReadWritePtr::~JackShmReadWritePtr - Init not done for 4294967295, skipping unlock
JackShmReadWritePtr::~JackShmReadWritePtr - Init not done for 4294967295, skipping unlock

RtApiAlsa: dump hardware params just after device open:

ACCESS:  MMAP_INTERLEAVED RW_INTERLEAVED
FORMAT:  S16_LE S32_LE
SUBFORMAT:  STD
SAMPLE_BITS: [16 32]
FRAME_BITS: [32 64]
CHANNELS: 2
RATE: [44100 192000]
PERIOD_TIME: (83 185760)
PERIOD_SIZE: [16 8192]
PERIOD_BYTES: [128 65536]
PERIODS: [2 32]
BUFFER_TIME: (166 371520)
BUFFER_SIZE: [32 16384]
BUFFER_BYTES: [128 65536]
TICK_TIME: ALL

RtApiAlsa: dump hardware params after installation:

ACCESS:  RW_INTERLEAVED
FORMAT:  S32_LE
SUBFORMAT:  STD
SAMPLE_BITS: 32
FRAME_BITS: 64
CHANNELS: 2
RATE: 44100
PERIOD_TIME: (11609 11610)
PERIOD_SIZE: 512
PERIOD_BYTES: 4096
PERIODS: 4
BUFFER_TIME: (46439 46440)
BUFFER_SIZE: 2048
BUFFER_BYTES: 16384
TICK_TIME: 0

RtApiAlsa: dump software params after installation:

tstamp_mode: NONE
tstamp_type: MONOTONIC
period_step: 1
avail_min: 512
start_threshold: 512
stop_threshold: -1
silence_threshold: 0
silence_size: 4611686018427387904
boundary: 4611686018427387904

I get similar jack-related errors when I try to run an instance of ofSoundstream with the Magewell video feed (which tends to act like a webcam). ofVideoPlayer and ofSoundPlayer both output audio. Also, audioInputExample seems to be pulling in audio from the Magewell as well (visuals are happening).
Is this a general Linux/Audio issue, or specific to my setup? Any help is much appreciated.

FYI, I’m using a low-budget Intel NUC with a Celeron processor.


#2

After a clean install of Lubuntu I have a the audioOutputExample working. My confusion now is how to take the audio from the webcam (in this case the USB capture card) and output that audio out the analog output. My current attempt is to use to instances of ofSoundStream:

    bufferSize = 256;
    soundIn.setDeviceID(3);
    soundIn.setup(0, 2, 44100, bufferSize, 4);
    
    soundOut.setDeviceID(0);
    soundOut.setup(2,0,44100,bufferSize,4);
    
    left.assign(bufferSize, 0.0);
    right.assign(bufferSize, 0.0);

and then using the listeners, attempt to copy the input stream into the output stream

void ofApp::audioIn(float * input, int bufferSize, int nChannels){
    for (int i = 0; i < bufferSize; i++){
        left[i]		= input[i*2];
        right[i]	= input[i*2+1];
    }
}

void ofApp::audioOut(float * output, int bufferSize, int nChannels){
    for (int i = 0; i < bufferSize; i++){
        output[i*nChannels    ] = left[i];
        output[i*nChannels + 1] = right[i];
    }
}

I assume the device IDs are correct based on the output of aplay -l and arecord -l :

 card 0: PCH [HDA Intel PCH], device 0: ALC283 Analog [ALC283 Analog]
  Subdevices: 1/1
  Subdevice #0: subdevice #0
card 0: PCH [HDA Intel PCH], device 1: ALC283 Digital [ALC283 Digital]
  Subdevices: 1/1
  Subdevice #0: subdevice #0
card 0: PCH [HDA Intel PCH], device 3: HDMI 0 [HDMI 0]
  Subdevices: 1/1
  Subdevice #0: subdevice #0

and

card 0: PCH [HDA Intel PCH], device 0: ALC283 Analog [ALC283 Analog]
  Subdevices: 1/1
  Subdevice #0: subdevice #0
card 1: XI100DUSBHDMI [XI100DUSB-HDMI], device 0: USB Audio [USB Audio]
  Subdevices: 1/1
  Subdevice #0: subdevice #0

What I’m seeing now is that the HDMI/USB capture device is busy:

RtApiAlsa::getDeviceInfo: snd_pcm_open error for device (hw:1,0), Device or resource busy.

I tried running the app without initializing the videograbber. I got the same result. Any help is appreciated.


#3

Hi,

you have the same error if you try to do that with mplayer?

mplayer -ao alsa:device=hw=1.0 output.mp3

there is an active pulseaudio daemon?
or serves a type: jack?


#4

thanks, @kashim

i got this error:

[AO_ALSA] alsa-lib: pcm_hw.c:1590:(snd_pcm_hw_open) open '/dev/snd/pcmC1D0p' failed (-2): No such file or directory
[AO_ALSA] Playback open error: No such file or directory
Failed to initialize audio driver 'alsa:device=hw=1.0'
Could not open/initialize audio device -> no sound.
Audio: no sound
Video: no video

That being said, the device I’d like to use is input only. And that device works with the audio input example. I’m wondering now if it’s because these are two different sound cards.


#5

I ended up rebuilding the system with Lubuntu (which i believe does not include pulse audio). Re-installed oF and now the audio output examples are working. I am still not able to get the audio from the webcam to play out of the oF app, but I wonder if this is an issue with trying to use to soundcards (the magewell and the default).


#6

using two sounds cards should be fine. if you can hear the sound produced by the soundOutput example and see the input from the webcam microphone in the soudnInputExample it should work.

sending one card input to another’s output is tricker than it seems though, for one you usually have 2 different threads so it’s not as simple as writing to a buffer and then reading from that buffer from the input.

first of all i would try mixing both examples in one and see if it still works, if you can hear sound and you can still see the input from the mic.

once you have that working a simple solution would be to use an ofThreadChannel to send sound buffers from the input to the output like:

void ofApp::audioIn(ofSoundBuffer & buffer){
    channel.send(buffer);
}

void ofApp::audioOut(ofSoundBuffer & buffer){
    if(buffer.getTickCount()>4){ // add some delay to avoid buffer underruns
        channel.recv(buffer);
    }else{
        buffer.set(0)
    }
}

the lower the number of the buffers you wait the least delay but the easier it’ll be that at some point there’s no buffers to receive in the channel and you’ll hear some noises.

also, this will only work if the settings for both cards are exactly the same, for example both stereo, 44100Hz, 256 samples if any of those parameters is different you’ll need to manually convert the samples from one format to another by resampling if the rate is different or joining and cutting buffers to adjust the size. ofSoundBuffer has some utilities to do it but it can be quite complicated to get perfectly right.