Getting sound streamed through AudioPulse sink | Audio visualization

I was playing around with some pretty things and decided to create a music visualizer. But for the second day now I can’t find any example or tip on how to get the output the system is currently playing. I especially do not want to play the music from inside the App, as my goal is to have something you could open and get a beautiful picture (with music playing from your favorite service).
As I got it, there is no way to intercept the audio stream going to the dynamics without additional actions, so I followed this answer to create a sink. As I understand, there now should be a way to grab the stream coming through it to analyze it. How could I do this?

I listed devices available (using code from the audioInputExample folder) and the only thing connected with PulseAudio was:

[notice ] ofBaseSoundStream::printDeviceList: Api: Pulse
[notice ] ofBaseSoundStream::printDeviceList: [Pulse: 0] PulseAudio [in:2 out:2] (default in) (default out)

But even after rerouting the audio from Google Chrome to the created sink, I didn’t start to receive the audio in the mentioned example - it only tacked the micro input.

I’m ready to use any additional headers, already tried the “pulse” but couldn’t get it to do anything, there’s just no tutorials about it…

Hurray, I found a way!
pavucontrol “Recording” tab allows choosing, what input a recording app gets, so rerouting output given to “record-and-play” sink is exactly what is needed.
Though, if there is a way without using pavucontrol and manually change the source of the recording, I would be grateful to hear

1 Like