ofxMaxim & Delay lines

Hi there,
I tried to use the native delay line from Maximilian.
It sounds very weird and I probably missed someting.
By “weird”, I meant the delay isn’t really nice.

  
void testApp::audioRequested(float * output, int bufferSize, int nChannels){  
    for (int i = 0; i < bufferSize; i++){  
        mix=0;//we're adding up the samples each update and it makes sense to clear them each time first.  
          
        //so this first bit is just a basic metronome so we can hear what we're doing.  
          
        currentCount=(int)timer.phasor(8);//this sets up a metronome that ticks 8 times a second  
          
        if (lastCount!=currentCount) {//if we have a new timer int this sample, play the sound  
              
            if (voice==2) {  
                voice=0;  
            }  
              
            ADSR[voice].trigger(0, adsrEnv[0]);//trigger the envelope from the start  
            pitch[voice]=voice+1;  
            voice++;  
              
            lastCount=0;  
              
        }  
          
        //and this is where we build the synth  
          
        for (int i=0; i<2; i++) {  
              
              
            ADSRout[i]=ADSR[i].line(8,adsrEnv);//our ADSR env has 8 value/time pairs.  
              
            LFO1out[i]=LFO1[i].sinebuf(0.2);//this lfo is a sinewave at 0.2 hz  
              
            VCO1out[i]=VCO1[i].pulse(55*pitch[i],0.6);//here's VCO1. it's a pulse wave at 55 hz, with a pulse width of 0.6  
            VCO2out[i]=VCO2[i].pulse((110*pitch[i])+LFO1out[i],0.2);//here's VCO2. it's a pulse wave at 110hz with LFO modulation on the frequency, and width of 0.2  
              
              
            VCFout[i]=VCF[i].lores((VCO1out[i]+VCO2out[i])*0.5, 250+((pitch[i]+LFO1out[i])*1000), 10);//now we stick the VCO's into the VCF, using the ADSR as the filter cutoff   
              
            mix+=VCFout[i]*ADSRout[i]/6;//finally we add the ADSR as an amplitude modulator   
              
              
              
        }  
          
        delayout=(delay.dl(mix, delayTime,delayFeedback)*0.5);  
  
        output[i*nChannels    ] = (mix+delayout)*0.5*0.5;  
        output[i*nChannels + 1] = (mix+delayout)*0.5*0.5;  
	}  
  
}  

delayTime & delayFeedback are tweaked through my UIKit based gui, which works fine.

delayTime unit is samples, afaik.
afaik, the hardcoded builtin size of that tap delay is 88200, which means 2s at 44100.

anyone used it & was happy with it ?