Regarding ffserver delays in av_read_frame

We are streaming a render app using ffserver and ffmpeg via websockets. There is considerable latency of about 300-400 ms (from the app to the browser) and it occurs due to the ffserver frame handling funntions (av_read_frame and av_read_frame_internals) which adds a delay of 1 frame.

Delays are also introduced by the webm container format with variable cluster size, which needs to be optimized. I would really appreciate, if any one has some pointers on how to reduce the cluster sizes and the ffserver frame-handling.

Thanks,
Tilak.