MIDI Time Code MTC + midi over network

hi all

as i’ve been recently posting stuff related with synchronizeing N computers splitting a video image … http://forum.openframeworks.cc/t/frame-sync-between-pcs/1498/0

I’ve found that some “pro” hardwares made to syncronize video use Midi Time Code to control “time” of a given video file …

so i was thinking …

  • has anybody tried to generate MIDI TIME CODE or MIDI TIME CLOCK from an OF ?
  • would it be possible to transport MIDI messages over a LAN network instead of over MIDI wiring ?

thanks.

Hey, if you google ‘midi time code c++’ or ‘midi time code openframeworks’ the first result you get is
http://memo.tv/midi-time-code-to-smpte–…-frameworks
:slight_smile:

This is for decoding Midi Time Code (which is the SMPTE info encoded into midi bits in a non-human readable way) into SMPTE (which is human readable hours, minutes, seconds, frames etc). Once you have the SMPTE you can seek your video to any frame you want. That’s exactly what I was doing, my openframeworks based video player was slave to an external hardware timecode generator. SMPTE time code is encoded onto a midi signal, sent via midi network, decoded at the other end by my OF app into SMPTE to control my app and sync it to the other devices also slave to the same signal.

Yes you can send midi over ethernet, and generally that’s the norm. OSX supports this natively, you just set it up in Audio Midi Setup. Dunno how windows handles it. If you are using midi-over-ethernet just to send midi between macs, you don’t need any special hardware. Just connect them with ethernet cables. Or better still, connect a whole bunch of macs to a router, from audio-midi setup create a session and join the computers together, and thats it. No messing with IP addresses or anything, they will all be connected via midi.

P.S. Midi is so crap and old, but simple to use and it usually just works (for what it can do) so is still pretty industry standard.

hola memo !

yes indeed i saw your SMPTE code … lovelly human time code !
what i would like to avoid is to have an external midi-clock signal, so … i suppose i can do that with Live or any audio software … but what about generating it with OF ?

* did u generate the MTC yourself inside OF or did u use an external (hard or soft) procedure ?
* was MTC precise enough to make frame synch between several computers ? other options i’ve been looking at is : NTP (Network Time Protocol) and PTP (Precision Time Protocol) … but still haven’t find a way to generate that. My first tries told me that if i’m able to syncronize clocks over computer LAN with good precision, then video frame sync becomes really easy !

many thanks …

e*

PD :: all this “try & error” are to “try” (again) to make a multiprojection video player with N-computers … all slaved by MTC …

Hi eloi !

Have you found any solutions for creating a MTC generator? I would love to create a MTC Signal in OF.
The only MTC-Gernerator I found for mac can be found here:
http://www.audiofile-engineering.com/backline/

With this peace of software I was able to synchronize 2 Ableton Live Applications (slaved) on 2 Macbooks.
Perhaps this will help you.

Have a nice day.

christian

@memo. Thank you very much for your code. But unfortunately I don’t know how to receive my created MTC Signals (from the backline application) into OF.
(I’m a really newby in coding)
Anyway, thank you very much !!

Hi Chris !

Indeed i’ve not found any solution yet to this.
But with your idea of the MTC-Generator, i guess that we should be able to read to MTC coming from it.
If i’m not wrong the Midi Time Clock it comes “in” trough the MidiIn port as a F8 message, so it should be easy to detect … ?¿

How did u make it to syncronize 2 abbleton live with this ? could you explain me a bit the setup ? look interesting …

thnkx

e*

Hi OF’s !

Well i’m back to this research … what i’m trying to get is to have multiple videoplayer in multiple computers to play in “perfect” frame synch. The synch signal will come from a Midi time based signal so SMPTE trough MidiTimeCode signal.

I’ve tried different approaches and i would like to share my results and my doubts.

First i’ve downloaded Numerology3 music software which can generate MTC or SMTPE signal over a MIDI port, so this is the origin of my sinch signal. This software can do this on it’s demo release so it’s enough for now.

I’ve made a small app which reads Midi messages and using the “unvaluable” Memo’s Midi Time Code decoder code, i’m able to translate this to SMPTE frame format … so hours:minutes:seconds:frames.

Here comes the doubt and tries i’ve been doing :

  • first approach is to have a video which is not playing and i’m just moving the header of the video (videoPlayer.setFrame()) in coordination to what is coming over the Midi Time Code. This works, so video is played nice, but the playing of the video is not “smooth” at all, so i mean that playing the video, normally the visualization it’s much smooth. With this approach i’ve “small” (tiny) jumps on the playback …

  • another approach is to have the same setup but with the video on normal “play” so the MTC synch message is somehow just correcting the frame in case it goes out of the sync master … but again, the play of the video looks a tiny bit jumpy … not as smooth as a simple “play” video with no frame control …

So i guess that trying to achieve this using videoPlayer.setFrame() has some problem and it’s not smooth enough …

Has anybody tried something similar ? Frame Sync via a MTC signal ? Which would be the best way to achieve smooth results similar to a normal videoPlayer function ?

Memo, did u used your MTC stuff to play videos in sync ? Can you point me out some direction to keep trying to achieve a smooth frame sync ?

Thanks a lot for your ideas …

e*