Since the inception of MIDI, personal computers have become vastly more complex. Where once software MIDI Sequencers could communicate directly with the hardware for precise control of interrupts, IO and timing, their now exists multiple levels of abstraction between the application software, operating system and hardware.
Whilst this has overall made modern computers more robust, has it reduced the reliability and accuracy of computers for timing-sensitive applications like MIDI?
As a case in point, I use a MOTU MIDI Express XT for routing MIDI around my project studio. In standalone hardware mode, where MIDI data is transmitted and received via 5-pin DIN connector only, the round-trip latency to & from a hardware MIDI sequencer is a respectable 2ms with very low jitter. By comparison, disabling the hardware routing features of ine interface and routing MIDI in and out of Cubase running on a modern Mac over USB increases this figure to just over 9ms - a nearly five-fold increase. Combined with the latency of D/A converters and so forth, this has a significant effect on overall latency.
My experience with using MIDI over USB as an alternative has generally yielded an improvement in overall latency, but with an increase in associated jitter.
Manufacturer's and software developers have tried to introduce many clever techniques to combat jitter and latency, such as MIDI Time Stamping, but the implementation of such technologies is often opaque, incompatible between vendors and doesn't solve the issue with realtime MIDI communication.
So my question is: have we in fact gone backwards in terms of support for MIDI on modern computers? I am seeing a resurgence of interest in hardware sequencers (such as the Yamaha QY700) and even old computers (such as the Atari ST) for MIDI Sequencing, and I myself now sequence primarily on hardware due to ongoing issues with MIDI on both Mac and Windows.
Would be keen to hear people's thoughts.