Jump to content

Testing Midi Latency Thanks for your help.


John Vere

Recommended Posts

Update March 1 2023: I just uploaded the video that is the topic I started here. I'm posting the Link here at the top for future reference by poeple who found this thread by searching for information regarding Midi Latency. Thanks to all who supplied valuable information

 

 

I'm just in the process of researching for a video about Midi latency.

The internet is mostly very outdated information and it is rare to find any mention of actual midi latency facts. All the videos and articles just jump right into explaining what we already know. If you hear midi lag, it is because of your audio system.  

 But there is actual latency in the Midi system too. The best info I could find would say that it will be around 1ms per device if you daisy chain or if you try and feed 16 channels using Midi 1.0 each channel will be delayed by 1ms and channel 16 will be delayed by 16ms due to the fact that Midi 1.0 is serial. Also they mentioned USB jitter.  Notice I mentioned Midi 1.0. That is because with Midi 2.0 all this will change. I actually downloaded and read most of the new 2.0 specification PDF. It will be a new age for midi users.  You’ll be able to feed 16? channels thought USB C and they will all arrive in less than 1ms and so on. But most of us are still using Midi 1.0 gear and a lot of it is important to us still. 

So after hours and hours of reading and watching I still only had vague answers about actual Midi latency itself.  I thought I would try testing to see what really happens when you hit a drum pad.

I'd really appreciate anyone taking the time to read this and shed any light on my findings and what I might be missing or doing wrong.  

The test and the results: 

This is a test of the difference between the 2 mikes. It should be 6.1 but it clearly is not. Notice they are 180 out of phase. You could easily call the 1.8ms as 1.9ms the reported input latency but this is mystery #1. Note to record this I turn on input Echo for track 1 and turned of direct monitoring so in theory this should represent the signal passing through track 1 and back out again. 

1532828965_1-differencemikes.thumb.png.99fc2295f2c88416337ec24258ae3871.png

This is thee screenshot of the test tracks. I put markers at each event.

93970295_17-resultsbetter.thumb.png.dd9d203cf98d576bbd3c9b95c9c4f455.png

 

1725059812_TestresultsMidi.jpg.5bdf2fd34142419bbcc6dd3c8452f9e2.jpg

Track 1- microphone 1" from Drum pad. This signal will be delayed by 1.9ms and then Cakewalk will adjust for the latency and place it 1.9ms earlier on the timeline. This is assuming this is actually what happens. Therefore the placement of the drum hit is presumed to be very accurate. It is where it should be in time.  

Track 2 - the midi event is sent Via USB and is recorded 3.5ms later than Track 1.  This would seem to be about what you would expect from a USB 1.0 midi system. This is assumed to be the correct measurement of Midi Latency of my Yamaha DTX  drum system. Note that results were different when testing using a Roland Keyboard controller. That measurement was 5.1ms.  I will assume that midi events are not placed earlier on the time line as they are not involved with the audio system. They are recorded when they arrive there.  

Track 3 -the audio output of the drum module is recorded 10.4ms later than Track 1. It is also assumed this audio track will be placed 1.9ms earlier on the timeline to adjust for the audio input latency. It is not clear why this is delayed as much as it is?  This is what you will be hearing in the headphones even without connecting the Module to a computer?  Unless?  Strange it doesn't "sound" like there is latency as you play. But?  

Track 4- A mike placed 1" from the playback monitor is recorded 15.4ms later than Track 1. This will also be adjusted and placed 1.9ms earlier on the time line. But we will have to factor in the output latency which is reported as 4.2ms. This is where I'm sort of puzzled?  Things don't compute. 

It took 11.9ms from the midi event to trigger the VST get to the speaker, return and be recorded again. What we do know is it should have taken 4.2ms for the sound of the VST instrument to reach the speakers. But if we subtract that from 11.9ms we still have 7.7ms unaccounted for. Possibly there is a lag from the VST?  More testing. I froze SI drums and yes there is almost 200 samples when you look at the midi event and the frozen audio .  200samples = about 4ms. then 7.7 minus 4 = 3.7ms still unaccounted for? ( Note: Addictive Drums only lagged 50 samples -1ms) 

Thank you for taking the time to read all this and I apologize to those who are now sitting there going WTFIHTA.

Edited by John Vere
Insert link for topic
Link to comment
Share on other sites

Friendly suggestion--wait until everyone has upgraded their systems (including all gear) to MIDI 2.0.  As you know MIDI is serial communication. I haven't read the 2.0 docs in a couple of years, but as I see it, when all software and gear uses all the extra data in the expanded formats, there will be a period of what Thomas Kuhn called a paradigm shift which will result in chaos until the "normal science" of MIDI communication settles in into something a tutorial can safely talk about with any accuracy, reliability, and practical value (usefulness).

PS: As a hobbyist, several years ago I gave some thought to how I might use a microprocessor to create a MIDI data converter to actually do something with the extra data.  But I decided that manufacturing teams were also probably working on it and they could devote more resources to it than I could.  

From what I've seen of a few videos over the past few years that document midi latency and what some call jitter, with widespread marketing of MIDI to MIDI 2.0 converters, routers, handlers, or whatever they will be called, all the current video documentation will be out of date. 

JMO: Good to do the research to keep track of what people are saying, but postpone your judgments and have a very probative stance in any videos.  Maybe a series on "What we know now" that take snapshots of current ideas where subsequent videos do a recap ("The last time we discussed _____________, _____________________." + an update to add new information, changes, etc.)  Trying to be helpful so you don't have to keep on revising videos. More of a historical approach, than a how-to-use-it approach.

Edited by User 905133
To add a PS: fixed typo; added a few extra words as clarifications
Link to comment
Share on other sites

Thanks . Actually the video starts with”  midi 2.0 will change all this but most of us will be using midi 1.0 for a while yet because we own a lot of midi 1.0  gear which is still and might always be important to us.  “.

I didn’t throw out my tube amps when solid state came out either. There’s a lot of us who keep old gear forever and as time goes by it goes from being outdated to being retro or cool. 


And most of what I’m trying to figure out won’t change because it’s more about understanding how the audio system relates to monitoring a VST as we perform.  Especially digital drums where every ms counts. 

Edited by John Vere
  • Like 1
Link to comment
Share on other sites

"... wait until everyone has upgraded their systems (including all gear) to MIDI 2.0 ..." (User 905133)

This thread reminded me we had one on the old forum (midi "Jitter" it Does Exist). It was started in 2007 and it frequently referred to the imminent MIDI 2.0 ?. The final post on that thread was in 2015. It was full of wild conjecture and a lot of misinformation, but there were a few examples of good experiments.

I just finished reading the whole thing, and I don't recommend it (it's 18 pages!). But it did point out that there were variations from one VST to another and even from one note to another for at least two VSTs (one of them seemingly purposeful). And also variations from one MIDI interface to another, and external synths made things even wilder. So experiments need to develop anchors that eliminate unpredictable latencies and jitters.

In John Vere's post, there are indeed mysteries in the latency timings. But there are quite a few places where latency can spring up (including the drum brain converting to MIDI, including velocity, and also generating the associated audio). The 180 degrees out of phase is not especially mysterious, since the mic wirings or the speaker's amp could be responsible for that. (And it isn't strictly phase, which would relate to some timing issues -- it's a signal polarity reversal.)

Also, the little bump on the "monitor mike" at 294 samples, where did that come from? Did that mic hear the drum pad from a short distance away?

It can be worth it to try a VST (not TTS 1), maybe a drum VST in parallel with the drum machine.

Edited by bvideo
Link to comment
Share on other sites

12 hours ago, bvideo said:

Also, the little bump on the "monitor mike" at 294 samples, where did that come from? Did that mic hear the drum pad from a short distance away?

It can be worth it to try a VST (not TTS 1), maybe a drum VST in parallel with the drum machine.

I didn't want my OP to be 3 pages long so many detail of what I have been doing are not mentioned. They might be part of the video but this is where I'm at. I only want to clearly include facts that would help people troubleshoot the midi lag issue. This is including not only the obvious audio part, but the cases where the Midi system it self can be involved.

And thanks for posting the link to the thread, Those sort of things seem to be the only information sources, old forum threads both Sonar and Gear Space.  That's when I started snooping around the info from the Midi Association.  

And I am quickly determining that as a whole, every midi set up will see slightly different results, so my demonstration in the video of how to test might be of use to others who wish to test their own systems. Therefore the need to double check my information is correct. 

The Hump- One issue is to actually determine Ground zero of the stick hitting the pad or keys. I eventually figured out that that hump is the Swish of the drum stick passing in front of the mike which is 1" from the Pad.  I had the same weirdness when hitting a keyboard note I had to learn to have my finger on it then press otherwise the sound of the key moving gets recorded before it hits bottom.  The world is a different  place down there in sample land. I was very carful about leakage,  the mikes as close are they are to the Pad and speaker, plus input echo off, plus Direct monitoring off etc.  Only other thing gets printed is the metronome turned way down so it's pretty obvious. 

The Phase reversal-  I noticed the phase reversal when I was looking more at the transient peak of the wave to determine the timing differences. I immediately jumped out of my chair and turned my monitors around to check the wires. All is good there. Now this becomes another time suck while I try and figure that out. 

 

Link to comment
Share on other sites

1 hour ago, Craig Anderton said:

I also think there would be a major difference between feeding MIDI data in via USB, and via a DIN connector+MIDI interface.

Thanks for that  Greg this is super import part of the quest.  I tested that best I could using my Roland A49 which has both USB and Midi connections. USB won the race by a whopping 1ms in this case.  Pretty rare to have both options. 

This was one of my questions that I came up short on facts when searching on line. Once again this is impossible to determine as there could be many factors involved.

Example- I will assume my Motu Audio interface uses a  good quality Midi driver and I used a 3' long  good quality Beldon Midi Cable.  But if I  had used a Cheap  interface  that used generic midi drivers and a cheapo 10' Midi cable  the results might have been very different. I wish I could prove this theory but I don't own a cheapo Midi interface. I even thought about  ordering one off Amazon.  I do own a Akai Controller that uses Generic Midi drivers and it was about double the latency on the Mike test. But that's not enough data to make blanket statements and there could be cases of Microsoft's generic  driver outperforming a Roland or Yamaha driver, No if any reliable info is available to me so far.  

Edited by John Vere
Link to comment
Share on other sites

The drivers for USB or DIN (usually connected by USB, only much older equipment was connected by serial or printer port) can be viewed by going through the device manager and finding the item associated with that device and inspecting the driver stack. Most likely, they all use the Windows "class compliant" drivers, but it's worth checking. Variations are more likely within the h/w dongle or, in your case, the synth, which, by the way still has to pass the DIN output through another interface (which is also USB connected?)

Audio drivers, particularly ASIO, are the ones that suffer from some manufacturers' efforts.

Link to comment
Share on other sites

I am a big fan of echo/delay--sometimes synchronized, but often fractionally random.  So in terms of musical preference, random midi latencies of the miniscule variety (jitter) is often OK. 

On the other hand, a few days ago, I tried to synchronize a software sequencer in a particular soft synth with a BPM hardware sequencer.  I have had no problems synching hardware across a span of 15 feet via MIDI.  I would go crazy with all sorts of pitch wheel, pitch bend, and CC data, even testing the setup by wildly changing from 0 BPM to 300 BPM from the hardware sequencer to all the gear.  Those modules never lost any clock bytes!!!!!

However, with the BPM sequencer and the PC both within arms distance, the software got totally messed up with some very basic BPM changes--like from 120 to 78 over the span of several seconds.

Since the hardware could handle extreme changing in timing but the software couldn't, I don't think it was a problem with MIDI but with how MIDI was implemented.

On 2/25/2023 at 11:48 PM, bvideo said:

. . . there were variations from one VST to another and even from one note to another for at least two VSTs (one of them seemingly purposeful). And also variations from one MIDI interface to another . . . .

Yup!!!!

On 2/25/2023 at 11:48 PM, bvideo said:

So experiments need to develop anchors that eliminate unpredictable latencies and jitters.

I have spent over 2 years trying to see if I can make a move to software synths. After the attempt to sync my BPM sequencer to a PC based soft synth less than a week ago, I am now wondering if even basic MIDI synchronization can be done much less eliminating miniscule latencies.  Maybe time-stamping all midi data in MIDI 2.0 will work with PC based tools music-making.    

Great discussion!!  Thanks for raising the subject.

 

Edited by User 905133
removed "it" between "handle" and "extreme"
Link to comment
Share on other sites

2 hours ago, User 905133 said:

Maybe time-stamping all midi data in MIDI 2.0 will work with PC based tools music-making.    

From what I gathered reading up on Midi 2.0 it's not just the new specification but then if tied in with USB C technology we will not suffer data log jams, Jitter or any of that.  Or at least that is one of the goals.  It seems these issues alone where the reason the Midi Association has been working on the long over due update.  Using Midi 1.0 is like dial up internet. 

The thing is on a simple set up with a USB midi Keyboard controller and a Laptop I doubt if those people care or would even notice midi latency itself as it will be under 6ms or something like that ( not talking audio latency , different topic) . But set ups like your's are where every ms and data packet lost is an issue. 

USB 1.0 replaced the serial ports on our old PC's.  And we have slowly updated that system since.  We are now at USB C which was developed because  of the demand put on the USB systems now needing faster data as well as more power to charge stuff. I have a pair of heated socks that have USB mirco B connections to charge them.   If I had spent more money I could have got the ones with USB C. 

 My Motu interface has USB C.  Here's another one to ponder! Is Midi 1.0 is stuck on USB 1.0 specs?   Guess what I tested too.. If using a different USB port made any difference. My computer is 12 years old so only had USB 1.0.  I had to add a  PCIe card with USB 3.0 when I bought the MotuM4. I've always had my controller plugged into the same old port so I swapped it to the PCI card.  It seemed to shave 1ms off.  I'm going to look at USB C PCIe cards out of curiosity. Remember my Motu is using that very same bus for it's midi and audio. 

 

2 hours ago, bvideo said:

Most likely, they all use the Windows "class compliant" drivers, but it's worth checking.

Yes this I think is also important info for people who are using USB Midi devices. I mention it every chance I get in my tutorials. In my 2 video about Midi set up this is demonstrated immediately. I show how to open device manager and check your driver status. I also show how to open Show hidden devices and make sure it's not a big mess because you've been swapping to different ports and have now excided the 10 device limit per port. 

My Roland keyboard and my Yamaha drum kit both have there own midi drivers. I also have a Akai controller that uses a generic driver. All is I can say is there is a difference in many ways. The Akai has noticeable sluggish response which was verified in testing. And a minor detail,  but you cannot turn it on and use it if Cakewalk is already running. No problem with the Roland or Yamaha. Cakewalk just politely asks if you want to connect but the Akai requires completely shutting down and restarting of Cakewalk can't find it.

An interest observation is open the Midi 2.0 specification PDF and on the list of people contributing is folks from Native Instruments, Roland and Yamaha.  Part of what they are doing  is working on a new Microsoft  Midi 2.0 generic driver that promises to be a big improvement. Don't hold your breath. Microsoft is involved. But it somehow make me feel safer knowing Roland and Yamaha are involved in this. 

Link to comment
Share on other sites

Drum/Keyboard module processing time (physical impact till MIDI is generated) can be ms or so. And there is no guarantee MIDI(USB or own) output is sent at the same time/before/after module sound generator get the event. So comparing Mic on pad with audio from module output is just a measure of module impact to sound latency, which is not bound to MIDI latency.

Audio buffer size contribute to the "jitter" of MIDI to VST output latency. If the interface is used for MIDI and audio, I have a feeling the jitter is smaller since the interface knows time relation between both inputs.

"Real hardware MIDI 1" transfer speed (throughput) is ~1ms per note. For drums that is less significant then for keyboard (we have more then one finger per hand). USB quantization contribute as most into transfer latency (there is almost no difference to transfer 1 or 100 notes, the delay till the "packet" is sent dominates). In that respect USB 1 is way worse then 2 (throughput of USB 1 is sufficient).    So in practice hardware MIDI connection + audio interface with MIDI may have lower latency then device own USB connection. For interfaces there is MIDI loopback test utility (like RTL for audio), I remember with RME I had something around 2-3ms while cheap "MIDI to USB adapter" had more then 10ms. My Kawai connected throw RME with MIDI had lower latency then own USB ( I don't remember the results of the test with my Roland drums).

For me MIDI latency starts to be boring in case it goes "crazy" only. That has happened several times, for some reason some (not all!) MIDI devices start to get latency over 30-40ms. Not DAW/audio interface/audio buffer dependent. Disappears with Windows restart... I still have no idea from where that comes.

Note that most MIDI devices normally "imitate" instruments with "natural" acoustic latency (unlike f.e. singing, guitar, flute, etc., I mean something with rather short or even fixed distance from the "sound generator" to our "sound sensors"). Just using headphones compensates 3-5ms latency.

  • Like 1
Link to comment
Share on other sites

Update on the reverse phase issue. I would appreciate if anyone else could test this. I determined it only happens when I loop through Cakewalk with input echo on. 
The test is simple 

2 mikes. One to record a transient sound. I hit my desk with a drum stick. The other mike on your monitor speaker. 
You have to set your monitoring so the mike on the speaker is not being sent there( obviously) 

The 2 mikes feed two tracks in Cakewalk and you see if they are out of phase. 
Thanks. 

 

Link to comment
Share on other sites

On 2/26/2023 at 2:30 PM, John Vere said:

We are now at USB C which was developed because  of the demand put on the USB systems now needing faster data as well as more power to charge stuff. I have a pair of heated socks that have USB mirco B connections to charge them.   If I had spent more money I could have got the ones with USB C. 

Just an FYI. USB C is a connector type. At this time, it could be USB 3.2 Gen 1, USB 3.2 Gen 1x2, USB 3.2 Gen 2x1 or USB 3.2 Gen 2x2. USB 3.2 Gen 1 and USB 3.2 Gen 2x1 aren't necessarily USB C, they can have USB A connectors.

https://www.pcmag.com/how-to/what-is-usb-c-an-explainer

  • Like 3
Link to comment
Share on other sites

12 hours ago, John Vere said:

Update on the reverse phase issue. I would appreciate if anyone else could test this. I determined it only happens when I loop through Cakewalk with input echo on. 
The test is simple 

2 mikes. One to record a transient sound. I hit my desk with a drum stick. The other mike on your monitor speaker.

I have not done the test, but one "theoretical" note. Let say you have 100Hz sin wave. That means its "turnaround" is 10ms. If the interface input to output latency is 5ms, recording input and output simultaneously should produce 180° phase shift. I mean visual shift between waveforms depends from the frequency and the interface RTL.

PS. I assume you have checked the interface reported latency is accurate, using loop-back recording. Good interfaces report it correctly, but if external  mic pre-amp with digital output is used,  it is not (can't be) auto-accounted into the interface RTL. Also while RTL is easy to measure with loop-back (in a DAW or with special utility), its division into input and output parts is way trickier to deduct.

Link to comment
Share on other sites

8 hours ago, rsinger said:

Just an FYI. USB C is a connector type. At this time, it could be USB 3.2 Gen 1, USB 3.2 Gen 1x2, USB 3.2 Gen 2x1 or USB 3.2 Gen 2x2. USB 3.2 Gen 1 and USB 3.2 Gen 2x1 aren't necessarily USB C, they can have USB A connectors.

https://www.pcmag.com/how-to/what-is-usb-c-an-explainer

Thank you. There is unfortunate confusion about the terms "USB C" and (less so) "USB 3." And then of course there's "Thunderbolt." They all relate to each other, but they also get confused for each another.

I didn't pay it much notice until last December, when I bought a Dell laptop that has a Thunderbolt 3 port. At that point I started wondering what exactly I could connect to such a port. I also wondered where the docking station connector was on my new computer, turns out the answers are related.

One way to remember it with USB is numbers=protocol, letters=connector. There's no such thing as a "USB 3 connector," nor is there such a thing as the "USB C protocol."

A USB C port on a given bit of hardware might only be for USB 2 protocol connections. A USB 3 port may use a USB A connector.

Thunderbolt is yet another protocol, and it's also a term for a certain type of port that always uses USB C connectors. A Thunderbolt 3 port is, physically, USB C. Protocol-wise, it can handle DisplayPort video, charging, and USB 2 and 3 and Thunderbolt data. The way to tell whether your system has a Thunderbolt 3 port is that it will be a USB C connector with the image of a lightning bolt (of course they would use lightning to muddy the waters even further) printed or molded next to it.

So rather than using a traditional docking station, if I want to connect my laptop to an external monitor, that will be done via its Thunderbolt 3 port. My understanding is that if I had a charger with enough current capacity, I could even charge the laptop via its Thunderbolt 3 port.

There's a LOT of confusion. The statement "USB 3 uses the USB C type connector" is only true some of the time. Any current USB protocol may use any type of USB connector. My Presonus Studio 2|4 uses USB 2 and has a USB C connector.

So the only way you know what's provided by a USB C connector (aside from the fact that it will include some kind of USB) before plugging something into it is if you see the lightning bolt indicating it's a Thunderbolt 3 port.

Going forward, it's a good idea when buying USB C-to-USB C cables to get only ones that are rated to work with Thunderbolt 3. This is because Thunderbolt 3 uses "smart" connectors that help the devices on each end know who they're talking to.

On 2/26/2023 at 10:42 AM, User 905133 said:

random midi latencies of the miniscule variety (jitter)

My understanding of the term "jitter" relates to variances in clock frequency. I assume that jitter on the MIDI clock results in slight variations in delay?

On 2/25/2023 at 2:06 PM, User 905133 said:

wait until everyone has upgraded their systems (including all gear) to MIDI 2.0

I suspect that will take a very long time. Speaking strictly for myself, I'm not dumping any of my MIDI 1 gear in favor of MIDI 2 gear. I mean, not just for the sake of being able to use the new protocol.

This discussion has made me wonder what we'll see 20 years from now, once MIDI 2.0 has really taken hold, and CPU's have gotten even faster at executing instructions, With a resultant reduction in the amount of latency and timing variations. will we start to see plug-ins that imitate the current imperfections? People will claim that the variations induced by timing inaccuracies were an important part of what made late 20th and early 21st century electronic music sound like it does. "That's why sequenced music that you hear today sounds so mechanical and robotic. Back then, since the tech wasn't as advanced, there were all these little imperfections."

Just like we have "analog" buttons on compressor and EQ and delay plug-ins that induce noise, roll off highs, add subtle compression, etc. we'll have "MIDI 1.0" buttons that induce latency and timing variations. And they'll do the same kind of tests that we do, put a dozen instances of the vintage-izer on a track just to see how their effect adds up and what they're really doing....

Seems funny, but we make music in a world where pretty much every producer and mix engineer has some sort of "bitcrusher" plug-in that they can use if they want to get the cool grainy sound of the earliest samplers. Not saying that we all use them, but I bet if we looked in our plug-in collections, somewhere in there, as part of a bundle or whatever, we could find a bitcrusher. That would have seemed hilarious to me 25 years ago. But bitcrushed sounds have become part of the musical language of some genres.

Turns out that we like it when music bangs up against technological limitations. Take away those limitations and some of them we want back.

  • Like 2
Link to comment
Share on other sites

 

On 2/28/2023 at 3:04 AM, azslow3 said:

I have not done the test, but one "theoretical" note. Let say you have 100Hz sin wave. That means its "turnaround" is 10ms. If the interface input to output latency is 5ms, recording input and output simultaneously should produce 180° phase shift. I mean visual shift between waveforms depends from the frequency and the interface RTL.

This is the special case of "phase" corresponding with a polarity flip. It can only happen with a purely symmetrical wave and only with a 180° time shift as determined by the period of that wave's fundamental frequency. So it would be possible to construct a case where transmitting a sine wave through a time delay of exactly 1/2 the wave's period would appear as polarity reversal.

 

(edit: some pointless material removed)

Edited by bvideo
Link to comment
Share on other sites

On 2/27/2023 at 3:27 AM, azslow3 said:

That has happened several times, for some reason some (not all!) MIDI devices start to get latency over 30-40ms. Not DAW/audio interface/audio buffer dependent. Disappears with Windows restart... I still have no idea from where that comes.

Thanks for your input and I understand and agree with all you said. I wasn't initially trying to measure the latency of the Audio output of the drum module but it was connected so what the heck. I was actually surprised at it's higher reading.  As you say it would have to be due to internal processing of the Brain. The above quote is another example of DAW weirdness that happens from time to time. Example as I was doing the test I made the mistake of opening a few web pages to look things up. When I returned to Cakewalk all the test results went crazy and made no sense anymore. There was very high numbers in the latency reading for everything. I restarted computer and it all returned to normal.  

On 2/27/2023 at 5:46 PM, rsinger said:

ust an FYI. USB C is a connector type. At this time, it could be USB 3.2 Gen 1, USB 3.2 Gen 1x2, USB 3.2 Gen 2x1 or USB 3.2 Gen 2x2. USB 3.2 Gen 1 and USB 3.2 Gen 2x1 aren't necessarily USB C, they can have USB A connectors.

Thanks I found a good source of info on good old wikipedia   https://en.wikipedia.org/wiki/USB_hardware   My mention of this due too reading different discussions on the Midi Associations pages about developing Midi 2,0 to take advantage of USB C.  

On 2/28/2023 at 2:04 AM, azslow3 said:

PS. I assume you have checked the interface reported latency is accurate, using loop-back recording. Good interfaces report it correctly, but if external  mic pre-amp with digital output is used,  it is not (can't be) auto-accounted into the interface RTL. Also while RTL is easy to measure with loop-back (in a DAW or with special utility), its division into input and output parts is way trickier to deduct.

Absolutely was part of troubleshooting the mystery of the phase reversal.  The external loopback passed with flying colors. But the Motu's internal Loopback was early by 1.5 ms. I assume this is due to Cakewalk sees it as an actual external audio source and it's not being calculated by the driver properly. I think I'll contact Motu and ask. I was wondering why when doing screen captures if I accessed both my mike input and the Loopback mix to record Cakewalk there was a latency echo. 1.5 should not do this but things get complicated when running 3 apps that use audio simultaneously. 

On 2/28/2023 at 3:45 AM, Starship Krupa said:

My understanding of the term "jitter" relates to variances in clock frequency. I assume that jitter on the MIDI clock results in slight variations in delay?

I'm glad you mentioned this as I kept finding old articles that said part of midi latency over USB was because USB midi had more jitter than DIN midi. I guess the reference to jitter is trying to describe a screwed up data stream.   Yet another deep topic with no solid info forthcoming.  

I'm sure the answers are out there but the topic is not THAT import to me as I have a very good Midi set up with both old hardware and  some newer USB stuff. Some of my stuff is real old like the Roland 505 which I can still sync  to Cakewalk. I get my moneys worth out of most of my older gear because it was well made back then. Go figure the only broken device I have is the newest. The Akai Synth station. What a POS. Laggy response bad drivers and now a dead key right in the middle and they don't make spare parts like Roland, Korg and Yamaha do. I've fixed dead keys before it's easy with new key sensors for $12 a strip.  I'm glad I got it cheap.  

Anyway I've completely revised my plan for this video and I'm just in the process of filming (screen capturing)  this morning. I decided to break the topic into 2 videos. First is focused on Midi  Latency and how people can " attempt" to measure their systems latency and what to look for and how to fix it.  Then the second video on Audio Latency and how to measure that. 

UPDATE: I posted the video at the top of the thread. Appreciate any feedback. 

 

Edited by John Vere
Link to comment
Share on other sites

17 hours ago, John Vere said:

...

I'm glad you mentioned this as I kept finding old articles that said part of midi latency over USB was because USB midi had more jitter than DIN midi. I guess the reference to jitter is trying to describe a screwed up data stream.   Yet another deep topic with no solid info forthcoming. 

I think MIDI jitter and audio jitter are from a bit different "domains":

  • Audio is a continuous stream. Audio is sampled using clock and when this clock is not accurate, samples are for "wrong" time. Audio jitter is inaccuracy in the time of samples. If there is more then one clock and they are not synchronized, f.e when two interfaces are used in parallel, there are "drift" and "jitter" between audio streams. If interfaces are synchronized, there is no drift but still there is some jitter between stream (samples taken at the same world time are not put at the same time position in the audio streams). Note that audio transfer way/speed/latency does not influence that jitter.
  • MIDI events are not sampled as continuous stream. Jitter there is a deviation in latency.  So how late an event is delivered in comparison with usual delivery time. Unlike audio, there is no predefined "sample rate". Obviously there is some "rate of reading hardware sensors and converting them to events", but it is unknown and device specific. The only known clock/delays is MIDI hardware transfer clock (~31kHz) and so it takes ~1ms to transfer one note. Hardware MIDI transfer use uni-directional continuous stream, so a note can be delivered as soon as prepared. In other words ~1ms is full (and constant) delay between the note is ready and delivered (important for comparison with USB).  
  • USB-MIDI has much higher transfer speed in comparison to MIDI. Even USB 1.1 is at least 1.5MHz (up to 12MHz), so transferring one (or even several) notes is way faster using any USB (one note  in less then 0.02ms). But USB use host driven packet delivery. And here comes the problem, in a "standard" mode used by computer keyboards, mouse and "cheap" USB-MIDI, delivery from device happens every 5-12ms ("polling rate", device+mode specific fixed number, easy to see in Linux, I have not tried to look under Windows). So a single note, in case of 10ms quantization, will be delivered between 0.02ms and 10.02ms after it is ready for delivery. And so there will be "jitter" up to 10ms.
    USB-MIDI devices with own drivers are supporting (and using) lower polling rates. With 1kHz polling rate max deliver jitter will be ~1ms, for any number of simultaneous notes (USB2+ can go higher, but I have not checked if that is used in USB-MIDI).
  • Like 1
Link to comment
Share on other sites

Yes, understood.   If you Google the Question " Is DIN Midi faster than USB Midi " you will find no real answer to that question. Therefore I tried my own tests.

As I said I clearned up the information in the video to make it easier to follow and keep it short. One of the tests was to answer the above question. My A 49 has both Midi and USB connections so it was easy to test this. The USB latency with 1 note played was 3ms and the MIDI pathway was 4 ms. No big deal really. Without a robot I could test what happens with a 4 note chord. I thought about making a set of fingers out of wood but decided that was getting carries away. 

But of note was the Roland MKB 200 which was made in the mid 80's had a latency of 9ms. My conclusion was things have improved since 1985. INteresting feature uf the MKB is it has 3 Midi outputs which would have solved the daisy chaining issue for many people.

Link to comment
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
×
×
  • Create New...