What is latency? (and why you don’t care)

Latency is a word you hear thrown around the digital audio world quite a bit. A lot of people use the word latency to mean a lot of different things, and often are not clear on what it does actually mean or how it will affect their home studio workflow. You will often hear salesmen or so called experts recommending this gear over that gear because you will get better latency. The truth of the matter is latency does not matter at all for the recording needs of nearly every home recording enthusiast. Most of you literally should not care about latency at all.

What is latency?

Simply, latency is the time it takes digital audio data to go through your soundcard’s software driver. Great, so what is latency? Latency is the amount of time between when your computer tells a sound to play and the time when your speakers make the sound, or the time between when you put a sound to be recorded into your computer and the time it actually gets the data to record it. Awesome, so what is latency? I thought you’d never ask…

There are two types of latency. More accurately there are two directions of latency. There is playback/output latency and recording/input latency. Often they are the same amount but sometimes they can differ. We are going to tackle them one at a time. Let us start by taking a look at what happens when you press play in your favorite home recording software.

Playback/output latency
daw playback latency

  • A. Press “play”
    Represents the moment in time when you press play in your DAW.
  • B. Read data
    All the tracks in your recording are stored as files on your computer’s hard drive. Before playing them back, your computer needs to read a little bit of every file. Depending on your hard drive speed, DAW software, RAM, computer speed, track count, etc. this step can take around 5-100ms. This is not latency.
  • C. Process data
    All those tracks, plugin effects, mix automations, etc. require a bunch of math to do what they do. Your computer needs to crunch all the numbers to work out what the final sound should be. This can take anywhere from 1-100ms or more depending on some of the same factors from step B. This is not latency.
  • D. Send to driver
    The computer crunched all the numbers but all it did was come up with more numbers. We can’t hear numbers. This is where the soundcard and its driver come in to play. Your home studio computer needs to push all these numbers through the soundcard so they can be converted into a signal which will later be turned into physical sound. This is done by filling up presized buckets (buffers) of data and sending them through the soundcard to be converted from digital to analog. The time it takes to push data through these buffers, convert them from digital to analog, and send them to the soundcard outputs is latency.
  • C. Sound is heard
    This part is pretty straight forward. Your speakers take the analog signal and move air so you can hear your music. This is not latency.

You can see there are a lot of things happening in the computer between the time you press play and when you hear the sound. Most of these things are not latency. Only the processing time associated with moving data through the soundcard driver is considered latency. This is true for recording too, but the steps are just a bit different.

Recording/input latency
daw recording latency

  • A. Sound is captured
    This sound can be captured through a microphone, an amp modeler, a set of V-Drums, or any other creative way you can think of to get sound into your home studio soundcard. This is not latency.
  • B. Convert sound to data
    The analog sound gets converted to digital data by your soundcard. The driver fills input buffers with chunks of data to be retrieved by your computer. The time it takes to fill these input buffers to be retrieved by the computer is latency.
  • C. Computation
    The computer grabs the data in the buffers, does some mundane computation involving streaming and creating a file, and sends this data toward the hard drive for storage. This is not latency.
  • D. Store Data
    The data is stored on your computer hard drive. Depending on the speed of your hard drive and the number of tracks you are recording, this could take from 5-50ms or more. This is not latency.

Again we can see that many things happen when recording a file that take some time and that almost none of them are latency. The only thing that is latency is the time it takes the data to get through the soundcard driver.

What effect does latency have on recording?

  • “You can’t use a soundcard for recording unless the latency is below 10ms” – False!
  • “With my brand X soundcard I can record with zero latency” – Impossible!
  • “My home studio computer is better than yours because my latency is lower” – Funny, but wrong!
  • “Professional quality soundcards all have lower latency than amature cards” – That’s fiction!

Let’s take a look at the timeline of events when we want to overdub a track in a home studio environment with 31ms of latency.

daw latency in the home studio

  • A. Your computer gathers up the sound to start playing a at 0:00.000. The sound data is sent to the soundcard buffers at this time.
  • B. By the time the data works its way through the soundcard and you hear it, it has been 31 milliseconds since the sound was played. At this point in time you are hearing the sound that was played at 0:00.00 but the computer is playing the sound from 0:00.031. That’s right, you are hearing sound that the computer played 31ms ago. When you play along with the track, what you are playing (the instrument in your hands) is already 31ms behind the music the computer is processing.
  • C. By the time your instrument’s signal has made its way back through the soundcard driver, another 31 milliseconds has passed. Yes, that means the sound you played is not heard by your computer until 31ms after you played it. Worse, the sound is not recorded until 62ms after the sound it was supposed to be played over.

This seems pretty bad. How could any of our home studio recordings sound any good if everything is off by 62 milliseconds every time we overdub? At 120 beats per minute that is about equivalent to a 32nd note. Not even the Rolling Stones are that loose. We still don’t care. As home studio owners latency does not matter to us. Why?

Why don’t we care about latency?

Computers can be the bane of our existence. They almost never seem to do what we want them to do, but always seem to have an agenda of their own. Even in light of the love/hate relationship most of us have with our computers, there is one thing we must never forget. Computers are good great at math! We can’t get around latency. It is impossible. Soundcard drivers have to do their thing. There will always be latency. Computer programmers know this and they build this knowledge into their software. It is a small matter for your home studio DAW to do a bit of math and correct all the wrongs latency introduces. Here is what your recorded track looks like in its raw form according to the real world timing of the events:

before home studio daw correction

Notice the overdubbed sound file doesn’t start until 62ms after the track we were playing over. The computer can easily calculate how long it will take the soundcard drivers to send sound out and bring sound back in. Once it does these calculations it knows the audio you played was meant to go over the audio that happened 62ms before you played it. Yes, the computer is like a time machine! It sends your recorded track back in time, using latency correction, to make sure it plays back right in sync with the prerecorded audio tracks. It ends up looking like this:

after home studio daw correction

Notice the tracks are now perfectly aligned. The principles of latency correction are well defined and understood. All pro level software for the home studio does this automatically for you. I can’t think of a single DAW package I’ve used that does not do this for you. Sometimes there are settings to turn this off or adjust it by manual amounts. You don’t really want to play with those settings under normal circumstances. You can see from this illustration that latency is not bad at all. In fact, you should never need to care about latency. The computer takes care of everything behind the scenes for you.

It should be noted that these principles hold no matter what your latency is. The DAW can correct for your latency just as well at 7ms as it can at 700ms. You should never notice the difference in your recorded tracks. The only thing you might notice at extremely high latency settings is a general feeling of sluggishness in the controls of your DAW. This should not concern you. It doesn’t matter if your song doesn’t start playing until a quarter second after you hit play. It doesn’t feel snappy but it will sound just as good.

Is high latency bad?

If 7ms and 700ms are no different in terms of sound, is one better than the other? That is a complex question but the general answer is no. For the majority of home recording enthusiasts there is no practical difference between higher or lower latency. Here is a comparison.

Low latency:

  • Pros: DAW is generally more responsive. Softsynths perform better (more on that in a bit). Bragging rights (useless).
  • Cons: Harder on your processor. Can’t run as many plugin effects live. Lower potential track count.

High latency:

  • Pros: Easier on your processor. More plugins can be used. Higher potential track count. Softsynths perform better (more on that in a bit).
  • Cons: DAW can feel a touch sluggish. “Experts” will make fun of you (clueless).

The higher your latency, the more time your computer has to batch up data in large chunks and process it. This is primarily important when reading data from the hard drive because physical drives are notoriously slow (in comparison to computations done in memory). If you ever notice your song’s playback starting to experience dropouts or stuttering, this is a sure sign you are running out of horsepower. Increasing your latency can help you out. Higher latency translates to more live effects at mixdown and a higher potential track count.

Who does care about latency?

Well, someone must care about low latency or everyone wouldn’t be making such a big deal about it. There are two situations where low latency matters. Latency is a big issue if you plan on using software monitoring or virtual instruments.

I truly believe software monitoring is one of the worst things to happen in the home studio space during the computer DAW revolution. Other than being a tool to keep you from playing with feel and enjoying your home studio experience, what is software monitoring? Software monitoring is when you use your DAW’s internal mixer to create your monitor mix. When do people feel the need to do this? If you were recording with something like a direct guitar box, then you have no way to hear the guitar signal because the outputs are plugged into the computer for recording. DAW makers started allowing you to use the software mixer to echo this signal back out for monitoring purposes. There is a big problem with this approach. All signals have to make a round trip through the soundcard drivers before you can hear them. Consider our previous examples of recording with 31ms latency. If you wanted to hear that guitar track through software monitoring, it first has to go through the 31ms latency on the way in to the computer, then go through another 31ms latency on the way back out of the computer for you to hear it. You are trying to play in time with the song but all your notes coming back out the speakers are horribly lagging behind your fingers. If you want to experience this first hand, set up a delay effect on your sound with 60ms or so of delay time and 100% wet. Try to play anything with a constant time. It will mess with your head. Avoid direct software monitoring like the plague unless you enjoy spending a lot of money on the latest computer hardware and daydream about configuring every aspect of your computer for barebones, stripped down performance. For as little as $30 you can have a perfect, latency free, hardware monitoring solution (the techniques are outlined in this article).

Virtual instruments are synthesizers that run on your computer with no external host software. You remember that Casio keyboard your sister had in the 9th grade? Virtual synthesizers are a lot like that, except without the $20 of plastic and they sound a ton better. Synthesizer is actually too limited a term for the virtual instrument. Some virtual instruments are drum machines, guitar amp modelers, analog synth modelers, and even artificial band members.Virtual instruments are actually quite useful in the modern home recording studio. The only problem arises when you want to play them live with an instrument in your hands. This can take the form of a midi keyboard playing through a virtual synthesizer or trying to play your guitar through a virtual amplifier. The same situation applies that we saw previously while discussing software monitoring. First your computer has to get the signal of your playing in to the computer, then it has to be played back to you. This essentially doubles your latency and makes everything you play come back to you with a lag.

You can set your rig up to successfully run live virtual instruments but it won’t come cheap. If your sense of rhythm and feel is highly developed, it is unlikely you will be able to tolerate anything higher than 3-4ms of latency.

I hope this explanation of latency has been both informative and enlightening.

This article has been fueled by my love of gourmet ramen. Shout out to WorldRamen.net!

Share
Share this Post[?]
        

30 Responses to “What is latency? (and why you don’t care)”

Read below or add a comment...

  1. Smith says:

    Mr. Vesco how can you tell people that latency is not important?! This is worst of your articles and that statement is just RIDICULOUS. People wouldn’t spend hundreds euro’s on RME interfaces that allow for as alow as 14 samples latency if it weren’t important. You cannot underestimate soft synths these days. Maybe you only record drums and guitars, but believe me – most pro musicians these days use software processing and for such people latency is kinda crucial. So I’m sorry but WE DO CARE ABOUT LATENCY, and you should finally notice that it isn’t 1985 anymore.

  2. bvesco says:

    I certainly agree that a person who has a requirement that they play “live” through a soft synth must have a low latency environment. I disagree that every person using a computer to record since 1986 has relied on soft synths as the core of their workflow. I know plenty of people mixing on small to very large projects that are not using soft synths to play “live” in the studio. Any projects I need a soft synth on are served perfectly by tracking and monitoring with a hardware synth, recording the midi, then wiring up and tweaking the soft synth after tracking is complete. Latency does not matter in this situation. You are entitled to your opinion for certain, but it is a fact that low latency is really only important to people who insist on playing “live” through a soft synth. If you are one of those people then you need low latency. The rest of us do not. To the low latency kool-aid I say, “pass!”

  3. roy says:

    Hi, very nice article. That give me hope. By the way do you know if adobe audition cs5 or 3 does this “automatic latency compensation” because for what I could gather It does not. I use a Lexicon Alpha interface and I always find myself moving a lilte bit to the left the track I just recorded in Adobe Audition. Even with the minimun latency configuration. I’m starting to think that I should use Cubase. In adobe audition If make a track like a metronome 4 x 4 and I rercord that with and Mic the recorded track alwas get a delay, I understand sound velocity but I think it is to much. Should the track recorde match perfectly the orginal metronome track?

  4. Riley says:

    Really solid article, good explanation, and very easy to understand!

    Thank you sir!

  5. andkon says:

    to experience this first hand, set up a delay effect on your sound with 60ms or so of delay time and 100% wet. Try to play anything with a constant time.

  6. rich says:

    @roy : yes, it should match the metronome. There seems to be a known bug in adobe audition. Search on the adobe cs5.5 forums for “How to make latency compensation work?”. You should join us there and help us fix this.

  7. Miloš says:

    Very nice article(sorry for my english!). I work with usb midi keyboard and usb mic with kids for Junior Eurovision and it works for us. Simply,I record without that kids hear himself during recording, and after that I add vocal in mix. It isn’t matter if latency exists or not.
    I think, in music idea is most important.

  8. D says:

    Thank you for writing this article. It explained and helped to solidify some things I have been thinking about. I have a couple of questions that I think you would be able to help me with.

    I record in Reaper, and I have noticed that I occasionally get clicks/glitches in my audio if I have the buffer set very low in Reaper. I don’t have problems if I set the buffer to 512 samples. Is this because the higher buffer is easier on the CPU? There isn’t any particular delay or problem using a buffer of 512 that I have noticed, since I am not using Live Monitoring. My audio interface doesn’t have a dedicated buffer setting section, other than selecting your Firewire Latency as Short, Medium, or Long. I have it on Short.

    I have tried layering guitar tones by running a previously-recorded guitar signal through a reamp box to a pedal, and back into my DI. I noticed comb filtering in the signal when I blend the pedal tone witht he previous guitar tone, which tells me this new pedal pass is somehow recorded with a delay. After some testing, I discovered a 3.5ms delay in this pedal overdub. It is unaffected by Reaper’s buffer setting. Is this 3.5ms delay a result of bad coding in Reaper, bad audio interface drivers, or the unavoidable delay of D/A + A/D? Are drivers written to compensate for the natural delay of the converters?

    My tests essential told me that all of my overdubs are being placed about 3.5ms late. This may not be noticeable/perceivable when overdubbing a different instrument, but is troubling to me in theory, and is definitely noticeable as comb filtering if I am overdubbing a very similar signal.

    Thanks!

    • bvesco says:

      Yes, clicks/pops can be caused by using too low buffers because CPU gets overtaxed.

      Regarding the comb filtering issue it’s an unavoidable side effect of re-amping. What I do in this situation is add a click or two at the beginning of the track. Then when I get the re-amp track I can line up the sharp transients on the click to correct for the delay. Modern DAWs tend to include latency compensation for recording but when you re-amp you are doubling the latency but the DAW only corrects for it once.

Trackbacks

  1. […] This explanation might help you understand latency. […]

  2. […] are not designed to meet the requirements of a DAW. The main problem you may encounter is that of latency. This relates to how sound is streamed through them. If you are inputting signals into your […]

  3. […] los requerimientos de una DAW. EL principal problema con el que te podrías encontrar es el de la latencia. Esto está relacionado con la forma en que el sonido fluye a través de las tarjetas de sonido. […]

    > the requirements of a DAW. The main problem that you might find is that of latency. This is related to how the sound flows through the sound cards