Why are there no TVs or monitors that have 0ms input lag?

2022 UPDATE

Since writing this blog post I have written more on the topic on the AV Latency.com. This website contains a number of definitions that may be helpful in explaining the difference between Input Lag and Video Latency.

Original Blog Post

The following history is based on an email discussion I had with a display review author back in 2013 that I thought would be valuable to share:

Input lag is commonly understood as the latency (or delay) between the moment a video signal is transmitted and when that video signal is displayed on a TV or monitor. Back in the days of CRT displays, this would be less than a millisecond. But when looking at input lag measurements, one finds that no modern TVs or monitors have an input lag of less than 8ms when running at 60Hz. So why is this?

The reason dates back to decisions made in 2012 or 2013 regarding how to interpret the results from the Leo Bodar 1080p 60Hz Input Lag Tester. This testing device would measure the time between the start of a frame of video and the time that it was displayed on the screen, at three different points on the screen:

Image of the Leo Bodnar Input Lag Tester from the Leo Bodnar online store page
Image of the Leo Bodnar Input Lag Tester from the Leo Bodnar online store page

The results would be different based on which point of the screen the device was held to. Here’s an example of what you would see on a display that has zero video latency and instant response time, like what a CRT would have:

  • Top of screen: 0ms
  • Middle of screen: 8ms
  • Bottom of screen: 16ms

This makes sense because it takes 16.67ms to transmit a video frame at 60Hz. So the 8ms and 16ms readings are showing the transmission time of a frame of video rather than the video latency of the TV/monitor.

Initially, “input lag” results from this device were reported using a measurement at the bottom of the screen. This would include the video latency of the TV/monitor plus the full 16ms of a 60Hz frame transmission time. Unfortunately, it seemed that this input lag testing device would show confusing results with some displays, where the bottom reading would be less than the top reading. It’s possible that these displays would buffer an entire video frame and then present it bottom to top, rather than top to bottom. Or, this could have been an error in the testing device. Either way, it seemed like there wasn’t a way to simply report the measurement at a single point on the screen.

To address the issue that some displays were giving these abnormal results from the Leo Bodnar Input Lag Tester, a number of review sites decided to report input lag as follows:

Input lag = (average video latency at the top, middle, and bottom of the screen) + (average frame transmission time at the top, middle, and bottom of the screen)

Compared to measuring a single point on the screen, this reporting method has the benefit of accounting for displays that might present a frame of video with a different latency for different points on the screen rather than presenting each pixel immediately as they are transmitted to the display. A primary example of when this happens in practice is Black Frame Insertion, where pixels at the top of the screen are delayed more than pixels at the bottom of the screen. Unfortunately, this approach has the downside of including the frame transmission time in the input lag measurement.

It is worth noting that there are very few displays that present from bottom to top. I am personally unaware of any that have this sort of behaviour. This leads me to believe that the errors in the testing device might have been the primary reason for results to be smaller at the bottom of the screen than at the top of the screen. If you are aware of any TVs or monitors that present this way, please let me know so I can amend this comment!

Determining Video Latency without Frame Transmission Time

So long as your video signal does not make use of Quick Frame Transport [QFT] and is not operating in Variable Refresh Rate [VRR] mode, the average frame transmission time across the screen that is included in input lag measurements is a constant value based on the refresh rate of the video signal. This means it can be easily subtracted from an input lag measurement to determine the “pure” video latency of the display.

Refresh RateFrame Transmission TimeAverage Frame Transmission Time Across the Screen
30Hz33.3ms16.7ms
60Hz16.7ms8.3ms
120Hz8.3ms4.2ms
144Hz6.9ms3.5ms
240Hz4.2ms2.1ms

Here’s a chart that shows how to calculate the average video latency from an “input lag” measurement, without including this transmission time. Again, this only works if your video signal does not use QFT or VRR:

Refresh RateAverage Video Latency
30Hzsubtract 16.7ms from “input lag”
60Hzsubtract 8.3ms from “input lag”
120Hzsubtract 4.2ms from “input lag”
144Hzsubtract 3.5ms from “input lag”
240Hzsubtract 2.1ms from “input lag”

Using this final chart, we can discover that there are many modern LCD/OLED TVs and monitors that have close to 0ms of video latency, just like the old CRT monitors of the past!


Posted

in

by