Why are there no TVs or monitors that have 0ms input lag?

November 12th, 2020

The following history is based on an email discussion I had with a display review author back in 2013 that I thought would be valuable to share:

Input lag is commonly understood as the delay between the moment a video signal is transmitted and when that video signal is displayed on a TV or monitor. Back in the days of CRT displays, this would be less than a millisecond. But when looking at input lag measurements, one finds that no modern TVs or monitors have an input lag of less than 8ms when running at 60Hz. So why is this?

The reason dates back to decisions made in 2012 or 2013 regarding how to interpret the results from the Leo Bodar 1080p 60Hz Input Lag Tester. This testing device would measure the time between the start of a frame of video and the time that it was displayed on the screen, at three different points on the screen:

Video Signal Input Lag Tester - Click Image to Close
Image of the Leo Bodnar Input Lag Tester from the Leo Bodnar online store page

The results would be different based on which point of the screen the device was held to. Here’s an example of what you would see on a display that has zero video delay and instant response time, like what a CRT would have:

  • Top of screen: 0ms
  • Middle of screen: 8ms
  • Bottom of screen: 16ms

This makes sense because it takes 16.67ms to transmit a video frame at 60Hz. So the 8ms and 16ms readings are showing the transmission time of a frame of video rather than the video delay of the TV/monitor.

Initially, “input lag” results from this device were reported using a measurement at the bottom of the screen. This would include the video delay of the TV/monitor plus the full 16ms of a 60Hz frame transmission time. Unfortunately, it seemed that this input lag testing device would show confusing results with some displays, where the bottom reading would be less than the top reading. It’s possible that these displays would buffer an entire video frame and then present it bottom to top, rather than top to bottom. Or, this could have been an error in the testing device. Either way, it seemed like there wasn’t a way to simply report the measurement at a single point on the screen.

To address the issue that some displays were giving these abnormal results from the Leo Bodnar Input Lag Tester, a number of review sites decided to report input lag as follows:

Input lag = (average video delay at the top, middle, and bottom of the screen) + (average frame transmission time at the top, middle, and bottom of the screen)

This reporting method has the benefit of accounting for displays that might present a frame of video with a different delay for different points on the screen rather than presenting each pixel immediately as they are transmitted to the display. A primary example of when this happens in practice is Black Frame Insertion, where pixels at the top of the screen are delayed more than pixels at the bottom of the screen. Unfortunately, this approach has the downside of including the frame transmission time in the input lag measurement.

It is worth noting that there are very few displays that present from bottom to top. I am personally unaware of any that have this sort of behaviour. This leads me to believe that the errors in the testing device might have been the primary reason for results to be smaller at the bottom of the screen than at the top of the screen. If you are aware of any TVs or monitors that present this way, please let me know so I can amend this comment!

Determining Video Delay without Frame Transmission Time

So long as your video signal does not make use of Quick Frame Transport [QFT] and is not operating in Variable Refresh Rate [VRR] mode, the average frame transmission time across the screen that is included in input lag measurements is a constant value based on the refresh rate of the video signal. This means it can be easily subtracted from an input lag measurement to determine the “pure” video delay of the display.

Refresh RateFrame Transmission TimeAverage Frame Transmission Time Across the Screen
30Hz33.3ms16.7ms
60Hz16.7ms8.3ms
120Hz8.3ms4.2ms
144Hz6.9ms3.5ms
240Hz4.2ms2.1ms

Here’s a chart that shows how to calculate the average video delay from an “input lag” measurement, without including this transmission time. Again, this only works if your video signal does not use QFT or VRR:

Refresh RateAverage Video Delay
30Hzsubtract 16.7ms from “input lag”
60Hzsubtract 8.3ms from “input lag”
120Hzsubtract 4.2ms from “input lag”
144Hzsubtract 3.5ms from “input lag”
240Hzsubtract 2.1ms from “input lag”

Using this final chart, we can discover that there are many modern LCD/OLED TVs and monitors that have close to 0ms of video delay, just like the old CRT monitors of the past!

Quirks and Limitations of the PreSonus Studio 26c

April 28th, 2020

Today I got a new oscilloscope! This new scope has a DC-coupled z-input that modulates the electron beam intensity and can be used for blanking (turning off the electron beam while it moves between shapes that it’s drawing).

When I hooked it up, I started to notice some strange quirks and limitations to my PreSonus Studio 26c audio DAC that I am using to drive the XY display mode of my oscilloscope: It seems that some of the audio outputs on this audio interface behave very differently than other outputs!

First off, I noticed that output channels 1 & 2, although they have the benefit of being adjustable by the volume knob on the front of the device and can have mic inputs live-mixed into their output, they are much noisier than output from channels 2 & 3:

(Click on the image to see the full sized image)

The second thing I noticed was that there is actually a delay on channels 3 & 4 compared to 1 & 2! I haven’t calculated how much of a delay yet, but judging from the image, I would guess it’s around 20 samples at 192 kHz, or 0.1ms. This is obviously a big deal when trying to use them in tandem:

I intend to use 3 channels of this PreSonus DAC: two for X/Y control and 1 for blanking (brightness/intensity). This delay obviously has an impact on blanking:

You can see in this image that there are gaps in the shapes and the blanking lines are still visible because the intensity is not being modulated at the correct time due to the delay.

To address this issue, it won’t be too hard to put a software delay on the brightness output stream to line it up with the X/Y output streams.

Overshooting, Oscillations, and Blanking on a Vector Display

March 31st, 2020

Before I started work on creating a vector game engine to be used on displays like oscilloscopes, I had watched a really fantastic explanation of how to create graphics for laser displays by Seb Lee-Delisle:

In this video, Seb talked about the challenges of working with a laser that takes time to accelerate and decelerate as it moves around the screen. Draw sorting and gradually changing the laser’s position when moves between shapes are two techniques that he used when recreating Asteroids for a laser projector. When approaching my new oscilloscope project, I kept these ideas in my back pocket as solutions to problems I expected to have.

Overshooting and Oscillations

Sure enough, I did have some problems with controlling the electron beam of my oscilloscope vector display. When moving from one position to another which is very far away, the electron beam would overshoot and oscillate before finally settling into the place I wanted it to land:

For context, my current setup and game engine uses an audio DAC that is running at 192 kHz with a variable video game framerate. My current project targets 80 fps, but it could go much higher. 80 fps with 192,000 samples per second means that I can move the electron beam 2400 times per frame. If I need to move the electron beam more times per frame, that frame will take longer to draw and the framerate will drop. If it drops too low, you can start to see flickering, like an old CRT monitor running at a low refresh rate. The electron beam moves extremely quickly — faster than 192,000th of a second! So I need to gradually step along the path I am drawing. This effectively “slows down” the electron beam’s movement when drawing a shape. This means that 2400 samples per frame is not very much to work with!

Draw Sorting

Getting back to the problem of the electron beam overshooting and oscillating, I started first with the solution of draw sorting. I figured that if I could make the electron beam move in a more optimized path that it could help reduce the distance that the beam needed to travel between shapes and thus reduce the overshooting and oscillation at the start of drawing a new shape.

I implemented a very quick prototype algorithm that simply looked for the next shape that was closest to the current electron beam position every time it finished drawing a shape. This had some immediate drawbacks that I didn’t expect! Although, it’s possible this method could help with the overshooting, I found that it caused a much bigger problem: the scene started flickering and jittering as I moved the camera around!

The reason for this flicker and jitter was immediately obvious to me — the refresh rate of each shape was varying between frames! Here’s an example of a worst case scenario that would cause this problem: Let’s say, after sorting all the shapes, the draw order of objects is: [A, B, C, D, E, F, G]. But then the player moves the camera and the scene changes such that the draw order after sorting is something like [B, C, D, E, F, G, A]. While object A was drawn first in one frame, it was drawn last in the next frame. Without sorting, object A would have been drawn once every 1/80th of a second, but now it needed to wait a full 1/40th of a second before being re-drawn to the screen. This behaviour causes flicker and can make objects look like they are jittering as a camera pans across the scene.

This experiment has made it very clear that I must always draw shapes in the same order with a vector display to ensure that their refresh rate is kept somewhat constant!

Blanking

Blanking is a term used to describe the time when a display will “blank” (stop drawing an image) for a short period of time while it prepares draw at a different position, usually on the opposite side of the screen. The idea of blanking is important and valuable to both raster and vector displays. With my current oscilloscope, I do not have the ability to turn off the electron beam, but I can dedicate some additional time to give the electron beam a chance to settle in its new position before starting to draw. Here’s what it looks like if I pause on the new draw position for 7 samples (7/192,000th of a second) before continuing to draw the new shape:

You can see that this definitely helps. Now each line is being fully drawn, though the initial oscillations as the beam settles on the new location can still be seen.

Exponential Deceleration During Blanking

There was one last trick that I kept in my back pocket: changing the acceleration of the the electron between points, during blanking. I fiddled with some different tween functions, and settled on a 7 frame blanking with a quadratic ease out:

Ta-da! This is now looking pretty crisp. In the future, I plan to change the number of blanking samples based on the distance between the two shapes. If it’s a small distance, there’s no point in spending a full 7 samples during this blanking time.

Oscilloscope Z-Input & Blanking

I mentioned previously that I was unable to turn the electron beam off and on while moving between shapes. This isn’t entirely true: my oscilloscope does have a “z-input” that can be used for blanking. But unfortunately, I found that it is AC-coupled. This means that the z-input is only able to detect changes in voltage, rather than read the DC voltage directly like my x and y-inputs. My game engine has actually supported changing the brightness of samples through a “z-input” and a third audio channel since the beginning, but I will need to get a new oscilloscope with a DC-coupled z-input that I can use for full-featured blanking.

Further Progress

My vector game engine’s source code is available on Github under the MIT license. It’s pretty rough and I intend to use it primarily for my own experiments, but you’re welcome to peek around and check out the progress. I plan to continue to post here from time to time with updates, but for the latest news on the project, check out my Twitter feed.

Making a Vector Game Engine with an Oscilloscope

March 10th, 2020

A couple of weeks ago I started work on a new video game engine for vector displays. For a long time I have been enamoured by the uniquely high contrast of vector displays. A local bar named House of Targ hosted an original Asteroids arcade cabinet that only could be appreciated in-person: the super bright effect of the weapon shots was something that couldn’t be reproduced on any type of display.

Later I was introduced to Oscilloscope Music and where I discovered how easy it was to control a vector display using an audio signal. The creator of this music also produced a series of tutorials that described the audio equipment needed. This allowed me to get up and running very quickly with my own vector game engine.

My vector game engine is being developed in C# with its primary output being the ASIO audio interface. I’ve started off by using a number of math and input classes from MonoGame. After less than a week of blind programming with only a debugger to show me the buffer states, I managed to create a rotating cube that displayed first try on an old oscilloscope. After another week, I’ve got to the point where I have a very basic 3D scene that you can navigate using an Xbox One controller or the Xbox Adaptive Controller. My plan is to make alternative control video games with this as my output.

I’m not the first to be doing these types of experiments. And vector display games were some of the first video games. There was even a home all-in-one console called Vectrex! Here are some links to other cool articles and projects:

…And some music-focused links:

My vector game engine’s source code is available on Github under the MIT license. It’s pretty rough and I intend to use it primarily for my own experiments, but you’re welcome to peek around and check out the progress. I plan to continue to post here from time to time with updates, but for the latest news on the project, check out my Twitter feed.

Disabling Frustum Culling on a Game Object in Unity

December 19th, 2013

You should never disable frustum culling in a release build.

But sometimes it can be useful to do so for debugging or when dealing with a really wacky vertex shader where mesh bounds don’t make sense anymore. Here’s an easy way to disable frustum culling on a game object by moving its bounds into the center of the camera’s frustum:

// boundsTarget is the center of the camera's frustum, in world coordinates:
Vector3 camPosition = camera.transform.position;
Vector3 normCamForward = Vector3.Normalize(camera.transform.forward);
float boundsDistance = (camera.farClipPlane - camera.nearClipPlane) / 2 + camera.nearClipPlane;
Vector3 boundsTarget = camPosition + (normCamForward * boundsDistance);

// The game object's transform will be applied to the mesh's bounds for frustum culling checking.
// We need to "undo" this transform by making the boundsTarget relative to the game object's transform:
Vector3 realtiveBoundsTarget = this.transform.InverseTransformPoint(boundsTarget);

// Set the bounds of the mesh to be a 1x1x1 cube (actually doesn't matter what the size is)
Mesh mesh = GetComponent().mesh;
mesh.bounds = new Bounds(realtiveBoundsTarget, Vector3.one);

[Download C# Unity Script Component]