29.97 Things You Didn’t Know About Frame Rates

29.97 Things You Didn’t Know About Frame Rates

Articles

Thought leadership articles by IABM and our members
Articles taken from IABM's journal and at show papers
To submit your article email marketing@theiabm.org

29.97 Things You Didn’t Know About Frame Rates

By Jason Gyorog, Quality Assurance Engineer at Studio Network Solutions

Fri 07, 05 2021

As video editors, cinematographers, and content creators, we spend a remarkable amount of time adjusting our settings. From resolution to codecs to frame rates, there is a lot of history, design, and mathematical precision behind these settings, and learning more about them might help you in your next big project.

So strap on your thinking caps, it’s time to talk frame rates.

Origins on the Silver Screen

There is no single standard for video frame rates (or framerates, frames per second, fps). While the early elite film studios of Thomas Edison’s era often shot their motion pictures between 16 and 24 fps with hand-cranked cameras, the films would be played for audiences anywhere between 20 and 26 fps. Because of this inconsistency, many films of the era seem frantically sped up, often with comically fast character movement.

Frame rate: the number of frames (consecutive still images) in one second of video.

With the addition of sound synchronization and widespread film distribution all over the world, motion picture frame rates were standardized to 24 fps in the late 1920s. As we all know, the rest of video post-production is far less defined.

While there are common frame rate choices for different types of projects, it’s important to know why certain rates became standard, how this affects the way your video plays back, and how editing platforms convert between different frame rates.

Standardizing the Video Frame Rate

In the early days of broadcast television, huge increases in viewership created demand for more high-quality, standardized television programs. The earliest television sets used cathode ray tube (CRT) technology to display video feeds as many vertical and horizontal lines, creating the first standard video resolutions.

Interlaced video is a standard method where the display first updates all even-numbered lines, then updates the odd-numbered lines. Interlaced video splits a 30 fps signal into a 60 half-frames (or “fields”) per second signal, creating images with smoother motion. In true interlaced video, each half-frame displays a static image. This means a ball moving across the screen would be in a different position for each field, or two different positions per frame.

In order to synchronize—and therefore standardize—frame rates for each television set, early systems used the AC power system. The American power grid is a 60 Hz system, which created a standard broadcast television frame rate of 30 fps. In Europe and other places with 50 Hz power systems, television broadcasts use a standard frame rate of 25 fps.

With the addition of color television came new challenges. Standard black and white television signals, known as luminance signals, could be broadcast without issue. A separate chrominance signal, carrying all the information necessary to display a program’s color data on a television screen, required extra bandwidth to transmit.

Building off earlier research, industry leaders developed a system where chrominance information could be encoded into the same signal as luminance information. While this would display color on compatible devices, chrominance information often interfered with the luminance, creating visible dots on non-compatible black and white televisions.

By adjusting the standard television frame rate, the dots would no longer display in the same place on the screen each second. The dots were far less noticeable when they were moving around. For this reason, the standard broadcast frame rate in the United States is approximately 29.97 fps (technically 30,000/1,001), just slightly fewer than the commonly used 30 fps.

Quantifying the Moving Picture

This history is the reason we have so many standards for frame rates and video formats. Because 29.97 fps is so close to 30, many people (and some software) will conflate the two, using the integer for any frame rate close to 30 fps.

Video playback that is slightly too slow or too fast is usually imperceptible, except when synchronizing audio. If a video is two hours long and was recorded at 30 fps, the video contains 216,000 static images. If that video is played back at 29.97 fps, it will be two hours and 7.2 seconds long. By the end, the audio will be 7.2 seconds behind the video, which would obviously be very noticeable.

Another way of looking at it is by counting the number of frames for a certain video length. For example, a 33.333(repeating)-second video at 30 fps will have 1,000 frames, while the same video duration at 29.97 fps would only have 999 frames.

This effect is also seen in the difference between 30,000/1,001 fps and 29.97 fps, although it requires a much longer video. For a video that is 33,366.666(repeating) seconds long (over 9 hours), a 30,000/1001 fps video would contain 999,999 frames, while a 29.97 fps video would contain only 999,998 frames.

Converting Frame Rates

What if your project’s raw footage was filmed at 24 fps, but the video was to be displayed on a 30 fps television broadcast? You would need an additional 6 frames every second. As shown in the images below, additional frames per second with less time between frames creates a smoother moving image.

24 fps with a delay of 200ms.

30 fps with a delay of 160ms.

Duplicating 6 out of each 24 frames in the source video would cause a jerky motion in the final export. In the image below, the loop uses frames from the 24 fps video, but duplicates frame numbers 0, 4, 8, 12, 16, and 20. As you can see, the ball pauses slightly at these intervals.

Here you can see the jerky motion of the ball as it duplicates six frames.

The problem is, there is no perfect way to synthesize the in-between frames. For a 30 fps video, frame number 3 would be displayed at 100ms. For a 24 fps video, however, there is no frame that represents this timecode. The closest times are frame number 2 at 83ms and frame number 3 at 125ms.

As illustrated below, the missing frames can be approximated using adjacent ones, but it’s not always exact.

Frame 2 of the 24 fps video (displayed at 83ms).

Frame 3 of the 24 fps video (125ms).

By blending the two frames together, you can approximate what should be displayed at 100ms. When compared to the actual frame 3 from the 30 fps video below, it’s still not perfectly accurate.

The approximated display of the 24 fps video at 100ms.

The true placement of frame 3 in the 30 fps video at 100ms.

Now that you understand the standard frame rates, how they came to be, and how seemingly small differences (29.97 fps vs. 30 fps, for example) can impact your videos, it’s time to put your new knowledge to use. Try it out for yourself and see if you can recreate some of the sample videos above.

Years of history, calculations, and trial and error have gone into the standardized frame rates we use repeatedly in our work. By learning more about the reasons why these standards exist, we can better understand why our media acts and looks the way that it does. And by understanding the facts and figures behind frame rate conversions, we can easily plan for tricky post-production situations that could stall an otherwise smooth video editing workflow.

Learn more about video frame rate and its effect on timecode in Decoding Timecode Standards in Video Production.

Search For More Content


X