Skip to main content
Televisions

Behind The Screens: TV Color

Color is a massive part of a TV's picture quality. Here's why.

Credit:

Recommendations are independently chosen by Reviewed's editors. Purchases made through the links below may earn us and our publishing partners a commission.

Originally introduced in 1953, color television was not an immediate success. Nowadays, though, most of us couldn't imagine watching our favorite TV shows or movies without it. The means of digital color production have changed over the years, but the basic makeup of what is called "additive color" has remained the same.

As professional TV reviewers, we make it our business to know how televisions create color on-screen, and to know how that color relates to what we see in real life, as well. By the end of this article, expect to have a pretty good idea of how digital color works, and why there are standards for it.

What Is Digital Color?

The color produced by TVs, monitors, tablets, and smart phones is called digital or "additive" color, made distinctive from real or print color by the way it is produced. In the real world, our eyes use tiny, photosensitive receptors called cones to perceive color. Oddly enough, we only have cones for three colors: Red, green, and blue. This is because human vision, like the other four senses, is tuned to promote our common environments, our repetitive habits, and—ultimately—our survival.

Real color is a phenomenon of the Universe that only exists for the observer.

Real color is a phenomenon of the Universe that, like a tree falling in the forest, only exists for the observer. Color falls along a spectrum of longer/shorter wavelengths of light; our eyes are tuned to some wavelengths better than others, and there are some we can't see at all. An apple, for instance, absorbs all of light's wavelengths—except for red. That is why, to human beings, apples appear red.

Digital color is called "additive" color because the colors are added together, rather than subtracted from one another, to make each hue and intensity of color possible. When a TV displays red, it employs a single color filter, or sub-pixel. To display yellow, it combines red and green sub-pixels. Adding blue to that—using all three sub-pixels—creates white. This is how all TVs work, save for a few rare exceptions.

Color Saturation

Have you ever wondered why high-definition television looks better than the old boxy sets of yesteryear? Sure, a lot of credit goes to expanded resolution, higher quality cameras, digital video versus film—but let's not forget about color!

Color saturation is what truly sets modern TV apart from the old and busted. HDTVs can produce more highly-saturated colors than standard definition TVs. While no display can saturate color to the vivacity of real life, they're getting closer every year.

Color saturation refers to how "colorful" a given color is.

Color saturation refers to how "colorful" a given color is. Saturation shifts cause the same hue to appear more- or less-filled with color—for example, from a mild "carnation" pink to a vibrant "hot rod" red. They are all identical in hue; only the amount of saturation changes.

As you can imagine, improper saturation can make for some very bizarre, flat-out wrong images on your tablet or smartphone. Errors in saturation (how colorful it is), hue (which color it is), and brightness (how luminous it is) can make for images that are unnaturally severe, or flat and lifeless in appearance.

To avoid this conundrum, color scientists have set color requirements for standard definition, high definition, and—coming soon—ultra high definition displays. Within the industry, the colors a display can produce are rendered for easy reference into what is called a color gamut.

Color Gamut

A color gamut is a visual illustration of the colors a display can produce. It's usually portrayed as a triangle, charted against a "color space." Shaped like a shark's fin, the color space is a reference to all of the color we, as humans, can see. Thus, the color gamut maps the colors a TV is capable of displaying against the colors we are capable of seeing.

The size and shape of a TV's color gamut should adhere to an international standard.

The size and shape of a TV's color gamut should adhere to an international standard. For standard definition, that standard is called Rec. 601; for high-definition, Rec. 709; and for ultra-high definition, Rec. 2020. Each standard specifies an ideal color gamut that TVs of that type should be capable of.

For all displays, red, green, and blue are primary colors; cyan, magenta, and yellow are secondary colors. Each of a TV's primary and secondary colors fall along a specific gamut coordinate, most often expressed in "x" and "y" positioning. Thus, determining a TV's color gamut is the simple task of checking the x/y coordinates of its colors against the standard it is meant to adhere to.

Why Color Matters

Adhering properly to the ideal color gamut is a very big deal for TVs and computer monitors.

If a professional wants to use a computer monitor for graphic design, photo touch-ups, or assisted engineering, it needs to be producing the right color. Sending Vanity Fair off to the printer, only to discover your monitor was showing you the wrong colors during design, is a mistake no one wants to make.

Color is just as important for televisions. International color standards are the only way films, video games, and TV shows can manage to look the same on different displays around the world. Further, proper color saturation ensures that all of the finer details mastered in production studios are preserved during playback.

Finally, having set standards for color is one of the best ways to ensure display technology keeps evolving—scientists could hardly design bigger, fancier TVs if they had no idea how older content was going to look on them. That's why we rigorously test and grade the color integrity of every TV that hits our labs.

Up next