OLED vs LCD Display Guide: How To Pick The Best TVs For Your Commercial Project

June 30, 2022 | Reviewed by Sam Scott

Are you confused about the latest in TV technology? What’s the difference between LCD, Mini LED and OLED TVs and displays? Their names can be misleading. Those that look similar might not be the same at all.

Despite an abundance of acronyms, there are really only two types of flat-panel display technology on the market today: backlit displays (LCD) and self-emissive displays (OLED).

Read on to learn the pros and cons of each, how to pick the right TVs for your commercial project, and how the other terms you’ve heard fit into the picture.

Backlit Displays (LCD, a.k.a. LED)

The main difference between LCD and OLED TVs is in how they illuminate and colour the images you see. But before we get into that, let’s clear up some confusing terminology:

LCD TVs are also called “LED TVs.”

That’s right. The term “LED display” does not indicate a type of OLED display.

LCD stands for liquid crystal display, a technology used to shape and colour images on a screen. Liquid crystals rotate polarized light to illuminate tiny red, green, and blue subpixels. A set of RGB subpixels make up a single “pixel,” and the number of pixels determines the display’s resolution, which we’ll discuss later.

Since liquid crystals can’t emit light, they need to be backlit. So, LCD displays are backlit by—you guessed it—LEDs (light emitting diodes).

Now, this wasn’t always the case. Early LCDs used fluorescent tubes to light the LCD panel from the sides before switching to LEDs. Eventually, LEDs became small enough to be moved to the back of the display, allowing for thinner bezels, more even lighting across the screen, and higher contrast ratios (the LEDs could be dimmed for deeper blacks).

This was a welcome development in picture quality, and the thin bezels made backlit (or “Direct Lit”) LCDs much better for many commercial and video-wall applications. However, backlit LCDs generate more heat and cost more to produce than their edge-lit alternatives. And with the LEDs at the back, they are not as slim as modern edge-lit displays.

Product designer Dan Hollick wrote an excellent thread on the development of LCD technology. He has granted permission to share his images illustrating these changes.

First were the edge-lit displays with large bezels and uneven lighting:

Edge-lit LCD backlight

Image by Dan Hollick via Twitter

Followed by backlit displays with improved image quality:

Direct LCD backlight

Image by Dan Hollick via Twitter

Today, both Direct Lit and Edge Lit LEDs (as you will see them called on the market) create very bright and colourful images. Direct Lit TVs hold an advantage in image quality, though Edge Lit TVs are cheaper to purchase and come in slimmer profiles.

Compared to OLEDs, both LCD styles suffer from the need to house and power an external light source inside the display. They also have inferior image detail in dark environments, both on-screen and while watching in a dark room. They remain prevalent in the market due to their affordability, their availability in virtually any size, and advancements such as Mini LED and QLED technology.

So What Does Mini LED Mean?

Mini LEDs are pretty much self-explanatory. LEDs are getting smaller, which means thinner-profile displays can use more LEDs to enhance dimming control, enabling rich colours and deeper blacks.

miniLED backlight

Image by Dan Hollick via Twitter

Again, we stress that OLEDs are the odd man out. Mini LED displays are an LCD subtype.

And What About QLEDs?

QLED stands for quantum-dot light emitting diode, another advancement in LED technology. Like Mini LEDs, QLEDs improve how LCDs are lit. The core technology in QLED TVs is still LCD.

“Quantum dot” technology is as neat as it sounds. QLED displays use nanocrystals to transform the light’s wavelength behind the LCD panel. This means QLEDs can be edge-lit and slim but have controlled lighting similar to a backlit LCD, providing better contrast and improving the image from sharp viewing angles.

However, premium QLEDs again opt for a backlit configuration. Thus far, these backlit QLEDs provide the closest competition—in terms of image quality—to OLED displays, which we will describe next.

Self-Emissive Displays (OLED)

OLED displays are a breakthrough image technology commonly referred to as “the picture quality king.” OLED stands for organic light emitting diode, and its critical development is self-illumination. Whereas every LCD panel needs some form of backlight, each OLED pixel can essentially light itself.

This means that OLED TVs control images at the individual pixel level. And how does this improve picture quality? Consider how an LCD creates a black image: light shines through a “black” pixel, but the LCD panel doesn’t block all of the light. With an OLED display, the pixel simply turns off. No light. Just black.

Further, this organic emission is enabled by placing a thin layer of light-emitting film between two conductive layers, a cathode and an anode. So, with a layer of film essentially replacing a diffuser and light source, the entire assembly can be incredibly slim compared to even an edge-lit LCD.

OLED layers

Image by Dan Hollick via Twitter

OLED displays have several advantages over LCDs, including superior contrast, thinner profiles, and faster image refresh rates. Plus, they consume less power, and their production is more environmentally friendly than LCDs’.

However, there are tradeoffs. OLED displays are more susceptible to image “burn-in” after prolonged use, and they tend to burn out quicker, too. Their relatively short lifespans are tied to image brightness.

These concessions don’t always make sense for commercial applications, though commercial OLEDs are manufactured with these limitations in mind. For example, modern OLED signage displays protect against burn-in with self-healing technology. Though, as you might expect, their price point is much higher than their LCD alternatives’.

Resolution: An Important Consideration

You may be wondering about some other terms we haven’t discussed yet, such as 4K, HiDPI, and Retina display. These describe screen resolutions, and while the subject of resolution could warrant an entire article, there are a few things you should know when selecting (or being sold) displays for your project.

Screen resolutions describe the number of pixels available to represent an image. Some common standards are SD, HD, and 4K, and resolutions can also be defined by horizontal and vertical pixel counts.

For example, a 720 x 480 resolution represents 720 pixels horizontally and 480 vertically. You can think of these like measurements you would use to calculate a room’s square footage: 720 x 480 = 297,600 total pixels to make up an image. 720 x 480 is a typical SD or standard definition resolution.

SD is easy to spot because it appears as that 4:3 (almost) squared image on older TVs instead of the widescreen, rectangular shape of modern displays. When you’re watching TV and you see an old show come on with black bars on each side, it’s because the show was produced in SD.

SD was the standard until the early 2000s when it gave way to HD or high definition. HD has a broad scope and may describe any resolution greater than SD, though there are a few common HD resolutions such as 1080i or 720p.

1080i refers to a 1920 x 1080 resolution on a display that creates images by illuminating rows of pixels in an alternating fashion (_interlaced _scan). If a 1080 display lights pixels progressively from top to bottom, the resolution would be called 1080p (progressive scan). The same goes for 720i and 720p displays, which specify a 1280 x 720 resolution with interlaced or progressive scan.

4K indicates a resolution with approximately 4,000 horizontal pixels or approximately four times the pixel count of a 1080p HDTV. 4K displays have at least 8 million active pixels, though the term is not as exact as 1080p. 4K has picked up steam as a marketable buzzword, and 8K displays have emerged as well.

You may also see the term UHD or Ultra High Definition used to describe similar resolutions to 4K. In fact, many consumer displays that would be more accurately described as UHD or 2160p are marketed as 4K. There are some technical differences between 4K and UHD, but they aren’t differences you’d be likely to notice.

Screen resolutions used to be standardized and easy to understand. But with the proliferation of tablets, phones, gaming devices, and laptops, there are now too many to count.

Size Matters

So, now we can determine the number of pixels in a display from its resolution. But are those pixels compressed into a 40” TV or spread out across a 15’ video wall? The resolution alone does not tell us how good an image will look. The pixel density and the viewers’ distance from the screen do.

DPI (dots per inch) and PPI (pixels per inch) are measures of density that represent the number of pixels per inch of screen. HiDPI is a term used for displays with very high pixel density—usually at least 200 DPI. High-density displays came on our radar with iPhones. Since then, tablets, notebooks, and other HiDPI personal devices have followed.

Today, smartphones have DPIs climbing into the 500s, while an 8K TV might have a PPI of only 117. The reason for this disparity is perspective: the closer the viewer is to the screen, the higher DPI you’ll need for a crisp, seamless image. Though, the opposite is also true. You might not need to splurge on those 8K displays if the viewing area is far away. Most scoreboards and video walls have lower resolutions than a 4K TV—and much lower DPI.

What About Retina Displays?

“Retina display” is a proprietary term used by Apple. It refers to displays with a pixel density so high that the human eye cannot perceive the pixels. Apple introduced the term with the iPhone 4, which had 326 DPI. Steve Jobs qualified the screen as having imperceptible pixels at a distance of 12 inches.

Since then, Retina displays have lacked a concrete definition, though they tend to have more than 300 DPI for phones and sometimes less for tablets—the justification being that people tend to hold tablets further away than their phones.

Do You Need To Buy “Commercial” Displays?

The primary difference between commercial and consumer TVs is that commercial displays are designed to withstand a business environment.

Consider a “day in the life” of a commercial display. It is constantly on, perhaps 24 hours a day. If used for signage, it might show the same images over and over again, making it susceptible to burn-in. Depending on its location, it may be at risk of being bumped around by staff or patrons, and aesthetically it might require a perfectly rectangular frame to match up with adjacent screens or windows.

Consumer TVs are not built with these issues in mind. Their chassis are not as sturdy, and their components do not last as long as commercial displays’. Further, consumer TV designs are updated frequently, so you might have a hard time finding a replacement that matches the building’s other displays when the need arises.

Lastly, consumer displays may lack external control functionality. If your facility has multiple displays, you shouldn’t need someone to walk around with a TV remote to turn them on every day. Commercial displays come with standard control ports, so they can be tied into several types of control systems, enabling touchpanel control, scheduled operation, and more. An AV consultant can ensure you get the models you need for seamless integration.

When buying TVs for your commercial project, it’s best to understand your needs clearly and avoid getting distracted by buzzwords and tech hype. The battle between LCD and OLED technology is ongoing as manufacturers work to improve the image quality of LCDs and make OLEDs more durable and affordable.

Either display type may be the right choice for your project, and luckily we live in a time where there is a display available for almost every application.

We hope this guide helps you make informed decisions and demystifies the growing TV vernacular. You can contact Chroma today for an expert’s opinion on the right displays for your project.

 

FAQs


What’s the difference between an OLED and an LCD display?

LCD TVs are backlit displays that use liquid crystals to shape and color images on a screen, while OLED displays are self-emissive displays that can control images at the individual pixel level.

Is an LED TV a type of OLED display?

No, an LED TV is not a type of OLED display. LED TVs are actually LCD displays that use light-emitting diodes (LEDs) as a backlight source.

What is a QLED display?

QLED stands for quantum-dot light emitting diode, another advancement in LED technology. QLED displays use nanocrystals to transform the light’s wavelength behind the LCD panel.

What does display resolution mean?

Display resolution refers to the number of pixels that a display can show, measured in width and height. The higher the resolution, the more pixels are displayed, resulting in a clearer and more detailed image.

What is a Retina display?

“Retina” or “Super Retina” displays are proprietary terms used by Apple. They are displays that have a pixel density so high it makes individual pixels indistinguishable to the human eye from a normal viewing distance.

Do I need to buy commercially-rated displays for my facility?

It depends on your needs. Commercially-rated displays are designed to withstand more rigorous use than consumer-grade displays, making them ideal for high-traffic areas or 24/7 operation. Consumer displays are not typically warrantied for commercial use.