Skip to content   Skip to footer navigation 

TV and computer screen jargon explained

HDR, UHD, 4K, OLED – what does it all mean?

modern_tv_in_living_room_screen_jargon
Last updated: 29 January 2019

How do you know what to choose when shopping for a new TV or computer monitor? We've put together this brief guide on what you need to know, so you can screen-shop with confidence.

The industry jargon you face can be bewildering. It can seem like you have to wade through an alphabet soup of acronyms and terms such as HDR, UHD, 4K and OLED, resolution, aspect ratio, pixels, and panel technology. And storefront sales staff might not know any more than you do.

Screen resolution

Resolution is the total number of pixels (dots) on a screen, usually expressed as horizontal (width) times vertical (height) or W x H. For example, 1920 x 1080 means 1920 pixels horizontally and 1080 pixels in height.

TV screen resolution

A TV's resolution affects the quality of video you can watch, though it's just one of many factors. Most modern TVs will be at least 720p and more likely 1080p (that's 720 or 1080 pixels high). Some have even higher resolution.

Computer monitor screen resolution

For a computer monitor it's different. Many programs, such as web browsers, word processors, and suchlike are designed for parts of their user interface (UI) to be a certain number of pixels wide or deep. 

Dropdown and popup menus, tool boxes and alerts, for example are designed to be a particular size in pixels.

The higher the resolution of your screen, the more pixels it has and the smaller they look. On a very high resolution screen, these items can look too small, unless you can tell the program to make them larger. Some modern programs (and your Windows or macOS operating system) allow you to increase just the size of text, icons and menus to your liking, while keeping the rest of the screen at high resolution.

Lower resolutions (fewer pixels) make everything look bigger, no matter what. If you view a computer screen at low resolution, you might end up with a menu bar at the top of a web page or other program that takes up a third or more of your screen.

Screen resolution names and numbers

A single type of screen resolution can have more than one label. So 1920 x 1080 can also be called 1080p or full HD, for example.

Here are some of the more common resolutions for computer monitors and TV screens, along with all the names they're known by:

Pixel density

Pixel density compares a screen's resolution to its size and finds the number of pixels per square inch (ppi).

The closer the screen is to your eyes when you use it, the higher you want the pixel density to be. A 1080p TV screen that measures 55 inches diagonally (big, but not huge) is around 41ppi. But a smartphone with the same resolution and a 5.5-inch screen has around 410ppi.

Both are acceptable for how you use them, but you wouldn't want a smartphone with 41 pixels per square inch, and a TV with 410ppi is unheard of.

It all comes down to viewing distance. You may have heard of Retina Display, a marketing term registered by Apple to denote a screen with a pixel density so high (approximately 300ppi or more) that the human eye is unable to see individual pixels at a normal viewing distance. Many modern screens (particularly smartphones) have the resolution of a Retina Display, or higher, but without the catchy name.

Aspect ratio

Aspect ratio is how square or rectangular the screen is. The most common is 16:9, which means for every 16 pixels along the width, there are nine along the height. This is commonly used for widescreen TVs. All the display resolutions mentioned above are 16:9.

Older, squarer, TV screens and computer monitors were usually a 4:3 ratio.

Hertz (Hz)

For screens, hertz (Hz) denotes how many times the screen is refreshed per second. While 60Hz is commonly acceptable for TVs and computer screens, some utilise 120Hz to reduce motion blur for fast-moving scenes. On a computer monitor you're unlikely to need more than 60Hz unless you're an avid gamer.

Display technologies

The technology your screen uses to create an image makes a big difference, but it can be confusing telling them apart by their names or to see the difference in a brightly-lit retail store.

Plasma

Plasma screens were among the first to popularise large screen TVs and may still be around in people's homes, but are no longer sold after being pushed out of the market by lower-cost LCD screens.

LCD

Liquid crystal displays (LCDs) are the most common for both TVs and computer monitors. However, they're rarely advertised as just LCD anymore and currently come in two main variants:

  • LCDs need a lighting panel behind or beside the screen. This used to be done with fluorescent lighting, but LED (light emitting diode) lighting panels eventually took over.
  • Exclusive to Samsung, quantum light emitting diode (QLED) screens also have an LCD panel with LED lighting, but add the new "quantum dot" panel technology to the mix to produce better colours. These high quality screens compete directly with OLED displays.

OLED

Organic light emitting diode (OLED) screens use a different technology to LCD. They're usually more expensive and can provide superior contrast and viewing angles, as well as being slimmer. While OLED isn't quite as bright as QLED or LED, it's particularly good at providing deep, inky black tones for higher contrast.

HDR

High dynamic range (HDR) is increasingly built into new screens and provides vivid, lifelike images. It's not a display technology in the sense of LCD or OLED, but significantly increases the brightness and colour range of a video or image. However, whatever you're watching also needs to have been shot and recorded in HDR for the screen to benefit.

We care about accuracy. See something that's not quite right in this article? Let us know or read more about fact-checking at CHOICE.

Stock images: Getty, unless otherwise stated.