For most of flat-panel history, it didn’t. if only that arbitrary resolution would look good. Neither of these solutions is optimal, and there’s a vast gulf between these two points where you could strike the perfect balance of resolution, frame rate, and rendering quality. You can use that extra headroom to increase other visual quality settings or you can take advantage of faster frame rates, especially if you have a display capable of displaying them thanks to fast refresh rate support. On modern hardware, there’s a good chance you won’t be using all of your GPU power at this resolution. As we mentioned before, that’s four times fewer pixels. For example, the next lower resolution that scales perfectly with 3840×2160 (UHD 4K) is 1920×1080 (Full HD). What about falling back on perfect scaling? While this does deal with the worst scaling artifacts, it offers the opposite problem. There’s no correct answer here since different games and different players assign different values to these, but there’s a tradeoff no matter what. If you choose to simply accept the lower frame rate, you’re then trading motion clarity and responsiveness for better quality in each frame. In effect, you are forced to trade visual quality for visual clarity. For example, you might lower the quality of lighting, texture detail, draw distance, etc. If we can’t lower the resolution, then the only way to reduce the load on the GPU and increase the framerate is to dial back other visual features. If the additional burden is too much, the GPU may not be able to draw frames fast enough to make the game smooth and playable. Unfortunately, this puts a heavier load on the GPU (Graphics Processing Unit) which takes longer to draw each frame of the game, since there are more pixels. For gaming, this effectively means that the game has to render at the native resolution of the display for the best image quality. Native Resolution and Gaming Performance Mohsen Vaziri/Ĭonventional wisdom has been to only use native resolution content or, at the very least, content that scales perfectly to the higher resolution. Sadly, this generally makes for an ugly image. There are various approaches to solving this, such as averaging the values of the split pixels. With an imperfect scaling factor, some pixels have to represent the color and brightness values of different original pixels.
When we have a perfectly divisible scaling factor, the groups of pixels that represent one low res pixel all have the same color and brightness value as the original. This is where we move into the realm of estimating pixel values.
The trouble starts when lower resolution images don’t divide so neatly into the native pixel grid of the display. This works quite well and in general, the picture will still look good, just not as sharp as a native image.
On flat-panel screens, when the number of pixels in your content (for example, a game, photograph, or video) is less than the native resolution of the display, the image has to be “scaled.” A 4K display has four times the number of pixels compared to a Full HD display, so to scale a Full HD image to 4K you can use four 4K pixels to represent one Full HD pixel.
Images on a CRT look good at any resolution because the beam can simply draw the number of pixels It needs, up to a certain limit at least.
This is different from older CRT (Cathode Ray Tube) displays, which use a charged beam to draw an image on the phosphorescent layer on the back of the screen. When an image has at least as many pixels worth of data as the screen can physically display, you’re getting the maximum sharpness and clarity possible on that screen. Flat-panel displays, whether LED, OLED, or plasma, have a grid of physical pixels.