“Will the PS 4 Pro deliver native 4K?” “GTX 1080 can run games at 4K/60 FPS!” “Damn I want HDR enabled gaming” “Consoles can probably do upscaled 4K at the best right now” – These might be some statements you may be hearing every now and then these days. With the advancements in gaming and display technology, gamers want better graphics and details, and rightly so. Hence, for those still who are not familiar with these phrases or want to know more because they are thinking about investing in these, here’s our little guide.
What is 4K?
Basically, 4K describes the resolution of your screen. By definition, it refers to a horizontal resolution of 4000 pixels, establishing a deviation from the conventional method of characterising resolutions using 480p, 1080p. By that convention, 4K would be 2160p.
Now, 4K has two standards,
- The one used in film production and theaters, which is 4096 x 2160, hence making it true 4K
- The standard for TV screens and monitors, which is a little lower (3840 x 2160). For the same reason, some brands prefer not to use the 4K label, hence referring to it as Ultra HD, or UHD for short.
4K resolutions are capable of delivering four times the detail as Full HD, or 1080p (about 8, million pixels to FHD’s 2 million). The result? Amazing details and exponential, finer textures that can improve your gaming experience by a great extent.
Alright, got it. 4k = Better Graphics. But what about native and upscaled?
Here’s the interesting part. You have a 4K screen, but is can your device run 4K in native, or will it be upscaled? And if upscaled, is it really worth it?
Simply put, upscaling is a process that converts a lower resolution into a higher definition. It’s something that has been there ever since we started using DVDs on Full HD screens. It works more like an interpolation, filling in the blanks based on what the surrounding pixels display. And while it may sound like cool, like getting a 4K quality out of your 1080p resolution, it’s actually not. There is no increase in quality when the image is upscaled, as the amount of information in the input signal is the same. It may or may not add a bit of smoothness to the textures, but the overall result is a little subjective, as the upscaled 1080p image will be more or less similar to how you’d see it on a 1080p screen. I mean, you’re basically forcing an image into duplicating the number of pixel to stretch it to a higher resolution, so you can’t expect it to be perfect.
Gee, thanks man! That clears it up. One more thing, what’s this HDR people keep talking about?
4K has gained immense popularity in recent times, and it’s not gonna be long before the next big thing is display tech i.e. HDR takes off. HDR, or High Dynamic Range is a term photographers would be familiar with, but why should TV screens and monitors care about it? Well, HDR refers the capability of the screen to display a much richer and wider range of colours, making your whites whiter and your blacks blacker than you can imagine. Overall it provides the image a depth and a “dynamic” look.
Current technology is not able to preserve content quality in the most extreme of colour areas (the darkest and brightest), but HDR is able to do it successfully. It also results in much more natural colours, bringing them closer to how we really see them in real life. Colour and contrast, hence, becoming the deciding factors when going for an HDR screen.
Nowadays, some brands have TVs and screens which are 4K and HDR enabled, hence delivering a “premium 4K experience”. These come with Ultra HD Premium logo, something that differentiates them from regular 4K monitors and TVs, which simply have a 4K/Ultra HD logo.
So there you go. All the basics you need to know about 4K and HDR. Choose well and keep gaming!
For more news and reviews, keep checking back at Gaming Central.