Having a laptop with a 4K UHD (3840 x 2160) screen has made us realise that you really do need to have a 4K screen to work with 4K video.
We’ve been sent a Dell M3800 laptop for a long-term test. We wanted to see what it’s like living and working with a workstation-class super-thin laptop with 4K UHD screen (3840 x2160). In some ways this computer is Dell’s response to the Macbook Pro Retina, but it has an even better screen.
We will be writing about our experience with the Dell over the next six months, but first I just wanted to talk about my first impressions of using a 4K screen on a laptop.
Most people, including me, would be skeptical about the necessity of having such a high resolution screen on a 15” laptop. Apple’s Retina 15” screens are pretty amazing, and you certainly can’t see the individual pixels on them. So why would you want even more pixels?
Well, the difference between Retina and 4K is subtle, but it is definitely noticeable. The 4K screen is, obviously, extremely sharp. But it does go slightly beyond that, particularly with text. And I think the reason for this is that when text is displayed on a computer screen, most of the time it is not a bitmap with a fixed resolution. Instead, it is rendered from a mathematical description of the individual characters, at the precise resolution of the screen. So it will use all the resolution available. If you had an 8K screen, it would render out at 8K.
And it shows. The hard thing about text is not with verticals and horizontals, but with diagonals and circles. Pixels on a screen are square, and diagonals always look jagged if you look closely enough. Having more pixels simply means that this blockiness is minimised. It’s already very good on a Retina screen, and even better on Dells.
But for me, working in the video production business, the biggest reason for having a 4K screen is that you can watch 4K video at its native resolution. You sit much closer to a laptop screen than you do to your television, so you see more detail. 4K video looks glorious on the M3800’s screen. I’ve already noticed something on this screen that I didn’t see on a Retina screen: motion blur.
Motion blur, for me, is one of the biggest issues with very high resolutions like 4K and (especially) 8K. You only need a tiny amount of motion blur to relegate 4K back to HD or even less. If you don’t watch your 4K material on a 4K screen, you can’t make accurate judgements about motion blur.
To be honest, there isn’t much you can do about it, because if it’s there, it’s there, but you might want to check your shots on set or on location to see if you need to do them again. The same applies to focus and overall sharpness.
So, much of this is stating the obvious, but, the point is that it is genuinely worthwhile having a 4K screen on a 15” laptop, even though it’s easy to dismiss it as overkill.
Finally, if you are shooting in 4K DCI (4096 x 2160), then none of the above really applies, because you will need to view your material, ideally, on a screen with a genuine 4096 x 2160 resolution. Downscaling to fit a smaller sized screen will involve a loss of resolution, especially trying to map 4096 horizontal pixels onto 3840, because there is no way to exactly map each pixel in the image to a corresponding pixel on the screen. The reduced image will inevitably be softer.
A better way might be to actually crop each end of the 4096 image by 143 pixels, which would leave the one to one mapping in place. The problem wouldn’t arise with a 4096 x 2160 screen because a 3840 image would show correctly with just a small (143 pixel) black line at each side.
If you’re doing critical work with 4K, you absolutely have to have a 4K screen. You wouldn’t dream of editing or post producing HD on an SD screen. So it’s very good indeed to see computers like the M3800 making this possible.