- cross-posted to:
- technology@lemmy.world
- cross-posted to:
- technology@lemmy.world
A new study published in Nature by University of Cambridge researchers just dropped a pixelated bomb on the entire Ultra-HD market, but as anyone with myopia can tell you, if you take your glasses off, even SD still looks pretty good :)
I think people, and this paper, misses a few elements.
4K encoded content often has significantly higher bitrate (well, duh, there’s more content) and often higher than the simple increase in pixel density would suggest. So content with heavy moment (flocks of birds, water, crowds etc) still looks better than 1080p, not because of the increase in pixel density, but because of the decrease of compression artefacts.
Second, high dynamic range yo! On a still picture on my TV it’s hard to see difference between 1080p and 4K but it isn’t hard to see the difference between SDR and HDR.
So I still vastly prefer 4K content, but not because of the resolution.
I always keep one 24" 1080p monitor at my desk, alongside the larger and wider gaming monitor, because that size and resolution is (to me) perfect for text (and side viewing of old films that I don’t want to over-stretch).
I got my first 4K, widescreen monitor recently, and it’s a hugely noticeable difference from 1080p, but depending on what I’m doing, it’s often not an improvement.
@t3rmit3 @artifex Agreed - I recently went from a pair of 24" 1080p monitors to a 32"4k and a 24" 1080p for precisely this reason - had to upgrade my graphics (Intel UHD730 to GT1030 - I’m not a gamer, and had to find something that would fit into this case and didn’t need additional power) to get DP, but now I have so much screen real estate, it’s unreal!
Still got a 1080p 55" tv on the wall downstairs, thoughI’ve been tempted a few times to update it, but it’s working fine, so it’s still chugging along.TV is definitely different to monitor because of what it’s used for and distance though.
24“ with 1080p sucks. The pixels are clearly visible at normal viewing distance on a desk.
The actual paper presents the findings differently. To quote:
Our results clearly indicate that the resolution limit of the eye is higher than broadly assumed in the industry
They go on to use the iPhone 15 (461ppi) as an example, saying that at 35cm (1.15 feet) it has an effective “pixels per degree” of 65, compared to “individual values as high as 120 ppd” in their human perception measurements. You’d need the equivalent of an iPhone 15 at 850ppi to hit that, which would be a tiny bit over 2160p/UHD.
Honestly, that seems reasonable to me. It matches my intuition and experience that for smartphones, 8K would be overkill, and 4K is a marginal but noticeable upgrade from 1440p.
If you’re sitting the average 2.5 meters away from a 44-inch set, a simple Quad HD (QHD) display already packs more detail than your eye can possibly distinguish
Three paragraphs in and they’ve moved the goalposts from HD (1080p) to 1440p. :/ Anyway, I agree that 2.5 meters is generally too far from a 44" 4K TV. At that distance you should think about stepping up a size or two. Especially if you’re a gamer. You don’t want to deal with tiny UI text.
It’s also worth noting that for film, contrast is typically not that high, so the difference between resolutions will be less noticeable — if you are comparing videos with similar bitrates. If we’re talking about Netflix or YouTube or whatever, they compress the hell out of their streams, so you will definitely notice the difference if only by virtue of the different bitrates. You’d be much harder-pressed to spot the difference between a 1080p Bluray and a 4K Bluray, because 1080p Blurays already use a sufficiently high bitrate.
A link to the study, because I don’t think I see one in this very clickbait-feeling source.
Depends on the source material, the distance you look and quality of the panel. 4k is just more resolution than HD. This does not really need a study.
It sounds like the study actually did include display distance, and gave different requirements depending.
My point is, I could do a study too and claim that 4K/8K TVs are much better than HD to your eyes. Its just the setup and source that makes the difference.
Now that I’ve actually looked at the study, what they did is make an apparatus with continuously adjustable distance to display and try to get people to distinguish fairly similar clips until they couldn’t anymore.
Actual maximum pixel-per-visual-degree values varied quite a bit based on colours involved and the like. And like @GenderNeutralBro@lemmy.sdf.org said, they framed the results the opposite way to the article - human vision can distinguish more than previously thought.
Seconded. Really depends and very dependent on the person and source.
I noticed quite a bit with my OLED TV sitting about 7 feet away from it. My wife doesn’t care, and she would say it doesn’t make a difference until we watched something that was truly done well like Dune.
There’s too many variables here to say “doesn’t make a difference”. We can safely say “diminishing returns” which is universally true, but not that there is no change
came to say it depends on how far away.
1080p to 4k was a big improvement in my opinion. I still have a mix of 1080p and 4k equipment. 4k to 8k is real minor to me because 4k TVs are varying degrees of HDR now. Brightness range per zone/pixel and wider color gamut. 8K someday because someday the only TVs you should be buying for the price will be 4k but content picture quality, 4k with quality HDR brightness, contrast, color gamut - minor difference. Just need high quality sources. When I encode something, I use fairly high bitrate AV1. Another 5-7 years and I expect to be encoding everything new at AV2
Maybe a TV across the room, but for monitors, or content that is close enough, it sure does make a difference! I’ve even started to find 1080p a bit “blurry” feeling, like 720p before it when 1080p came out. Especially when it’s YouTube’s shitty bitrate vs a game rendering in 4k. Night and day.
For me it’s mostly for text and content with fine lines where the difference is the most obvious. I use a 1440p monitor at work and a 4k one at home. Text is noticeably sharper and easier to read on my 4k monitor and remains readable at smaller font sizes.
I went to 4K monitors many years ago and agree there is a big difference, that said there are a lot of issues with HiDPI monitors on desktop OSs, plus it requires a strong GPU for gaming. Since then I’ve gone to 1440p and think it is an ideal middle ground for performance while looking almost just as sharp.
It really depends. I managed to play Cyberpunk 2077 at 4k on my old 1080ti with 30-50fps, and thanks to having a nice monitor and knowing how to tune graphics settings, my roomie that had a 2080 gtx was complaining that mine looked better. lol
Anything that’s not crazy levels of fidelity can be tuned to work totally fine at 4k on modern graphics cards. Every generation (that has more than 8gb of gRAM, anyways…) since the 1080 has been more than capable of 4k gaming with some settings tweaks.
I have a 55 inch plasma TV and that thing is massive. When I bought it, it was larger than I wanted to get. I don’t get why people are shoving these massive screens in their living room to watch the nightly news.
If you aren’t sitting pretty close to a giant screen there isn’t a discernable difference. People are just buying into marketing because there has to be something new to make you buy all your stuff over again.
Plasma TV’s are beasts. They also get hot and use a lot of power. They look great though.
I have a 77” OLED and while it’s large it’s also very flat and sits flush to the wall. Really doesn’t take up much space at all. The thing that takes up more space than the actual screen are the 12 speakers you need for the surround sound system that goes with it.
Depends on the physical size of the screen. Which is why mostly only wealthy people with 90-inch+ screens really have ever cared about 8K. 4K is a noticeable improvement on a 60 or 70 inch screen, but the extra cost of content isn’t anywhere near being worth it for most people.
I like 4k, I personally think it is better than 1080p without the data to back it up. Having said that, I’m all for 8k TVs to be produced if it causes 4k TVs to become cheaper, so I can buy a bigger 4k TV at a lower price.
The biggest improvement for me, even viewing at a distance, is that I can disable anti aliasing. If it’s rendered at a high enough resolution jaggies aren’t an issue








