120Hz is absolutely a noticeable improvement over 60Hz. I have a 60Hz iPhone and a 120Hz iPhone and the 60Hz one is just annoying to use. Everything feels so choppy.
120Hz is absolutely a noticeable improvement over 60Hz. I have a 60Hz iPhone and a 120Hz iPhone and the 60Hz one is just annoying to use. Everything feels so choppy.
I can't tell at all when my mbp is in 120hz or 60hz. I tried to set up a good test too by scrolling really fast while plugging and unplugging the power adapter (which kicks it into high power 120hz or low power 60hz).
One of those things that some people notice, some people don't. I'm definitely in the camp where I feel differences between 120hz and 60hz, but I don't feel 60hz as choppy, and beyond 120hz I can't notice any difference, but others seemingly can. Maybe it's our biology?
I would bet most people would fail a blind test.
Basically everyone who has played videogames on pc will notice the difference. I easily notice a drop from 360Hz to 240Hz.
I also use 60Hz screens just fine, saying that getting used to 120Hz ruins slower displays is being dramatic. You can readjust to 60Hz again within 5 minutes. But I can still instantly tell which is higher refresh rate, at least up to 360Hz.
Videogames also do the input every loop so there's a big difference there. It must be evaluated with a video only.
We're talking about monitors here, which usually have a mouse cursor on it for input. Of course it would be hard to tell between 60 vs 120Hz screens if you used both to play a 30FPS video.
Lots of games don't do input on every loop. Starcraft 2 has 24 hz input.
60 to 120? Generally there are tell tale signs. If I quickly drag a window around it’s clear as day at 120.
Most people who’ve used both 60 and 120 could tell, definitely if a game is running. Unless you’re asking me to distinguish between like 110 and 120, but that’s like asking someone to distinguish between roughly 30 and 32.
North of 120 it gets trickier to notice no matter what IMO.
I can live with 60 but 85+ is where I’m happy.
It's super easy, put your finger on a touchpad and move it fast in circle so that the cursor also moves in circle. As the eye is not that fast, you will see multiple faint mouse cursors images. With 120 Hz there will be twice more cursors than with 60 Hz.
On a perfect display you should see just a faint grey circle.
Another test is moving cursor fast across the white page and tracking it with eyes. On a perfect display it should be perfectly crisp, on my display it blurs and moves in steps.
So basically on a perfect display you can track fast moving things, and when not tracking, they are blurred. On a bad display, things blur when tracking them, and you see several instances otherwise. For example, if you scroll a page with a black box up-down, on a bad display you would see several faint boxes overlayed, and on a perfect display one box with blurred edges.
You could replicate a "perfect display" by analytically implementing motion blurring (which is really just a kind of temporal anti-aliasing) in software. This wouldn't let you track moving objects across the screen without blur, but that's a very niche scenario anyway. Where 120hz really helps you is in slashing total latency from user input to the screen. A 60hz screen adds a max 16.667ms of latency, which is plenty enough to be perceived by the user.
Display quality should be measured in eye-strain and fatigue after your intended workload.
I think it’s more noticeable if you are touch interacting with your screen during a drag. If you are scrolling using the mouse, you might not realize it at all like if you were scrolling with your finger.
I believe refresh rate/FPS is one of those things where it doesn't really matter but human eyes get spoiled by the higher standard, making it hard to go back. I never saw issues with 30 FPS until going to 60, etc. Hopefully I never get a glimpse of 120 or 144Hz, which would require me to throw out all existing devices.
Best take in this thread.
The jump forward doesn't even necessarily feel that huge but the step backward is (annoyingly) noticeable.
Quality of life adjustments are all like this.
Especially wellness.
I'm not convinced. I have an iphone 14 pro which has a 120 Hz screen. I can absolutely see the difference when scrolling compared to my older iphone 11 or computer screens.
However, I'm typing this on my Dell monitor which only does 60 Hz. It honestly doesn't bother me at all. Sure, when I scroll long pages I see the difference: the text isn't legible. But, in practice, I never read moving text.
However, one thing on which I can't go back is resolution. A 32" 4k screen is the minimum for me. I was thinking about getting a wider screen, but they usually have less vertical resolution than my current one. A 14" MBP is much more comfortable when looking at text all day then my 14" HP with FHD screen. And it's not just because the colors and contrast are better, it's because the text is sharper.
4K too, at anything over 15” or so.
I’m always baffled people insist otherwise.
At the distance I look at my TV screen (about 7 feet from the couch) I can't make out the pixels of the 1080p screen. 4k is lost on me. 2020 vision but I guess that is not enough.
Resolution is much less important for video than it is for text and user interfaces.
This is exactly why I went to 4K.
Used to have a 27" 2560x1440 monitor at home. Got a 4K 27" at work, and when I got home, the difference was big enough that I (eventually) decided to upgrade the home monitor.
Unless the screen is right in front of your face, video codecs and their parameters matter more than FHD vs UHD, IMO.
At least to me, with corrected vision, a high quality 1080p video looks better than streaming quality 4k at the same distance.
Compare apples to apples, e.g. gaming, and the difference is glaring.
I’m 3m from my TV and I can absolutely tell 4K from 1080p, but it is indeed subtle.
But a fraction of that distance to my monitor makes even 4K barely good enough. I’d need a much smaller 4K monitor to not notice pixels.
I also have perfect vision in terms of focal length - but it turns out I have astigmatism in opposite axises in both eyes.
Glasses make a huge difference when watching TV, and are the dividing line between being able to tell the difference between 4K and 1080p and not being able to discern any.
I agree with this, but I use a 43" 4K TV as my monitor... which probably isn't what you meant.
I notice it on my 27” monitor. I’ve seen 15” 4K displays and that’s about the limit where I can see the difference.
My eyesight isn’t perfect, either.