I can't tell at all when my mbp is in 120hz or 60hz. I tried to set up a good test too by scrolling really fast while plugging and unplugging the power adapter (which kicks it into high power 120hz or low power 60hz).

One of those things that some people notice, some people don't. I'm definitely in the camp where I feel differences between 120hz and 60hz, but I don't feel 60hz as choppy, and beyond 120hz I can't notice any difference, but others seemingly can. Maybe it's our biology?

I would bet most people would fail a blind test.

Basically everyone who has played videogames on pc will notice the difference. I easily notice a drop from 360Hz to 240Hz.

I also use 60Hz screens just fine, saying that getting used to 120Hz ruins slower displays is being dramatic. You can readjust to 60Hz again within 5 minutes. But I can still instantly tell which is higher refresh rate, at least up to 360Hz.

Videogames also do the input every loop so there's a big difference there. It must be evaluated with a video only.

We're talking about monitors here, which usually have a mouse cursor on it for input. Of course it would be hard to tell between 60 vs 120Hz screens if you used both to play a 30FPS video.

Lots of games don't do input on every loop. Starcraft 2 has 24 hz input.

60 to 120? Generally there are tell tale signs. If I quickly drag a window around it’s clear as day at 120.

Most people who’ve used both 60 and 120 could tell, definitely if a game is running. Unless you’re asking me to distinguish between like 110 and 120, but that’s like asking someone to distinguish between roughly 30 and 32.

North of 120 it gets trickier to notice no matter what IMO.

I can live with 60 but 85+ is where I’m happy.

It's super easy, put your finger on a touchpad and move it fast in circle so that the cursor also moves in circle. As the eye is not that fast, you will see multiple faint mouse cursors images. With 120 Hz there will be twice more cursors than with 60 Hz.

On a perfect display you should see just a faint grey circle.

Another test is moving cursor fast across the white page and tracking it with eyes. On a perfect display it should be perfectly crisp, on my display it blurs and moves in steps.

So basically on a perfect display you can track fast moving things, and when not tracking, they are blurred. On a bad display, things blur when tracking them, and you see several instances otherwise. For example, if you scroll a page with a black box up-down, on a bad display you would see several faint boxes overlayed, and on a perfect display one box with blurred edges.

You could replicate a "perfect display" by analytically implementing motion blurring (which is really just a kind of temporal anti-aliasing) in software. This wouldn't let you track moving objects across the screen without blur, but that's a very niche scenario anyway. Where 120hz really helps you is in slashing total latency from user input to the screen. A 60hz screen adds a max 16.667ms of latency, which is plenty enough to be perceived by the user.

Display quality should be measured in eye-strain and fatigue after your intended workload.

I think it’s more noticeable if you are touch interacting with your screen during a drag. If you are scrolling using the mouse, you might not realize it at all like if you were scrolling with your finger.