I can’t buy this:
> I've also learnt I do benefit from the 8 kHz setting of my mouse, as even at 3200 DPI with fast & smooth motion, some frames still miss a pointer update
It may be true that pointer updates were being missed. But does that really affect anything?
It turns out that there’s a way to test this experimentally. Do a double blind experiment, just like in science. If you can tell which monitor is 240hz more than randomly, then it matters. Ditto for the pointer updates.
The corollary is that if you can’t tell with better than random chance, then none of this matters, no matter how much you think it does.
Experiments like this have decisively settled the “Does higher sampling rate matter when listening to music?” debate, among other questions. People still swear that they can tell that there’s a difference, but it’s expectation bias. They’re mistaken.
(10ms drops every few seconds would definitely be noticeable though; that wasn’t the point.)
There are videos on youtube showing people perceive differences at much higher framerates. e.g. https://www.youtube.com/watch?v=OX31kZbAXsA (long video, so you can skip to the end - they found that even casual players were performing measurably more consistently at 240Hz than even 144Hz.)
Anecdotally, I recently switched to playing racing games at 165FPS and the difference is massive!
I'm game for a randomized blinded test on 120 Hz refresh rate vs 240 Hz refresh rate. I would indeed be very curious to confirm I can tell the difference with a proper protocol.
Many years back (we were on CRTs), I was in similar shoes, convinced my friend couldn't tell the difference between 60 Hz and 90 Hz when playing video games.
Turns out he only needed to look at the pointer through one push of the mouse to tell right away, successful 100% of the time in a blinded experiment.
That’s a silly experiment. I could look at a CRT with a completely static image and tell almost immediately whether it was at 60Hz, 90Hz or 120Hz. Flickr at 60Hz was awful, 90Hz was clearly perceptible, and even 120Hz was often somewhat noticeable. And most CRT/graphics card combos would become perceptibly blurry in the horizontal direction at 120Hz at any reasonable desktop resolution, so you could never truly win. Interlaced modes made the flicker much less visible, but the crawling effect was easy to see and distracting.
It's like lightning strokes of tens of microseconds making a lasting impression on your perception of the scene. You don't "count" strokes over time, but in space.
When you make circles fast and large enough on screen, you can evaluate the number of cursors that appear before your eyes. At 4 circles per second, is each circle made of ~60 pointers or ~30? Belief not fact: it's not hard to guess.
“If anyone wants to implement this, I think the way to do it is to put the mouse cursor randomly on the edge of a circle whose radius is a few hundred pixels. The randomness is important, though I’m not sure it would be possible to count how many cursors there are.”
And then I realized that doesn’t work, for a few reasons.
One is that you won’t be able to count how many cursors appear during one second. It’ll all look like a jumble.
That leads to the argument that you should place the cursors at a consistent spacing, and the spacing needs to make it so that the cursors stay at the same spot on the screen each loop around the circle.
Unfortunately that doesn’t work either, because you’ll end up seeing a trail of cursors going around a circle once per second, and counting the cursors is hopeless.
So I think you’d need to make a list of the spots on the circle where the cursors should go, then randomly select from them as quickly as possible. That will let each cursor be perceptible because they’ll be spread out over time; the next cursor won’t be just one pixel apart, so this eliminates the “trail of cursors” problem.
I’m still a bit skeptical this could work, but I admit I can’t think of a reason it wouldn’t. You’ll need to be careful, because it’s really easy to fool yourself that you’ve done it correctly when you haven’t.
It would be interesting to make a WebGL canvas and try this out for real. Or maybe just reposition the mouse cursor with Python instead of doing anything graphical.
It seems important to reposition the mouse cursor rather than use WebGL to draw frames, but I think both could work. Actually, the WebGL route would be more faithful to the question of whether gamers specifically can notice 240Hz; there are all kinds of reasons why repositioning the mouse cursor wouldn’t really tell you that. Vice-versa too, because it might be possible to notice when repositioning cursors but not when using WebGL, though I can’t think of why that would be the case.
Neat idea. Thanks.
Presumably the 3200 Hz is needed for a combination of reasons:
- Under ideal conditions, if you want less than 10% variation in the number of samples per frame at 240Hz, you may need ~2400Hz. This effect is visible even by human eyeballs — you can see multiple cursor images across your field of view, and uneven spacing is noticeable.
- The mouse itself may work less well at a lower sampling rate.
- The OS and input stack may be poorly designed and work better at higher rates.
In any case, the application and cursor implementation are unlikely to ask for a mouse location more than once per frame, so the user is not really using 3200 updates per second, but that’s irrelevant.
Second 3200 was DPI not Hz. I can trivially tell how much I have to move with 3200 DPI (my sweet spot with 2 4K monitors), 4800 DPI, and 6400.
For Hz, it was the polling rate. With a configured 8000 Hz polling rate which is a lie/peak, I still see stalls in the 4ms range with my hardware.
As to acceleration I disable it. To truly lose it at high DPIs I've had to install RawAccel on Microsoft Windows.
Imagine 2 identical gaming setups with 2 players that have the same skill set. In an FPS game, you'd expect each of those players to win 50% of the games.
Now switch one monitor from 120Hz to 240Hz. On average, the player on the 240Hz monitor will see their adversary 4ms earlier than the player on the 120Hz monitor and thus be able to push the mouse button earlier too.
Further if your network has more than 4ms of jitter then I don't think you can make any concrete claim in either direction.
You can film the screen in slow motion and visually see more fluid motion (and see how it reacts to player input).
Games also use predictive methods and client side hit detection to mitigate most of the effects of network latency in the common cases.
You can present the game state statistically earlier to the player with the higher refresh rate display.
A pro FPS player might notice that they loose contests peeking around corners more often. Obviously network latency in online games will be a factor as well, but since it likely averages out for both players over time, I would guess you can mostly discount it along with alternating who’s doing the peeking.
I don’t think anyone could look at a scene on a 120hz vs 240hz display and tell the difference, there needs to be some indirect clue.
If I’m just watching, I’m not sure I could even tell the difference between 60hz and 144hz.
From this one paper alone, humans can perceive information from a single frame at 2000 Hz.
https://doi.org/10.1080/00223980.1945.9917254
Humans can read numbers and reproduce them immediately a 5 digit number is displayed for 1 frame at 400 fps. This is a single exposure, it is not a looping thing with persistence of vision or anything like that. 7 digit numbers required the framerate to be 333 fps. Another student produced 9 digit number from a single frame at 300 fps. These were the average results. The record results were a correct reproduction of a 7 digit number from a single viewing of a single frame at 2000 Hz. This was the limit within 2% accuracy of the tachistoscopic equipment in question. From the progression of the students chasing records, no slowing of their progression had ever been in sight. The later papers from this author involve considerable engineering difficulty to construct an even faster tachistocope and are limited by 1930s-1940s technology.
This research led the US Navy in WW2 to adopt tachistotopic training methods for aircraft recognition replacing the WEFT paradigm (which had approximately a 0% success rate) to a 1 frame at 75 fps paradigm which led to 95% of cadets reaching 80% accuracy on recognition, and 100% of cadets reaching 62.5% accuracy after just 50 sessions.
Yes, humans can see 2000 fps. Yes, humans can see well beyond 2000 fps in later work from this researcher.
https://doi.org/10.1080/00223980.1945.9917254
Yes, humans can detect flicker well above 1000 fps in daily life at the periphery of vision with cone cells as cone cells can fire from a single photon of light and our edge detection circuits operate at a far higher frequency than our luminance and flicker-fusion circuits. Here's flicker being discriminated from steady light at an average of 2 kHz for 40 degree saccades, and an upper limit above 5 kHz during 20 degree saccades, which would be much more typical for eyes on a computer monitor.
There is no known upper limit to the frequency of human vision that is detectable. As far as I know, all studies (such as this one I link) have always been able to measure up to the reliable detection limit of their equipment, never up to a human limit.
Not really relevant. Music is experienced after a Fourier transform, in frequency space,
The more telling example is that experienced drummers get frustrated by lag of 2 ms from computer-generated effects. That's 500 Hz.
Neat tool, though. I'm also very sensitive towards latency.
This delay wasn't present on the Logitech gaming mouse I previously used, probably a combination of a high polling rate (500Hz) and a much longer idle delay. The battery life was also much shorter, only 250 hours on high-performance mode, but I just recharged a set of AA batteries every week so it was never an issue.
I ended up returning the Marathon mouse.
At one point I had a Razer wireless mouse (Mamba I think?) which had no discernible latency and a nice dock for recharging the mouse, I was very happy with it until one evening it just stopped working. While alone in my flat, I stepped away from using my computer for about an hour, didn’t even put it to sleep, came back, and it would no longer move but would still register mouse clicks. I tried contacting customer support asking if there was a way to reset it or reflash the firmware or something and they’re just like “nope”. Last piece of Razer hardware I ever bought.
Also it's interesting that with ProMotion enabled it reports 16.67ms per frame (indicating 60Hz redraw rate) in Safari, but in Chrome it's 8.33.
Although it's for gamepads, it's pretty much indispensable in debugging gamepad-related latency issues. For example, I found that my presumably 1000Hz controller can do only 500Hz in ideal conditions and it starts to drop at a much lower distance from the computer than advertised. Neat stuff.
I’m curious if there is a USB hub that I could buy of higher quality as my mac doesn’t have too much i/o
I’d love to be wrong on this but haven’t been so far.
There are other differences in the tools, mine was designed for what I wanted to understand so I'm biased toward it.