…but these go to 11
Very cool, now find me a decently priced rig that can actually run games consistently above that refresh rate. This is future tech. Barely anyone will be able to take advantage of it rn
It’s a $900 monitor for professional esports players. It’s like you’re complaining about F1 cars being too fast because roads are too slow for them.
Having had played near that level of several years I highly doubt 99% of e sport players could tell the difference between 240 and this. 160+ is hard enough for most of the semi-pro players I’ve played with and 99% of the pro players I’ve talked/played with all say 144 is plenty fine in 99% of situations.
CRT revival when?
lol where do these people gets their statistics?
Yeah I bet I could run Notepad at 540Hz
Solitaire endings are gonna look sickkkkk
The Minesweeper explosions are gonna be off the hoooook
With frame generation! Let the AI fill in the text for you xD
I play cs2 at 120hz and it’s not the monitor holding me back. :/
Yup city design skills are what hold most people back in that game.
I like that everyone else is assuming they meant Counter Strike 2 and you’re talking about Cities Skylines 2
Just be happy you’re getting 120 faps on cities skylines 2.
There there
Does CS2 have humans vs bots mode yet? I’d play that.
I don’t care to play against 13 year olds with instant reflexes who have thousands of hours in the game.
There’s community servers now, so maybe you could find one that has that. Don’t think it’s in official yet though.
We all know it’s the team mates you get. Keep grinding, Ace!
Really don’t understand this trend you will never that. Even 240hz is hardly noticeable over 165. Just need OLEDs to come down in price but for most gamers who dont have 40xx gpus we can’t really run much over 1440p, 240hz anyway
The diminishing returns after 144hz is insane. I have a 240hz and a 144hz and it’s hard to tell the difference. It’s there, but you gotta have them side by side or be playing a really competitive shooter to notice.
540hz?! Isn’t that overkill? That is The Marvels level cringy.
For most people it is. This product isn’t for most people though.
Why would you ever want a refresh rate that high?
I did my primary coursework on this exact topic in college so I feel rather aptly in the know here, and VDU’s with displays above 100hz have diminishing returns and anything above 200hz is completely indistinguishable to the Human eye.
Anyone who says otherwise either has superhuman eyesight or is just lying.
To spot a sweaty DCS player going mach 3 over your port view, maybe?
This youtuber says it’s a really noticeable jump from 240 to 500.
He has tested every single new high refresh rate panel and use scientific methodologies.Have you actually tried some of these technologies ? I own a 240hz an have never seen above so I can’t talk. I can say that motion blur reduction technologies influences perceived smoothness a lot tho.
use scientific methodologies
such as?
measuring pixel response times, total system latency, slow motions of different displays
Watch the video?
And I’m just sitting here with my old & busted eyes thinking 60Hz is enough…
Don’t get me me wrong, I can see a difference between my 60Hz simracing screens and my 144Hz desktop display (both pushing north of 120fps), but it just doesn’t matter when I hammering around Spa in a GT3 car or blasting scavs in Night City.
Have you actually tried some of these technologies ?
Extensively. He is sponsored to sell you a product.
You are VERY correct in your deduction that motion blur tech is a far bigger hitter than just up’ing the refresh rate.
Have you written a paper on it? I would like to read more about it.
Frame rate snobs.
Idk, I can’t distinguish after 144hz. Hell I wouldn’t even know if it was 120 or 144
My brother has a 240hz monitor so I decided to play around with some CS on it on different refresh rates to see if I could notice any difference. And yeah, back to back 144hz to 240hz you notice *something* but it’s really hard to say I can make use of that extra information and every step above 140hz is really hard to notice any difference at all. It’s only barely noticeable when you jump from 140hz to 240hz in one step. In fact even after 90hz taking 10hz steps it already doesn’t feel like a big improvement after 90hz and it rapidly becomes less noticeable after 100hz.
It’s very possible your brother’s 240Hz monitor isn’t particularly good. Just because a display can refresh at a certain number, it doesn’t actually mean that it’s better than a display that might have a lower refresh rate, but has faster pixel response, etc.
A 144Hz OLED for example, will be significantly better than a much higher refresh rate LCD, because OLEDs have pixel responses that are sub 0.1ms. This is more than an order of magnitude faster than the fastest LCD monitor.
90 is definitely some kind of threshold where i notice the performance is dropping, as long as an fps is like 95low im happy with the performance
I think I might slap someone if I ever hear them say anything under 240 is unplayable. And I just know someone will find their way to this comment to argue why that’s not a ridiculous thing to say with some sort of special scenario.
I have a functional neurological disorder and will have a seizure if subjected to a screen that is under 240hz.
The only special scenario I can think of is playing a game at 144hz while using a monitor thats 240hz. Going from playing 280hz on my monitor to 144hz makes games visually stutter to a noticeable degree that makes games frustrating to play when you’re used to the super ultra^tm smooth feel of 240hz. Even in that case, which I am familiar with, I wouldn’t go as far to say that anything is unplayable, just noticeably more visually frustrating than usual.
Going from playing a given game on a 240hz display, to a 144hz display does have a noticeable difference however not anywhere near significant enough that its unplayable. I could imagine even pro FPS players being able to tolerate it despite possibly taking a hit in their performance. Point being I completely agree its stupid to suggest its unplayable though the special scenario, in my case, I could imagine is what most people are experiencing when they say its unplayable (I hope lol); even then I wouldn’t really agree.
I can only tell when playing fps going from 60 to 144. After 144 it pseudo felt better but really I couldn’t tell
In automotive parlance we call that the Butt Dynometer. Your ass feels extra speed in the seat but the numbers don’t lie.
144-to 240+ is extremely noticeable. Can’t speak on 240 to 360 since I don’t have a 360hz monitor
I’ve found that beyond 144hz you need a serious investiment to actually get a pixel response that satisfies it without getting ghosting
oled helps there.
It’s not usually about seeing the actual fps or refreshes, it’s usually about how the game reacts to the high FPS and refresh rate.
CSGO behaved particularly different at 120 vs 300+fps
Didn’t people do a lot of research on why +120Hz wasn’t really necessary? I can absolutely tell the difference between 200Hz and 240Hz
Absolutely not true. Play a high speed FPS with extremely high FPS (think 1000+ fps, with some games being played at 2k or more fps), and current 240, 280, 360hz monitors are not fast enough to eliminate the lack of smoothness. Ghosting is still plainly visible. These games will absolutely benefit from 500+hz and your eye will be able to tell.
There’s def a difference between 144hz and 360hz.
It’s 216
this is assmiung displays don’t have any issues such as overshoot, which the majority of 240hz and 360hz panels on the market do - a lot of the 240hz panels (ahem, Samsung) are worse than even budget 144hz panels.
anything above 200hz is completely indistinguishable to the Human eye
First off that is just not true, the human eye can perceive the difference, but the change may not be noticeable to those who are not familiar. Dropping from 240hz to 200hz is absolutely noticeable to competitive FPS players, though it primarily has to do with feeling the response time of your inputs rather than the way it looks.
First off that is just not true, the human eye can perceive the difference, but the change may not be noticeable to those who are not familiar
Pretty much. The theoretical results are about what basically amounts to seeing light pulses. To see the actual difference in a game it requires not only a very good monitor (most high refresh rate monitors are trash with worst case response times way longer than a frame time) and a very fast moving object. Past certain refresh rate these conditions just become unlikely, so you don’t really see a difference unless it’s a synthetic test.
All those idiot pro gamers using 240+ they’ve been swindled! /s
You can’t see it but you can feel it in some games
Human hubris
Anyone who says otherwise either has superhuman eyesight or is just lying.
Y’ever listen to audiophiles argue over quality? Argue that -yes- they can tell the difference between $10 HDMI cables and $300 HDMI cables. Or how their $5000 “power filter” cleans the dirty pedestrian AC current before it gets to their sacred audio gear? Or that their diamond/iridium/angel-eyelash record needle gives them a FAR better sound reproduction… YOU just can’t hear it.
There will be someone out there that will tell you that yes, they CAN tell the difference between 200hz and 250hz.
But gosh, they’ll never take a double-blind test to prove their golden ears/eyes.
I can’t comment on $10000 power cleaners, but based on your comment I’m assuming you’re not an electrical engineer with a background in amplifiers and high frequency noise (neither am I)
Completely false equivalence btw
I have a 360hz display and sometimes when I play Halo MCC there’s this random thing where its like my eyes realize the refresh rate and everything looks really strange. the movement and speed looks weird. And if I play for awhile and get up and move around its like my eyes are still seeing at that refresh rate for a few minutes. Even with this experience I can’t go back below 360hz. There’s just something about it I enjoy that I can’t even explain
I think it is well understood now that this is incorrect. I can tell the difference between 360 and 240, and it is even more jarring when going from 360 to 144 or 120.
I did quite a few years on a 60hz monitor. The change to 144hz was massive. Did quite a few years on that 144hz monitor. The change to 240hz was practically unnoticeable.
One problem a lot of those studies have is it doesn’t factor in how you can get used to something and then notice a change. I remember when I was a kid that happening with headphones. I thought my good headphones weren’t that large of an improvement until I had to borrow somebody else’s years later and thought they sounded like absolute junk.
So somebody might not be able to tell much of a difference between 144 and 240 if they’re sitting side by side, but they might be able to tell the difference very easily if they’re used to one and then switch to the other. People are also more sensitive to negative changes, so you’re more likely to notice the difference between the two if you go from 240 to 144 versus 144 to 240.
Somebody below me posted a link to a study showing that humans can detect differences up to about 1000, and most can easily tell the difference between 240 and 360.
I did a blind test between two monitor screens.
Maybe to the eye you can’t tell the difference, but whenever I am gaming, I guessed the correct refresh rate 10/10 times.
I tried with my eyes and it was just a 50% guess rate… but whenever I had a controller in my hands, I could instantly tell.
Not sure how I can explain this 🤔
Input lag is how you could tell when gaming. It wasn’t your eyes, but what your brain expected to see.
It’s less about seeing 540 frames per second and more that each frame taking less time to render on screen, it means what you’re seeing is a more recent representation of the game state. I’m a casual gamer that can’t appreciate anything past 144hz but that’s what I’ve heard about ultra high refresh rate screens.
Okay but isn’t this a case of the classic mouse sensor that can do 24000 dpi tale? People say these mice are good because even though you won’t ever go that high, the fact that the sensor can go that high means it’s much more accurate than ones that can only go up to 6000 or whatever dpi you’ll end up using. Doesn’t this logic also apply to these displays?
mainly only useful for professional gamers. Also there’s definitely been people that can distinguish above 200hz
PC Gamer did a good deep dive into this.
While theres no set answer and everyone’s vision is different, generally theres zero benefit above 200Hz as you said.
I game on my couch at 120 hz. I somehow keep beating all of these multi-thousand-dollar-setup competitors.
Thank you. My background is biology. Frankly neurons and cells just aren’t that fast. You know how your hand can feel cool when you touch a hot plate for a second? It’s because it takes like 200ms for the pain signal to register in your brain. And here these people are acting like a monitor with microsecond refresh rates is somehow something you can even detect.
I’m a 120 or 144hz man myself. Seems plenty snappy and smooth. 60 for office work is even fine. I can for sure sense 30 though. Like when a driver gets updated and something goes wrong and it sets your main display to 24 or 30fps. That shit is obvious.
My background is biology. Frankly neurons and cells just aren’t that fast.
The way we’re able to see high refresh rate isn’t based on neuron speed. (Ignoring input lag stuff). The eyes continuously sample as light hits them sending signals. Because it’s not a global shutter we’re able to perceive a displays refresh as it’s sending discrete images of a scene.
It’s far easier to notice an issue when there is a large change between two frames. The easiest way to reproduce this is in 3rd person games where you quickly rotate the camera as the large change between frames is very obvious to the eye. (There’s a limit obviously, but I can tell between 120Hz and roughly 240Hz, but I’m fine with 120Hz as I don’t tend to rotate the camera fast in games).
Because what you are saying is absolutely not true. Trained eyes and pros can definitely tell between 240hz and 360hz. There is a difference, it’s pretty marginal but there is a difference. Yes this is absolute nonsense for anyone but FPS pros, but there is still a market for it and it’s more of alpha consumer funded research than anything else
He did a coursework on it though!! Op prolly cant perceive lighting strikes too since many of them last less than 1 milisecond
According to the US Air Force, training allows people to process at and above the 220 fps level after having started at only being able to process at around 40 fps; some pilots even ended the study processing at 300 fps.
Anything above that is probably far past extraneous, though, especially for the average consumer. So a 540hz monitor is a little bit silly.
I mean, my subjective experience is that it’s less of a distinguishing one frame from another, but a perception of fluidity in motion.
Like I can see a noticeable jutter between 60-100hz and 144hz. This has been further trained because my personal setup has a cabling issue where emi from my chair piston causes the display cable to be disrupted which sometimes resets my monitor to 100hz. It is very annoying because I’m some of my games I notice a considerable play performance drop when this happens and I didn’t notice it in the past.
So I can definitely see the difference there. Then moving on from 144 to my integrated 240hz display in my laptop, I can absolutely notice a difference with specifically the edges of objects and which is extra-noticeable when moving the mouse around in circles and such.
This also translates to gaming where I’ve genuinely gone and checked my settings because something felt different from normal when I get so used to using the external display that I forget the internal one is the higher spec. Not bad, mind you, just different. I’m super vigilant about any change, though, since I’ve previously had issues that were indicated by performance changes beforehand.
That being said, I don’t think your understanding or coursework is wrong, but just that my subjective experience indicates otherwise and that the research potentially fails to translate well onto an exact Hz scale. My understanding is that the eye doesn’t really work on “frames” and is a bit more “stream of information” about it, meaning that distinguishing change even on minuscule time scales is relatively easy but distinguishing individual objects or the substance of an image flashed at that speed is very hard. If you know what to expect, though, I’d be willing to bet the process becomes significantly easier.
Perhaps there is already a plethora of studies on this exact stuff, but I’d be willing to bet that many existing studies fail to reasonably account for expectation when evaluating comprehension as well.
I have a monitor with 165 refresh, 60 to 100 there is a difference, 100 to 144 I see nothing, and 100 to 165 I see nothing, turned it off and left it at 100 hz. I really could not see a difference. 540 is riduclous.
Sweet, all those gamers can ignore their own lives even more realistically now!
(Yeah I’m a salty ex-gamer)
Hardware Unboxed just made a video about Why Higher Refresh Rates Matter - 30Hz vs 60Hz vs 120Hz vs 240Hz vs 540Hz
tldr: higher refresh rates result in better motion clarity and lower input latency
690HZ or bust
Anything above like 90-100 FPS is barely perceptible to me and provides close to no benefit, even my 165hz monitor is currently locked at 100 for more consistent frames rather than more frames.
So what’s the point of 540hz exactly? And don’t tell me it’s a competitive advantage, it’s not, no human will be able to utilize the 1 digit millisecond increases it provides.