Posted by yoyoman @ 08:05 CST, 23 February 2013 - iMsg
I was just curious if it's technically possible that a crt at 240hz will be smoother than a crt at 125hz on quake live where the fps is capped at 125fps?
after 120hz I can't see a difference. Even at 160hz the only way I can tell, I point my DSLR at the screen and see how the refresh rate lines have slowed down dramatically.
i wonder what hz you would get if you set it to 320x240 which is quake live's smallest resolution. I know its won't be fullscreen but you could just use your driver to set it to the custom 320x240 res and fit to exact width and height of the unfullscreen quake to get it fullscreen.
i mean it's not necessary to have 240hz to negate frame tearing, as you can use v-sync on a CRT even with lower refreshrates and still don't suffer from inputlag.
if there is any inputlag 125fps@125hz on a CRT, then it's a lesser problem than the tearing with those settings at least in my oppinion.
I had 3 CRT that were running at 200Hz at 800x600, the last one was even running at 255Hz at 640x480. The difference between 200Hz and 255Hz is realy noticable. The frame tearing is substentialy lower.
I have to say i myself would have had a hard time understanding it if i didn't tested it myself, so i can understand people's missconception about getting a refresh rate higher than the frames per seconds.
When you say that the frame tearing is substantially lower, what does this look like? Ie. what is the practical appearance to the end user.
I'm a huge fan of high refresh rates, and it frustrates me to no end that refresh rates haven't continued to increase over the years and instead have even lowered going from crt -> lcd. I know they are different technologies and all, but yeah, crank up the refresh rate industry!!
Well i think i am gonna make a post to explain the test methode. By the way, the more a game is fast, the more the refresh rate you need. There is a direct correlation between the speed at which an image move and the refresh rate and screen tearing, people working @television @movies know what i mean and they have a motion tricks to avoid tearing.
Now, for fast FPS the effect is clearly more proiminent than for a RTS or a car game, because the ingame images moves faster. Here again we are back to the same correlation "size of the movement per time" = speed.
I think my english is horid right there but i hope it was understandable.
screen artifacts are the result of hz/fps being too far apart, not the result of higher hz. tearing (as in seeing a line updating the screen) usually only happens when fps=hz.
I want to see a 1000hz setup. A monitor at a 1000hz, a game at 1000fps, and a mouse at 1000hz. The bandwidth required would be massive between the card and the monitor, but would be crazy impressive to see.
honestly i don't get why QL is the only serious competitive fps left with such a low fps cap yet it's one of the most dynamic ones(cs had 100fps cap, but csgo has none, most other titles got adjustable caps). The physics are already independent so it shouldn't cause any issues to increase fps to smth like 250, it would bring much better experience for every QL player.
There're 144hz lcd monitors already and people buying those are capped by the game itself :/
I don't know how much work it would take to make the physics independent at any com_maxfps setting. From what I remember of the old qlive forums, it was stated that the physics was only "fixed" up to 125fps. Beyond that it was/is still bugged (eg 333fps).
Trying to find the thread where I believe Sync himself said it. Can't remember if it was on the old beta forums or not. But the question was brought up there, why can't we have uncapped fps if the physics are independent in qlive. The response was it's only independent up to a certain point, that being 125.
My Samsung SyncMaster 959NF does 800x600@160hz but I play with 1024x768@120hz because the difference is really small. The smoothest setup I know is com_maxfps 320 and 160hz which you can only do on localhost (at least on q3). The difference with that setup is like going from 60hz to 120hz.
I read on a couple of sites that a human eye cannot really see the difference for anything above 90hz. I'm not sure if that's true. All I know is that going from 60hz to 120hz for me was quite big, and I hate even speccing on my laptop that is only 60hz.
(not to mention input lag from the old LCD was huge, this new 120hz Benq is awesome, less than 4ms.)
the eye is just the lens, it's the brain that counts, and maybe a human can't consciously distinguish the difference between 90 and above but it might well make a difference subconsciously