your two eyes can only see 30 frames per second. therefore, if you play a game at 20000 fps, it will only SEE 30. fps isn't everything, yeah? yeah.
Thats probably the most irrelevant fact I've ever heard about
Well technically you can see a difference. Your eyes can see up to 60 i believe, and if you have 100+ fps then you can tell a difference (smoother), anything past 120 or so you cant really tell unless you are an uber-hardware freak.
Well I can tell you with full confidence that your eyes can see up too 100 fps my friend
http://amo.net/NT/02-21-01FPS.html
Look there your eyes can see well past 200 fps even !!
http://amo.net/NT/02-21-01FPS.html
Look there your eyes can see well past 200 fps even !!
Thats why its a waste buying super video cards (besides graphical improvements) because you dont even notice the increase of fps.
I think what he either ment or misunderstood, is that you can start to see each INDIVIDUAL frame when it hits 30fps.
If you can't see a difference between 30 and 80fps then you should visit a neurologist of your trust.leetkyle wrote:
your two eyes can only see 30 frames per second. therefore, if you play a game at 20000 fps, it will only SEE 30. fps isn't everything, yeah? yeah.
"Super video cards," as you put it, not only affect fps, but total picture quality as well. For example, I have an ATI 9800Pro and on medium settings I get about 20-30 fps...managable, but with no blurring affect it looks choppy at times (usually at busy times with a lot of movement on screen...which is a bad time to be choppy). At low settings I get about 50-60 fps and have no problems with chopping. If I had a better...or even a "super" video card, I'd be able to put my settings higher and still get better fps than 20-30.Fadediesel wrote:
Thats why its a waste buying super video cards (besides graphical improvements) because you dont even notice the increase of fps.
I think something that is not being taken into account here also, is that...hypathetically, let's say the human eye CAN only see about 30 fps. Now let's also say your game is running at 30 fps...sounds perfect right? Should be no problem. But my guess is that the refresh timing of your game, and the refresh timing of your eyes is not perfectly insync. That would cause for some pretty noticeable lack of fluidity, wouldn't it?
Regardless, high fps = nothing bad. Low fps = borderline acceptable/major lag and chop problems possibly and probably leading to more deaths.
HIGH FPS FTW!
FPS in games are ever changing. If you have a card that runs 150 fps most of the time but when the time is crucial and you are in a fight with arty exploding around you the frames could dip way low and those with the good cards will not lag out at that time(The worst time to lag). It will leave some gamers dead waiting to spawn pondering the difference in game fps and real life fps.
Last edited by Kmarion (2006-10-24 09:43:36)
Xbone Stormsurgezz
lawl, you guys don't understand how the FPS works in the game.
It's not like an animation. If you make an animation run at a higher FPS, it goes faster...right?
The FPS in a game is how fast the computer can RENDER those animations. If it is only allowed to render 30 frames per second, you will lag because it has to wait. If it can render 100 frames per second, then you won't lag because it will have all those frames ready.
The actual animations are at a set FPS and cannot be changed.
It's not like an animation. If you make an animation run at a higher FPS, it goes faster...right?
The FPS in a game is how fast the computer can RENDER those animations. If it is only allowed to render 30 frames per second, you will lag because it has to wait. If it can render 100 frames per second, then you won't lag because it will have all those frames ready.
The actual animations are at a set FPS and cannot be changed.
Actually, that "dip" can happen even on really good cards.Kmarion wrote:
FPS in games are ever changing. If you have a card that runs 150 fps most of the time but when the time is crucial and you are in a fight with arty exploding around you the frames could dip way low and those with the good cards will not lag out at that time(The worst time to lag). It will leave some gamers dead waiting to spawn pondering the difference in game fps and real life fps.
Say you can run the game on 100% maxed settings. Say you get, 200FPS while standing still with nothing happening.
Say you then get into a jet, and head to a highly populated area, with a lot of graphics, and a lot happening (arty, explosions, lots of people, sounds, etc) and you drop to 120 FPS. That 80FPS drop is going to cause a sudden stutter, it won't last, but you will notice it.
Therefore, if you keep your game locked at around 90 - 100FPS, it will almost never drop, and you won't get lag.
I dont care I have 130 fps at everything high, 6x aa and 1280×1024
O'realy....lolelmo1337 wrote:
I dont care I have 130 fps at everything high, 6x aa and 1280×1024
Xbone Stormsurgezz
Guys, movies are on film and that creates a natural motion blur. With that said it is fast enough that the brain can compensate and fill in anything that is missing. Also, movies are shown at 24-30 fps, and it is constant and never changes.
With a video game you have 100% independantly rendered digital images. As a result of that there is no motion blur to compensate. As noted previously having higher fps than is really necessary for smooth game play is just so that when things get ugly, you don't lag out when the fps drop to what would be considered unplayable levels.
Lastly, it's about the way the game feels. I'm here to tell you that I can certainly "feel" the difference in how things look, and how my mouse tracks when my fps are in the 30's, 40's, 50's, and 60's. If I can maintain a good 60+ fps, then I'm quite happy. I still find many games playable down into the mid-40's, but anything lower than that becomes unacceptable in look and feel.
With a video game you have 100% independantly rendered digital images. As a result of that there is no motion blur to compensate. As noted previously having higher fps than is really necessary for smooth game play is just so that when things get ugly, you don't lag out when the fps drop to what would be considered unplayable levels.
Lastly, it's about the way the game feels. I'm here to tell you that I can certainly "feel" the difference in how things look, and how my mouse tracks when my fps are in the 30's, 40's, 50's, and 60's. If I can maintain a good 60+ fps, then I'm quite happy. I still find many games playable down into the mid-40's, but anything lower than that becomes unacceptable in look and feel.