I don't have the best computer in the world (3 ghz processor, 1 gig ram, radeon x600 256mb gfx).I can get FPS from about 5 to 60 or so, depending on my settings. How low can they go before it starts becoming a problem? I average about 30 at the moment, but sometimes that doesnt quite seem enough.
i like to keep above 50 or so even so i can play all high with a 7600gt
You have to learn the rules of the game and then you have to play better than anyone else.
you want at least the average to be above 30 (minimum) or 40. I THINK anything above 70 or 80 fps is indiscernible to the human eye..or something like that.
Last edited by MURcarnage (2006-10-21 03:23:57)
You would need at least 30 (Thats the maximum FPS your eyes would recognize). However, There is a difference between recognizing and feeling. So 30 fps is recognizable, but it would not look smooth. Try to keep it above 45~50. However, if you can, try to keep it above 60.
I aimed for about 40 on my laptops 7600. Just make sure you have the view distance at full when you set them or you will have a major disadvantage.
try here for more info. http://www.100fps.com/how_many_frames_c … ns_see.htm
Don't post stupid numbers without info to back it up guys.
And remember that the Hz your running is actually what is being displayed, so if your monitor only supports 60Hz then anything above that is really meaningless.
try here for more info. http://www.100fps.com/how_many_frames_c … ns_see.htm
Don't post stupid numbers without info to back it up guys.
And remember that the Hz your running is actually what is being displayed, so if your monitor only supports 60Hz then anything above that is really meaningless.
Last edited by the_outsider38 (2006-10-21 03:35:25)
For 50-60 I'm gonna need all low... great.
Does the resolution make a difference to FPS?
Does the resolution make a difference to FPS?
The higher the resolution, the harder your card has to work. So you should see an improvement in performance with a lower resolution.
Last edited by rh27 (2006-10-21 05:52:08)
To me 30's are to be avoided, 40's are slow but tolerable, 50+ is best. Just remember that guys shooting for high frame rates don't need what they get at the max when they are pushing 80+. It's just that with those higher frame rates, that usually means your lows when the shit hits the fan are still in those more desirable areas.
The most you will need is 70 FPS. Anything after 70 cannot be noticable because the human eye can only see up to 70 FPS.
Hell yes it's noticable....if your highest FPS is 70 then your lowest can still drop to 20 or less at times. If your highest FPS is 150 then your lowest may only get down to 70 or 80. The difference between lows of 20 FPS and lows of 70 FPS is extremely noticable.DeCon_1 wrote:
The most you will need is 70 FPS. Anything after 70 cannot be noticable because the human eye can only see up to 70 FPS.
The human eye cannot see above 70 FPS so it is not necessary to have the fastest but the best quality of the picture itself. If the FPS can be sustained between 60 to 70, that is all you really need.
I like to keep it at least 40-50. It's good if it's around 60.
Last edited by haffeysucks (2006-10-21 08:43:55)
"people in ny have a general idea of how to drive. one of the pedals goes forward the other one prevents you from dying"
I don't know what kind of card your runnning, but my laptops 7600 dips maybe 15 FPS maximum during gameplay. In fact setting it to 45 on the ground gets me near 100 when I fly. The only time it really dips down to 30 is during an arty strike.ShotYourSix wrote:
Hell yes it's noticable....if your highest FPS is 70 then your lowest can still drop to 20 or less at times. If your highest FPS is 150 then your lowest may only get down to 70 or 80. The difference between lows of 20 FPS and lows of 70 FPS is extremely noticable.DeCon_1 wrote:
The most you will need is 70 FPS. Anything after 70 cannot be noticable because the human eye can only see up to 70 FPS.
Also, BF2 has a frame limiter at 100.
Simply set your settings to stay above 30FPS and you should be good. If you notice it drops below, then lower your settings, or accept it, I mean if your in the middle of an arty strike your screwed anyhow.
And yea, that x600 is going to have to work pretty damned hard to run the game at playable speeds on any setting.
So it is the card letting me down. Great. When I bought this comp I didn't know much about graphics cards, and though all 256mb ones were the same.And yea, that x600 is going to have to work pretty damned hard to run the game at playable speeds on any setting.
Ok, next question - is it better to use 1100x800 (roughly) resolution or 1200x800 with 2x anti-aliasing?
well the human eye sees at 24 fps, so u really only need like 30 in a video game.
There is a notable difference in a firefight at 24FPS and a firefight at 70.Superior Mind wrote:
well the human eye sees at 24 fps, so u really only need like 30 in a video game.
30 and up you will not notice a differance.. soooooooooooooooooo at least 30.
It shouldn't be any different in a firefight, unles the frame rate changes during the firefight.Hurricane wrote:
There is a notable difference in a firefight at 24FPS and a firefight at 70.Superior Mind wrote:
well the human eye sees at 24 fps, so u really only need like 30 in a video game.
If there is no difference when you're riding your hummer down a high way, there is no difference when you're running around Karkand dodging bullets.
human eye doesnt see at 24, TVs and such are usually displayed at 24FPS, its the lowest amount of frames per second the human eye recognizes. 60FPS is the most a trained eye notices, at a steady rate. If you stared at a constant 60FPS for 3 minutes or whatnot, and then it slowly transitioned into 70+FPS for 3 minutes, you would not notice a difference.
Yes, although it has 256mb of memory its GPU (graphics processing unit) is too slow to effectivly use it. Cards used to be measured by the amount memory, but now its much more dependant on the core and memory speeds.Talon wrote:
So it is the card letting me down. Great. When I bought this comp I didn't know much about graphics cards, and though all 256mb ones were the same.And yea, that x600 is going to have to work pretty damned hard to run the game at playable speeds on any setting.
Ok, next question - is it better to use 1100x800 (roughly) resolution or 1200x800 with 2x anti-aliasing?
Im going to guess that the card is either on a PCIE x16 bus or in a laptop?
As for the second question:
The smaller the resoulution the better the performance, regardless of the AA. What you really should find out is your monitor's naitive resolution.
Then what you can do is set BF2 to run at that resolution or a simmilar one keeping the same aspect ratio so it won't stretch it.
This guide should help you optimize your BF2 experience as well as give you the steps involved in forcing a resolution.
http://www.tweakguides.com/BF2_1.html
So many people seem to think they know the FPS which the human eye "sees at" (and it always seems to be a number which they heard someone mention once in passing but they can never seem to agree). In reality, it's so complex that even with modern science there are no clear cut answers to the question. In spite of this, there are plenty of people on gaming forums who "seem" to have all the answers figured out.Superior Mind wrote:
well the human eye sees at 24 fps, so u really only need like 30 in a video game.
http://www.100fps.com/how_many_frames_c … ns_see.htm
Maintaining ultra-high frame rates when motion blur establishes itself won't be quite as important, but 60fps is the sweet spot for modern gaming. If you can maintain an average of 60, then you're pretty much guaranteed not to run into any jerky movement. That being said, some of the new games run at 30 on my machine and I'm pretty happy with it.
Last edited by unnamednewbie13 (2006-10-21 20:53:28)
if i'm not mistaken, all LCDs run at 60hz right? and since hz is cycles per second (i'm assuming FPS in this case) cant an LCD monitor only display 60 FPS? if you have a CRT they can probably display more. but I played down at like 30-60ish on all high 1600* 1200 with my 6800 gt and it was fine..
Last edited by CommieChipmunk (2006-10-21 21:30:57)
Exactly.CommieChipmunk wrote:
if i'm not mistaken, all LCDs run at 60hz right? and since hz is cycles per second (i'm assuming FPS in this case) cant an LCD monitor only display 60 FPS? if you have a CRT they can probably display more. but I played down at like 30-60ish on all high 1600* 1200 with my 6800 gt and it was fine..
Hz = Cycles/second = FPS
Most games only want to put out about 60Hz as a default, so anything over just gets thrown out. So say your running 100FPS, well 40 of those frames you just rendered wouldn't get drawn to the screen. 60 is pretty much the top end anyone could concievably need, however games IMO are playable down to 30FPS.
So if your hardware only draws at "X" Hz then you really cant utilize more then "X" FPS.
Just go with what seems to work for you, if you find it to slow then just readjust the video settings.