Re-use the rest of your computer.
Trust me, he does know my computer inside and outFreezer7Pro wrote:
With the DDR prices of today, not really. Especially not one 2GB. Perhaps that.TheAussieReaper wrote:
Wouldn't a single stick of 2gb ram also be a good idea freezer?I somewhat doubt that, and http://www.newegg.com/Product/Product.a … 6814131097War Man wrote:
I like that, I gotta discuss this with my brother see his opinion. He knows my computer more than you guys after all.Mekstizzle wrote:
NoWar Man wrote:
Edit: is this what you're referring to? http://www.bestbuy.com/site/olspage.jsp … 3815723741
This
Have you heard him snickering about it behind closed doors?War Man wrote:
Trust me, he does know my computer inside and out
Screw the CPU, not worth the effort. And the RAM, I'd say it's not worth it either. But, you can't go wrong with some good RAM but you already have 1.5GB, which is why I'm hesitant to say go for it. That's upto you.DeathUnlimited wrote:
For the RAM: Go this: http://www.newegg.com/Product/Product.a … 6820145579
You will not find a better deal for new ram. Then I'd say go hunting for a second hand AGP 7600GT or 7600GS and possibly a Socket A 3200+ processor.
needs ram too
Logic would dictate that you ask him thenWar Man wrote:
Trust me, he does know my computer inside and out
Half of his ram is 333MHz. 2.5GB of DDR400 will make a good difference over 1.5GB of DDR333. The CPU will not make a lot of difference, that is true.Mekstizzle wrote:
Screw the CPU, not worth the effort. And the RAM, I'd say it's not worth it either. But, you can't go wrong with some good RAM but you already have 1.5GB, which is why I'm hesitant to say go for it. That's upto you.DeathUnlimited wrote:
For the RAM: Go this: http://www.newegg.com/Product/Product.a … 6820145579
You will not find a better deal for new ram. Then I'd say go hunting for a second hand AGP 7600GT or 7600GS and possibly a Socket A 3200+ processor.
As for the GPU, If you're not going to go full whack and get a HD 3850, go for that HD 3650 at least.
Last edited by DeathUnlimited (2008-11-13 15:25:08)
I know the 36 computers in school inside out, my own 4 active computers inside out, I know over 20 random people's computers inside out, and I even know your computer inside out.War Man wrote:
Trust me, he does know my computer inside and outFreezer7Pro wrote:
With the DDR prices of today, not really. Especially not one 2GB. Perhaps that.TheAussieReaper wrote:
Wouldn't a single stick of 2gb ram also be a good idea freezer?I somewhat doubt that, and http://www.newegg.com/Product/Product.a … 6814131097War Man wrote:
I like that, I gotta discuss this with my brother see his opinion. He knows my computer more than you guys after all.
His mobo only has 3 RAM slots.DeathUnlimited wrote:
Half of his ram is 333MHz. 2.5GB of DDR400 will make a good difference over 1.5GB of DDR333. The CPU will not make a lot of difference, that is true.Mekstizzle wrote:
Screw the CPU, not worth the effort. And the RAM, I'd say it's not worth it either. But, you can't go wrong with some good RAM but you already have 1.5GB, which is why I'm hesitant to say go for it. That's upto you.DeathUnlimited wrote:
For the RAM: Go this: http://www.newegg.com/Product/Product.a … 6820145579
You will not find a better deal for new ram. Then I'd say go hunting for a second hand AGP 7600GT or 7600GS and possibly a Socket A 3200+ processor.
As for the GPU, If you're not going to go full whack and get a HD 3850, go for that HD 3650 at least.
Last edited by Freezer7Pro (2008-11-13 15:28:19)
Last edited by Stimey (2008-11-13 15:28:18)
Those 2x1GB + the 512MB corsair in his comp equal 2.5GB to me.Freezer7Pro wrote:
His mobo only has 3 RAM slots.
Ahh fuck I fail. Thought you were talking about buying another 1GB.DeathUnlimited wrote:
Those 2x1GB + the 512MB corsair in his comp equal 2.5GB to me.Freezer7Pro wrote:
His mobo only has 3 RAM slots.
In the defence of some of the earlier posters (myself included) - well, I guess I can only really speak for myself here - but, it was just a bit of humor to lighten the mood whilst we waited for more info from war man.Parker wrote:
btw, this whole elitist attitude going on in the tech section really sucks.
Last edited by Scorpion0x17 (2008-11-13 16:07:08)
That mobo has 3 slots, like NF2 mobos tend to have. The problem in this case is mostly the fact that 2/3 of the ram is DDR-333. I was suggesting to buy a cheap 2x1GB kit of DDR-400 and use it with the 512MB DDR-400 stick inside already. 2.5GB of DDR-400 would totally be dufficient and far better than 1.5GB of DDR-333Scorpion0x17 wrote:
Now, 1.5GB RAM, whilst, technically, enough for a lot of things, is on the low side - I cba to check, but I'm pretty sure you're mobo will have four RAM slots available - so, the cheapest option would there would be to purchase another 512MB, to bring you up to 2GB - that's more than enough to run BF2 with no problems and Windows will love you for it.
Best AGP nvidia card is the 7800GS AGP, that and the 7600-series would probably be a good spot to look for on ebay. For ATI, HD3850 is the best AGP card, but I wouldn't suggest getting it. While it is good, it would be bottlenecked hard by the CPU, and thus would not perform a lot, if any, better than a card half the price. Thus a pointless card to get. For ATI range I'd say either a HD3650 or an X1950 card. HD3650 would give DX10 but that rig is not going to run anything in DX10, so not a real reason to favor either. I'd get whichever is cheaper on ebay. Even X1650 would be a legimate choice.Scorpion0x17 wrote:
OK, now probably the biggest difference you're going to be able to make is to buy a new AGP video card - just make sure it's AGP and you can't really go wrong - if you want to stick with ATI (which actually I think you may have to - I've not seen an AGP nVidia card for ages, even though I believe AGP is still an option on some of there models), get anything with a HD in front of the model number, and, most importantly - pay attention to the third digit from the right in the model number - this indicates, in general, the raw grunt of the card - so, with your current card - an X700 - it's the 7 that is the important digit - so, for example (and I'm not suggesting you get this particularly model, just using it as an example) a HD 4850 provides more raw grunt than your X700, along side having a whole host of new and shiny features that your card doesn't have.
Might say it that way.Scorpion0x17 wrote:
So, the biggest decision you need to make is how much money you want to spend.
Last edited by DeathUnlimited (2008-11-13 15:53:23)
Yeah, just noticed that. My NF2 mobo has four slots. But, then, I'm hardcore.DeathUnlimited wrote:
That mobo has 3 slots, like NF2 mobos tend to have.Scorpion0x17 wrote:
Now, 1.5GB RAM, whilst, technically, enough for a lot of things, is on the low side - I cba to check, but I'm pretty sure you're mobo will have four RAM slots available - so, the cheapest option would there would be to purchase another 512MB, to bring you up to 2GB - that's more than enough to run BF2 with no problems and Windows will love you for it.
Yep, and thus it's most worthy to go for some of the cheaperScorpion0x17 wrote:
Yeah, just noticed that. My NF2 mobo has four slots. But, then, I'm hardcore.DeathUnlimited wrote:
That mobo has 3 slots, like NF2 mobos tend to have.Scorpion0x17 wrote:
Now, 1.5GB RAM, whilst, technically, enough for a lot of things, is on the low side - I cba to check, but I'm pretty sure you're mobo will have four RAM slots available - so, the cheapest option would there would be to purchase another 512MB, to bring you up to 2GB - that's more than enough to run BF2 with no problems and Windows will love you for it.
Anyhoo, in that case, yeah, whip out the two slower of the 512s and put in 2x1GBs.
On the subject of bottlenecking - every AGP card available that's worth upgrading to is going to be bottlenecked by his CPU - even if he gets a spanky XP3200+ it'll bottleneck.
In that there's no point spending more money than necessary, yes.DeathUnlimited wrote:
Yep, and thus it's most worthy to go for some of the cheaperScorpion0x17 wrote:
Yeah, just noticed that. My NF2 mobo has four slots. But, then, I'm hardcore.DeathUnlimited wrote:
That mobo has 3 slots, like NF2 mobos tend to have.
Anyhoo, in that case, yeah, whip out the two slower of the 512s and put in 2x1GBs.
On the subject of bottlenecking - every AGP card available that's worth upgrading to is going to be bottlenecked by his CPU - even if he gets a spanky XP3200+ it'll bottleneck.
Last edited by Scorpion0x17 (2008-11-14 08:57:45)
Vista on a 2500+? Are you mad?Scorpion0x17 wrote:
In that there's no point spending more money than necessary, yes.DeathUnlimited wrote:
Yep, and thus it's most worthy to go for some of the cheaperScorpion0x17 wrote:
Yeah, just noticed that. My NF2 mobo has four slots. But, then, I'm hardcore.
Anyhoo, in that case, yeah, whip out the two slower of the 512s and put in 2x1GBs.
On the subject of bottlenecking - every AGP card available that's worth upgrading to is going to be bottlenecked by his CPU - even if he gets a spanky XP3200+ it'll bottleneck.
But, the whole bloody reason both nVidia and ATI bring out new cards all the bleeding time is that those new cards have features that the old ones simply didn't have - for example DX10.
Now, granted his system ain't gonna handle the DX10 mega games - but his system will run Vista and therefore it will run DX10 and more and more, not so mega, games will be coming out supporting DX10, so it is worth considering.
It's not just about how fast the card goes. In fact it's hardly at all about how fast the card goes - hence the reason it's so important to take note of the third-digit-from-the-right in the model number - if, like me you own an X800 XT PE, and you then go and buy a HD 3400, you're gonna wish you hadn't - not because the HD3400 is not a vastly superior card - 'cos it is, but because it's just slower - it just doesn't have the same raw grunt as an X800XT PE, even though it can do more things. But a 3800 would blow your mind, in comparison.
(edit: just read that back and it sounds like I contradict myself, but I don't... oh how I love GPUs...)
I can't wait for them to start using the greek alphabet instead of numbers...Freezer7Pro wrote:
Vista on a 2500+? Are you mad?Scorpion0x17 wrote:
In that there's no point spending more money than necessary, yes.DeathUnlimited wrote:
Yep, and thus it's most worthy to go for some of the cheaper
But, the whole bloody reason both nVidia and ATI bring out new cards all the bleeding time is that those new cards have features that the old ones simply didn't have - for example DX10.
Now, granted his system ain't gonna handle the DX10 mega games - but his system will run Vista and therefore it will run DX10 and more and more, not so mega, games will be coming out supporting DX10, so it is worth considering.
It's not just about how fast the card goes. In fact it's hardly at all about how fast the card goes - hence the reason it's so important to take note of the third-digit-from-the-right in the model number - if, like me you own an X800 XT PE, and you then go and buy a HD 3400, you're gonna wish you hadn't - not because the HD3400 is not a vastly superior card - 'cos it is, but because it's just slower - it just doesn't have the same raw grunt as an X800XT PE, even though it can do more things. But a 3800 would blow your mind, in comparison.
(edit: just read that back and it sounds like I contradict myself, but I don't... oh how I love GPUs...)
And the general rule for current-gen cards is to take the numbers that come after the first and aren't 0's into consideration. I find that wording easier to get.
shaders, dx10.1 and 55nm are things I don't consider to be 'raw grunt' - they just 'fancy features' - the raw grunt of a card comes from the clock speed of the GPU/RAM.Freezer7Pro wrote:
Also, I find that ATI's late releases haven't been much but momre raw grunt. 4870 = 3870 with 480 more shaders = 2900XT with DX10.1 and 55nm tech.
No, shaders are raw grunt. Take for example the 8800GT and 8800GTS G92. If you clock an 8800GT to exactly the same speeds of an 8800GTS G92, it'll be about 13% slower, due to the lack of those extra 16 shader units. Shaders matter more than clock speeds nowdays. If I clock my 2400Pro up to the same clock speeds as my 3870 (and believe me, I have), it is still less than 1/10th of the speed. Why? Because my 3870 has 320 shaders, and my 2400Pro has 20. If you wish, you could basically rate a card's performance on the ammount of shaders * clock speed.Scorpion0x17 wrote:
I can't wait for them to start using the greek alphabet instead of numbers...Freezer7Pro wrote:
Vista on a 2500+? Are you mad?Scorpion0x17 wrote:
In that there's no point spending more money than necessary, yes.
But, the whole bloody reason both nVidia and ATI bring out new cards all the bleeding time is that those new cards have features that the old ones simply didn't have - for example DX10.
Now, granted his system ain't gonna handle the DX10 mega games - but his system will run Vista and therefore it will run DX10 and more and more, not so mega, games will be coming out supporting DX10, so it is worth considering.
It's not just about how fast the card goes. In fact it's hardly at all about how fast the card goes - hence the reason it's so important to take note of the third-digit-from-the-right in the model number - if, like me you own an X800 XT PE, and you then go and buy a HD 3400, you're gonna wish you hadn't - not because the HD3400 is not a vastly superior card - 'cos it is, but because it's just slower - it just doesn't have the same raw grunt as an X800XT PE, even though it can do more things. But a 3800 would blow your mind, in comparison.
(edit: just read that back and it sounds like I contradict myself, but I don't... oh how I love GPUs...)
And the general rule for current-gen cards is to take the numbers that come after the first and aren't 0's into consideration. I find that wording easier to get.shaders, dx10.1 and 55nm are things I don't consider to be 'raw grunt' - they just 'fancy features' - the raw grunt of a card comes from the clock speed of the GPU/RAM.Freezer7Pro wrote:
Also, I find that ATI's late releases haven't been much but momre raw grunt. 4870 = 3870 with 480 more shaders = 2900XT with DX10.1 and 55nm tech.
Last edited by Freezer7Pro (2008-11-14 09:21:48)
No. Shaders are not raw grunt. They're features.Freezer7Pro wrote:
No, shaders are raw grunt. Take for example the 8800GT and 8800GTS G92. If you clock an 8800GT to exactly the same speeds of an 8800GTS G92, it'll be about 13% slower, due to the lack of those extra 16 shader units. Shaders matter more than clock speeds nowdays. If I clock my 2400Pro up to the same clock speeds as my 3870 (and believe me, I have), it is still less than 1/10th of the speed. Why? Because my 3870 has 320 shaders, and my 2400Pro has 20. If you wish, you could basically rate a card's performance on the ammount of shaders * clock speed.Scorpion0x17 wrote:
I can't wait for them to start using the greek alphabet instead of numbers...Freezer7Pro wrote:
Vista on a 2500+? Are you mad?
And the general rule for current-gen cards is to take the numbers that come after the first and aren't 0's into consideration. I find that wording easier to get.shaders, dx10.1 and 55nm are things I don't consider to be 'raw grunt' - they just 'fancy features' - the raw grunt of a card comes from the clock speed of the GPU/RAM.Freezer7Pro wrote:
Also, I find that ATI's late releases haven't been much but momre raw grunt. 4870 = 3870 with 480 more shaders = 2900XT with DX10.1 and 55nm tech.
In theory, within the same series/core/etc. Don't kill me, fellow tech whores.
GPU processing units scale much, much, much better than CPU processing units. If they wouldn't scale that good, we'd only have graphics cards with one stream processor, not over 100. And I'm not talking 3DMark here. I'm talking everything. Look it up. An 8800GTS G92 on stock performs better than an 8800GT at GTS speeds.Scorpion0x17 wrote:
No. Shaders are not raw grunt. They're features.Freezer7Pro wrote:
No, shaders are raw grunt. Take for example the 8800GT and 8800GTS G92. If you clock an 8800GT to exactly the same speeds of an 8800GTS G92, it'll be about 13% slower, due to the lack of those extra 16 shader units. Shaders matter more than clock speeds nowdays. If I clock my 2400Pro up to the same clock speeds as my 3870 (and believe me, I have), it is still less than 1/10th of the speed. Why? Because my 3870 has 320 shaders, and my 2400Pro has 20. If you wish, you could basically rate a card's performance on the ammount of shaders * clock speed.Scorpion0x17 wrote:
I can't wait for them to start using the greek alphabet instead of numbers...
shaders, dx10.1 and 55nm are things I don't consider to be 'raw grunt' - they just 'fancy features' - the raw grunt of a card comes from the clock speed of the GPU/RAM.
In theory, within the same series/core/etc. Don't kill me, fellow tech whores.
Just because a card gets a 13% lower 3DMark score, or even 13% lower framerates, it doesn't make it 13% slower.
You said it yourself - if you clock an 8800GT to exactly the same speeds of an 8800GTS, it is exactly the same speeds.
Raw grunt = clock speed.
A card with more shader pipelines may seem faster, but it's not, it's just doing more.
It's like with multi-core CPUs - is a 3GHz QuadCore faster than a 3GHz DualCore?
No it is not.
But it can do more.
Last edited by Freezer7Pro (2008-11-14 09:38:22)
You're talking about a totally artificial measure of performance.Freezer7Pro wrote:
GPU processing units scale much, much, much better than CPU processing units. If they wouldn't scale that good, we'd only have graphics cards with one stream processor, not over 100. And I'm not talking 3DMark here. I'm talking everything. Look it up. An 8800GTS G92 on stock performs better than an 8800GT at GTS speeds.Scorpion0x17 wrote:
No. Shaders are not raw grunt. They're features.Freezer7Pro wrote:
No, shaders are raw grunt. Take for example the 8800GT and 8800GTS G92. If you clock an 8800GT to exactly the same speeds of an 8800GTS G92, it'll be about 13% slower, due to the lack of those extra 16 shader units. Shaders matter more than clock speeds nowdays. If I clock my 2400Pro up to the same clock speeds as my 3870 (and believe me, I have), it is still less than 1/10th of the speed. Why? Because my 3870 has 320 shaders, and my 2400Pro has 20. If you wish, you could basically rate a card's performance on the ammount of shaders * clock speed.
In theory, within the same series/core/etc. Don't kill me, fellow tech whores.
Just because a card gets a 13% lower 3DMark score, or even 13% lower framerates, it doesn't make it 13% slower.
You said it yourself - if you clock an 8800GT to exactly the same speeds of an 8800GTS, it is exactly the same speeds.
Raw grunt = clock speed.
A card with more shader pipelines may seem faster, but it's not, it's just doing more.
It's like with multi-core CPUs - is a 3GHz QuadCore faster than a 3GHz DualCore?
No it is not.
But it can do more.
And I don't talk about MHz. I talk about performance.
The thing is, no-one uses "that form" of raw grunt, it doesn't matter. You being too theoretical. It's what people use the cards for that matter, and that's what I'm talking about, namely rendering full 3D. Be it artificial or not, that's what matters.Scorpion0x17 wrote:
You're talking about a totally artificial measure of performance.Freezer7Pro wrote:
GPU processing units scale much, much, much better than CPU processing units. If they wouldn't scale that good, we'd only have graphics cards with one stream processor, not over 100. And I'm not talking 3DMark here. I'm talking everything. Look it up. An 8800GTS G92 on stock performs better than an 8800GT at GTS speeds.Scorpion0x17 wrote:
No. Shaders are not raw grunt. They're features.
Just because a card gets a 13% lower 3DMark score, or even 13% lower framerates, it doesn't make it 13% slower.
You said it yourself - if you clock an 8800GT to exactly the same speeds of an 8800GTS, it is exactly the same speeds.
Raw grunt = clock speed.
A card with more shader pipelines may seem faster, but it's not, it's just doing more.
It's like with multi-core CPUs - is a 3GHz QuadCore faster than a 3GHz DualCore?
No it is not.
But it can do more.
And I don't talk about MHz. I talk about performance.
Ok, say you wrote a program that just drew say 100,000 wireframe triangles on the screen.
That would measure raw grunt.
As soon as you start filling, texturing, lighting, or applying shaders to those triangles, then you're test is becoming more and more artificial and less and less a measure of true grunt.
Last edited by Freezer7Pro (2008-11-14 09:52:34)