No, that's how it was in the dawn of 3D accelerators, before the time of D3D being the standard to render everything.Scorpion0x17 wrote:
Well, that's always happened since the dawn of 3D Accelerators. And it always will.GC_PaNzerFIN wrote:
It does indeed have awsome looking graphics but the fact someone just forgets a very large group of ppl (now angry ppl) in favor of another group is unacceptable in long term.Bertster7 wrote:
Yes I have. It runs very badly. The fact that it's optimised for nVidia cards doesn't mean it isn't optimised. They do recommend playing it on an nVidia card after all. Your real complaint seems to be that it's not optimised for ATi cards, which is a fair point - just a different one.
For the graphics you get with Crysis it runs surprisingly well. There is nothing else that comes even remotely close to the graphics seen in Crysis, which is why it doesn't perform as well in general - not due to optimisation.
Hey you need OUR card to run THIS game well and THEIR card to run THAT game well. sounds ugly to me...
The idea of any hi-fi system is to reproduce the source material as faithfully as possible, and to deliberately add distortion to everything you hear (due to amplifier deficiencies) because it sounds 'nice' is simply not high fidelity. If that is what you want to hear then there is no problem with that, but by adding so much additional material (by way of harmonics and intermodulation) you have a tailored sound system, not a hi-fi. - Rod Elliot, ESP
Exactly 100% right.Scorpion0x17 wrote:
Well, that's always happened since the dawn of 3D Accelerators. And it always will.GC_PaNzerFIN wrote:
It does indeed have awsome looking graphics but the fact someone just forgets a very large group of ppl (now angry ppl) in favor of another group is unacceptable in long term.Bertster7 wrote:
Yes I have. It runs very badly. The fact that it's optimised for nVidia cards doesn't mean it isn't optimised. They do recommend playing it on an nVidia card after all. Your real complaint seems to be that it's not optimised for ATi cards, which is a fair point - just a different one.
For the graphics you get with Crysis it runs surprisingly well. There is nothing else that comes even remotely close to the graphics seen in Crysis, which is why it doesn't perform as well in general - not due to optimisation.
Hey you need OUR card to run THIS game well and THEIR card to run THAT game well. sounds ugly to me...
As for Crysis itself, well, I don't know how the full game compares, but I ran the Crysis demo on my AthlonXP 3200+ with AGP X800XT PE.
Not on all highs, of course, but it looked amazing and ran smooth as a babies bottom.
Crysis, like Farcry before it, was designed to not only run on ageing hardware, but also on hardware that hasn't been released yet.
The problem is, people go out and buy a single GTX or 48xx and a pretty average CPU, and expect to be able to run Crysis on Ultra High and get a constant >100fps.
That just ain't gonna happen.
Does anyone remember playing FarCry when it came out? Back when nVidias FX series and ATis R300 series were the best GPUs around. It didn't run well on them - in fact with HDR turned on (but that wasn't included at launch) you'll need a 7800GTX to get it running smoothly.
I like people to design games that push the limits of hardware. I wish people would stop bitching about it.
Bollocks it is. The situation is slightly different today, but not much. There are loads of games that perform massively differently on GPUs from different vendors. Oblivion being another moderately recent example of something that performed way better on ATi cards.Freezer7Pro wrote:
No, that's how it was in the dawn of 3D accelerators, before the time of D3D being the standard to render everything.
Is there a way that you can switch between cards? Just buy the best Nvidia and the best ATi and then have a hardware switch or something. Change game, change card.
Or if you're rich, just switch computers.
Or if you're rich, just switch computers.
I'm not complaining the fact Crysis is pushing the limits of current tech.
I'm complaining about the fact that it is pretty much NVIDIA only game. Every other game runs equally well on both manufacturers GPUs.
8800GTX ran Crysis better than much better performing HD 3870 tri-fire for example. Of course the 8800 got its ass kicked in every other game.
Crysis is the only game I have ran into this kind of problem. Every else "the way its ment to be played" game has run just great on ATi as well but original Crysis was designed only using 8800 series.
Assassin's Creed ran poorly on NVIDIA cards so they had to remove DX10.1 from it with a patch btw.
I'm complaining about the fact that it is pretty much NVIDIA only game. Every other game runs equally well on both manufacturers GPUs.
8800GTX ran Crysis better than much better performing HD 3870 tri-fire for example. Of course the 8800 got its ass kicked in every other game.
Crysis is the only game I have ran into this kind of problem. Every else "the way its ment to be played" game has run just great on ATi as well but original Crysis was designed only using 8800 series.
Assassin's Creed ran poorly on NVIDIA cards so they had to remove DX10.1 from it with a patch btw.
3930K | H100i | RIVF | 16GB DDR3 | GTX 480 | AX750 | 800D | 512GB SSD | 3TB HDD | Xonar DX | W8
So you're not complaining that Crysis isn't optimised. You're complaining that it's too optimised.GC_PaNzerFIN wrote:
I'm not complaining the fact Crysis is pushing the limits of current tech.
I'm complaining about the fact that it is pretty much NVIDIA only game. Every other game runs equally well on both manufacturers GPUs.
8800GTX ran Crysis better than much better performing HD 3870 tri-fire for example. Of course the 8800 got its ass kicked in every other game.
Crysis is the only game I have ran into this kind of problem. Every else "the way its ment to be played" game has run just great on ATi as well but original Crysis was designed only using 8800 series.
Assassin's Creed ran poorly on NVIDIA cards so they had to remove DX10.1 from it with a patch btw.
huh? now I don't get ur point at all.Bertster7 wrote:
So you're not complaining that Crysis isn't optimised. You're complaining that it's too optimised.GC_PaNzerFIN wrote:
I'm not complaining the fact Crysis is pushing the limits of current tech.
I'm complaining about the fact that it is pretty much NVIDIA only game. Every other game runs equally well on both manufacturers GPUs.
8800GTX ran Crysis better than much better performing HD 3870 tri-fire for example. Of course the 8800 got its ass kicked in every other game.
Crysis is the only game I have ran into this kind of problem. Every else "the way its ment to be played" game has run just great on ATi as well but original Crysis was designed only using 8800 series.
Assassin's Creed ran poorly on NVIDIA cards so they had to remove DX10.1 from it with a patch btw.
Warhead runs equally on both camps because it is optimized for both. Original is not.
3930K | H100i | RIVF | 16GB DDR3 | GTX 480 | AX750 | 800D | 512GB SSD | 3TB HDD | Xonar DX | W8
Warhead does not run equally well on both camps.GC_PaNzerFIN wrote:
huh? now I don't get ur point at all.Bertster7 wrote:
So you're not complaining that Crysis isn't optimised. You're complaining that it's too optimised.GC_PaNzerFIN wrote:
I'm not complaining the fact Crysis is pushing the limits of current tech.
I'm complaining about the fact that it is pretty much NVIDIA only game. Every other game runs equally well on both manufacturers GPUs.
8800GTX ran Crysis better than much better performing HD 3870 tri-fire for example. Of course the 8800 got its ass kicked in every other game.
Crysis is the only game I have ran into this kind of problem. Every else "the way its ment to be played" game has run just great on ATi as well but original Crysis was designed only using 8800 series.
Assassin's Creed ran poorly on NVIDIA cards so they had to remove DX10.1 from it with a patch btw.
Warhead runs equally on both camps because it is optimized for both. Original is not.
It runs far better on a 260GTX than on a 4870. We're talking 40-60% difference in frame rates here.
no. difference is 0-10% http://www.techspot.com/article/118-cry … page3.htmlBertster7 wrote:
Warhead does not run equally well on both camps.GC_PaNzerFIN wrote:
huh? now I don't get ur point at all.Bertster7 wrote:
So you're not complaining that Crysis isn't optimised. You're complaining that it's too optimised.
Warhead runs equally on both camps because it is optimized for both. Original is not.
It runs far better on a 260GTX than on a 4870. We're talking 40-60% difference in frame rates here.
3930K | H100i | RIVF | 16GB DDR3 | GTX 480 | AX750 | 800D | 512GB SSD | 3TB HDD | Xonar DX | W8
http://enthusiast.hardocp.com/article.h … VzaWFzdA==GC_PaNzerFIN wrote:
no. difference is 0-10% http://www.techspot.com/article/118-cry … page3.htmlBertster7 wrote:
Warhead does not run equally well on both camps.GC_PaNzerFIN wrote:
huh? now I don't get ur point at all.
Warhead runs equally on both camps because it is optimized for both. Original is not.
It runs far better on a 260GTX than on a 4870. We're talking 40-60% difference in frame rates here.
As you can very clearly see, the GeForce GTX 260 has a distinct performance lead here. In fact, it averages 62% faster than the Radeon HD 4870. Also clearly visible is the fact that these settings are very playable on the GTX 260. On the other hand, the Radeon HD 4870 is not playable at all with these settings.
there must have been something horrible wrong with the test setup. every other benchmark shows 260 and 4870 being very close in performance. what drivers did the [H] use? Some ATi catalysts were bugged badly in warhead and there was a hotfix available soon.
http://www.guru3d.com/article/top-10-ga … 216-test/7
http://www.guru3d.com/article/top-10-ga … 216-test/7
3930K | H100i | RIVF | 16GB DDR3 | GTX 480 | AX750 | 800D | 512GB SSD | 3TB HDD | Xonar DX | W8
Using most recent drivers as of mid-october.GC_PaNzerFIN wrote:
there must have been something horrible wrong with the test setup. every other benchmark shows 260 and 4870 being very close in performance. what drivers did the [H] use? Some ATi catalysts were bugged badly in warhead and there was a hotfix available soon.
http://www.guru3d.com/article/top-10-ga … 216-test/7
They are using the older version of the 260GTX and the 512MB version of the 4870.
Last edited by Bertster7 (2008-11-22 05:31:48)
yeah and ati is the 512mb model. you understand gtx 260 and gtx 260 '216' don't have 60% performance difference.Bertster7 wrote:
Using most recent drivers as of mid-october.GC_PaNzerFIN wrote:
there must have been something horrible wrong with the test setup. every other benchmark shows 260 and 4870 being very close in performance. what drivers did the [H] use? Some ATi catalysts were bugged badly in warhead and there was a hotfix available soon.
http://www.guru3d.com/article/top-10-ga … 216-test/7
It is not the 216-core version of the 260 they are using.
Last edited by GC_PaNzerFIN (2008-11-22 05:30:06)
3930K | H100i | RIVF | 16GB DDR3 | GTX 480 | AX750 | 800D | 512GB SSD | 3TB HDD | Xonar DX | W8
Nor do the 4870 512MB and the 4870 1024MB.GC_PaNzerFIN wrote:
yeah and ati is the 512mb model. you understand gtx 260 and gtx 260 '216' don't have 60% performance difference.Bertster7 wrote:
Using most recent drivers as of mid-october.GC_PaNzerFIN wrote:
there must have been something horrible wrong with the test setup. every other benchmark shows 260 and 4870 being very close in performance. what drivers did the [H] use? Some ATi catalysts were bugged badly in warhead and there was a hotfix available soon.
http://www.guru3d.com/article/top-10-ga … 216-test/7
It is not the 216-core version of the 260 they are using.
From the initial benchmarks you posted you'll see that despite there being slightly less of a gap between the 260 and the 4870 there is still a big gap.
nVidia have released new drivers in the time between when the 2 articles were written. This could account for it.
As you'll notice, ATi's older hardware still performs badly on it.
Last edited by Bertster7 (2008-11-22 05:40:52)
yeah that must be it. It took a while longer to release the hotfix that close the gap back to <10%.Bertster7 wrote:
nVidia have released new drivers in the time between when the 2 articles were written. This could account for it.
edit: I had 8800GTX SLi and HD 3870 X2 when the original crysis had been out for awhile.
Last edited by GC_PaNzerFIN (2008-11-22 05:44:50)
3930K | H100i | RIVF | 16GB DDR3 | GTX 480 | AX750 | 800D | 512GB SSD | 3TB HDD | Xonar DX | W8
And look at the performance figures for the 3870 compared to the 8800GTX for Crysis and Warhead. The older ATi hardware does not fare well. Also prior to the 1.1 patch Crossfire performance was awful.GC_PaNzerFIN wrote:
yeah that must be it. It took a while longer to release the hotfix that close the gap back to <10%.Bertster7 wrote:
nVidia have released new drivers in the time between when the 2 articles were written. This could account for it.
edit: I had 8800GTX SLi and HD 3870 X2 when the original crysis had been out for awhile.
I remember the Intellivision, and spent some of my college years tweaking out a Z80 on a breadboard.Scorpion0x17 wrote:
When you remember the release of the 386, it's not.unnamednewbie13 wrote:
I've been looking forward to Nehalem. The CPU I'd like is cheap enough, but the motherboard is not.../orly owlScorpion0x17 wrote:
What???
Core2 came out, what, 2-3 years ago.
That is NOT a long time ago.
It is. Computer electronics do not have a Twinkie's shelf-life.
Last edited by unnamednewbie13 (2008-11-27 21:49:56)