I analyse the data of ATI and NVIDIA VGAs and think that ATI's cards has generally higher clock-frequency(engine) doesn't that mean ATI cards are better?
I analyse the data of ATI and NVIDIA VGAs and think that ATI's cards has generally higher clock-frequency(engine) doesn't that mean ATI cards are better?
Last edited by edmont; November 16, 2006 at 04:27 AM.
Make WAR not LOVE.In the GRIM DARK MEDIEVAL there's only WAR !!!
No, clock frequency is just one factor in the total speed.Originally Posted by edmont
Other factors include number of vertex/pixel/unified shaders, how much data each shader can process per cycle, the shader models, bus with and many other major or minor factors.
There is no single factor that can tell you how fast a certain card really is.
The only way to tell is by benchmarking it with real-world applications.
Currently, they both beat each other at certian games, but overall a high end ATi will still run games that are better on nVidia at max settings, and vice versa. If you by any of the two brands top end cards, you will be satisfyied, currently ATI seem a bit cheaper though.
Under the patronage of Rhah and brother of eventhorizen.
Ah the old clock speed myth. Core clock frequency does not necessarily translate into higher performance, but as mentioned above, is only one factor. The best example of this is the Pentium 4 CPU. An athlon 64/core2 chip at much lower clock speed will easily wipe the floor with a 3+ ghz P4 chip, as it was not designed with IPC efficiancy in mind. The entire myth that a higher clock speed equals higer performance is bourn from the days when there was no difference between different achetectures, and also letting your marketing department control what the technical department churns out in terms of products.
-There is no Overkill
-Apathy causes the least amount of conflict with stupid people
Check out my Guide to creating webpages
Under the patronage of Incinerate_IV