With the news that G80 will likely ship in November, with R600 probably not too far behind, who here is thinking about buying one? My comp is positively ancient, so I'll try if I can afford it.![]()
Yes, bring it on.
No, not worth it now.
With the news that G80 will likely ship in November, with R600 probably not too far behind, who here is thinking about buying one? My comp is positively ancient, so I'll try if I can afford it.![]()
Not for me, but thats only beacause I just ordered a 7900GTO.
-There is no Overkill
-Apathy causes the least amount of conflict with stupid people
Check out my Guide to creating webpages
Under the patronage of Incinerate_IV
No I won't be, I've just brough a X1900XT and it runs all games fast and unlike the person who choose Nivida () I can run HDR and AA at the same time.
![]()
I might look at upgrading to the 2nd gen DX10 cards. The first run of DX 10 cards are going to be buggy and power hungry so hopefully new techonolgy will have come out by then to hopefully reduce power requirements (the GeForce 8800 needs 2x PCI-Ex connectors!).
Same with me.Originally Posted by Freddie
Plus I heard that DX10 isnt backwards compatable with DX9, and I aint shellin out £180 for Vista...
Under the patronage of Rhah and brother of eventhorizen.
What does this mean?Originally Posted by Freddie
Originally Posted by shenmueguru
Most new cards need to have an extra auxiliary power connectors hooked on to them to supply power. This used to be supplied via the 4 pin white Molex connector but when PCI express was introduced a newer power connector was invented (one that could supply even more power) for the socket.
G80 will be a directX 9 beast with 128 "pipelines" which applications can divide themselves into the optimal pixel shader / vertex shader / geometry shader ratio, with a default value likely being 96 / 32 / 0 or 114 / 16 / 0, as only DX 10 games will use the new geometry shader in shader model 4.0
I'm leapfrogging Shader Model 3.0 as I still own a radeon 9700 Pro (getting old & shader model 2.0a), and I'll probably get a good mainstream model this time (powerful enough).
R600 will not require you to set this ratio for each application separately (which is a typical nVidia solution, look at SLI for example), and will probably provide a better implementation of the geometry shader, better filtering and anti-aliasing with HDR, which is a required feature for ANY shader model 4.0 / DX 10 graphics card. Probably better dynamic branching within the pixel / vertex / geometry shader as well, though remember that DX 10 graphics card emulate direct X 9 at a lower granularity. This means they approximate DX 9 functions with simpler functions, but these simpler functions are more precise!! (due to help of integer values and FP32 all the way !!)
In other words your DX 9 games will fly by like Quake 3 on today's systems. Massive FPS, but some detail will be lost. The DX 10 team have said they didn't see the difference, nitpickers obviously will. Dynamic branching will still be useful if the geometry shader is going to be abused by developers to create massively beautiful geometric effects.
Required features (Shader Model 4.0):
Floating point 32 all the way, alpha&color blending included!
Integer values 16/32 possible within certain parts of the pipeline
HDR formats support required
raster operations support required (early Z culling, viewpoint divide, etc.)
integer operations
fully dynamic shader management (nVidia is partly ignoring this, Microsoft has approved that, it consists of two parts, within the shader and the shaders as a whole)
non-required:
anti-aliasing
filtering
Furthermore, DX 10 cards are HDMI ready, useful for future content. You can still use them with Windows XP, while also being able to upgrade when you want to. You cannot play (not yet existent) DX 10 games until you do.
"in montem soli non loquitur" basically means that you should not argue against what is obvious.
(> <) (\_/) Haha, die little bunny, die!
(_)(_)(x.X) No soup for you!
becoming is for people who do not will to be
If you look at the pictures, it seems as if the 8800 has two connector cables to the motherboard(or was it to power?). Whatever the case, 2 connectors means it's extremely power hungry.
Clients: Caius Britannicus, Waitcu, Spurius, BrandonM, and Tsar Stephan.
http://www.totalwardai.com
I'll wait untill when DX10 actually comes out, and I have to upgrade to Vista first.
THE PC Hardware Buyers Guide
Desktop PC: Core 2 Duo E6600 @ 2.8 Ghz | Swiftech Apogee GT waterblock + MCP655 + 2 x 120mm rad | Biostar Tforce 965PT | G.Skill 4gb (2 x 2gb) DDR2-800 | Radeon HD 4870 512mb | 250GB + 160GB hard drive | Antec 900 | 22" Widescreen
I will probably get a 2nd generation DX10 Card. How reliable do you think the first generation will be anyways? What about driver issues? And performance issues? And that fact that Vista ain't even out yet. Plus it costs a damn load of money.
I agree, not worth it yet. I can only hope that my X850 XT gets me through another year or so, so that I can look at Vista and DX10 before going off to college.
Well, I would be shocked if DX10 games came out without some sort of support for DX9 cards in the next two years. Devs are not idiots. If no one buys their games because they don't have a DX10 card, they lose serious amounts of cash. Two years later, it is upgrading time, so no biggie.
As far as HDMI goes, its main benefit is HDCP, which is the spawn of the devil, so no harm there.
And unlike the people who chose ATI, we can actually run linux with decent drivers.No I won't be, I've just brough a X1900XT and it runs all games fast and unlike the person who choose Nivida ( ) I can run HDR and AA at the same time.