Jump to content

Graphics Cards Comparison


Gerard

Recommended Posts

It's about time there was a simple chart to see how today's graphics cards stack up against each other:

5-08-08-graphics.jpg

For idiots: higher is better, left is cheaper (lower is worse, right is more expensive)

Which explains why I can only get 30-40fps when I play GTASA on 800x600 true colour, plus mods. Rawther decent, but it sucks when I try the HDR lighting mod - I only get a frame or two per second...

Link to comment
Share on other sites

lol nVidia

That chart is misleading...what game is being benchmarked and what are the other specifications...the CPU, the RAM, the hard drive, the driver versions, etc...and under what settings.

For me, nVidia cards have been the epitome of AIDS and overheating, it's pig disgusting.

Anyway, that chart also lies, 9600 GT and 3870, 3870 is cheaper but the opposite is shown on the chart. The cheaper 3870 also has faster memory (GDDR4), more stream processors, and a higher core clock.

Also, how in the dick is the 9600 GT that much higher on the chart than the 3870? In that benchmark comparison, the competition is about even with the 3870 winning in some situations and the 9600 GT overcoming in others, however, the final result I linked to shows the 3870 was slightly faster on average.

Research and sauce, children.

Link to comment
Share on other sites

There is no way that the 8800GT's performance is that close to the 9600GT.

8800GT is on average 15% better than the 9600GT. The difference on the chart is like 2fps. On latest games like Crysis and Cod4, there is about 10+ difference.

Link to comment
Share on other sites

|Chart sucks.

Tri SLI makes games run slower, not faster. 8800 Ultra is a bit slower than shown. Other then that the price of the products are way off. Cards to the left are more expensive on average, cards to the right much less (and then I'm talking about 900 instad of 1500)

The cheaper 3870 also has faster memory (GDDR4), more stream processors, and a higher core clock.

Memory: correct.

Stream processors: Not. ATi's stream processors suck. Only 1 every five has all instructions. Other then that, it's clocked much lower.

Higher core clock: You can't compare clocks from different chips. No way a Pentium D at 2,8 GHz would be faster than a Core 2 Duo with one core disabled at 1,8 GHz. Same for this.

ATi cards just suck in some games. Lost Planet being one of them. 9600 GT is faster than a 3870 X2 in crossfire at higher reso's then. So that's like 1 GPU vs 4 GPU's.

I find the 3870 X2 a rather 'sad' card. ATi had to make a dual chip card in order to be able to rival the 8800 ULTRA.

Looking forward to R700 vs GT200. I'm quite sure nVidia will prevail this round, again.

I do hate their prices for their flag ship. Won't easily get them in Europe for under the 750 USD. But if the GT200 meets my expectations, I'm sure to get one.

@Chris82: First benchmarks I've seen with the 3870 being faster.

Link to comment
Share on other sites

ATi's stream processors suck. Only 1 every five has all instructions. Other then that, it's clocked much lower.

Sauce on this?

Higher core clock: You can't compare clocks from different chips. No way a Pentium D at 2,8 GHz would be faster than a Core 2 Duo with one core disabled at 1,8 GHz. Same for this.

True, just pointing it out.

ATi cards just suck in some games. Lost Planet being one of them. 9600 GT is faster than a 3870 X2 in crossfire at higher reso's then. So that's like 1 GPU vs 4 GPU's.

Link to benchmarks of this statement? You can't say "ATi cards just suck in some games" without backing it up.

@Chris82: First benchmarks I've seen with the 3870 being faster.

Really? Well here's another one from a more well-known source. The cards are very close but the 3870 edges out the 9600 ever so slightly on some occasions. and vice versa is true in certain situations...notably with Anti-Aliasing turned on, which is AMD's fault but whenever they want to fix that problem...

Link to comment
Share on other sites

Forgot to remove the "@Chris82: First benchmarks I've seen with the 3870 being faster." since I actually came across other benchmarks that showed that the 3870 is usually faster (by a small margine). Somehow had the idea that it was slower.

ATI's R600 features 64 Shader 4-way SIMD units. This is a very different and complex approach compared to Nvidia's relatively simple scalar Shader units.

Since R600 SIMD Shader can calculate the result of four scalar units, it yields with scalar performance of 256 units - while Nvidia comes with 128 "real" scalar units. We are heading for very interesting results in DX10 performance, since game developers expect that NV stuff will be faster in simple instrucions and R600 will excel in complex shader arena. In a way, you could compare R600 and G80 as Athlon XP versus Pentium 4 - one was doing more work in a single clock, while the other was using higher clock speed to achieve equal performance.

Well, I guess this plan hasn't turned out as it should have done, since it's pretty obvious that nVidia's cards perform better. But anyway, 64 seems to be the real number. That times 5 gives the known 320 the R600 claimed to have. That's also the reason ATi calles them units

r600architecture550of8.png

Just count. 8 x 8. Each having 4 "side shaders" (got no idea what the official name is).

Quote is from the Inquirer. And I know that they aren't very trustworthy, but I read this somewhere else too. But this is the best article I could google up.

R660 works about the same. Not any major difference.

About this:

Link to benchmarks of this statement? You can't say "ATi cards just suck in some games" without backing it up.

That isn't actually ATi's fault. Scaling for the X2 is pretty bad in some games. nVidia cards usually do better in this because they're better optimized (TWIMTBP).

COD 4 scales very, very good: http://www.anandtech.com/video/showdoc.aspx?i=3209&p=4

But in other games, like Crysis, it gets raped by a card like the 8800 GT. http://www.techpowerup.com/reviews/HIS/HD_3870_X2/7.html

Fanboyism got to me though. I should have said that in game ATi cards don't always work all that great. On average SLI scales better than CF. That's just so and you can't deny that (Tri-SLI is made of fail though. Don't know why nVidia ever came up with that).

Link to comment
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
×
×
  • Create New...