What graphics card is the Xbox360 equivalent to?

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

cpburns

Distinguished
Aug 28, 2006
239
0
18,680
the xbox 360's gpu is very close to the Radeon X1800XT. that's sort of close to a 7800GT i think. also, in terms of what could replicate an xbox 360 game? i'm going to say something like a 8-core cpu and a future tri-sli geforce 12 series might have a good shot. why so extreme? because look at how long it took to get a reasonable speed ps2 emulator. yes, you have to emulate that code because it does not run natively on x86. you'd have a better shot at it on an old power mac.
 

3Ball

Distinguished
Mar 1, 2006
1,736
0
19,790


No...it doesn't actually, which is what we have been explaining. Unless you consider a computer with an X1900XT in it with a limited amount of memory and a CPU that has made a good leap forward in multiple cores and threads, but just as equal a leap backwards in single thread performance a super computer then I guess it is.



I am, and I can guarantee that a 7600GT and/or X1600 cannot produce the picture quality and performance that the 360 can from games that are on both platforms such as: Gears of War, Call of Duty 4, MoH Airborne and Rainbow Six Vegas. I can run all of those games on my X1600, but all at low res and detail at mediocre frames.

I am not trying to dispute you or make you angry. I just dont want anyone who (for whatever reason I dont know) look at this thread for information and get misguided. I am not saying that I am 100% correct on this, because it is very hard to compare the 2 platforms. Alas, I believe very strongly that the X1900 series is a much better competitor to the GPU in the 360.

If a game is ported well as in the case of MoH Airborne (maybe not online play, but as far as graphical performance goes). My system in the sig with a change in the video card. (i.e. my old X1900XTX) produced the same graphical quality and very similar frames. So from personal experience I can tell you that this is the most comparable video card series to the one in the 360 as many others have also agreed and backed up with technical understanding over the past few years that this question has been asked.

Best,

3Ball
 

nottheking

Distinguished
Jan 5, 2006
1,456
0
19,310
If I were to take a stab, I'd say that the Xbox 360 has a GPU equivalent to the Radeon X1650XT, though with twice as many texture units. (16 instead of 8) Of course, that's an ATi card... But still matches it closely spec-wise... (128-bit memory interface, and pretty close numbers for shader throughput) And likewise, the results I've seen in games on the PC set to match the Xbox 360's settings do seem to reflect this. So for the GeForce side, I've placed the Xbox 360, along with the Playstation 3, between the GeForce 7600GT and GeForce 7800GS.

I think something CRITICALLY important to note here is that the "shader units" in the Xbox 360 are *NOT*the same as the pixel shader units found in a Radeon graphics card. They are MUCH more like the "stream processors" found in a Radeon HD 2k or GeForce 8 card; like in those, they are mostly just bare logic ALUs.

And once you compare them in that light, it's pretty clear that the Xbox 360 is a far cry, shader-wise, from those: it has 48 SPs @500MHz, (24G/ops) compare to, say, 128 @1500MHz in the GeForce 8800GT, (192G/ops) or 320 @775MHz in the Radeon HD 3870. (248G/ops)

Really, it seems ludicrous to see people keep claiming that the current console will be as powerful as (insert current best PC hardware) I saw this with the Xbox, and seeing people claim that it was more powerful than a 2.8GHz P4 with a GeForce 5. (in spite, of course, of it CLEARLY having a Celeron 733MHz and a modified GeForce 3)

I'll bet that when we've got a whole new generation of graphics hardware, we'll be seeing the console pushers (primarily Xbox/Xbox 360 fans, I've noticed) claiming that it equals/bests the power of a Radeon HD 4900 or GeForce 9800GTX. :lol:

And let's not even THINK of getting into the CPU. John Carmack, I noted, had quite a bit to say on the Xbox 360's CPU. Namely that it was about 50% as powerful as a "modern CPU" of that time... (link) that *could've* referred to the newly-released dual-core CPUs, but likely referred to the single-cores, as I'd have my doubts that even he might've worked extensively with a Manchester/Toledo/Smithfield core yet.

Your friend has eaten loads of BS perpetuated by the hype. It's not even close: a GeForce 8800 series would be more akin to what the NEXT Xbox will have.


Not so at all; in the case of the Playstation, it uses hardly-modified PC GPU. And on the Xbox 360, DirectX means that PC GPUs are "optimized" by default, drivers willing.


I'd actually call it a 7800GS, to be honest; both have the ROPs cut down to 8. Though it also has the memory interface cut down to 128-bit...

To be honest, performance-wise, the PS3 almost seems closer to the 7600GT, though it's definitely above it.


I'd say they're pretty darn close, but for some reason, the Xbox 360 does seem to have a SLIGHT edge. Though I'm going to hazard a guess that it's NOT so much because of the CPU or GPU at all, but rather, because the Xbox 360 has up to 512MB of memory directly available to the GPU; hence if you have a game with, say, need of 300MB of video RAM, it can all fit on there, and allocate the remaining 212MB for the CPU. Whereas on the PS3, you either have to cut that down to 256MB, or you're going to have 44MB that's hanging off of the MUCH higher-latency, and lower-bandwidth main XDRAM attached to the CPU.

 

pauldh

Illustrious
I've avoided this topic so as it's been overdicussed and there is no equivilent. But if I had to pick "which geforce" (as was asked in the OP)is the closest, I would say 8600GT/GTS.

And BTW, the X1800XT is alot better than the 7800GT (someone thought they were =). I owned both and they are not even close. It takes a 7900GT to challenge the X1800XT and still IMO the X1800XT was a bit better.
 

tipoo

Distinguished
May 4, 2006
1,183
0
19,280



heres a cell vs a 1.6GHz G5... keep in mind the SPU's arent being used though.


http://www.geekpatrol.ca/2006/11/playstation-3-performance/




lol, i pissed myself.
 

spoonboy

Distinguished
Oct 31, 2007
1,053
0
19,280
I see what you are all saying and its all good stuff. But I think your all forgetting that the xbox hasnt got the millstone of windows hanging around its neck. And im sure its equivalent of dx9 or dx10 or whatever is more efficient than what the pc has. That equals alot more visual bang for less hardware buck.

Whoever said the xbox is equal to a 7600gt or 7800gt your having an absolute laugh mate. I had a 7600gt and oblivion p#ssed all over it. Upgrading to a 7900gt still didnt do much for my min and average frames outdoors with the same settings. Im pretty sure the xbox port, although i guess somebody will jump up and say its got less detail, looks alot better than what the 7600gt could possibly ever ever ever manage. And on the system ram point, your forgetting consoles need alot less ram to get by than pcs seeing as their not tied to xp or vista and the 20+ background processes that entails.

Maybe if you think, what cards are needed to run xbox ports well (very roughly the same res/visuals and frames), or maybe more tellingly what cards can give an equal experience in pc games that were ported to the xbox. Name some pc->xbox ports and what cards give the same experience as on the xbox. Then you'll be getting somewhere.

7600gt is alot better than the x1600. end of.
 

3Ball

Distinguished
Mar 1, 2006
1,736
0
19,790


I did!

Best,

3Ball
 

nottheking

Distinguished
Jan 5, 2006
1,456
0
19,310

How did I miss this? It's brilliant. Best post of the thread.


No, it doesn't have anything resembling DX10 support at all... And technically doesn't even meet the STANDARDS of DirectX 9.0c fully.

As for perforamnce, I'd like to point out that Windows doesn't bog down a graphics card AT ALL. No matter which way you try to play it, there's no way that happens.


As someone who's actually extensively tested Oblivion, both on the PC and Xbox 360, I'll stand by my claim. (I'll admit to not having given it a spin on the PS3, though I'd be confident in saying that I know more about its performance than any other person around)

I'd also note that minimum framerates in Oblivion almost NEVER have anything to do with the graphics card: hiccup occure solely because of CPU-related issues. As for appearance, to do a fair comparison, you should either plug your Xbox 360 into a VGA monitor, or output your PC to an HDTV using component cables.

As for RAM, I'd note that I actually run the game in Windows ME... Or what was once Windows ME. I've cut off so many modules that it actually uses up less RAM than the Xbox 360 does just for the dashboard, which is *always* running. (i.e, about 24-30MB of RAM) And if you bothered to go talk to the Oblivion enthusaist community, you'd find that even though most use XP, they do *not* run those background processes while playing the game; (many use the task manager to kill every process including explorer.exe, then run the game from there) it's entirely luddites that do so.

So that gives you a fair comparison of the hardware, and then the results become more readily clear.


I get notably higher settings on my PC in Oblivion using a Radeon X800XT than I do on the Xbox 360.

A break-down of the settings that I used for comparison here; both got the same framerates of around 30 (most X360 games run at 30fps; Halo 3, and most sports titles are the exceptions)
■Resolution of 1280x720 (on the Xbox 360, 1920x1080 is available, but it only renders in 720p, and then stretches the resolution to fit)
■x4 multisample AA.
■4x AF on the PC, 0x AF on the Xbox 360 (the Xbox 360 lacks the ability to use AF)
■"Large" textures.
■All sliders set to 100% on the PC version. On the Xbox 360, they vary between 50-90%, from what I've seen. (save for view distance, which is always maxed)
■Water: default high-detail with basic reflections on the Xbox 360, .ini edited to reflect trees, rocks, arcitecture, NPCs, creatures, arrows, spells, etc. (by default, only the ground, city walls, the sky, and the Imperial Palace reflect)
■HDR enabled on both versions. (using the add-on SM 2.0 version on the PC, which runs slower than the base SM 3.0 version because it's not integrated right into the game's engine)

Net result: the game runs at a comparable framerate on both version, though I get some noticeable quality improvements on the PC. The AF makes a rather blatant difference, as one might imagine. Similarly, the water looks much, much more realistic when it reflects literally *everything*.

There were a few other tweaks that were made POSSIBLE because of the PC (such as setting the blood timer to fade-out after a half-hour rather than 10 seconds) and while those were used during the comparison, I do not consider them performance-related.


I never denied that; the thing's maybe twice as potent. However, the card I mentioned was the X1650XT, which is comparable to the 7600GT, and not the plain X1600. And actually, as I noted, I did say it was *more* potent than the 7600GT. ;)

As for the X1600... Again, the Xbox 360 is clearly worlds above this. This would actually be my estimate for the equivalent card that would match the Wii, albeit cut to a 64-bit memory interface, and possibly clocked down further. It seems to be in-line with producing the sort of visuals I see in, say, Need for Speed: Pro Street, as well as the fact that the RV530 die and the RV-Vegas GPU die are both 90nm SOI ATi parts with very close to the same silicon size.
 

spoonboy

Distinguished
Oct 31, 2007
1,053
0
19,280
Mister blondes revenge lol, well nottheking's i mean :) - On a side note i always chuckled that the enemies didnt react any differently to you as mr blonde than as joanna dark. They still said things like "get the b#tch!" I thought they were talking about someone else lol.

Well done, that was probably the clearest and most thorough post so far. #Docks hat, bows slightly#

sorry about the 7600gt + x1600 point, I missed that you were refering to the newer x1650. The original x1600 was total arse though lol.

So your rough equivalent to the xbox is the x800xt? Would you say that for any other pc->xbox ports?

btw: "As for performance, I'd like to point out that Windows doesn't bog down a graphics card AT ALL. No matter which way you try to play it, there's no way that happens." dx9 is slower on vista than xp. Is it not?
 

nottheking

Distinguished
Jan 5, 2006
1,456
0
19,310

Why thank you. :)

Though the avy is actually from the Crash Site level, not Mr. Blonde's Revenge... A bit hard to tell, though, because of the contrast in the background. (the forums don't have HDR capabilities, sadly)


Well, it wasn't good for its original price point. Once it was dropped to be a mid-low level card, it was worth getting, as a card with an edge over the vanishing 6600GT, in having AA+HDR support as well as typically packing 256MB of RAM instead of 128MB.


I'd say that it'd be fairly close. Of course, I'd note that the X800XT has very, very solid DirectX 9.0b support, while the Xbox 360's GPU technically is DirectX 9.0c, though its version of SM 3.0 isn't "complete." (i.e, it has fewer registers, etc.) So you wouldn't get full equivalence in features there, just primarily performance.


I'd personally blame that on the drivers. :p
 

dos1986

Distinguished
Mar 21, 2006
542
0
18,980
lol.....

The Xbox360/PS3 are a good bit more powerful than x800xt/standard pentium D chip....

Forget about the numbers and play the games....

Put tomb raider legend with next gen effects on in your pc and you will not get past 10fps on a x800xt...While the xbox360 can play it at 720p with all the next gen effects on smoothly....

Same goes for other games like Call of Duty 4,Bioshock ,The Orange Box,Dirt,Medal of honour airborne,Unreal 3 etc

A ps3/xbox360 are highly optimised gaming systems,designed to do one thing,play games.....

They are both similar in gameplay/settings performance to a pc of...

Amd x2 4600
Radeon X1950 XT

Put in Unreal 3 on the above pc and it will at best be able to look and play at the same speed as the ps3/xbox360 version.....

Saying that it would probably play smoother on the ps3/x360....

I have connect my ps3 with Dirt playing onto my samsung 226bw monitor and it looked and played better than on my 2.33ghz,x1950pr0,2gb ram,xp pro system...Make of that what you will...
 

spoonboy

Distinguished
Oct 31, 2007
1,053
0
19,280
Yeah youve got a point on DiRT, I had that for a while on my pc, my old 7900gt clocked to gtx speeds was just about 30fps average with all on high, some on ultra but not all of the ones that could be on ultra actually on ultra if u see what i mean, 2xaa @1152x864. Might have got the x800 on that one.
 

nottheking

Distinguished
Jan 5, 2006
1,456
0
19,310

Read up. I *did*. And that comment on the CPU I'd take at face value; it's not from a gamer, but instead from John Carmack, possibly gaming's best-known and most-famous programmer. He's the one who's responsible for the programming seen in Xbox 360 games like Quake IV and Prey. So I'd hazard that he knows what he's talking about; and he stated that the Xbox 360's CPU was half as powerful as a modern PC CPU of the time.


That is an incorrect comparison, since if you enable "next gen effects" on the PC version, it doesn't match what the Xbox 360 version is doing.


Will some of you STOP using this line?! It's complete garbage, and irrelevant.

My own PC is a "highly optimized gaming system." A pre-packaged crappy OEM box a luddite picks out of Best Buy, Wal-Mart, or some other retailed should not be the basis of comparison here.


Hardly; with that hardware, you could get it to look BETTER. Here's a little hint: the console versions do NOT always equal the PC version on 100% settings; in fact, they rarely do, unless the game was ported to the PC AFTER the console, but even then it's not always the case. Typically, they're only on medium to high, and NEVER "ultra."


Logically, if you take fairly more powerful PC hardware, but then crank it to settings vastly above what the console has, yes, it would stand that it would run slower than the console using MUCH lower settings.


Likely improper selection of settings; one advantage that consoles have is that the developers lock in one set of graphics settings, which they can tweak with until they find the one that they feel provides the best visuals for the framerate. Finding such a setting on the PC can be hard.
 

pauldh

Illustrious
I remember Anand's pulled article that stated in it game developers they talked to said MS and Sony would have had significantly more processing power had the gone with a single core A64. it went on to praise xenos though. ( staying on topic ;) )

Even though the article was taken down, it's text still circulated for a long time and has been posted on these forumz many times.




 

dos1986

Distinguished
Mar 21, 2006
542
0
18,980


lol

Do you own a ps3/x360?

I do and the fact is I can not play games as well on my 2.33ghz Core 2 Duo,x1950pro,2gb ram,xp pro system,as I can on the ps3.....

Oh yeah Tomb raider legend includes next gen effects on the xbox360 version,at the time barely any card could play it smoothly @720p,yet the xbox could.....That game crippled my system as does Dirt with equal visual quality to ps3....

http://www.bit-tech.net/gaming/2006/04/11/tomb_raider_legend_review/3

The 7800gt at the time couldnt even play the game with next gen on....

Interestingly, when testing the generally faster Radeon X850XT, the next-gen content was completely disabled.



 

nottheking

Distinguished
Jan 5, 2006
1,456
0
19,310

It does have that: coupled with the XNA development stuff, the Xbox 360 is an amazingly easy system to program for. Which, in the end, would matter more than power, I'd think. And it shows: the Xbox 360 appears to have been getting FR more development resources poured into it than any other platform, including the PC.


Which makes me question your work in attempting to match things up evenly between them: most people actually over-shoot the console version's settings by a large bit, vastly over-estimating them. This leads to them coming to the conclusion that the console was somehow more powerful than a PC that's newer than it, and it then becomes a self-repeating cycle. You've failed to provide anything beyond a completely anecdotal claim here.


Keep in mind that "720p" refers to a TV-only resolution, I'd note.

That, and the Xbox 360 does not use the entire "Next Generation Content" package: it uses some, but it's worth noting that on the PC, unfortunately, it is a VERY sweeping set of settings changes. So it's a very poor metric, because it presents the illusion that it's one feature, when in fact it's a package, one which is not mirrored on the Xbox 360; as one part would seem, when I looked at it, was that on the PC, "next gen content" actually included higher-resolution textures, that the Xbox 360 likely wouldn't have had the RAM to even FIT; depending upon what the game is using for non-graphics parts, the Xbox 360 has the equivalent of usually around what a 256-320MB graphics card would.

Likewise, the Model 3.0 shaders are written differently, given that the Xbox 360 doesn't have full SM 3.0 support. So technically, it's IMPOSSIBLE to get a straightfoward comparison in this game, because the PC version lacks the option to match the Xbox 360's settings. In short, you get "well below" and "higher." And likewise, I'd point out that I've yet to see an Xbox 360 game that didn't have even occasional spots where it "chugged" in framerate for a moment, just like on the PC; games on consoles don't automatically run at a constant full framerate.

This is entirely irrelevant: the NGC package includes the use of SM 3.0 HDR, which pretty obviously a Radeon X850XT, being a SM 2.0b device, couldn't use.

As far as the 7800GT, if you're familiar with Bit-Techs benchmark reviews like I am, I know that what they list are what they RECOMMEND for settings: they shoot for what you can get and maintain framerate that is good, solid, and playable. It could use them, though they apparently decided to leave them off; it got vastly higher framerates than the other cards did that were using it as a result.

Also, I'd point out that the resolutions used were 1600x1200 and 1920x1200; those are, respectively, just barely below 1080p, and a notch above 1080p. They're both OVER DOUBLE the 720p resolution that the Xbox 360 runs its games at... And runs them with a maximum FPS of 30. So by that comparison, one could estimate that the Radeon X1900XTX presented there would be about double the power and performance of the Xbox 360 in that game, because the results shown were for a resolution two and a half times as large. Likewise, I'd also note, from the screenshots, that it was using some degree of anisotropic filtering, which is a feature that the Xbox 360 does not support. (this was mainly because RAM in the X360 is at a premium, I believe, and AF rapidly eats up video RAM)

I think something that's a problem here on the PC is that it always OFFERS pretty much arbitrary settings: thanks to things like resolution, ever-increasing AA, anisotropic filtering, etc., people are PUSHED to try to get the best and shiniest-looking screens... But at a cost to their framerate, which plummets. On a console, you can't change them, and they're locked well below the top of what's available, at "reasonable" settings, so that you get decent performance. I'd argue that framerates would tend to appear higher on the console not because of the hardware power present (which is markedly weaker than on PCs of its day) but rather because the settings are picked by people who have a better idea of what to select than the player. :p
 

spoonboy

Distinguished
Oct 31, 2007
1,053
0
19,280
@nottheking

"Will some of you STOP using this line?! It's complete garbage, and irrelevant.

My own PC is a "highly optimized gaming system." A pre-packaged crappy OEM box a luddite picks out of Best Buy, Wal-Mart, or some other retailed should not be the basis of comparison here."

Although unoptimized ludites might buy the 360, doesnt mean the 360 isnt highly optimized.#

The 360 also has a tesselator, i dont know if its widely used in the games, but where it is used it would really blow away an x800, and make the 360 in those cases at least, not comparable to any pc card. (only since we dont have any tesselator games yet).
 

free2game

Distinguished
Jan 13, 2008
13
0
18,510
Using Oblivion as an example it puts out about the same as a x1800 GTO. It's really much weaker than that but when developers work on a standard hardware set they can properly optimize that code for it and it only so that's why you generally see the 2x performance difference between a GPU of about the same power.
 

free2game

Distinguished
Jan 13, 2008
13
0
18,510
lol wut, for one thing the 360's CPU is SM3.0. Not to mention a card like that can easily run something like COD4 at 1680x1050 with 4XAA at similar framerates of what the 360 did running at 1024x600 with 2x AA and less graphical features.
 

free2game

Distinguished
Jan 13, 2008
13
0
18,510
Equal visual quality on Dirt? Dirt ran like **** on the 360 and most PCs when it came out (I still can't max it out myself, I have to turn the lighting down to high) and Dirt on the PS3 looked even worse, mostly because of it's very demanding but impressive lighting engine that made use of a lot of dynamic soft shadows. Kind of similar to the lighting engine used in S.T.A.L.K.E.R.. Which was another very demanding game when rendered with fully dynamic lighting.
 

spoonboy

Distinguished
Oct 31, 2007
1,053
0
19,280
Having full grass density is the biggest killer in stalker i would say. Reduce it by 50% and alot of cards can suddenly play the game with full dynamic lighting. Think both stalker and dirt are quality games btw but i got fed up of my profile getting corrupted on dirt so i stopped playing. (yes i was connected to the internet at the time before anyone says anything about that).
 

free2game

Distinguished
Jan 13, 2008
13
0
18,510
Not sure, the foilage system wasn't that advanced in it from what I remember, it was about the same level as the foilage I see in most UE3 games, which brings me to another point about how horriblely over-rated that engine is, but that's something else I wont get into. But the biggest killer for Stalker's performance was deffinately the lighting engine, it's probably the most advanced one out today. Crysis has some really nice dynamic soft lighting though but I never understood why the flashlight didn't use object dynamic lighting while the rest of the light sources were fully dynamic.
 

nottheking

Distinguished
Jan 5, 2006
1,456
0
19,310

And an "unoptimized" luddite could readily use a PC, though. And if they get a TAD of help from someone who knows what they're doing, they'll have a highly optimized system as well.


Um, tessellation is a buzzword, mostly; it's something that's been supported on ATi's video cards since the Radeon 9500/9700 series. And as I recall, very few titles actually use it, on either platform. And when it is, its effectiveness is usually pretty underwhelming. (as examples, I don't believe that either Gears of War or BioShock make use of it, as I don't believe its used for anything in Unreal Engine 3)
 
Status
Not open for further replies.