Previous Generation Radeon HD Powers the Wii U

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
IDK, what do you think. Comparing a $500 GTX 580 to a $170~ Radeon 6870.


Of course you're under the belief that GPUs have progressively been getting hotter and more power hungry, you look at the absolute highest end of the spectrum.
 
[citation][nom]whysobluepandabear[/nom]At least you have DX 11. You know you can use it for future proof and development. The fact still is, that The Wii U is going to be extremely outdated by the time It hits It's 5-6 year life. At least MS & Sony are willing to provide you with hardware that they take an initial loss at. Nintendo wants to milk for the console, games and everything else.[/citation]

future proof? when it cant take advantage of everything that dx11 has to offer? even optimized, a lower end 5XXX (5770) cant handle dx11 good enough to use 11 over 10 or 9

and its not milking, nintendo is only a game company, they dont have other sections to keep them floating like microsoft and sony do.
 
[citation][nom]kashifme21[/nom]There are quite afew reasons why console makers wont be going for the highest end of GPUs even for Sony and MS:1. Higher end GPUs of today require quite alot more power then what was needed few years ago.2. With power comes heat.Both of these are severe issues for console makers. A console is generally very restrictive when it comes to cooling and doesnt have much space for a proper PSU (a good PSU also costs lot of money console makers generally want to fit the cheapest of stuff).No console maker will want to deal with issues like RROD or YLOD. Not only will a high end GPU cost alot it will also require a huge casing, good PSU and high airflow.[/citation]


AMEN to this , get sick of people down console graphics around here , with out and forthought. seriously , do you expect that in 2 years (aobutt eh tiema new xbox and ps 4 will come out) that they wil soem how magically figure out how to reduce size , heat and cost of say a GF 580 or radeon 6970 , let alone beable to cram the current vid chip we ahve in two eyars into that same package , i'll be super amazed if the enxt xbox packs any thing over a 5770 or a 6770 in two years. and sony has already said taht the ps4 it's self wil be incremental step in power nothgin mind blowing. in short the next gen consoles (excluding wiii U) will be aiming for modest DX 11 support (meaning only slight sue of tessellation) with 1080p support. though i'd love for them to aim higher, but the technology to fit a monster 12 inch video card that uses a 2-3 slot cooler into a console's profile JUST DOESNT EXSIST , and most likely wont exsist in two eyars either.

peopel can cry consoles are holding games back all they want to when teh time comes , but face facts technological limits will be waht is holdign consoles back thus inturn it wil also hold graphics back. i suppose they could build a monster console that uses uber hardware , and is the smae size as a full tower PC to accomadate the chip sizes and cooling but right off the bat you can throw out two of the MAIN selling points for consoles vs PC's those being small form factor for living room use and cost. no one would buy the dman things , so going with older vid chips only makes sense for this market , technology ahs allowed them to reduce teh size of older chip dies which improves cooling drastically , it also reduces cost to make the chips.
 
[citation][nom]cyb34[/nom]THANK GOD WE HAVE PCs[/citation]
+1000
Like flies arguing over who gets to eat the turd, console gamers dissing the opposition.
I think i'll keep playing on a PC thanks, for less than the cost of any of these consoles I could do an incremental upgrade now that could keep me sweet against the Wii 3, PS4 and Xbox 720.

And the games are cheaper.
And I don't have to pay for online play.
And I can use it as a "computer" as well as a games machine.

Keep playing with your toys, real gamers use a PC.
 
I find all this hilarious, my 4850 1GB easily plays bad company 2 at 50 - 60fps at 1920x1080 with most settings on max (just shadows on medium and no AA) while a 360 and ps3 render most games at a max of 1200x800 and then upscale the game with a few exceptions if the game is not as graphical but even then they are maxed usually at 25fps for a really graphical game by the consoles standards and even then they suffer from horrendous drops in fps.

even my old 1800xt i had at the time easily beat the 360 and ps3 in graphics performance.

I think too many butt hurt console owners on here and not enough people who think.
 
[citation][nom]whysobluepandabear[/nom]You're an idiot. You seriously are. You're using the highest of high end GPUs. I wasn't even TALKING about $300+ GPU's, so WTF are you talking about? I've been preaching about the 6850/6870, or somewhere close around there, and you come here throwing me cards that whoop the shit out of the 6850/6870. Way to skew the debate.[/citation]

Dude no point calling people names. Even if you consider a 6870 under load it will easily break the 75C mark in a large PC case with proper cooling.

http://www.anandtech.com/show/4137/amds-gtx-560-ti-counteroffensive-radeon-hd-6950-1gb-xfxs-radeon-hd-6870-black-edition/6

Now imagine what sort of temps this chip would touch in a closed console environment. Like i mentioned to start off Sony and MS will be using low end to entry level GPUs. which would mean a 5770 or 450GT GPU. Sure their will be die shrinks etc however the problem with temps will still exist.
 
[citation][nom]alidan[/nom]im not spending 1000+ on a console, and nintendo isnt eating a 5-600$ loss per system to make it reasonably priced.[/citation]

Certainly not, but if you know anything about consoles its that its games are pricier due to licensing and that the regular user usually pays more in the end for a system with 10ish+ games than a working PC game rig and the same number of games.

Whats also sad is that when the new version of the console comes out the "gamer" can't even use the new console to run purchased games. With a PC the story is different, instead of purchasing a new system the gfx card for instance can be replaced with ease and the system is taken to a whole new level AND the old purchased games will still work (with greater detail/resolution)

This is not a hard choice for me, get robbed for medicree gaming and complete system AND game cycles or robbed for good gaming with only a component swap each cycle!
 
to those dissing nintendo's choice:
picture yourselves as sony executives, now choose one of these:
a. with its high tech gpu, console A will last for ten years, therefore each consumer will only need to buy l of these (assuming no breakage)
b. standard gpu, console B will last 5 years, then by the that time we will release a new console that those same consumers have to buy again. That would be twice the revenue 😀 (oh, also the profit margin for these consoles will be much higher)

Me? I'll stick with good ol' PC, thank you very much....
 
You guys are forgetting that it's not practical to put a $500 GPU in an entire computer (console) meant to be sub $500.

You DO realize that the more they spend on graphics, it drives the console price up, right?
 
Somebody needs to come up with a modular console that will replace the PC for gaming. A console where you can upgrade components, but can't have access to files. This will prevent pirating, and hopefully let deveopers push the graphics to the limits. Full keyboard and mouse support for gaming is a MUST!

We'll use the PC for work purposes like video/audio editing, internet, word, etc. Where hardware people will concentrate on CPU and GPGPU.

ATI and nVidia can sell graphics modules for the mudular consoles...
 
Guys, consoles directly access hardware, and as such, offer better performance due to not having high level API's slowing everything down. A 4xxx series GPU on a console will be routhly equivalent in speed to whatever you can get on a PC today for that very reason.
 
[citation][nom]gamerk316[/nom]Guys, consoles directly access hardware, and as such, offer better performance due to not having high level API's slowing everything down. A 4xxx series GPU on a console will be routhly equivalent in speed to whatever you can get on a PC today for that very reason.[/citation]

Not necessarily, current consoles like PS3 and Xbox 360 always lose out to GPUs that launched during the same time. 8800GTX launched around the same time as the PS3 and it plays most of its games in 1080p PS3 only plays 5% of its games in 1080p. So even though PS3 has direct access to its hardware it looses out to a GPU that came just 1 gen later (it launched at the same time as the PS3. Hence i would estimate a GPU like the 5870 should easily beat whatever the Wii U comes equipped with.
 
[citation][nom]dimar[/nom]Somebody needs to come up with a modular console that will replace the PC for gaming. A console where you can upgrade components, but can't have access to files. This will prevent pirating, and hopefully let deveopers push the graphics to the limits. Full keyboard and mouse support for gaming is a MUST!We'll use the PC for work purposes like video/audio editing, internet, word, etc. Where hardware people will concentrate on CPU and GPGPU.ATI and nVidia can sell graphics modules for the mudular consoles...[/citation]

That would not be possible because console makers generally like to use proprietary parts. Meaning if MS created ram for its console it wouldnt use the same ram we have available in the PC market today.

Now imagine the ram we use in our gaming rigs is the same that is available for the millions of PC out there.

It would not be the same case for consoles. Sony would make its own ram design, MS would have its own design (same would be the case for GPU's, CPUs etc).

The idea of upgrades is possible on PC because those parts are made in millions of quantity and can be used on any PC in the world not just a gaming PC, which helps lower the cost of new parts being developed (for example when Intel releases a new CPU they dont just do it for gaming PCs they do it for general progress of tech).

Console maker doing the same thing would only be limited to their specific market. Hence the cost to do this would be outrages with no gaurantees how many people really would upgrade. Also the parts themselves would be several times more expensive then the same variation available on PC(just like how MS charges several times more for the HDs it releases on its console, now imagine if they did the same thing with RAM, Cpus, GPUs etc).

In short it wouldnt be possible unless they made a console with PC parts it self, which would defeat the idea of making a console.
 
@ whysobluepandabear,
Do you seriously think AMD or Nvidia would even consider for a second letting a Console manufacturer use their latest tech ? For all we know Nintendo asked for a 6 series core and got slapped down.
If i ran AMD i certainly wouldn't want my newest and best that was designed and aimed at the PC market going into consoles, that would be a great way to kill off my main business line. Epic fail for even thinking it could happen.
 
It's weird how people aren't really able to distinguish between high-end and current gen. There's a difference. There are more HD 6000-series cards than just the 69xx and 68xx--they're crazy high-end, and importantly, they still support the stuff that the high-end cards do, like DX11. Additionally, they use newer manufacturing processes to be more power-efficient (4000-series is almost all at 55 nm, 6000-series is at 40nm).

The latter will not be an issue, I'm sure, because console manufacturers get periodic die shrinks to decrease power use and allow for smaller, cooler, quieter, cheaper systems. If Nintendo does use an HD 4000 series chip, we can assume it will likewise get die shrinks and be on par with what a 6000 series would have. However, the former will still likely be an issue: no DX11.

Personally I'll be very pissed if Nintendo's choice in hardware prevents widespread adoption of DX11 features like tessellation. I can only hope that either A) Nintendo goes with a 5000-series or newer despite this announcement, B) they do tessellation in software somehow, C) the Wii U is a flop, or D) the combination of PCs, Xboxes and Playstations provide such a large installation base with DX11 support that developers make games with tessellation in mind but fallbacks for the Wii U and old computers.
 
[citation][nom]whysobluepandabear[/nom]Um, no and no. Power management and heat have actually gotten a bit better. You might really want to look into this. Assume they're using a 4850, and then really look over your statement and see if you want to run with that.[/citation]

Whysobluepanabear,

If you are comparing an actual 4850 graphics card to today's tech, yes you are right. But when you think about it from a console perspective, the 4xxx technology will be drawing less power. Bear with me here:

Even though the older technology was more power hungry, it was also built on a larger fabrication package (55nm) than what is capable for today. Older harder over time becomes cheaper to produce due to higher yields and cheaper manufacturing all the while shrinking the fabrication package (and let's not forget that the recouping of R&D costs has already been done). By taking older tech, and making the package more efficient and smaller say 40nm, 32nm, or even 28nm), the power draw will be quite a bit less than today's generation of more efficient hardware and still can be produced at a fraction of the cost all the while not having to shell as much money into R&D.

Now to the others out there,

Also remember, that even though the Wii U might be based off the Radeon 4xxx series, it may not be identical to its graphics card relative. There could be some extra optimization thrown in, some specialized features, etc. Both the Xbox360 and PS3 are based off a certain graphics chip but there were some minor changes or optimizations that make it unique. The Xbox360 was largely based off of the x1900/x1950 with some x2900 features added in. The PS3 is based off of the 7800 (?) but also had some extra features and optimizations thrown in that are not available in graphics card.

And while it would be nice to have a console based off of current technology or tomorrow's technology, it rarely happens. At best it is either 1 year behind what is available in the home PC. But that is due to costs and how much time is spent R&Ding the console. If you keep waiting for the latest technology it'll never be released or will cost as much as a PC. A line has to be drawn as to when to accept the technology and work with what you have. The 360 and PS3 did come close, at the time, to what their home PC siblings had. But the costs of those consoles at release were more than I was willing to spend. $250-$300 is about the sweet spot for the max I would pay for a console. Otherwise, I might as well invest into my PC.

Also, today technology is improving so much faster than it did even a few years ago, and several fold compared to 10-20 years ago. New graphics card get released in 6 month increments where in the past it could be a few years before a major upgrade came out. And with the way consoles are designed, how long they are expected to stay on the market, and a plan as to when a console becomes profitable, it is near impossible for a console stay at pace with the current graphics market. Nintendo is one of the only console company's that has been able to either break even or turn a profit on their consoles from day one, all the while offering a price that doesn't gouge the customer. And they've been able do do that since the early console days. Not trying to prove they are better with this comment, it's just a little factoid.

Winding down, I do think that the graphics package Nintendo will be offering in its system will be a more than capable piece. The graphics it can generate, based off of the technology, will be better than the PS3/Xbox360 and shouldn't have any issues handling 1080P (if the ram specs are correct). Now will the new Xbox and PS have more capable hardware? Probably, but they will most likely also be released later too (probably a year or so after Nintendo's) and at a higher price point. But Nintendo, in the meantime, will have a 1 year head start, develop a strong user base and rake in the money. When the competing consoles come out, Nintendo knows people will be buying those too and will then have both consoles in the home. In my opinion, Nintendo has stopped competing to be the sole console in the home. They know people will be buying the competition as well, so why fight for that. Instead, offer a fun, decent console at a good price point that is better than last generation technology and release it a year before the rest. That way, everyone (including the companies and customers) will be able to have their cake and eat it too. The cake is not a lie.
 
Granted its not the newest hardware but its still 3 generations more advanced and probably 2x faster than the 360 and PS3... Lets not forget the joking about the Wii motion controls... Fast forward 4 years and we now have kinect and PS move!

Nintendo makes some odd choices, yes, but quite frankly they are the only company driving innovation and pushing boundaries in the console market. There are risks involved, sometimes you miss...

BUT atleast nintendo is willing to take a risk once in a while
 
I think some people are being unrealistic about console graphics. They will never match P.C. graphics and are unlikely to surpass the Wii U's native ouput of 1080p resolution any time soon, they rely on current gen T.V. displays, not monitors. Further the current dev kits are under-clocked and have no problem keeping up with "current gen" consoles, so the final configuration might well surprise some people. As for directx, does anyone really think the majority of gamers will notice much difference between Directx 10.1 and Directx 11? Most of them don't even know that the majority of games they are playing run at 720p, so just how graphically discerning are they?

Unreal 3 and Crysis engine 3 are already running on the Wii U, how much more of a leap in power would it take to create a 'Wow!' factor above that level? The majority of "gamers" might whinge and whine about the Wii's obvious lack of graphics power, but really, how big a difference is going to be needed for them to spot the difference beyond the Wii U, where the resolution is "topped out"?
 
[citation][nom]kashifme21[/nom]Power requirements have actually gone up, If people are thinking console makers will be using GPUs like the GTX 580 or some high end 6000series GPU, its not going to happen.At load a GPU like 4000 series would be very comfortable with about 150watts. A GPU like 6000series or GTX 580 will require anywhere between 300 - 400 watts at full load. Try fitting in one of those GPUs in a small HTPC with console like airflow and see where the temps get to. I can assure you it wont be pretty.Last thing console makers will be doing is investing in large PC like casings then giving the same machine expensive PSUs, just to accomodate todays high end GPU requirements.Sony and MS at best will be going for entry level to mid range parts so that they can start making profits day one. Lets not even count Nintendo in this as they are not going for entry level or mid range they are going for outdated stuff.[/citation]

You are missing the point. The idea of the other posters isnt to get the latest and greatest in there, the idea is to have modern tech in the box. When you know your max resolution, and are stuck at 30fps, you dont need a 580 to push out good graphics with dx11. A 550 will do fine, and they will be cheap in a year (cheap now). Then when you consider the bulk discount, and the fact they can cripple it to a 530 to save costs, they can still have a cheap, cool, affordable dx11 card, that can push out 1080p at 30fps, and will keep people happy for much longer than a currently old card that is only going to get older.
But then again I keep hearing rumors of quad HD, or at least a higher rez HD standard coming late in the year or early in the next, and that could just mess everything on console up.
 
I need to quote the little prince here
http://www.angelfire.com/hi/littleprince/framechapter4.html

If I have told you these details about the asteroid, and made a note of its number for you, it is on account of the grown-ups and their ways. When you tell them that you have bought a new game, they never ask you any questions about essential matters. They never say to you, "Is the game fun? Does the story pulls you in? Do you enjoy playing it?" Instead, they demand: "How does it cost? How long is the game? Which version of Directx does it uses? What is the resolution before upscaling?" Only from these figures do they think they have learned anything about it.

If you were to say to the grown-ups: "I saw a beautiful new console with an original new controller and a lot of very fun games to play on it" they would not be able to get any idea of that console at all. You would have to say to them: "I saw a console with 800$ worth of pieces in it that cost 400$" Then they would exclaim: "Oh, what an interesting system that is!"

Just so, you might say to them: "The proof that the little prince existed is that he was charming, that he laughed, and that he was looking for a sheep. If anybody wants a sheep, that is a proof that he exists." And what good would it do to tell them that? They would shrug their shoulders, and treat you like a child. But if you said to them: "The planet he came from is Asteroid B-612," then they would be convinced, and leave you in peace from their questions.

They are like that.

Also, please refer to extra credits and their video about "graphics vs esthetics"
http://www.escapistmagazine.com/videos/view/extra-credits/3201-Graphics-vs-Aesthetics
 
[citation][nom]whysobluepandabear[/nom]I'm sure if Nintendo was approaching Nvidia or AMD, they could strike a handsome deal on something similar.[/citation]The GPU _is_ from AMD. The CPU is from IBM though. Also there's more than just the GPU that make up the cost of the console.
 
[citation][nom]kashifme21[/nom]That would not be possible because console makers generally like to use proprietary parts. Meaning if MS created ram for its console it wouldnt use the same ram we have available in the PC market today.Now imagine the ram we use in our gaming rigs is the same that is available for the millions of PC out there. It would not be the same case for consoles. Sony would make its own ram design, MS would have its own design (same would be the case for GPU's, CPUs etc). The idea of upgrades is possible on PC because those parts are made in millions of quantity and can be used on any PC in the world not just a gaming PC, which helps lower the cost of new parts being developed (for example when Intel releases a new CPU they dont just do it for gaming PCs they do it for general progress of tech).Console maker doing the same thing would only be limited to their specific market. Hence the cost to do this would be outrages with no gaurantees how many people really would upgrade. Also the parts themselves would be several times more expensive then the same variation available on PC(just like how MS charges several times more for the HDs it releases on its console, now imagine if they did the same thing with RAM, Cpus, GPUs etc).In short it wouldnt be possible unless they made a console with PC parts it self, which would defeat the idea of making a console.[/citation]

Not true! remember the N64, you could get an addon to make it an N128. It drastically helped the overall system speed, didn't mess with game compatibility, and it allowed people to feel smug when they bought it. And lets not forget that it was wildly expensive for 64MB of ram, so there was plenty of profit margin in it. Make your own graphics port that has a screw slide-out like a direct jet network port on a printer, and hard core gamers will flock to the console even if the console is crap. The promise of future upgrades, and the demand for such capability, will keep them coming even if the upgrade cards never match the power of pc cards.
 
Well, I WAS going to get a Wii U since my PS3 was stolen but after this? Defnitely having second thoughts. Leaning heavily towards getting back into PC gaming instead. Its time to upgrade the desktop anyway...
 
BLAHH BLAH DX9 DX10 DX11

Guess what? Nintendo DOES NOT USE DX anyway. Ohh and PS3, Nope. Even Xbox does not use the same dev kits as PC.

So its time people shut up about DX versions and realize that a die shrunk 4XXX card will still be a massive upgrade in the console market. PS4 and Xbox will be better, but look how underpowered Wii was and yet it sold off the shelves non stop.
 
Status
Not open for further replies.