IBM Confirms that Nintendo's Wii U Has a Power-based CPU

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

ikaruga

Distinguished
Jan 17, 2011
44
0
18,530


You are not even interested to read some and learn:(

As I already said, the response is just fine. IPS is a different technology and the GTG (grey to grey) time (what manufacturers give you as the response time) has an altered meaning. The ghosting is different too, its more like what you get when you turn on motionblur in Crysis on a CRT monitor for examlpe. It's hard to describe, but it's just a little clean blur, and there is no black or white overshoot/overshadow which makes any >4ms TN panel look horrible with anything moving.
W-Led has lower color gamut and other setbacks, so it's not just tech-snobism, it's common sense)

This is an example how you measure input lag: http://www.tftcentral.co.uk/articles/input_lag.htm

This is one of the many reasons why (Edge-)Led sucks compared to CCFL: http://www.tftcentral.co.uk/articles/pulse_width_modulation.htm

and this is some more if you want to be more prepared next time: http://www.tftcentral.co.uk/articles/panel_technologies.htm

I also recommend prad.de, imho they are one of the best on the web.
I hope it helps.
 

ikaruga

Distinguished
Jan 17, 2011
44
0
18,530




The Samsung SM2233RZ (one of the most successful 120Hz gaming monitors) has an average 15ms input lag. So if you run it at 120Hz you lose 1 full frame for sure but you can lose two as well sometimes (because 1000/120=8.333 and thats less than 15, and that's only average and not the worst case).
The Dell2209WA which has same native resolution and size and it's from the same time period, only has 9ms average input lag, (and you can run it at 77Hz to get even lower response while still have full RGB coverage) .... but the new 2312HM has less than 1 ms input lag (but that's a new LED model and I don't really like the picture quality):

http://www.tftcentral.co.uk/images/dell_u2312hm/input_lag_1.jpg

ps: sorry for the double post, had to switch computer.
 

mazty

Distinguished
May 22, 2011
176
0
18,690

Scenario dependent, you don't need more then 60fps, so allowing 120 fps is pointless and is only used really in 3d monitors. 2ms is the gamer response time as no one wants a visual delay, and 120hz is less beneficial then 2ms.

Vapour chambers would also knock the price up a lot - considering PC's don't have cooling issues it is far more to do with sloppy engineering design then the cooling methods used being inadequate.

It's CPU performance has been stated to be weaker then the PS3 and 360, and it's rumored that it's GPU is a E6760 which puts it online with a 8800GT which is 6 years old. I'm saying this is bad because they are then charging you $300 for it when you can get consoles that do practically the same for less. Also you are bringing in tech that has no effect on the gaming experience which makes your comparison completely absurd.
 

mazty

Distinguished
May 22, 2011
176
0
18,690

Doesn't change the fact they are far more expensive then standard TFT + LED backlit screens. It's like arguing everyone must be in a ferrari if they want to drive somewhere...Good, but not always practical. However I'm saying that if a company releases a new console, it should at least have relatively new hardware (read gaming hardware) and not 6 year old hardware.
 

ikaruga

Distinguished
Jan 17, 2011
44
0
18,530


You are mixing up input lag with panel response, this is becoming really entertaining tbh. The 2ms G2G time is pretty far from what you actually get in real life, but let's assume (only for you) that it's the real deal.
If you run your monitor at 60Hz, you only get a new picture in every 16.67ms. For that time, the same picture (frame) is displayed on your screen no matter what. After that, when the new picture has been processed and arrives, your panel needs time to respond and give the crystals a new energy state, that's the response time.

Input lag is the time needed to process the frame coming from the computer. So if we assume that the game code itself is realtime (just for this examlpe) it's the time between when you turn your wheel to right in the car racing game, and the time when you actually see the car turning on your screen (minus response time ofc).
The input lag and the number of refresh (Hz) is what's actually important for gamers, and the 2ms is what's not. The 2ms is only needed because TN panels look utterly horrible with anything more, they ghost , they overshoot, they break the picture with anything moving on the screen.



TFT monitors (both LED and CCFL) based on E-IPS panels are equal or cheaper than gaming monitors, and that's a "relatively new" gaming hardware because you get a custom tablet as a controller with it's own high-res touchscreen display.

I highly recommend you stop embarrassing yourself, for your own sake.
 

tipoo

Distinguished
May 4, 2006
1,183
0
19,280
[citation][nom]apone[/nom]90 nm ? Wow Nintendo, are you still living in 2006?[/citation]
The chip in the Wii U will be 45nm, we know that much for sure. The article just means the architecture MAY be based on the broadway, comma, which was 90nm. Not that the Wii U will be 90nm.
 

luciferano

Honorable
Sep 24, 2012
1,513
0
11,810


The first paragraph of yours is very flawed. Need is absolutely irrelevant in this. The point is what is better and 120FPS with 120Hz is noticeably smoother than 60FPS with 60Hz, at least for me, even in 2D gaming. I don't game in 3D and I can quite literally tell the difference between 60Hz and 120Hz gaming. It's like the difference between 45FPS and 60FPS, at least to me.

Higher end hardware would knock up the Wii U's price too. Your point is moot as a result of that.

It's CPU performance has been stated to be weaker than the Cell and Xenon by people who coded for the Cell and Xenon. That doesn't mean that it's actually weaker. For all we know, they simply don't know how to code for it properly yet. Heck, we still can't fullyutilize the Cell years after it came out for gaming, so this is not only entirely plausible, but also likely. You can't honestly expect developers to do the greatest job of developing for a platform that is probably extremely different from what they've been working with for almost a decade, can you?

What's absurd is your double-sided logic. I didn't bring any tech that was not relevant to gaming into this. IPS panels offer a better picture than most TN panels at a similar price and vapor chamber cooling could reduce failure rates among current consoles. That makes both of them not only relevant, but excellent examples of my argument. My entire argument thus far has shown that your view of the wii U is beyond absurd in effectively every way that I can see. Not only do you accuse Nintendo using outdated hardware and demonize them for it despite the fact that pretty much all products, even products related to consoles, do the same, but you don't actually know the performance characteristics of the console! We don't actually know much of anything about it. We have rumors, but that's it.

The most that we really have to go by is that enough sources say that it's at least some sort of Power-based CPU that is probably manufactured with a 45nm process.
 

mazty

Distinguished
May 22, 2011
176
0
18,690


Okay sorry I mixed up response time as I thought that was the same as input lag. However a) both are important and b) i've no idea where you are getting your info from but IPS still cost considerably more then LCD + LED making your point redundant.
 

mazty

Distinguished
May 22, 2011
176
0
18,690

Need is 100% relevant. If you are playing a racing game or FPS, you will notice frames more then if you are playing a platformer.

No my point is not moot because Wii U's profit margin is clearly too high. What the hell is double-sided logic? Chip cooling is not going to effect my gaming experience and therefore is irrelevant to the conversation. Using an 8800GT instead of a HD5770 will. The monitor is down to the consumer, not Nintendo, so stick to the point. Your argument is slapdash and scattered without a semblance of thought that holds up to basic scrutiny. Mentioning vapour chambers? Well the wii U isn't made out of diamond either, but I don't think anyone cares as it won't effect the gaming ability of the console.
We also have sources saying it uses the E6760 which is laughably weak to be considered a modern gaming chip. Weak CPU, weak GPU, but that hasn't stop kids pre-ordering it by the dozens...
 

luciferano

Honorable
Sep 24, 2012
1,513
0
11,810


Need is not relevant because you are not understanding the meaning of the word. You want to have higher FPS/Hz in games where it matters, but you don't need to have very high FPS/Hz in such games. I may want to have 120FPS on a 120Hz display when I play an FPS game, but that doesn't mean that I need it to play. I most certainly don't even need 60FPS on a 60Hz display to play, although I don't enjoy gaming as much if it's not running very smooth.

Your point is moot because until we get some concrete specifications of the Wii U, you don't know what it actually has. You only have rumors and rumors are worse than having nothing.

Better cooling means less cooling-related failures and lower noise for a given level of cooling. To say that this doesn't matter for a game console is insane. That's like saying that it doesn't matter if my graphics card and/or CPU and/or motherboard fails in my PC because of faulty coolers (something that is actually known to happen with a lot of poorly-designed coolers for motherboard chipsets and graphics cards). Vapor chambers are not nearly so expensive that they couldn't be used and could help these issues. They are literally just an improved form of heat pipe.

Define modern gaming chip. It doesn't even need to be nearly as fast as a PC gaming card to get comparable performance if the devs optimize the game properly. It's entirely possible that if the Wii U does have graphics hardware comparable to the 8800 GT, it could still give quality more like the 5770/6770 if used properly.

Beyond that, the Wii U is not supposed to be an incredibly detailed, high-quality graphics system. It's gaming experience is based around decent graphics with a more unique way of playing the games.

Again, you don't even have any information to prove that the CPU is weak. We have the word of developers that have worked on a very different platform for almost a decade and probably don't have much of a clue as to how to optimally code for the Wii U's CPU just yet something that could be rectified by launch time for all we know. In fact, looking at their own words in the supposed leaks, the devs (assuming that it's even them and not some scam) specifically say that they haven't figured out how to optimize for the CPU properly.
 

ikaruga

Distinguished
Jan 17, 2011
44
0
18,530


I bought my 22" U2209WA for 199$ and I paid $169 for my 23" u2311H (both were new from retail ofc). Also paid $370 for my Korean 100Hz 27" Catleap IPS. Tell me about any "gamer" monitor for that kind of price please.
 

luciferano

Honorable
Sep 24, 2012
1,513
0
11,810


A good non-IPS is more expensive than that one too.
 

mazty

Distinguished
May 22, 2011
176
0
18,690

Depends if we are talking about casual gamers or actual gamers. I won't play an FPS if it's not giving me a steady 60fps just as some people wouldn't ever race in a Pinto. Cooling does not directly relate to the game - drop that point, it's getting old.
A modern gaming chip is one which gives you gaming performance circa 2010 with DX11, so something like the HD 5770 - 1080p an high-ish settings with DX11 running (think AVP DX11 at 1080p).
Considering the form factor of the Wii U I doubt it will have performance close to the 5770 simply because chip miniaturization hasn't come along that much yet.

Regardless of what it's aim is, it still is giving you old tech for a high price. The info about the CPU is from whatever dev criticised it and again considering the form factor, I wouldn't be all that surprised if it's true.

Listen I like gaming a lot. I'm not trying to hate on nintendo but I'm sick of seeing gaming being turned into a giant gimmick whether it's through a stupid controller or by re-releasing the same game year after year (read Mario, CoD etc) or thinking that watering down games is the way forward. There's a lot of potential for gaming which was seen with the last gen with games like Ghost Recon (full squad planning and controls), Fable (great personal customisation and interactive world), Morrowind (do whatever whenever) and this gen it has all been pissed away. One thing that hasn't helped is the hardware hasn't been good enough to give devs the option to fully expand on the last gen and the last thing I want to see is more crappy limiting hardware be released as ultimately it hurts gaming as a whole.
 

mazty

Distinguished
May 22, 2011
176
0
18,690

A good non-IPS is about £120 here, and a good IPS is about 50% more expensive.

As you seem knowledgeable on this, how does the response time effect IPS displays? Does it have to be just as low as LCD?
 

ikaruga

Distinguished
Jan 17, 2011
44
0
18,530


This is hopeless. I want to help you to gain more knowledge and info about your interest (assuming that you are here because you like computers and technology), but you do nothing but trolling back.

I give up. You win! Hf and Bye \O
 
G

Guest

Guest
I heard a rumor that Sony's turning the PS4 into a competitor to the Apple TV, installing the hardware they developed for their 4k-graphics console into the 4k TVs they're trying to sell, making a OLED "Smart TV" that has twice the resolution of High Def, all the media features of PSN and the ability to play blue ray games and movies at a resolution that actually is better than the PS360 consoles.
 

mazty

Distinguished
May 22, 2011
176
0
18,690

Lolwut - explain. A basic IPS is more expensive then that basic LCD and a good IPS is more expensive then a good LCD. How is stating what a store has trolling? That's a piss poor bail-out sorry.
 
Status
Not open for further replies.