Why Nintendo's NX Console Needs To Be x86-Based (Op-Ed)

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Nintendo, bring back ROM cartridges! We have advanced enough now in solid state technology that this is inexpensive. No more installing console games or loading screens! Go back to the roots of what console gaming was all about. Add in to that an inexpensive x86 solution and gpu that can do 1080p and its a winner.

No matter how cheap rom chips have become, to get 50GB of ROM space it would cost many times over the cost of a 50GB blu ray disc. And they even have 128GB quad layer discs. With the massive quantities of discs they use a dual layer blu ray pressing has to cost less than 1 dollar per disc. A 50GB cartridge PCB with 50GB of ROM and all the other circuitry needed has to be at least 7 or 8 dollars each in mass quantities. Profit margins are to slim nowadays to give up that much on manufacturing costs.

I will agree with you tho 50GB cartridges would be awesome and maybe super large rpg's could be on 100 or 150GB cartridges. If people wouldn't mind paying like 75 dollars per new games cartridges could be do able.
 
Also games need higher bandwidth between the cartridge and GPU and CPU than the older days. So the interface that connects the cartridge would also have to have a much higher bandwidth than the n64 days.
 
Nintendo, bring back ROM cartridges! We have advanced enough now in solid state technology that this is inexpensive. No more installing console games or loading screens! Go back to the roots of what console gaming was all about. Add in to that an inexpensive x86 solution and gpu that can do 1080p and its a winner.

don't forget sandisk just developed a 200GB micro sdxc card that will be officially on sale quite soon. I know my 128GB in my galaxy note 4 is getting switched to 200GB ASAP

There is simply NO reason to do this. A far more sensible and versatile approach is what Sony uses on the Vita. Generic flash memory that can be mass produced and not committed to a particular game until very late in the process.

Mask ROM was the only game in town for a long time that met the needs of the console business but had all kinds of hassles, especially the commitment required for production runs because changing masks was a huge hassle. One of the reasons optical media was so favorable wasn't just the huge capacity gain but also because a CD production line could stop on a dime and switch what content was being put on the disc. Flash memory cards have the same advantage.

But even those are a dying breed. My experience with the Vita has shown it is far more convenient to have a high capacity memory card to store a whole library. The Vita's memory cards are severely overpriced due to the problem of convincing retailers to sell something that will cut them out of future software sales: you have to give them a very good profit margin. But more standard flash memory formats are more affordable. 128 GB microSDXC cards are frequently down to $80 now and should get a lot lower towards the end of the year. Currently the only limit in the New 3DS for supporting such a card is Nintendo's refusal to license exFAT or implement an open source file system and give the 3DS the ability to reformat the cards. Right now to use greater than the 32 GB cards requires reformatting in FAT32 and taking some risk.

Increasingly, more and more software will only be sold as downloads. Consoles are making that transition, with handhelds farther along but still trailing phones and tablets, where it is the sole choice.
 
This issue for me with the Wii U is that controller. Its just way to big! Just not interested in something that is as big as my tablet but has a fraction of the functionality 😀
 
While PPC is on the downslide and might be worth abandoning for Nintendo, x86 isn't necessarily the best option for them. While used on desktop PC's, x86 carries a lot of baggage from the early days of the design, where many current features weren't really anticipated and backwards compatibility needed to be maintained, even if it meant making a "bug" a "feature". ARM processors are a cheaper and viable alternative that are WIDELY developed for, especially as new designs push the performance envelope more aggressively than their typical power-efficient designs. Those power savings on CPU performance also provide a higher TDP for graphics power.

Depending on whether you want to believe Intel's propaganda or not (trying to sell those Atoms!), supposedly with die shrinks and new advanced power saving features, any inefficiency an x86 cpu has versus ARM is very small nowadays. Its just that we tend to compare the power hungry monsters in our desktops/notebooks with the super-efficient (but comparitively weak) ARM chips in our tablets/smartphones, when a more apt comparison would be with one of Intel's super low power consumption Atoms.

its certainly debatable, but Intel can't keep that up. The gap between manufacturing processes are shrinking, and without that advantage, X86 is clearly inferior. And if they use the AMD architecture to make it resemble the ps4 and xbone's cpu more, then they don't even have the small die advantage.
 
This issue for me with the Wii U is that controller. Its just way to big! Just not interested in something that is as big as my tablet but has a fraction of the functionality 😀

Have you actually tried it? It is lighter than almost any tablet I've tried and quite comfortable to hold as a controller, using the touch screen does tend to require holding it on my lap or something though (not a problem for party games where you mostly use just one control-scheme or another, or when the game supports playing on the controller as I usually play those on the bed anyway).
 
I could be wrong, but I don't think the new console will be a handheld or OUYA like competitor. The New Nintendo 3DS devices will likely be Nintendo's competitor in the portable gaming sector for the next year at least, if not two. As for the OUYA like device, OUYA itself was not a commercial success and there is a large number of Android devices that can offer this type of experience. Nintendo has never been one to try and follow other companies, and tends to have fairly unique products. As a result it doesn't seem likely for Nintendo to pursue the home Android gaming market.

I think you are wrong. I think they will have an amazon fire tv-like app ecosystem, where you can only use amazon apps. but they're winning strategy is something that is welcome in almost every gamer's home: real games by nintendo with their characters. not only will they be able to offer these games on a google play store, but they will also be able to offer other games that are already available through google play.

this will give them massive leverage over what is already currently available, and if they can really nail this at a $100 PP, it will no doubtedly be another wii, and perhaps the "ouya" that everyone was expecting. Imagine something like this having wiimote compatibility with older nintendo games? it will be FIRE.
 
Please no x86 disasters yet again.
One of the reasons the new consoles can not compete with pc's are the move to more regular x86 hardware.

If Sony had invested more into Cell or PPC based solutions, the Playstation 4 would be a lot powerful then it is right now.
The PS3 was real powerful on release, even so that organisations (like the US Airforce, just google for Playstation 3 Cluster) started buying playstations in bulk to use as computational server.

The only reason why Sony moved to x86 is because there were AAA developers complaining about the effort they needed to put into development or porting in the software side of thing (which actually means they needed to put more effort into programmer training, or finding better programmers) which resulted in a lot of complaints from developers towards Sony.
I think they should have stuck with Cell architecture and invested more time into Parallel programming like they started with the SPU tech.
 
Actually 2017 would be quite ideal for new x86 based Nitendo console. The 14 or 16 nm finvet will come to market this year from Samsung and TSMC, It will reduce the electricity usage. It also will be much cheaper in 2017 than it will be today. 2017 also would allow new version of HBM memory to console (the same that will be used AMD 390 flagship GPU this year and Nvidia in 2016) Too expensive now, but maybe reasonable 2017.
It would allow Nitendo to have faster than Sony or MS console with less money, that also would be a little bit faster. I am not expecting to see super fast Nitendo console... They will save money anywhere where they can. Anso in this and next year they will see if Intel has anything to offer for console market, but I still would believe that next gen AMD apu, with Next gen graphic engine and new HBM memory could be the cheapest way of making a gaming console, that would also support full DX12 and Vulcan if they want to... In 2017 the Wii U would ne 4.5 years old, so it would not be too harsh to leave it with less support. But new Nitendo console now... Would not be the best timing. IMHO
 


and here you see the games make the games, not the hardware.
cellphones could be supercomputers for all i care but their games made just for them suck and arent even worth getting them when they offer em up for free much less the dollar they normally cost.
Yes but it is MUCH harder to make a good game that works on underpowered hardware than a good game on hardware that at least is equivalent to what is in my E5-571-5552 laptop.
 
Since Day one , Nintendo was never the best Hardware . what sells Nintendo is their funny super playable games.

I never get bored from playing a Nintendo game even the oldest machine.

Having said that , Nintendo need to add new hero .. it is about time !

we neeed something new Nintendo , We want the feeling when Kirby First time Appeared , and Zelda and so on ...

we need newwwwwww hero in Nintendo universe , work on it !

and please , this time make a more powerful machine ... go for Nvidia GTX 970 they draw alot less power than AMD.
 
What Nintendo SHOULD have done with the Wii-U:

1) Use a basic controller and put the savings towards better hardware in the console box (similar to PS4).

2) Link to their portable 3DS.

This would have given the machine better performance. I think they overestimated people wanting a controller with a large screen in it. Cool in theory, and I'm sure some people like it but not only do some people HATE it due to weight/spacing etc but again the sacrifice was processing power.

Connecting to the 3DS would have also driven sales of that device. (Maybe they can anyway, but my point is there should not have been a tablet style controller.)
 


nooooooooooooooooooooo pleaaaaaaaaaaaaaase

we dont want the rubbish 3ds any more .. we want NEW handheld with True capacitive AMoled touch screen
and a Tegra X1 chip inside and 3G ram , it is cheap today.

I dont know why Nintendo are not making the new handheld 🙁 no more 3ds PLEASEEEE
 
"its certainly debatable, but Intel can't keep that up. The gap between manufacturing processes are shrinking, and without that advantage, X86 is clearly inferior. And if they use the AMD architecture to make it resemble the ps4 and xbone's cpu more, then they don't even have the small die advantage"

Not really what I meant---with every die shrink and new generation of x86 cpus, the actual x86-to-RISC decoder stage becomes an almost infinitesimal part of the actual cpu, and power draw.

If anything, Intel should be getting more and more close to ARM efficiency as cpu's continue to advance.
 


I actually agree with some of that---you could always usually count on the new gen of consoles that came out to be at least slightly more capable at gaming graphics than a top-end gaming PC at the time. But this generation, with basically standard PC hardware, there was no leapfrogging of PC gaming. It was usually IBM/Motorola/MIPS/Hitachi with their cpus that were specifically designed to be very powerful at certain computations which happened to be very effective for games, while the equivalent PC cpu was designed to be good all-around, and thus, at least initially, could not touch the consoles' cpus in certain areas that they excelled in.

But, PS4 and XBone use a general purpose PC processor and videocard, which are both quite underpowered compared to your up-to-date gaming PC, at both of their launches.
 


you can always design a co-processor and add it to the board to speed up Pc gaming. something like PhysX and Tesla chips etc ...

but the Main CPU should be x86 ... for cheaper game development. or Maybe move to ARM chips with 8 cores or so...

 
"you can always design a co-processor and add it to the board to speed up Pc gaming. something like PhysX and Tesla chips etc ..."

Sure, but remember, its always a matter of cost.
 


Unfortunately this isn't actually how game consoles work. Lets just focus on Nintendo, Sony, and Microsoft, not Atari and Sega.

Nintendo's NES came out in the mid 1980s, while the processor used inside of it, a MOS 6502 clone, was first created in 1975. The CPU ran at less than 2MHz. Modern PC processors of the day would be Intel's 80286 running at 6MHz and 80386 running at up to 40MHz. Not only were these processors 16-bit, they had a much higher clock speed. The SNES used a 16-bit variant of the 6502 that could run about 20Mhz, but Intel at the time had processors running over 100Mhz. Long story short, the performance was far below PC levels.

The Nintendo 64 was advanced for its day, but was still far slower than modern day AMD Athlon and Pentium 3 CPUs of the same time period. In terms of graphics processing power, many of the desktop GPUs offered more performance as well. The performance gap between Gamecube and modern PCs of that era was even greater than the Nintendo 64's performance gap too. That has only increased more over time as Nintendo's consoles have made rather slow advancements compared to PCs.

Sony and Microsoft followed a similar path up until Xbox 360 and Playstation 3. Those were the only home game consoles to launch that really offered performance above common computers of the day and could compete if not completely outperform computers of the same time period.

The home console market does not target to offer the best gaming experience, and should not be expected to over better performance than desktop PCs of the same era, let alone continue to do this in future. Game consoles primarily target to bring affordable cost effective gaming solutions that can perform on key with mid-range PC gaming solutions. From that standpoint, Microsoft and Sony have hit the target head on and thats the kind of performance we see out of them. They didn't over power the consoles like the previous generation, so hopefully they won't attempt to keep using the same consoles for nearly a decade again, but they did meet the common expectations of a new gaming console.

Nintendo's probably is simply that they under shot this mid-range gaming experience just like with the Wii. If they could offer better price points for the console this would be less of a problem, but so far they cannot. Either strategy works for Nintendo, but they need to either bulk up performance and ease of development to compete with other consoles, or they need to lower costs significantly enough that they can sell a lot more units and attract developers that way.
 
You are falling into the Mhz trap here---just because a certain CPU has a higher clock rate does not mean its automatically a superior gaming CPU. The x86 line has always been about "all around performance + backwards compatibility", whereas in the past, console designers have always gone after cpus that were both cost-effective and powerful for gaming purposes.

Look at the 1980s NES---based on its low clock rate, the games on this system should have been trounced by the 8088/286 games in the 80s, graphics wise. But thanks to the nightmare of coding efficiently for DOS and its millions of hardware combinations and non-standard drivers, any overall computational advantage the 8088/286 system may have had cannot be taken full advantage of. Plus, while the x86 chips of the time had the biggest software libraries, it was pretty well known that if you wanted to build a multimedia system, you went with Motorola, or Zilog, or MoS. In the case of Mos, it was because it was a very efficient (for games), low-cost CPU. The Intel chips were always designed for business first.

Lets move on further---the Commodore Amiga had a relatively slow clock-rate 7MHZ 68000 (which was available in 1980!), and yet the IBM PC couldnt even touch an Amiga in graphics until VGA games became mainstream and game programmers starting using DOS/4G to fix the nightmarish segmented memory architecture of x86. That was like 1990, and required a 386 (a true 32-bit chip).
 
The changing of the CPU architecture is not the discussion to be having. See GDC 2014's Efficient Usage of Compute Shaders by Alexis Vaisse. Plus developers have been working with PowerPC for the past 10 years.

Development Environment (IDE/COMPILER/API) and the GPU are what going to matter, and it's my belief*, along with lower sales, that is why 3rd party don't develop for the WiiU or won't for the next one.

* Based on running the same code on NDS,DC,Windows,Linux,OSX,Android,Javascript, RPI. It was always the getting the code to compile and/or making graphics work correctly which were the issues.


 
The SNES used a 16-bit variant of the 6502 that could run about 20Mhz

Not in the SNES it didn't! In the SNES it was clocked very low. Something like 3.58Mhz, IIRC. Not that clock speed is everything, but let's face it... it was slow. If you know of any high-speed variants that's swell, but I doubt anything near 20Mhz was released prior to the Super Famicom hardware being finalized in Japan. The graphics chip was great but in order for late-gen games to really shine they had to rely on cartridge-mounted processors to take the load off the CPU and/or introduce new features/effects.

Considering its design was much newer than the Genesis it's no surprise they had better graphics but I feel their CPU choice was a mistake no matter what. They could have just used a Moto 68K like the Genesis and run it at a higher clock (9-10Mhz, for example). I mean they didn't use the backwards compatibility of their chip, so what was the point? At least with the Genesis you could get a converter and run Master System games (I've done it, it works). To be fair they didn't push this feature like they should have at launch, it would have especially made sense to do that in the UK/EU regions where the Master System sold well.
 
"Nintendo, bring back ROM cartridges! We have advanced enough now in solid state technology that this is inexpensive. No more installing console games or loading screens! Go back to the roots of what console gaming was all about. Add in to that an inexpensive x86 solution and gpu that can do 1080p and its a winner."

you do know what ROM means right? ( Read Only Memory) which means that if a game is riddled with bugs (considering tons of games over the past year were) there would be no way to fix them because there would be no way to push an update to them like they can now.

.just check out packs of game ROM's sometime and you will find there were updated versions of thegame that were later sold and any early buyers of the game didn't even know there was an updated version of the game
 
Status
Not open for further replies.