News Intel reportedly lost PlayStation 6 chip design contract to AMD in 2022 — the $30 billion deal was up for grabs

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
I don't think this story is true at all. We learned our lesson on the original Xbox. Being in a game console is a crappy deal. You don't make any money at all.
You’re right that margins in game consoles can be very thin, but losing out on this contract could still be a significant blow to Intel. Given the high stakes of a $30 billion deal, the loss might impact Intel's Foundry division’s profitability and reputation more than we might initially think. Even with low margins, securing such a major contract could have provided crucial revenue and stability for Intel during its restructuring phase.

Intel has made its fabs more accessible through Intel Foundry Services (IFS), and while the specifics might vary, there's no reason to doubt the scenario. Similarly, Sony has shown a willingness to innovate with each new PlayStation generation, as seen with the PS3 and the rumored custom upscaling technology they are working on.
 
I don't think this story is true at all. We learned our lesson on the original Xbox. Being in a game console is a crappy deal. You don't make any money at all.
You are the native platform for a lot of AAA games being released for the next half a decade.
I do dare say that if it wasn’t for the consoles amd would have long closed their graphics department since no one would have bothered making games work on their tiny market share GPU’s.

And that it’s not unlikely amd is selling them chips at cost to keep that going.
 
  • Like
Reactions: artk2219
You are the native platform for a lot of AAA games being released for the next half a decade.
I do dare say that if it wasn’t for the consoles amd would have long closed their graphics department since no one would have bothered making games work on their tiny market share GPU’s.
um, i thought radeon supported directX, just like nvidia's gpus? so at the least they would still be able to run games, maybe not as well, eg. making use of specific radeon features, but games should still run. so this comment is kind of moot.
 
You’re right that margins in game consoles can be very thin, but losing out on this contract could still be a significant blow to Intel. Given the high stakes of a $30 billion deal, the loss might impact Intel's Foundry division’s profitability and reputation more than we might initially think. Even with low margins, securing such a major contract could have provided crucial revenue and stability for Intel during its restructuring phase.

Intel has made its fabs more accessible through Intel Foundry Services (IFS), and while the specifics might vary, there's no reason to doubt the scenario. Similarly, Sony has shown a willingness to innovate with each new PlayStation generation, as seen with the PS3 and the rumored custom upscaling technology they are working on.
The "rumoured" upscaling tech used in the PS5 Pro was reportedly simply handed over to AMD for them to use. I assume this is because AMD engineers and Sony engineers work collaboratively together, so I assume AMD also (by definition) designs a custom design, but that it mostly AMD to begin with so AMD must by definition share technology and design details with Sony for Sony to be able to choose WHAT parts to customize and what to leave as "up to AMD".

The PS5 Pro is done and has been done for some time (months), they are building stock so expect more and more leaks as we get closer to launch of people literally stealing them from warehouses etc.

As for Intel, yes it would have been a massive boon for Intel, but the problem Intel had at the time is that their graphics drivers were crap and they were promising the kinds of things that they had essentially never delivered before, at all, ever. Modern decently performing graphics within a sensible power budget and within a sensible cost and that is stable, and of course that Sony wanted to be able to customize the design like they do with AMD (and let's be totally honest here, with TSMC as well (menage a trois)) but just couldn't prove that this is what they could and WOULD do, but also remembering the chaos that was CELL (perfectly good chips, just not for Sony at that time) and would then have to have an actual REASON to change from using AMD / TSMC and Intel just was not able to offer this and that includes making a LOSS on every chip to begin with as part of Intel's cost of production of everything is dependent on massive orders like this over several years and so a LOSS to begin with gives them the revenue throughput for investment in expansion which then makes Intel more competitive with TSMC, and the loss of (or rather, a lack of a gain) of this contract really effected "Intel Foundry" and also Intel whom then laid off employees working on this (bettering their graphics.!).
 
You are the native platform for a lot of AAA games being released for the next half a decade.
I do dare say that if it wasn’t for the consoles amd would have long closed their graphics department since no one would have bothered making games work on their tiny market share GPU’s.

And that it’s not unlikely amd is selling them chips at cost to keep that going.
I believe that AMD has actually made a profit on every one of the PS5 chips it (via TSMC) has made, I could be wrong, it is fairly common for consoles to make a loss on the hardware in their first year, but a node change (typically) reduces that cost and then turns the chip into a profit, and this node change was always baked into the costings over the years of production.

As far as I am aware AMD and Sony both made profit on every PS5 chip / Console sold which is in itself a success over previous models / generations.
 
  • Like
Reactions: artk2219
I believe that AMD has actually made a profit on every one of the PS5 chips it (via TSMC) has made, I could be wrong, it is fairly common for consoles to make a loss on the hardware in their first year, but a node change (typically) reduces that cost and then turns the chip into a profit, and this node change was always baked into the costings over the years of production.
Towards the end of 2022, they already had a die-shrink down to TSMC N6. This improved energy efficiency, which then allowed Sony to optimize the PSU & cooling solution, further reducing costs and shipping weight. Unfortunately, the first version to feature this SoC still had the original case, which is huge.

I waited until this version shipped, since I wanted a version that was cooler and quieter than the original (plus, availability was still poor until mid-2023). I actually bought my disc version when the price dropped to $450. They hadn't released the slimmed-down version, by that point, but I don't mind as I think it's a little uglier, given that I partly wanted the PS5 for its UHD disc player and that looks a bit like a tumor, bulging out of the slim case.

Speaking of its big & heavy design, I guess I'm most surprised how chunky and heavy the controllers are. Massive difference, compared to the PS3 controllers.
 
Last edited:
um, i thought radeon supported directX, just like nvidia's gpus? so at the least they would still be able to run games, maybe not as well, eg. making use of specific radeon features, but games should still run. so this comment is kind of moot.
I agree with Kondamin on the whole of his first paragraph, I just think that He has somewhat exaggerated what would have happened to Radeon if AMD had not won console contracts, yes it would have hurt AMD but I doubt they would have gone bust.

As for your point Ogotai, I also agree with you in part, but it is these "specific Radeon features" that are precisely the kind of things that Sony wanted AMD to modify (or wholesale replace small segments of the chip with 100% Sony deign) that I understand Intel could not allow make happen anywhere near as well as with AMD/TSMC, or perhaps simply could not even use some of their chip part designs (IP blocks)) without redesign.

It has been noted that several companies that took designs to "Intel Foundry" ended up going with TSMC, and I understand this is because TSMC was simply easier to work with, so there is some evidence to suggest that if working with TSMC is easier for a company that already has a design, it will I assume be even more difficult if you need to actually collaboratively design parts of a chip together such as Sony AMD and TSMC have now done together with the PS5, PS5 Pro and PS6 is obviously ongoing but this is a baked in contract so with fingers crossed the end result will still be Sony/AMD/TSMC.!!!
 
  • Like
Reactions: artk2219
Towards the end of 2022, they already had a die-shrink down to TSMC N6. This improved energy efficiency, which then allowed Sony to optimize the PSU & cooling solution, further reducing costs and shipping weight. Unfortunately, the first version to feature this SoC still had the original case, which is huge.

I waited until this version shipped, since I wanted a version that was cooler and quieter than the original (plus, availability was still poor until mid-2023). I actually bought my disc version when the price dropped to $450. They hadn't released the slimmed-down version, by that point, but I don't mind as I think it's a little uglier, given that I partly wanted the PS5 for its UHD disc player and that looks a bit like a tumor, bulging out of the slim case.

Speaking of its big & heavy design, I guess I'm most surprised how chunky and heavy the controllers are. Massive difference, compared to the PS3 controllers.
I had forgotten about the die shrink and the original, huge loud version (and I get the feeling short supplied, I keep hearing that it was hard to get because of supply/demand) because I have only ever seen one (PS fan who ordered months in advance).

As for the design, I have liked / loathed various console designs over the years, but that is another subject altogether (which I wont start a thread on), and I largely agree with you about.

I am considering getting a game controller and (if it actually works just fine on Windows / Linux) I am open to options. If you have one, what is your opinion of the (current) X-Box vs PS controllers, which would you choose.?
 
  • Like
Reactions: artk2219
I wouldn't be surprised if AMD makes almost no profit on the PS6 chips. An article the other day said that AMD is focusing on volume over margin. AMD graphics is chiefly focused on getting developer support, and this is one way to do that.

The contract was made in 2022 though? The PS5 came out in 2020 and console generations are usually around 6 years. They seriously chose the chip 4 years in advance of release?
Look at how far back the work on the PS5 started. These things take a LONG time. Ryzen wasn't yet a shipping product, never mind Ryzen 2.

It isn't like decades ago when you might use an off the shelf chip set that was several years old. Even if you were doing a custom graphics co-processor, the transistor counts were in the mere thousands, as opposed to the tens of billions for current products. A chip might be the work of a single engineer, like POKEY in the Atari 8-bit computers and many arcade machines. The same guy, Doug Neubauer, also single-handed coded the classic Star Raiders. A Renaissance man.
 
  • Like
Reactions: artk2219
I agree with Kondamin on the whole of his first paragraph, I just think that He has somewhat exaggerated what would have happened to Radeon if AMD had not won console contracts, yes it would have hurt AMD but I doubt they would have gone bust.
The graphics division was working on a shoestring budget during the time zen1 &2 was being developed, I find it not unlikely they would have just cut it and sold the graphics up if they didn’t have a product that needed it.
 
  • Like
Reactions: artk2219
I suspect that Sony took some meetings to allow Intel to make their case but made sure AMD knew about it. Unless they offered an unbeatable price, Intel lacked the IP on the GPU side to make a competitive product for Sony's purposes. There would need to be another hardware partner, such as Nvidia or AMD to fulfill that part of the equation.
 
  • Like
Reactions: artk2219
I don't think this story is true at all. We learned our lesson on the original Xbox. Being in a game console is a crappy deal. You don't make any money at all.
The problem for Microsoft was that Intel and Nvidia WERE making money on the Xbox. Microsoft thought they would take a different approach to sourcing parts. This was part of the early idea that the Xbox would be a standard that many vendors like Dell and HP could offer, perhaps adding some additional feature to their SKUs. This would have been similar to the MSX computer standard, sold mostly in Japan. At its heart was the same chip set as the ColecoVision game console. Microsoft was involved in that multi-company initiative and memory of that was likely invoked in the early days when the Xbox was still just a concept.

Microsoft owned none of the chip sets or IP in the original Xbox hardware. It couldn't shop around for a better foundry deal or get a die shrink if the chip owners didn't see it as worth their while. It was a hard lesson. Microsoft has had ownership of the chip sets for all Xbox models since. The specific chips rather than the IP from hardware partners like IBM and ATI. (Microsoft does have an emulation license with Nvidia for the original Xbox.)
 
  • Like
Reactions: artk2219
I am considering getting a game controller and (if it actually works just fine on Windows / Linux) I am open to options. If you have one, what is your opinion of the (current) X-Box vs PS controllers, which would you choose.?
Sorry, I don't have any XBox, nor am I familiar with their controllers.

The PS5's controllers just seem so over-the-top (big, heavy, and laden with buttons & features) that I'd find it hard to recommend one, unless you really needed some of its features not found elsewhere. I liked the PS3 controllers, other than the fact that they were a bit small for my hands.
 
Last edited:
  • Like
Reactions: artk2219
It isn't like decades ago when you might use an off the shelf chip set that was several years old.
This is exactly what Nintendo Switch did!

Even if you were doing a custom graphics co-processor, the transistor counts were in the mere thousands, as opposed to the tens of billions for current products.
The original Playstation's SoC came in at 1M transistors.

The original Intel 8086 CPU had 29k transistors.

A chip might be the work of a single engineer, like POKEY in the Atari 8-bit computers and many arcade machines. The same guy, Doug Neubauer, also single-handed coded the classic Star Raiders. A Renaissance man.
The Nintendo 64 devoted an entire building, on the SGI campus, to design its Reality Coprocessor (RCP), from what I recall. That was already just over 30 years ago.
 
If Sony change the CPU for Arm I Think will be easier to Jail Broken.
Xbox one and Xbox series With AMD desing Never got a HIT
The Nitendo has defeat with a paper clip using nvidia.

If as a reason to keep AMD is the Fear of mods or Jail broken.
It's not like there haven't been plenty of hardware security vulnerabilities found in Intel and AMD chips. I guess none of them have been relevant to the jailbreaking community so far. If we get a jailbreak, that's the day I buy a console to use as a nice little PC, preferably Series S since those are what, $200? Or more like $90 on ebay.
This whole article is a months old MLiD leak...
This leak has been CLAIMED by Reuters Gang.
The "rumoured" upscaling tech used in the PS5 Pro was reportedly simply handed over to AMD for them to use.
I think I saw speculation like that but we don't know yet. If FSR4 is identical to PSSR, it would be pretty funny. It might even mean PSSR comes to Xbox depending on what RDNA architectures it supports.
 
  • Like
Reactions: artk2219
I am considering getting a game controller and (if it actually works just fine on Windows / Linux) I am open to options. If you have one, what is your opinion of the (current) X-Box vs PS controllers, which would you choose.?
Go xbox. I have both for use with PC, and while the PS5 controller is definitely nice (and supports gyroscopic functionality, if you need that), it is expensive, heavy, and I actually consider the built-in lithium battery a con because it cannot be easily replaced. Rechargeable batteries in the xbox controller is great for me, and I prefer its A/B/X/Y button feel compared to PS5's buttons.
 
Last edited:
  • Like
Reactions: artk2219
It isn't as widely known as it should be today, because a lot of people in the biz don't have experience that goes back decades...😉 Since I'm remembering back so long ago, I hope my menory is intact here. Anyway, the console "wars" are nothing new and are always open to any company wanting to bid. In fact, the first xBox was made using Intel CPUs and nVidia GPUs, IIRC, but then nVidia did a no-no and decided to sue Microsoft. Microsoft paid it and then dropped nVidia for good after that. Gates was still CEO at the time and apparently nVidia pissed him off...😉 So nVidia made a short-term profit but took a long-term loss. You can't blame Microsoft at all, as it felt--with some justification--that it could no longer trust nVidia, and that was that. But ever since, for xBox and for PS, the companies have been open for competitive bids and from what I recall, both Intel and nVidia and some others regularly bid on new console contracts and seem to always lose out to AMD--an extremely tough competitor.
The OG XBOX was a bit of a show, and there is a fun bit of history regarding AMD, Intel, and the original Xbox. Microsoft was originally going to go with AMD for the CPU, then at the last minute Bill Gates called up Intel and they ended up making a quick backroom deal. AMD had worked with Microsoft and designed the original Xbox hardware with them from the ground up. However on the day of the XBOX reveal, Microsoft stated that they were using an Intel CPU. The AMD engineers that had worked on the project were all in the audience, and Microsoft told them nothing about the change before the event. The hardware they used for the demo was running an AMD CPU, and Microsoft just passed it off as an early prototype. It was a pretty big betrayal after all the hard work AMD had done on the project.
 
This is exactly what Nintendo Switch did!


The original Playstation's SoC came in at 1M transistors.

The original Intel 8086 CPU had 29k transistors.


The Nintendo 64 devoted an entire building, on the SGI campus, to design its Reality Coprocessor (RCP), from what I recall. That was already just over 30 years ago.
And the original Playstation SOC was several generations in. I'm talking about the first generations like the Ataris and Colecos. The chip set in the Colecovision was entirely existing off the shelf parts and the transistor counts then were in the thousands. Yes, the 8086 was a higher count and it was also very expensive for an application like a game console in the time frame when it was considered a flagship product.

Yes, Nintendo disdained continuing the expensive merry go round of custom chip sets that wasn't delivering the wins they wanted. Their biggest success before the Switch, the Wii, was an upgrade Gamecube chip set. This kept the cost down and let them focus on creativity on a platform with low cost of entry. They forgot the cost of entry part when it came to the Wii U because they'd been distracted by the success of the iPad. If they had shipped the Wii U as a Wii HD using the exist Wiimote and Pro controllers for $200-250 and offered the GamePad as an accessory, they'd likely have seen far more success with this approach. The Switch was an instance of Nintendo remembering what they're good at. Competing in tech horsepower arms races has never gone well for them.
 
Nintendo disdained continuing the expensive merry go round of custom chip sets that wasn't delivering the wins they wanted. Their biggest success before the Switch, the Wii, was an upgrade Gamecube chip set. This kept the cost down and let them focus on creativity on a platform with low cost of entry. They forgot the cost of entry part when it came to the Wii U because they'd been distracted by the success of the iPad. If they had shipped the Wii U as a Wii HD using the exist Wiimote and Pro controllers for $200-250 and offered the GamePad as an accessory, they'd likely have seen far more success with this approach. The Switch was an instance of Nintendo remembering what they're good at. Competing in tech horsepower arms races has never gone well for them.
Nintendo seemed to do extremely well with the N64, which as I said before was a massive custom design effort.

Likewise, Gamecube used another custom-designed chip, this time by a company called ArtX (which ATI would later acquire). As you say, Wii used essentially an overclocked version of that chip, meaning Wii naturally retained backward compatibility with Gamecube games. Certainly, if you take Gamecube and Wii together, Nintendo did extremely well on that custom chip, as well!

As far as I can tell, Nintendo did pretty well at competing in the horsepower arms race, for about as long as they stayed in it. Once they saw how well Wii did, with substantially worse specs than PS3 or XBox 360, that's probably when they decided they could do fine with an alternate strategy that focused less on eye candy and more on games with broad appeal.
 
Nintendo seemed to do extremely well with the N64, which as I said before was a massive custom design effort.

Likewise, Gamecube used another custom-designed chip, this time by a company called ArtX (which ATI would later acquire). As you say, Wii used essentially an overclocked version of that chip, meaning Wii naturally retained backward compatibility with Gamecube games. Certainly, if you take Gamecube and Wii together, Nintendo did extremely well on that custom chip, as well!

As far as I can tell, Nintendo did pretty well at competing in the horsepower arms race, for about as long as they stayed in it. Once they saw how well Wii did, with substantially worse specs than PS3 or XBox 360, that's probably when they decided they could do fine with an alternate strategy that focused less on eye candy and more on games with broad appeal.
It's always shocking to me how hard we tend to push for graphics, graphics, graphics. I'm as guilty as any, I know. But the Switch has a freaking ShieldTV processor, which wasn't even the fastest thing around when it came out in 2014 or whatever. I mean, it's a refined Shield Tablet chip basically. But if you target that chip specifically and design your games around it? Yeah, you can make a pretty sweet device, as the Switch proves.

Incidentally, I'm not a console gamer — never have been. But a friend gave us a Wii with a bunch of games, and my family still plays Mario Kart Wii on it on a regular basis. The only real issue is that 3- and 4-player mode (with four different camera views being rendered) drops the framerate to 30 fps. It's very noticeable. Going from 2-player or single-player to four people is painful. And the Switch is fast enough to not have this problem. Anyway, my main point was to say that great game design trumps hardware every time.

The thing with Sony and MS is that they're basically selling very similar hardware, so the games and core specs are the only thing to differentiate them. Going after the motion controllers with Wii and now the joycons on the Switch was a brilliant move by Nintendo.
 
  • Like
Reactions: bit_user
a friend gave us a Wii with a bunch of games, and my family still plays Mario Kart Wii on it on a regular basis.
Yeah, but wasn't it limited to analog output like 480p output resolution? And that's if you even have the component video cable for it. That'd be my main beef with it and it's one reason I got a 1st gen PS3: upscaling + HDMI output for PS1 and PS2 games.

The PS3 had some very good looking games on it. Towards the end of its run, especially. I was impressed with how detailed and fluid Wipeout HD was, locking in 1080p @ 60 fps. The last installment of Mortal Kombat for it was also gorgeous (if you could say that about such a gory game). There were a few others for it that were quite impressive, but most of them only ran at 720p.

The thing with Sony and MS is that they're basically selling very similar hardware, so the games and core specs are the only thing to differentiate them. Going after the motion controllers with Wii and now the joycons on the Switch was a brilliant move by Nintendo.
I got a PS5 last year, having skipped the PS4 generation. I don't use it for much other than streaming (which is the main reason I got it + UHD bluray player), but I don't really see a night-and-day difference between some of the PS4 games I have for it vs. some of the PS5 exclusives.
 
Yeah, but wasn't it limited to analog output like 480p output resolution? And that's if you even have the component video cable for it. That'd be my main beef with it and it's one reason I got a 1st gen PS3: upscaling + HDMI output for PS1 and PS2 games.
Yeah, it is... and I had to buy some no-name Wii to HDMI adapter. But it works, so far for maybe five years. LOL.
 
  • Like
Reactions: bit_user
Nintendo seemed to do extremely well with the N64, which as I said before was a massive custom design effort.

Likewise, Gamecube used another custom-designed chip, this time by a company called ArtX (which ATI would later acquire). As you say, Wii used essentially an overclocked version of that chip, meaning Wii naturally retained backward compatibility with Gamecube games. Certainly, if you take Gamecube and Wii together, Nintendo did extremely well on that custom chip, as well!

As far as I can tell, Nintendo did pretty well at competing in the horsepower arms race, for about as long as they stayed in it. Once they saw how well Wii did, with substantially worse specs than PS3 or XBox 360, that's probably when they decided they could do fine with an alternate strategy that focused less on eye candy and more on games with broad appeal.
One of the reasons the Wii was just a modified Gamecube is that the motion controller started out as a Gamecube accessory. But the Gamecube got stomped badly by Sony in its generation. first and second party games did well but third party was was mostly a slaughter. Which was also the case on the N64. Both generations also trailed in installed base, which limited the potential of the strong first and second party titles. Not as painfully as on the Wii U but it was readily noticeable. Nintendo has become accustomed to making immense profits off third party with much less expensive hardware development costs. The Wii was where they chose to get off that track and focus on creativity.

It should also be noted that the Gamecube design was a reversal from the Ultra 64 project. Custom but with a focus on cost. Thus the Gamecube launched for a third lower MSRP than the PS2 and Xbox and was able to go lower until their hardware sales stopped justifying die shrinks within that generation for the same platform rather than the Wii.

ArtX was a company I first encountered at Comdex when they were pitching a PC chip set with IGA that included T&L, which was still considered a high end feature for discrete graphics.
 
  • Like
Reactions: bit_user