It absolutely is here's an example I saw today, but it's hardly the only one:I'm not convinced Zen 2 is the bottleneck for console games right now, and people don't seem too excited about graphics at the moment.
https://www.eurogamer.net/digitalfo...impresses-on-xbox-series-x-s-but-flags-on-ps5In testing Space Marine 2 on PC, it's clear that the game's encounters with alien swarms pushes the CPU hard - and at least as of patch 1.02, that results in lop-sided performance on consoles in the 60fps speed mode. Series X typically outperforms PS5 by around 10 to 15fps, despite both machines using the same settings and resolution targets. The lowest drops come when taking a flamethrower to the encroaching horde, where PS5 drops to the mid-30s while Xbox Series X is in the mid-40s.
The biggest problem this generation is that Microsoft's design absolutely couldn't leverage the performance advantage it should have had against the PS5 to any meaningful degree. The Series X should have been faster in pretty much every game but their overall design seemingly hampered what it could do so it ended up typically in parity with the PS5. This is similar to the problem Intel's first generation Arc cards have had where their performance is fine, but the silicon size versus performance is huge compared to AMD/nvidia. The only other time Microsoft has had the fastest console was the One X which seemed to do relatively well for them though that generation was already done.V cache, maybe could be a good reason to stick with AMD, and AMD would be likely if Microsoft wants to wait a couple more years and have a repeat of this generation, presumably with even bigger-er, more power consuming, and more expensive-er consoles.
...
Microsoft should be realizing by now that their strategy of having the most powerful console has never come remotely close to working.
Gimmicks instead of games first murdered the Xbox One because they aren't cheap. If Microsoft tries to put out a console that's no better than what they already have and only has a bunch of gimmicks the hardware side of Xbox is dead and buried. The Xbox One/PS4 era ruined the hardware side for Microsoft which in turn did the same with the current generation (without anything must have software wise you have to win by having a better experience which they didn't have). If they want to gain back market share it has to be by providing something they can't get with the competition and gimmicks alone aren't that. They could probably get away with some sort of handheld, but absolutely impossible to do a regular console without bringing real improvements.If the next Xbox is going to break generations, then I'm not envisioning a flagship. The Xbox series X is selling like a GameCube; Microsoft needs a Wii. Make a smaller, cheaper, "cute", colorful console that casts a wide net and brings the fun. Simplify to 1 model at launch. Ditch the current ad-first interface. They should also strongly consider letting people play online for free.
Throw in all the gimmicks. Maybe do Kinect again which will hopefully benefit a lot from AI, or anything else new they have in the works. Maybe a VR chat type thing for their avatars.
Maybe make and integrate a twitch-like platform so kids can buy jpegs with real money and run baby's first live stream with overpriced first-party accessories. Whatever it takes to look fresh. Doesn't matter.
Get it on the market first, start accumulating some wins, and holy cow get some system selling games on the thing. The "flagship" can wait a couple years and be the Pro/elite/X model - but only if absolutely necessary.
I would say that Nintendo for the first time has finally embraced third party software and designed a storefront that matches that goal. I've long been a Nintendo fan, N64 my first system and I have purchased everyone of their followups including portables. This is the first time, that Nintendo has shown a desire to make it easy to showcase third party software. The eShop on the Switch has made a world of difference for Nintendo and it has turned third party software sales into the Golden Goose for Ninty.the Switch has been a knockout, but that's been solely driven by Nintendo IP.
Such as Vampire Survivor !$700 for a console that requires subscription fees to play online and has a catalog mostly consisting of AAA games seems like huge waste of money to me.
I do own a PS5 disc version. It gets used a lot for 4K Blu-rays and somewhat for a few single player games. To have the disc add-on as an additional cost seems absolutely ridiculous to me. Graphics and FPS are nice, but the explosive growth of indie and retro games should remind everyone that visuals aren't everything.
Not failing to deliver but rather not being the core target of PC game developers when the installed base is a tiny fraction of the market. Devs will seek to make their products utilize the high end but not to the exclusion of the other market segments, not just for the same generation but recent generations as well.I think you misunderstood my meaning, games built with the Pro spec in mind will have to also run on PS5 but at a lower spec, I see it happening. Now as your conjecture about future, we will have to see how things turn out in the future. The 4090 sold like hot cakes though, I'm not sure why you are bringing that to the conversation as evidence of something failing to deliver.
Going to a more recent ZEN version is pretty much a given. It comes down to what process node they want to reserve productions slots on at TSMC. AMD will have a ZEN version that already lives on it, along with an RDNA version. Since architecture generations don't get die shrinks for the most part anymore, the architecture dictates the process node and vice versa.The current CPU cores are Zen 2 which has become a massive problem the longer the generation goes. Zen 4c/5c are definitely an obvious path forward on the x86 side of things as they're faster, more efficient and have great density. It's also possible that AMD could do a shared V-Cache for an APU which would cover both CPU and GPU. I'm not sure whether or not that would end up being cheaper/more efficient than a wider memory bus though.
The only chance of Intel getting back into the console game would be if they were doing all of the manufacturing and I'm not sure there's a timeline which would work.Going
Arm is certainly a possibility, but you run into a graphics IP problem pretty rapidly. I'm not sure AMD, nvidia or Intel would license their IP if they didn't get something else out of it.
Neither one has a custom Arm SoC architecture and the standard Arm ones are not competitive (so far) with current x86 cores when it comes to largely single threaded like gaming is. Apple and Qualcomm are the only two companies with Arm designs that can compete on this level and Apple is obviously not on the table here.AMD and Nvidia already have ARM licenses.
...
Nvidia, of course has been using ARM as the CPU for their SOCs for many years, most famously in the Nintendo Switch and its anticipated successor. It remains to be shown if they can deliver an ARM implementation competitive with what Apple and Qualcomm are doing but if Microsoft or Sony came knocking, they'd surely try to make a case for being to deliver.
This is also ignoring that AMD is a business and if they were to license the GPU IP it would have to be at a rate which makes financial sense to them. Using multiple companies to build a gaming SoC is asking for trouble which is a driver behind why we saw both Sony and Microsoft go single source for PS4/Xbox One and beyond. Neither company wants to take a loss on selling hardware anymore which makes it even harder to partner with multiple companies. Microsoft could potentially be developing an in house Arm SoC which could very well be applicable to this use case, but nobody really knows for sure.Also, AMD has already demonstrated a willingness to place their GPU IP on another company's ARM SOC, as they've done with Samsung. ARM is also used for subsystems within AMD products. It's only visible to those coding system drivers and end users need never give it any thought. AMD and the portion formerly known as ATI have worked with Qualcomm in the past, eventually selling them the ATI mobile products division. This is how the first Snapdragon SOC had a GPU derived from that originally created for the Xbox 360. Qualcomm's Adreno GPU line is alright for its intended market but for an ARM based game console it could work well for Qualcomm to partner with AMD to have the best combination of CPU and GPU.
Of course most gamers will be on cheaper GPUs and the majority of cars will be the affordable ones, you are not spilling some secret wisdom here.Not failing to deliver but rather not being the core target of PC game developers when the installed base is a tiny fraction of the market. Devs will seek to make their products utilize the high end but not to the exclusion of the other market segments, not just for the same generation but recent generations as well.
The 4090 sold well for a premium product but just as far more cars on the freeway are going to be Honda Civics and Toyota Corollas than $100K Mercedes Benz and similar models, the 4090 isn't going to sustain a publisher by itself. The most recent Steam survey shows the 3060 as the most common discrete GPU. Not even the 3090 but the 3060. Devs will always want more processing power but their continued employment hinges on serving the broadest market.
It's a mystery simply because things change over time and Sony isn't saying what node it's using. Yes, Zen 2 was made for N7 and later ported to N6. Does that mean that PS5 Pro with a new GPU has to stick with N6? Most definitely not. It could, and it equally couldn't. It also doesn't absolutely have to be a monolithic chip.I don't know why the authors are acting like the process node is some great mystery. It is 6NP in its entirety or a large portion. We already know Sony ported the PS5 APU to 6NP late in the PS5's production and then used the newer APU version in the Slim. At the same point in the PS4 and Xbox One life cycles, the engineers assessed how much ceiling for more transistors they had. Adding CUs was the obvious first choice, then making tweaks that offered improvement while maintaining compatibility.
6NP is the default here, as it exists to offer designers of 7NP devices a means of refinement while being closely compatible to the older process. While the use of chiplets opens the possibility of mixing process nodes on the same package, it has its own problems still. Taking the whole APU to 5NP or denser is very unlikely as the cost would be prohibitive. AMD has zero interest in porting ZEN2 or RDNA2 to those nodes for its own products, so all of the expense would fall on Sony. (Microsoft too if they went this route but it appears they aren't bothering with a mid-gen upgrade this generation.) When Sony and Microsoft wanted die shrinks in the previous generation, AMD had already done the work on their IP for their use. Thus the cost for Sony and Microsoft then was fairly low by comparison.
Ergo why I said 'a large portion'. AMD has zero interest in putting ZEN 2 on anything denser than 6NP and Sony wouldn't want to shoulder the cost on its own, so I'm fairly confident that little has changed on the CPU from the PS5 Slim APU. RDNA 3 has N6 implementation, so borrowing from that to upgrade the GPU wouldn't be too painful or expensive.It's a mystery simply because things change over time and Sony isn't saying what node it's using. Yes, Zen 2 was made for N7 and later ported to N6. Does that mean that PS5 Pro with a new GPU has to stick with N6? Most definitely not. It could, and it equally couldn't. It also doesn't absolutely have to be a monolithic chip.
Look at what AMD has done in the past few years. We've had CPUs using chiplets for PCs since Zen 2. Take everything except for the GPU and put it on one chiplet, then put the GPU on a different chiplet. Would that be more difficult or expensive? It's a bit of both I think. RDNA3 pioneered GPU chiplets as well (though only for Navi 31/32).
What we do know is that the PS5 uses RDNA2, which was also for N7. However, all the stuff about AI and two to three times the ray tracing performance suggests this isn't just standard RDNA2. So, if AMD is building a new GPU design that either takes elements of RDNA3 (or something even newer), well, it can again go either way.
Bottom line: Until Sony (or AMD) states definitively what node is used, we can't just assume it will be the same as the PS5 or PS5 Slim chips.