alextheblue
Distinguished
[citation][nom]A Bad Day[/nom]From what I researched, there were no mentions of a separate GPU. Besides, it would make sense to have a single GPU than a more cumbersome dual GPU system (think micro-stuttering).[/citation]Well, I also haven't seen any mention of a external GPU, but that doesn't mean it doesn't have two iGPUs. After all, I haven't seen any 8-core Jaguar designs announced. It's probably two custom Jaguar+GCN APUs together.
With that being said, your assumption that having two GPUs in a console would mean microstuttering is way off base. The reason we have microstuttering has to do with how we have to utilize multiple GPUs to render on PC, for compatibility purposes. On a console you're not going to be using traditional SLI or Crossfire, except maybe as a fallback for lazy devs. Even then, there's a lot they could do to minimize or eliminate microstuttering on fixed hardware like a console. You could even change how you render entirely - the two GPUs could render independently.
Basically, having two GPUs on a console means more bang for the buck, without the drawbacks of having SLI or Crossfire. Especially if you're putting two APUs together, for twice the CPU and GPU punch, while still maintaining a low cost and thermal envelope! I think this is the way to go to keep costs down this generation, and I wouldn't be surprised if MS did something very similar. Jaguar CPUs have better IPC than their predecessors, and the consoles get carefully tuned and tailored software, with lower overhead. So performance should be comparable to a reasonably high-end system.
The lower costs could also mean a more reasonable lifespan (4-5 years?), and using more-or-less standard AMD tech means the successor to these consoles likely won't have backwards compatibility woes.
With that being said, your assumption that having two GPUs in a console would mean microstuttering is way off base. The reason we have microstuttering has to do with how we have to utilize multiple GPUs to render on PC, for compatibility purposes. On a console you're not going to be using traditional SLI or Crossfire, except maybe as a fallback for lazy devs. Even then, there's a lot they could do to minimize or eliminate microstuttering on fixed hardware like a console. You could even change how you render entirely - the two GPUs could render independently.
Basically, having two GPUs on a console means more bang for the buck, without the drawbacks of having SLI or Crossfire. Especially if you're putting two APUs together, for twice the CPU and GPU punch, while still maintaining a low cost and thermal envelope! I think this is the way to go to keep costs down this generation, and I wouldn't be surprised if MS did something very similar. Jaguar CPUs have better IPC than their predecessors, and the consoles get carefully tuned and tailored software, with lower overhead. So performance should be comparable to a reasonably high-end system.
The lower costs could also mean a more reasonable lifespan (4-5 years?), and using more-or-less standard AMD tech means the successor to these consoles likely won't have backwards compatibility woes.