CPU's never bottleneck a GPU. Sure, either CPU/GPU can bottleneck a game's performance, but the CPU does NOT bottleneck a GPU. There is no relationship between bottlenecking between a CPU and GPU. This is a Myth created by people who just repeat what they hear (sheeple).
Really when it comes down to it... it's all about [shitty] programming in the SOFTWARE.
Take Everquest 2 for example. Despite its intense graphics, it's primarily CPU-driven and ignores the GPU.
If you add a second graphics card (SLI/Crossfire) you will LOSE performance. This is a big deal in 2009.
Take SupCom for example. Despite that it's a wonderful game and fine in performance, the game's actual scope goes far beyond any CPU's capacity to handle all the calculations without any programming to divert some of the tasking to the very powerful and often unutilized gpu. This isn't so much of a big deal, but the developers could have put a limit on the game or increased performance to be at a realistic level at [x] amount of units/calculations.
When programming software, the developers of video games need to take performance into consideration. If you want to make a game that can go from small skirmish to epic scale- you need to program with such foresight into performance. If you want to make a video game at all, you need to utilize modern technology instead of ignoring it.
I say "Bottleneck is a Myth" to reiterate the simple fact that the problem is rarely the hardware, but the lack of foresight (or too much foresight to "future proof" a game) in the actual Software.
As anyone can see, there is software made with absolutely wonderful performance, and software made with horrid performance.
There are video games that run smoothly on crappy systems that look stunning, and video games that run horribly on even the best systems and still have shitty graphics.
Bottleneck is in the software, infinitely more than the hardware- so grab your pitchforks and torches and tell your gaming developers (actually...their producers and advertisers $$$$$$$$...) and get some better performance out of your games.
Really when it comes down to it... it's all about [shitty] programming in the SOFTWARE.
Take Everquest 2 for example. Despite its intense graphics, it's primarily CPU-driven and ignores the GPU.
If you add a second graphics card (SLI/Crossfire) you will LOSE performance. This is a big deal in 2009.
Take SupCom for example. Despite that it's a wonderful game and fine in performance, the game's actual scope goes far beyond any CPU's capacity to handle all the calculations without any programming to divert some of the tasking to the very powerful and often unutilized gpu. This isn't so much of a big deal, but the developers could have put a limit on the game or increased performance to be at a realistic level at [x] amount of units/calculations.
When programming software, the developers of video games need to take performance into consideration. If you want to make a game that can go from small skirmish to epic scale- you need to program with such foresight into performance. If you want to make a video game at all, you need to utilize modern technology instead of ignoring it.
I say "Bottleneck is a Myth" to reiterate the simple fact that the problem is rarely the hardware, but the lack of foresight (or too much foresight to "future proof" a game) in the actual Software.
As anyone can see, there is software made with absolutely wonderful performance, and software made with horrid performance.
There are video games that run smoothly on crappy systems that look stunning, and video games that run horribly on even the best systems and still have shitty graphics.
Bottleneck is in the software, infinitely more than the hardware- so grab your pitchforks and torches and tell your gaming developers (actually...their producers and advertisers $$$$$$$$...) and get some better performance out of your games.