matt_b :
What happened to the technology sector where efficiency was where everything was headed? More powerful, efficient, smaller in size, less heat - where did the smaller energy requirement go?
Mostly, it's an unfortunate "side-effect" of the high levels of competitiveness between companies. Moore's law means that hardware makers have to wait to cram in more transistors... So the only other way to scale up performance to beat their competitor is to jack up the clock speed. And while smaller fabrication processes CAN cut power consumption, it's nowhere near as fast. As a result, chips have gotten hotter over time; this can be readily seen over just the past decade, from where passive cooling was sufficient for both mainstream CPUs and GPUs, to requiring massive active coolers. (and now, water cooling has gone from being a very 'underground' mod to being commonplace for those who can afford it)
Also, the prospect of needing to upgrade the household circuitry to run a PC makes me flinch just thinking about it.
siuol11 :
When Nvidia "ramps up production?" News flash, they ramped up a while ago. There are no cards because the yields are low.
No, they mean once they buy out and refit all the fabs in the world to make Fermi.
😀
ohim :
i trully wish this physx thing will end up like Creative`s EAX implemented even in onboard sound cards.
Actually, at this rate, I think that PhysX will simply wind up not being used by developers; last I knew, Havok held a dominant market share, particularly owing to better cross-platform compatability, as it works as well on both ATI and nVidia cards.
antilycus :
NVIDIA's chipset is brand new. NVDA has the experience, expertise and intelligence to make the best chipsets available. All AMD/ATI does is copy what NVIDIA does and put more crap in it, like pipelines, faster ram, faster shaders etc etc. I have yet to have 1 machine (out of about the 35 here at work) that runs correctly with an ATI Display Driver... I have been around long enough and have enough experience with 3D Accelerators/GPU's to know that NVIDIA is top of the line. The rest are just playing house.
Well, it's also worth noting that ATI's about 8 years older than nVidia. They don't exactly "copy." They've definitely invented a lot of things on their own. They've also been plenty agressive at times at adoping things before nVidia, such as DirectX 10.1, and later 11, support. Similarly, extra features are invented by them, such as EyeFinity.
It's unfortunate to hear anyone having a lot of problems with getting their computers to work... Though I would note that simply because you had those issues, doesn't mean everyone else does. Personally, I've gotten about equal numbers of problems with both ATI and nVidia drivers.