read it http://www.amd.com/us-en/Corporate/VirtualPressRoom/0,,51_104_543~103048,00.html
Maybe that was the reason they were delaying 65nm till next year. :lol:
Maybe that was the reason they were delaying 65nm till next year. :lol:
5Ghz with diminishing results!on paper everything is perfect. If only on paper perfection is required, Intel P4's accell beyond 5 ghz
No, but their new process is looking very good right from the beginning.
In fact, Intel was looking up and up until they switched to the 90nm process. So maybe their suckiness was only for that process.
That is funny. A prescott on any process is still a big leak. Well maybe not FD SOI, but even then, @ 65nanos, it would still leak badly. Just too many interconnects, with too many dissimilar charges next to each other.In fact, Intel was looking up and up until they switched to the 90nm process. So maybe their suckiness was only for that process.
No flame - just a correction - Intel has no FAB (or any other presence that I see listed) in Dresden. That's that other company... And we've got at least 3 (or is it 4) FABs ramped, or ramping, on the 65nm 300mm process.Intel still has the advantage. They've already migrated over to 65nm in Oregon and I believe they've started in Dresden.
Aside from what you mentioned, fab 39 is not thier production facility yet. That task still rests with it's next store neighbour.if amd is already 65nm capable, whats stopping them from switching over?
I remeber when prescott was supposed to be king shit and Spud @ the time had listed an entire page of improvements out of his head(he had really high hopes) and then a few days before release the pipline increase info was leaked and then the final product only for reviewers to find that it was actually slower overall and especially bad in games. :lol:No, but their new process is looking very good right from the beginning.
In fact, Intel was looking up and up until they switched to the 90nm process. So maybe their suckiness was only for that process.
Yes possibilities are endless but unfortunatly money and time is not and game developers need to get this stuff out the door on time and budget. Most serious gamers are AMD user's anyway so why bother.http://www.extremetech.com/article2/0,1697,1895945,00.asp
It seems that game developers, in a rush to get the game out of the door, usually fail to optimize the code for the latest instruction sets. It seems that many games are only optimized for the Pentium III generation which means only SSE support. This means the code penalizes the Pentium IV by not taking into account its higher latencies or its support for SSE2 or SSE3. Now people may feel that optimizing code for the Pentium IV would penalize AMD, but that may not be the case.
"This is unlikely to penalize AMD specifically, though unrolling loops and other P4-specific operations might possibly penalize the Athlon 64, but it's hard to know without actually trying it. But using SSE/SSE2 shouldn't adversely affect AMD. Even Fred Weber, AMD's former chief technology officer, acknowledged that SIMD was the way to go with floating point as we move into the future."
It seems that AMD has no problems with game developers optimizing code for the Pentium IV generation as AMD processors likewise support SSE, SSE2, and SSE3.
What's even more interesting is that in many cases, game developers don't even activate support for SSE as even AMD recommends. They only use FPU code which runs slower on the Pentium IV.
If game developers spent a bit more time to optimize their code for SSE, SSE2, and SSE3 as AMD's Weber suggests, Intel's processors would see better performance in games. It probably won't be enough to dethrone AMD at the very top, but it offers free performance improvements to everyone by making full use of the processor whether AMD or Intel.