ovaltineplease :
While it was entertaining to read your essay ( I hate to say it but I couldn't be bothered, not trying to be rude ) - it is unlikely that the MSRP would be any higher than 649.99 - otherwise they wouldn't have a hope to keep up with 4870x2, and they obviously know this otherwise gtx260/280 wouldnt have dropped in price like they did. Doubling the count of the card's physical statistics certainly makes it more expensive, but its not going to throw it into the 1000$ range anyways.
I'm not quite so bothered by your impatience at what I wrote, but rather that you seemed to not bother to realize that I'm not FrozenGpu. While I argued against points made against them, I wasn't necessarily in agreement with them.
I do agree that their follow-up card couldn't possibly cost $1,000US. nVidia knows that they probably wouldn't be able to get even $600US for whatever it is, just about no matter how powerful it is, or even if it costs them more than that much money in order to produce it.
Mainly, what I was getting at is how preposterous the numbers being presented are; nVidia's hurting right now, and such a GPU would only hurt them worse.
ovaltineplease :
If nvidia made a gtx280x2 it wouldn't use GDDR5, it'd use GDDR3 as there is no rhyme or reason for them to move to GDDR5 when the architecture is questionable to support it.
Building graphics cards might seem as simple as "throw a bunch of crap on a pcb" to some people, but its not - its obviously much more complex than that and making large sweeping architectural changes like switching to GDDR5 obviously requires a different type construction and design.
Exactly the point I was alluding at; I wasn't saying so much that the next card woud use GDDR5 and cost more, rather than that such was a reason it WOULDN'T use GDDR5. While I have my doubts that there would need to be much redesign necessary to the PCB in order to accomodate GDDR5, (as far as I know, it has the same pin count, it'd just need the PCB to take EMI/crosstalk considerations a tad more seriously) it would, however, necessitate a pretty serious re-design of the GPU's memory controller array, which would be nothing to sneeze at.
dos1986 :
No offence John, but we dont know what Nvidia or Ati can and cant do.We are just enthusiasts who sometimes, hear from a source, we are not engineers.
At the rate nVidia's going, I'd wager that a lot of us enthusaists might happen to know nVidia better than nVidia's own people.
(I'd note that a number of engineers are among the enthusiasts; while not working as an engineer, I have a bit of education as one)
dos1986 :
And what do you mean by power envelope?
Their is no envelope for enthusiasts with pockets full of cash.These babies may need fuel and they will fund it/find a way.
No, no amount of cash thrown at buying a PC can allow it to exceed the laws of physics; I'm sorry, but you just over-ran your own knowledge there.
It remains a fact that you can't break the limits of how much electricity you can feed the parts. One major way nVidia dug a hole for themselves, as I mentioned prior, (if you'd have bothered to read my post; long it may be, it's still a part of the topic) the GTX 280 comes dangerously close to the absolute maximum power that can be provided to a video card; it's at some 234 watts, out of a practical maximum of 250 watts. If you crank the speeds higher, it's going to require much more power, close to proportionate to the increase in clock speed. Likewise, GDDR5 consumes a lot more power than GDDR3; that's a major factor as to why the 4870 consumes some 37% more powerthan the 4850.
THAT is what is meant by "power envelope." It's simply how much electrical power can be provided.