Kepler news and discussion

Page 30 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.


Nothing strange IMO, you already have the faster GPU on the market and for less price that your competitor. Releasing another one just give you a sure place in the market, while you finish the build of mind end GPUs.
 
im loving that the world has actually given me a blessing in disguise. by crashing my car and the holiday im saving for i estimate i wont be purchasing a new card till late september. right after all cards are expected to be released and ill know the perfect one to get xD 😀
 
Releasing the 690 before the 7990 would give AMD an opportunity to tweak their clocks to be more competitive. I wonder if Nvidia knows something about the 7990 making them comfortable to release the 690 first.

It will be interesting to see if the 7990 ends up being real noisy in order to compensate for the higher clock speeds that will likely be necessary to compete with the 690. The 7970 at 925 mhz is already a tick louder than the GTX 680 at a turbo boosted 1100+ mhz.
 
Well here is a thought, If NVIDIA wasn't already confident they wouldn't release it until they knew what AMD's card did on the Market and on the court. NVIDIA may not even see much of a competition now with AMD since they are going for the "Price/performance" and Heat and noise and power usage
 


That would be a problem, releasing the GTX 690 before HD 7990 can means somethings:

1) AMD/ATI can't find the way to beat that clock/performance/power use and price.
2) AMD/ATI want wait the GTX 690 release to work in something better that give problems to nVidia.

In both cases, I see that AMD/ATI have to "re build" the current design of those Tahiti GPUs for be on similar performance/power use and price of Kepler architecture.
 
Of course the aces that AMD holds are the slightly higher dual GPU scaling and the slightly better clock scaling. Maybe these are the equalizers in the equation. Still though the noise factor will be an issue either way. They will want to avoid the epic noise situation that plagued the 6990.
 
The past two dual GPU models attempted to force at least some of the heated exhaust outside the case. Maybe they are trying to play it safe since not everyone who owns one will have a well ventilated case.

The Asus Mars 2 had a nice internally exhausting setup with two full GTX 580 cores. I haven't seen any report on how that affected internal case temperatures, but I don't expect there were too many reported problems. The card itself ran quite cool.

imageview.php
 


That's why you wait for the non-reference coolers....

Nvidia isn't going to spend the extra time and money on an extravagant cooling solution from the factory when AIB manufacturers are just going to change the design anyway. The 3rd parties rely on their own modifications and features to make their products stand out. Aside from that, the card is designed with certain specifications: power consumption, heat tolerance, and stability are among those considerations. To spend extra money on cooling, when it isn't required, just doesn't fit into the market plan.
 
The biggest problem with the dual GPUs is going to be availability. If history is any indication, regardless of when AMD and Nvidia do their "paper" releases, stock is going to be pretty scarce. I think the $699 price tag is a bit conservative for the GTX 690. Personally I expect it will be a $749 card, but we'll have to wait and see. :)

I think, in the dual GPU market, people are more likely to stick with their preference of manufacturer. Meaning, a person that typically buys Nvidia isn't going to switch to AMD unless their is some very compelling reasons. If the GTX 680 and HD 7970 are any indication, there isn't going to be a huge difference between the dual GPU cards. Each card will have its strengths and games that it will outperform the competition on, really it's just a matter of brand preference and card features. Nvidia has certainly narrowed the gap in multi-monitor performance, plus they have PhysX, TXAA, Adaptive V-sync, etc. I'm not looking for a dual GPU, but if I were, it would be Nvidia, hands down!
 


yup makes sensse lmao
 
Status
Not open for further replies.