honestly i'm glad nvidia make this feature available to older geforce card. lol some people even mention that there are more reason to keep their current card now rather than spending more money on more powerful card 😀
Well yea, they would still have TSMC in the future. AND yes having this abillity to older Geforce GPU's is a great feature, I am gonna hold on to my cards unless you see a need to get the new kepler cards for upcoming games
honestly i do want to replace my card this generation but seeing the price i think i will skip. i want something that double the performance of my current GTX460 but at the same price when i get my single 460 . also with 1600x900 resolution my GTX460 eat most of the game i play for breakfast lol except a few that were known to be gpu killer such as metro. so with that being considered i have less reason to do gpu upgrade. maybe i will upgrade my card once microsoft comes out with DX12
Yea, plus if you notice at every event like the most recent.... PAX EAST, they had 100 GTX 680's just sitting around, either for give aways or random others
Dual chip cards are usually semi-paper launches anyway. With how much a dual Kepler would likely cost ($800?), not many would be sold even if they had them.
I want to know.Toms did a review about image quality(hd7800s), does the present gtx680 have better image quality or does the 7970 have it?
I have learned that performance and image quality sometimes are different.
For me i dont care about massive fps but if it lacks great image quality then its not worth the buy.
There may be a clue in this article, but it was written before the GTX 680 and the new driver enhancements came out.
http://www.anandtech.com/show/5261/amd-radeon-hd-7970-review/12
Not good with gpus but what is your take on the article?
Not good with gpus but what is your take on the article?
I actually don't really think this is a valid point, but they are the experts at Anandtech.
Quote: "As a result for the time being there will continue to be an interesting division in image quality between AMD and NVIDIA. AMD still maintains an advantage with anisotropic filtering thanks to their angle-independent algorithm, but NVIDIA will have better anti-aliasing options in DX10+ games (ed: and Minecraft). It’s an unusual status quo that apparently will be maintained for quite some time to come."
I have always read that Nvidia has better texture filtering. I think at the end of the day, they both are very similar, although Nvidia is making a jump with its FXAA and TXAA modes.