jdwii :
tourist :
jdwii :
tourist :
old vs new and when does cost come into the equation ?
absolutely not, future products are always coming and its a bit sad to always say such things. Even more so when both are built on the same 28nm process.
old =last gen new as in next gen we will have to wait until 300 series vs maxwell
what is sad is AMD's dx11 driver implementation in the benchmark.,
http://www.anandtech.com/show/8962/the-directx-12-performance-preview-amd-nvidia-star-swarm/5
"Meanwhile it’s interesting to note that largely due to their poor DirectX 11 performance in this benchmark, AMD sees the greatest gains from DirectX 12 on a relative basis and comes close to seeing the greatest gains on an absolute basis as well. The GTX 980’s performance improves by 150% and 40.1fps when switching APIs; the R9 290X improves by 416% and 34.6fps."
I always joke with my friend with tech and say we live in a capitalist country what matters the most is whats out right now. Now i hear Amd might be doing stack memory even people who actually know about things seem to hint this cough "Tek Syndicate" cough cough but i really want a 200 watt card not a 300 watt one. Be nice to have my whole rig continue to use 300 watts or less when gaming. Funny the PS4 uses 140 watts or so when gaming.
Like i said i expect i'm in the small majority i just like efficient stuff i just replaced my 100 watt monitor with a 22 watt one that is superior in quality.
I'll say this for the 101 time Amd needs efficient GPU's(more so then Nvidia or Intel) for their APU series Intel isn't joking around with this i used their 4600 graphics it is around Llano level when it comes to graphics something i never thought was possible. I was going to make videos but 720P medium on 2012 games and before was very possible on it. I read comments all the time and most claim the intel graphics are good enough now. Amd honestly has nothing else to give to this market if Intel beats them in this.
Edit when i say this market i mean mainstream products not actually the game market.
Edit again when you say bad directx 11 optimization gamer hinted at this 100 pages ago(or more). They might care more about Mantle in these games and possibility optimize more for that for marketing purposes. Its something i'd suspect Intel or Nvidia would do as well.
At $0.11 kw/hr, I am buying a product I want over something that uses less electricity..."just because".
I tried the "super efficient" light bulbs that are supposed to last 2 years or something like that a while back, you know what happened? In the same 6 months my $3 pack of light bulbs would have died, I was replacing the $12 light bulbs.
Did it save me money? Honestly, I replaced 14 light bulbs in my house, and my electric bill was perhaps $3/mo. cheaper, if even that..and that was a ~50-60% reduction in power consumption for all the bulbs in my house. From 60W to ~18-22W per bulb. Now, in that 6 months, I saved ~$18, we will call it $20 to be generous.
However, the bulbs cost me ~$10-12 per 3 bulbs instead of ~$3 per 4 bulbs, so the cost for the bulbs was so high it actually *cost* me money to "save money" on electric through efficiency.
The reality of all this "more efficiency" garbage is simply this: You might save money somewhere, however, you will not come out ahead overall because of initial costs. This is the same thing with hybrid cars. Would you spend an extra $6,000.00 to buy a car that would save you money on gasoline and be marginally better for the environment? What if you knew the fuel savings in dollars would require you to drive that car until it had 160,000 miles on it
just to break even? Or, if I told you that the difference in emissions was ~10 ppm between a PZEV and a ZEV?
So, I am a power user. In the end, I run rigs maxed out, straining components. What it does under "average load" or "consumer load" is irrelevant to me. I want to know what it does when utilization stays at 90%+ for hours on end.
In that scenario...there is no NVidia card that makes enough difference for me to not buy AMD.