AMD Radeon HD 7790 Review: Graphics Core Next At $150

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
I just read HardOCP's review of this card too. It really looks like a sweet-spot card, especially after prices settle. I don't regret the GTX650Ti I bought (I wanted to test AMD vs. nVidia differences), but this card looks like it will be the one to get for "good" settings on new games, with the HD7770 quickly dropping down to the "sufferable" range.

...the real agony is recalling how much I paid for a HD7870 a year ago.
 
For all those people moaning about over expensive cards from the likes of AMD really need to check AMDs profits for the last couple of years.

Its not like they are making billions a year in profit, financially they are doing quite bad so how can you expect them to lower the cost of their cards? They need to milk their graphics cards as much as they can to counter poor CPU sales.

I'm sure everyone would love to live in a world where we could have the best graphics cards for half their current price but grow up and actually think about what your saying
 
[citation][nom]Article[/nom]but considering that the Radeon HD 7790's TDP is just 10 W more than PCI Express' electromechanical specification,[/citation]

I don't think the mechanical specs have anything to do with power consumption
 

The point is the 6-pin PCIe connector only needs to supply 10W for the card to get to its TDP. Drawing 10W from molex plugs converted to a 6-pin PCIe is not going to skew loads on the PSU to an unhealthy degree. It's not likely to just overload the power supply either.
 

Getting free games is NOT a gift card. Gift card is something you can choose from a store for sum amound of money. I wouldn't let rot a gift card either. But forcing me to sell or play 3 certain games isn't anything close to a gift card.

Yeah AMD filled their line of gpus with the 79xx series like they were part of 6xxx series. And priced them accordingly. Instead of replacing some of the 69xx...

We obviously disagree on what targets Nvidia had set with the 6xx series. You believe that they wanted to use a smaller die in order to get more profits. But they had planned to release GK100/110 as a GTX680 and that's a fact. You can find it on Nvidia plans for the future (was back then) 6xx products. Some leaked numbers back then was about a gpu that was average twice as fast as GTX580. There is a single gpu nowadays that is almost twice as fast. But AMD released the 7970 non GE edition. Then the GK104 (a product which was supposed to be at $200-$300 price range) was suddenly named as GTX680... The year ended with Nvidia posting record fiscal year profits. Especially on the GPU division which:
"Nvidia's GPU division, meanwhile, continues to dominate Nvidia's earnings, pulling in $3.2 billion in revenue for the financial year."
You choose to believe on coincidence. I don't.

Like Intel 6core 990X with Intel 6core 3960X? Correct me if I am wrong but previous generation had $999 price tag. 3960X has the same while offering more power. With your logic 3960X should be around $1600 MSRP.

Price/performance is what matters idd. Thats why 6950/570/580 offer better value for money than 7950/670/680.
 
Hey Don/Igor, do you think you could add the 6870 and 560 to the OpenCL benchmarks? Especially the AutoCAD one.

Actually i'd like to see which cards are capable of handling photoshop AND AutoCAD. Does Nvidia win in both places (since photoshop doesn't use CUDA anymore), or does someone using both have to buy one card from each...
 


You are confusing PhysX with physics.
PhysX is the SDK that allows the GPU to offload physics calculations from the CPU.
Source: clicky
 


Thanks super link
 
[citation][nom]Memnarchon[/nom]
Getting free games is NOT a gift card. Gift card is something you can choose from a store for sum amound of money. I wouldn't let rot a gift card either. But forcing me to sell or play 3 certain games isn't anything close to a gift card. [/citation]


[citation][nom]Memnarchon[/nom]
Yeah AMD filled their line of gpus with the 79xx series like they were part of 6xxx series. And priced them accordingly. Instead of replacing some of the 69xx... [/citation]

AMD priced them to compete with Nvidia. Blame Nvidia for not being able to compete at the time. Once Nvidia was competitive, AMD brought pricing down and now they're priced excellently. Would you have preferred that AMD completely upset the whole market at the time instead? Besides, again, considering the yield issues, it's not like supply was able to keep up with demand even at the high launch prices, so to price them lower would have literally been AMD throwing money away and that's not something that AMD can afford to do.


[citation][nom]Memnarchon[/nom]
We obviously disagree on what targets Nvidia had set with the 6xx series. You believe that they wanted to use a smaller die in order to get more profits. But they had planned to release GK100/110 as a GTX680 and that's a fact. You can find it on Nvidia plans for the future (was back then) 6xx products. Some leaked numbers back then was about a gpu that was average twice as fast as GTX580. There is a single gpu nowadays that is almost twice as fast. But AMD released the 7970 non GE edition. Then the GK104 (a product which was supposed to be at $200-$300 price range) was suddenly named as GTX680... The year ended with Nvidia posting record fiscal year profits. Especially on the GPU division which:
"Nvidia's GPU division, meanwhile, continues to dominate Nvidia's earnings, pulling in $3.2 billion in revenue for the financial year."
You choose to believe on coincidence. I don't. [/citation]

I can also find leaked slides where a Radeon 7970 had XDR2 memory. That doesn't make them right.

Nvidia never had plans for a GK100 as a gaming chip. Like I said, they even changed the generation number on the GPU to a different generation than GTX 600.

I can also find sources claiming that the GTX 680 was as fast as three GTX 580s. Not even Titan is that fast.

I don't believe in coincidence in this whatsoever. Nvidia had record profits, as far as the graphics goes, because like I said, they specifically optimized the GTX 600 cards to be cheap to make.


[citation][nom]Memnarchon[/nom]
Like Intel 6core 990X with Intel 6core 3960X? Correct me if I am wrong but previous generation had $999 price tag. 3960X has the same while offering more power. With your logic 3960X should be around $1600 MSRP. [/citation]

You misunderstood what I said. My point was that comparing naught but price for a part is irrelevant without other key information. Using a single CPU comparison as an example changes nothing and in fact it proves my point. If you only compared them by their price and CPU size, then you'd miss out on many of the big changes such as the updated AVX and much more.

Here's an example. If I paid $180 for my Phenom II x6 1100T back in 2011 and I was to complain about how the FX-8350 was more expensive than it for a while desptie both being the flagship CPU of their time for AMD, then I'd be full of shit because the FX-8350 is so much better that it has better performance for the money despite having been more expensive. The i7-3970X and even the i7-3960X were also actually a little more expensive than the i7-990X, so they prove my point even then too, albeit only to a small degree.


[citation][nom]Memnarchon[/nom]
Price/performance is what matters idd. Thats why 6950/570/580 offer better value for money than 7950/670/680.[/citation]

Instead of a useless picture, let's look at real prices.
http://pcpartpicker.com/parts/video-card/#c=39,40,37,38&sort=a5&xcx=0

There is only one card of the 570s, 580s, and 6950s in the entire database, a single GTX 570, with decent performance for the money and it's from Microcenter with a $30 MIR. The rest of the dozens of cards are crushed by current cards in price/performance.
 

Its from the Core2 days, I need to get something more recent.

I am actually almost sure that PhysX can be run on the cpu, just at a lower setting level(at any higher levels, the performance is awful). I also remember an article about how they MADE the program less efficient on the cpu.

They also had PhysX running on an AMD card for a bit, but you know as well as I do that Nvidia will not let this happen, they go as far as to disable PhysX on GPU in the presence of other non Nvidia cards.

Either way, I can never agree to such a monopoly as a good things, its only a hand full of games running it anyway.

I say this as someone who will run whatever card offers more performance per dollar(and better power consumption/noise is high on the list too.)
 

I am not a fanboy towards anyone. Of course I blame Nvidia too. Everyone has their own piece of fault.

Sure but the major difference of GK110 and 7970 with XDR2 memory is that the GK110 is real...
Also AMD has previous records of overhyping their products (Buldozer anyone?).


Nvidia, Titan and I, deny this.

What this has to do with the gaming existance of GK110/100?

Oo never heard or read anything like this. Can you provide a link?

So it was intended from the begining? And what if AMD had a Titan class gpu (you said that they didn't know what AMD had to offer) ready? Nvidia wouldn't be able to compete in the high end segment? I don't think that they wouldn't give the GK100 to the market. Do you? Remember that we both blame Nvidia for letting AMD playing alone in the market and having overpriced products for 3 months. 3 months is enough for Nvidia to cancel their GK100/110 as flagship and to bring GK104 as flagship.

Sorry if I misunderstood something. But I think that the 990X and the 3960X has the same price. I search a lot of sites and they still say both as $999 MSRP.

Useless picture??? It's the performance/dollar chart of the most used resolution (1920x1080/1200) benchmarked from 18 games/benchamrks by TPU. Unless money doesn't mean anything to you, then this chart is far from useless...

And thanks for the conversation. I rarely find such a great interlocutor. :)
 

No, I'm not sure. It just seems like the logical way to distribute the load. And of course, actual maximum power consumption may deviate from TDP a little.
 
[citation][nom]Article[/nom]It looks like OpenGL performance is a sore spot for AMD‘s current Radeon cards, at least in Windows. Aside from the 2D test, where driver optimizations look like they're needed, the Radeon HD 7790 falls exactly where we'd expect it to within AMD's product stack.[/citation]

As long as the Radeon cards can get their act together in gaming and Open CL, I'm buying.
 

Actually it does make some difference, because ideally you want to put as little load on the molex connectors as possible (when using a molex to PCIe adapter). They're meant to power fans and other things that only draw small amounts of power. Putting a heavy load on them with a graphics card may prove problematic. Especially since we're already talking about rather low-end power supplies (ones without PCIe power connectors).
 
[citation][nom]Memnarchon[/nom]Can you provide me a link for $159 7850?Also, free games are nice but can't give you performance. If you manage to sell them and get sum of your money back its ok. But wouldn't it be great if the pricerange of 68xx were replaced by 78xx? Since there were supposed to be their replacement.And to be honest I don't want to pay extra $ for something that in the previous generation I payed for far less. Same goes for Nvidia and especially for Titan. 2nd tier GF110 (GTX570) $330, 2nd tier GK110 (Titan something) Nvidia says MSRP $899.I don't think that there are a lot of people that they don't think that this generation is overpriced...[/citation]

http://www.newegg.com/Product/Product.aspx?Item=N82E16814150641

This HD 7850 2GB version is currently $160 AR and includes BioShock and Tomb Raider

Also, the game coupons can easily be sold on eBay or Craigslist. I have sold three sets of these coupons on eBay ranging from $55 to $75.

My HD 7870 cost me $135 and 1GB 7850 cost me $80 after selling the games.
My latest GTX 660 purchase will likely end up around $100 after rebate and games. So yes, games do have an effect on overall value.
 
[citation][nom]Sakkura[/nom]The point is the 6-pin PCIe connector only needs to supply 10W for the card to get to its TDP. Drawing 10W from molex plugs converted to a 6-pin PCIe is not going to skew loads on the PSU to an unhealthy degree. It's not likely to just overload the power supply either.[/citation]

I understand, but this is all about electrical specs and nothing else.

Still, looks like AMD is playing the sweet spot game again.
 
[citation][nom]Memnarchon[/nom] I am not a fanboy towards anyone. Of course I blame Nvidia too. Everyone has their own piece of fault.
Sure but the major difference of GK110 and 7970 with XDR2 memory is that the GK110 is real...
Also AMD has previous records of overhyping their products (Buldozer anyone?). [/citation]

That's not the point. The point was that the slide was faked too. The GK110 is real, but the GTX 680 with a GK110 is not real because that never happened nor was it even the plan. Furthermore, the slide with XDR2 on the 7970 was not from AMD. It wouldn't have made a huge difference even if AMD used XDR2 anyway.

[citation][nom]Memnarchon[/nom]
Nvidia, Titan and I, deny this. [/citation]

There is no GK100, so it is only you whom denies it, not Nvidia and Titan. GK110, again, isn't even part of the GTX 600 series, nor is it used in anything except very limited production cards, at least as of yet.

[citation][nom]Memnarchon[/nom]
What this has to do with the gaming existance of GK110/100?
[/citation]

There is no GK100. GK110 is a GPU used in a different card compared to what the slides from before said spec-wise. Heck, most of the supposed GK110 specs from most such slides (perhaps all of them, I'd have to check to be sure) weren't even accurate.

[citation][nom]Memnarchon[/nom]
Oo never heard or read anything like this. Can you provide a link? [/citation]
I could take a look, but if you Google Unreal 4 Kepler, it'll probably be on the first page of results.

[citation][nom]Memnarchon[/nom]
So it was intended from the begining? And what if AMD had a Titan class gpu (you said that they didn't know what AMD had to offer) ready? Nvidia wouldn't be able to compete in the high end segment? I don't think that they wouldn't give the GK100 to the market. Do you? Remember that we both blame Nvidia for letting AMD playing alone in the market and having overpriced products for 3 months. 3 months is enough for Nvidia to cancel their GK100/110 as flagship and to bring GK104 as flagship. [/citation]

AMD would never have had a Titan-class GPU. AMD doesn't make massive GPUs and I don't think that they nor Ati ever did. The only thing that Nvidia could have accurately guessed is that AMD would have kept the tradition of only designing GPUs that they can afford to make.

Furthermore, GK110 still seems to have poor yields even now. It's a luxury product, if even that, a limited production run to grab the record. Heck, now they're making a cut-down version of it, undoubtedly for at least a partially related set of reasons. Such massive GPUs simply aren't economical to manufacture and many dies are made with faults.

[citation][nom]Memnarchon[/nom]
Sorry if I misunderstood something. But I think that the 990X and the 3960X has the same price. I search a lot of sites and they still say both as $999 MSRP. [/citation]

I admit that I haven't kept up with pricing and maybe it's gone down a little, but the i7-3960X and the i7-3970X both launched at $1049.99 IIRC.

[citation][nom]Memnarchon[/nom]
Useless picture??? It's the performance/dollar chart of the most used resolution (1920x1080/1200) benchmarked from 18 games/benchamrks by TPU. Unless money doesn't mean anything to you, then this chart is far from useless... [/citation]

The picture is useless because it's wrong. It has nothing to do with current pricing. The link that I gave you shows current pricing in real time as in this very day and it proves beyond a doubt that there was only a single GTX 570 selling for a decent price and it was from Microcenter. All other models of the Radeon 6950, Radeon 6970, GTX 570, and GTX 580 cards were priced very high for their performance, much worse than the current generation. A better example for what you want to say might be the GTX 480 which is still around $200 IIRC, but even then, I'd have to check to be sure and the power consumption on that card ruins any such recommendation anyway. Still, the chart is useless because, to put it simply, it doesn't line up with reality as far as pricing goes.
 
[citation][nom]johnsonjohnson[/nom]I thought the HD 7850 1GB is good value at $150 after rebate and 2GB at $180 after rebate.[/citation]
Very true. In fact you've got the 2gb 7850's going for $165 after MIR (NCIX I believe).
 
To me the 7790 is kind of pointless. You can get a 7850 for 160 right now after a 20$ mail in rebate. So for 10 more bucks you can get a 7850 1GB. The most I ever use gaming at 1920x1080p is 7-800mbs. And that's playing Far Cry 3, Crysis 3, Metro 2033, Borderlands 2, and CODII.

I have a 660Ti. And for those of you that say there's no reason for me to get Nvidia. I have many reasons. Yes the drivers are one reason. Another reason is that the 660 Ti performs very well in all the games I play. I can run Crysis 3 with everything on Very High w/2xSMAA, w/High Motion Blur, and Lense Flares On while getting 40-70 FPS. That's the MAX settings, Why do I need anything more powerful?

Another reason I like Nvidia is because they have that awesome 3D Vision 2 technology. I know AMD has 3D too. But most of the gaming monitors that have 3D only support Nvidia 3D Vision 2, at least all the really nice ones I know of like the BenQ XL2420TX 24 inch that I currently have. It seems electronics makers perfer to team up with Nvidia for some reason.

Yet another reason is because I like Nvidia's control panel. I'm familiar with it and favor it over AMD's.

Nvidia also has PhysX technology. Which looks absolutely amazing in games like Borderlands 2. Which is my favorite game. I'd hate to not have PhysX in this gamea and any other offering it. AMD has no such technology that I've ever seen. I know they team up with game makers but I haven't seen any PhysX like effects come from that.

I realize a 7870 performs kind of close to a 660Ti when the 660Ti costs 270$ and a 7870LE costs 240$. But the settings Tom's Hardware uses perfer the 7870 because they use sometimes unnecessary amounts of AA, and they admit it in their conclusion. And with the 660Ti's 192 bit memory, it doesn't perform as well as it should. Check out "Crysis 3 Benchmarked" and you'll find that Nvidia's 660 Ti beats the 7870LE by a decent amount(a few FPS).

So I definately have my reasons for choosing Nvidia. To go so far as to say they are irrelevent is simply not true. They compete very well for the price and for all the reasons that I gave, this is why Nvidia is still numero UNO!

Oh yeah, and power consumption, although personally I could care less.
 
[citation][nom]ericjohn004[/nom]To me the 7790 is kind of pointless. You can get a 7850 for 160 right now after a 20$ mail in rebate. So for 10 more bucks you can get a 7850 1GB. The most I ever use gaming at 1920x1080p is 7-800mbs. And that's playing Far Cry 3, Crysis 3, Metro 2033, Borderlands 2, and CODII.I have a 660Ti. And for those of you that say there's no reason for me to get Nvidia. I have many reasons. Yes the drivers are one reason. Another reason is that the 660 Ti performs very well in all the games I play. I can run Crysis 3 with everything on Very High w/2xSMAA, w/High Motion Blur, and Lense Flares On while getting 40-70 FPS. That's the MAX settings, Why do I need anything more powerful?Another reason I like Nvidia is because they have that awesome 3D Vision 2 technology. I know AMD has 3D too. But most of the gaming monitors that have 3D only support Nvidia 3D Vision 2, at least all the really nice ones I know of like the BenQ XL2420TX 24 inch that I currently have. It seems electronics makers perfer to team up with Nvidia for some reason.Yet another reason is because I like Nvidia's control panel. I'm familiar with it and favor it over AMD's. Nvidia also has PhysX technology. Which looks absolutely amazing in games like Borderlands 2. Which is my favorite game. I'd hate to not have PhysX in this gamea and any other offering it. AMD has no such technology that I've ever seen. I know they team up with game makers but I haven't seen any PhysX like effects come from that.I realize a 7870 performs kind of close to a 660Ti when the 660Ti costs 270$ and a 7870LE costs 240$. But the settings Tom's Hardware uses perfer the 7870 because they use sometimes unnecessary amounts of AA, and they admit it in their conclusion. And with the 660Ti's 192 bit memory, it doesn't perform as well as it should. Check out "Crysis 3 Benchmarked" and you'll find that Nvidia's 660 Ti beats the 7870LE by a decent amount(a few FPS).So I definately have my reasons for choosing Nvidia. To go so far as to say they are irrelevent is simply not true. They compete very well for the price and for all the reasons that I gave, this is why Nvidia is still numero UNO! Oh yeah, and power consumption, although personally I could care less.[/citation]

I don't know, this nVidia fanboy rant seemed rather pointless as well... Competition is good, move along.
 
Status
Not open for further replies.