AMD Radeon HD 7790 Review: Graphics Core Next At $150

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Well that pricing could easily change. Chances are AMD will take the opportunity to bump up the price tag on the 7850.

I think the only reason they made the 1 GB version and priced it so low was to counter the GTX 650 Ti. Narrowing the price gap there would entice a lot of people to move up to the 7850.

Now, with a direct answer to the 650 Ti, they no longer need to do that.
 
Fanboy rant? I like AMD GPU's. I'd buy an AMD GPU if someone made a monitor that supported AMD 3D. Which is impossible to find.

I saw quite a few "AMD Fanboys" I guess you'd call them. Talking about how Nvidia's not even in competition as far as price/performance. So I figured I'd give all my reasons for going with Nvidia over AMD.

I'm neither a Fanboy nor was I ranting. I was just stating my opinion. Go somewhere's else with that man, if you don't have nothing to communicate why even say anything... GFYS
 
[citation][nom]ericjohn004[/nom]Fanboy rant? I like AMD GPU's. I'd buy an AMD GPU if someone made a monitor that supported AMD 3D. Which is impossible to find.I saw quite a few "AMD Fanboys" I guess you'd call them. Talking about how Nvidia's not even in competition as far as price/performance. So I figured I'd give all my reasons for going with Nvidia over AMD.I'm neither a Fanboy nor was I ranting. I was just stating my opinion. Go somewhere's else with that man, if you don't have nothing to communicate why even say anything... GFYS[/citation]

Yeah, I mean if I was building a budget rig, I'd definately go with the 7790 over the 650 Ti. That is, if I could find a monitor that supported AMD's 3D. I would for sure go for the 7790 if it would be priced as it should have been at say 139.99. I think that's probably the perfect price point. The 650Ti can be found for 129.99 and the 7790 is about 10 bucks better so it looks.
 

I don't understand why you put "budget rig" and "3D" in the same post. 3D is a tiny, pricey niche. And even if it is Friday night, you could be a little nicer.
 
Yeah I totally agree. Who would pay 200$ for a 7850 when they could get a GTX 660. Not to mention I can find a 7870Ghz edition Diamond for 194.99 w/free games and stuff. To me, if your going to spend even 160$ on a graphics card like the one in this article you might as well save up just a little more and go for a GTX 660 or 7870 or at the very least a 7850 for the same price. And then you'll have a real graphics card. I can see only having 130 and going for a 650 Ti though. To me the 7790, 7850, 7870, and 660 are all jam packed into a tiny 40$ difference price point. Granted, you have to look for deals on the 7870 and 660 to find them for around 200, but you can and will find deals everywhere so it makes little difference.

This is why I just can't see this 7790 selling for 160$ as it is in this article. The price will have to be 130-140$ for it to make sense. So I don't see why some people's saying "this will be my next graphics card". I guess if your just so strapped for cash you can't afford the extra 0$ bucks it'll cost for a 7850, then it'll make sense. Cause if the 7850 moves up in price it'll get killed by the GTX 660. Lose, Lose for AMD. I'm glad the card came out though as it gave me something interesting to read and it'll be a good card for a cheap build at 130$.
 

That's launch pricing AND the lack of the rebates and such you'll inevitably see at retail. Plus you're manipulating the quote to suit your agenda; the article says $150, not 160.
 
[citation][nom]tourist[/nom]That is a dangerous game to play in the gpu market, the 7850 is now caught between the Rock (gtx660) and a hard place (gx650ti/hd7790) leaving little room for it to move up for any length of time without rebates and games. GTX 660's for under 200 will keep that in check . To be fair Nvidia is about to do the same thing with the gtx650ti 192 bit , filling a gap in a narrow margin.[/citation]
Considering that the 7870 has been floating around the $200 AR price point recently (with the LE/XT versions not much higher), the logical step would be to phase out the 1gb versions of the 7850 and price the 2gb 7850 around $170 (around $150 with rebates). Also, considering that the 650 Ti has started to frequently be seen at around $110 AR, the $150 MSRP of the 7790 is definitely inflated. [citation][nom]Sakkura[/nom]That's launch pricing AND the lack of the rebates and such you'll inevitably see at retail. Plus you're manipulating the quote to suit your agenda; the article says $150, not 160.[/citation]
Actually, the article suggested that this non-reference design they were showcasing would sell for $160
Update: Sapphire's card should sell for $160, $10 more than reference-class 7790s, according to company representatives
 
[citation][nom]Memnarchon[/nom]Nvidia, Titan and I, deny this.[/citation]

Whoah there. Stop that thought train. Google search for a picture of the GK110 architecture. Now count how many double-precision ALUs there are. There should be 32 to each shader cluster. Good.

Now look at GK104. How many are there? 16 per cluster, which is half the theoretical double-precision throughput (not taking into account drivers and bus bandwidth). GK104 was never designed to be good at double-precision math, which isn't necessary for games and general consumer usage. GK110 was designed for environments where it would crunch through data sets that required the extra accuracy, which is why Nvidia supplied so many GK110-class Tesla units for the Titan project.

GK110 was destined (right from the beginning) for the Quadro and Tesla family, whereas it's guise as the Titan is to sell off extra stock to consumers who can't buy the Tesla K5000, but want better processing power than what the Quadro lineup can currently offer, even if it means using the unoptimised Geforce drivers. Nvidia purposely strangles double-precision throughput on its Geforce cards because the GTX480 and GTX580 was the more preferable choice over their over-priced Quadro equivalents and I'm sure they'd like to avoid a repeat of that.
 
Quote:
"When AMD launched its Radeon HD 7970 in December 2011, it appeared for a brief moment as though AMD was set for 2012. Brief, because there was more than just arrogance in NVIDIA's dismissal of AMD's new flagship GPU and the architecture that drives it. NVIDIA's "Kepler" GPU architecture was designed under the assumption that the HD 7970 would be much faster than it ended up being, so the company realized its second best chip, the GK104, had a fair shot against the HD 7900 series. The GK104 really was just a successor of the GF114 that drives the performance-segment GeForce GTX 560 Ti. What followed was a frantic attempt by NVIDIA to re-package the GK104 into a high-end product, the GeForce GTX 680, while shelving its best but expensive chip, the GK110 (which drives the GTX Titan we're reviewing today). The gambit paid off when the GTX 680 snatched the performance crown from the HD 7970 in March."
http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_Titan/

Quote:
"So here's a small secret, initially roughly a year ago we expected the GK110 chip to be launching in the GeForce GTX 680, but the GK104 currently in use for GeForce GTX 680 was, simply put, just too good and yielded so much better."
http://www.guru3d.com/articles_pages/geforce_gtx_titan_review,1.html

Quote:
"Nvidia's Kepler architecture debuted a year ago with the GeForce GTX 680, which has sat somewhat comfortably as the market's top single-GPU graphics card, forcing AMD to reduce prices and launch a special HD 7970 GHz Edition card to help close the value gap. Despite besting its rival, many believe Nvidia had planned to make its 600 series flagship even faster by using the GK110 chip, but purposefully held back with the GK104 to save cash, since it was competitive enough performance-wise."
http://www.techspot.com/review/644-nvidia-geforce-titan/
 

I have one too, it just does not get too much use. Even my gaming pc does not get too much use now days.
 

He didn't say anything about the non-reference one.
 


660 vs 7850

Mostly because the 7850 never costs 200 dollars, and trust me the performance isn't to far off.
I have a 570 it has the same power (approx) as the 660, the 7850 clowns the 570 even with a heafty oc in 3dmark 11 by up to 20% and depending on the game (not twimtbp) it actually put out more frames though at a less stable consistency

 
[citation][nom]johnsonjohnson[/nom]I thought the HD 7850 1GB is good value at $150 after rebate and 2GB at $180 after rebate.[/citation]

I thought so too, I got my 7850 at $164, and that price didn't move for months. Now the 1GB 7850s are $180 at newegg. Wow.
 
Making value judgements based on MSRP +/- $10 is pretty much meaningless. There are always deals to be had the day you decide to buy, it's just a matter of finding them. I paid just $140 for my eVGA SSC-version GTX650Ti (1071MHz) because I waited. This new HD7790 looks like a great card, but when the price is right.
 


Nvidia would've needed to lose all measures of competence in their field to think that AMD would have made a chip to fight with GK110. AMD and Ati never competed against Nvidia's big chips because that never made sense and it still wouldn't today. It's completely nonsensical, especially with how bad yields were on the 28nm TSMC processes. Even Nvidia still has significant trouble making GK110s in high numbers well after the process should have matured enough for Nvidia to churn them out like they did with their previous big GPUs. AMD couldn't afford to deal with that then and they most certainly couldn't now and this was all common knowledge.

I don't know about you, but I don't think that Nvidia could possibly be so incompetent to think that AMD would bankrupt themselves trying to do something that they've never done before nor had any reason to do before, especially when it would'n't make sense to try. The only way that Tahiti could have been much more powerful, since AMD couldn't even afford a much larger GPU and we all knew that, would be for Tahiti to be more powerful with a similar size to what it is now. If Tahiti was able to compete with GK110 despite it's much smaller size, then Kepler would have clearly not been made to compete with AMD unless Nvidia screwed up their so-called optimized to beat AMD architecture.

AMD in no way released the Radeon 7970 GHz Edition to close any value gap. That was purely in dropping the 7970's price to compete.

You can post as many marketing BS links as you want to. It changes nothing. Even better is Nvidia's public dismissal of AMD's architecture, a link to which is found in your third link, where Nvidia claims that Kepler was designed to beat a far superior architecture than GCN because Nvidia thought that AMD would do much better with it. That begs the questions of why the Kepler line is so messed up (ridiculous memory bandwidth bottle-necks causing some cards to be too far, such as the 650 Ti and the 660, and others to be too close, such as the 670 and the 680), yet again why GK110 wasn't given a GTX 6xx series GPU model number (which would be GK100), why Kepler isn't even beating GCN on equal terms, why Nvidia nerfed the hell out of compute like you'd expect of an architecture designed to be cheap to make for gaming cards, why all other aspects of the cards are made cheaply, and on a related note, why Nvidia nerfed the hell out of overclocking artificially despite the architecture clearly being capable of huge overclocking had Nvidia not used weak VRM and cheaped out on components. I could go on and on. The objective evidence certainly doesn't help the marketing BS.
 
[citation][nom]Sakkura[/nom]He didn't say anything about the non-reference one.[/citation]

He didn't? I wasn't aware that the Sapphire HD 7790 Dual-X in this article was reference design.
[citation][nom]ericjohn004[/nom]To me, if your going to spend even 160$ on a graphics card like the one in this article you might as well save up just a little more and go for a GTX 660 or 7870 or at the very least a 7850 for the same price. [/citation]

Now... who is it that is manipulating quotes?
 


That would be partially incorrect. PhysX is a (proprietary) realtime physics engine middleware SDK, but it is not specifically one that only runs on GPUs. In fact, it was originally only run on PPUs, not GPUs, and it can also run on CPUs using the old x87 FPU instructions (often speculated to be an intentional hindrance because modern CPUs, in theory, could be more than capable of handling PhysX in modern games if it ran using modern floating point instructions). PhysX processing can be accelerated by GPUs/PPUs (granted there are no PPUs that can handle modern workloads anymore, so the acceleration falls onto Nvidia's GPUs nowadays). Most of this information is in your link, just an FYI.

It's also worth stating that PhysX is not the only physics processing implementation that allows for running physics processing on a GPU.
 

The article is about the 7790 as a whole.
 
[citation][nom]Sakkura[/nom]The article is about the 7790 as a whole.[/citation]

Granted that the article is about the 7790, the card that was tested was the non-reference Sapphire HD 7790 Dual-X. Reading comprehension is really not your strength is it?

AMD didn't ship out reference designs for this launch, instead opting to have partners send out their own boards.
 

So? The article is still about the 7790, not the Sapphire 7790.
 




Courtesy goes even longer. You could have merely pointed out that Sakkura may have misinterpreted the post that Sakkura replied to, but instead, you assumed that your interpretation of it was correct and that Sakkura's was wrong and mocked Sakkura throughout the whole argument over it. Your interpretation was probably the correct one IMO, but that's not really the point by now, especially since it's not likely that we'll know for sure at least until ericjohn004 confirms which was correct.
 
Status
Not open for further replies.