AMD CPU speculation... and expert conjecture

Page 658 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.


I think the sad thing is though, *even if the hype is true* AMD won't get out of it what they should...

Nvidia (like Apple) are brilliant at *marketing* their products. I mean had AMD just released the 970 / 980 in Radeon branding I can guarantee the coverage would have been much less positive.

I mean I like Nvidia cards (owned a few), I also like AMD kit (currently moved up from a GTX 560 to an R9 280 as it was on offer for a stupidly low price + 3 free games). I have no special allegiance.

However from my position, the 970 and 980 aren't the 'home run' they're being lauded as. Sure they're very fast at 1080p, but that lead is quickly eroded as the resolution goes up, and to be honest if gaming at 1920 x 1080p is your goal, the anything from a GTX 660 / R7 270 is going to have you covered. I mean for a 'next gen' pair of cards, the performance gains we're talking aren't enough to justify upgrading should you already own any equivalent high end card from the last gen imo (of either side). They are very efficient, so that bodes will for a future 'big Maxwell' chip- though there's no telling if and when that will turn up- and when it does it will be a workstation part for at least 6 months.

The press (and general 'wisdom' on the internet) currently dictates that all thigns being equal, Nvidia > AMD and I really can't see why (from first hand experience with both vendors). If AMD release a new generation of card that is faster, more power efficient and quieter than Nvidia's best- you know the coverage they'll get? "A good card and BIG improvement from last gen, but needs better drivers so wait for Nvidia's next card"...
 

szatkus

Honorable
Jul 9, 2013
382
0
10,780


So R9 3xx isn't GCN 1.2 (like Tonga)? So why were they working on 1.2?

I remember how suddenly Jaguar+ changed into Puma after Bay Trail release :D
 

cemerian

Honorable
Jul 29, 2013
1,011
0
11,660
amd will never release a part that beats nvidia by 2 or 3 generations just as nvidia will never release a part that beats amd by 2 or 3 gens, they will simply hold back the high end part and rebrand mid range part to high end one until the other has something thats competitive, nvidia already did that with kepler, the gtx 670 and 680 where never originally planned as high end cards, but because the gk110 was far ahead of anything amd had at the time(also the costs but mostly lack of competition) they held it back for 7xx series, this has happened and it will always be thus, neither of the two want to crush the other, therefor even if gcn 2.0 as you put it would be as fast as you claim it will not be released to the public untill nvidia has something that can compete
 

8350rocks

Distinguished
They will this time because they have something up their sleeve I doubt any of you could honestly expect. Nor nvidia for that matter. Nvidia will be unable to compete for reasons outside of either company's control.

@cemerian: AMD wants to be the top seller of dGPUs. If they could produce a card capable of 50 TFLOPS and sell it tomorrow, they would...even if NVidia would be noncompetitive for 10 years trying to catch up. Seriously...
 

con635

Honorable
Oct 3, 2013
644
0
11,010
I did read a rumor, cant remember where, was a link on here iirc, it said nvidia has licenced use of hbm from amd due to delays with hmc and that amd had 1 year of hbm all to themselves, could there be any truth in that?
Have to agree with cdrkf.
 

szatkus

Honorable
Jul 9, 2013
382
0
10,780

I expect. Nothing surprising to see redesign much better than competitor.


That's not how things work. Getting much better product is usually much more expensive (materials, r&d, engeering etc.), that's why companies usually target just above competion.
 


I doubt they'd be doing that again. Remember the lawsuit against nVidia and ATI?

If the company finds a way to quantum leap over the competition, they'll use it. Just look at Intel over AMD. After they got Conroe out, the jump was huge and AMD took 1 or 2 years to make a decent comeback with Phenom IIs. I don't have other examples in memory, but I'm sure there are more.

Cheers!
 


Whilst I agree to an extent, I think many incorrectly interpreted the delay to the 780 (GK110) as 'Nvidia waiting for AMD to catch up'... The thing is the GK110 is one of the biggest GPU's ever built. It's hugely complex and very expensive to produce. It's quite likely that yields of that chip were terrible when it was first launched. So the chip is first used in the high end workstation / HPC environment where it can recoup as much money as possible.

As yields improve, then they start releasing it cheaper as a gaming chip. I really don't think Nvidia had any GK110 chips available back when the 680 launched. I think the R9 290 / 290X prompted them to release a full version of the sku, rather than relying on the slightly cut down version, however that's really the only change Nvidia made to their plans due to AMD.

The same is true this gen- if they are going to release big Maxwell, it will be down the road and only after the chip is already available as a PRO part. I'm not even certain they'll release it at all on 28nm, though I guess with the efficiency of the 980 there is the thermal headroom to do so.
 

jdwii

Splendid


Well we both know engineers love their own like a parent raising their own kid.

Amd has a poor poor history when it comes to over estimating their own products. Pretty sure you are talking about 20nm and stacked memory for the GPU's? I can see that enabling Amd to be far ahead compared to Nvidia but its almost a unfair advantage.
 

jdwii

Splendid


The 8800Gtx called and wants you to read reviews from that time
 

jdwii

Splendid


I owned both to and i just perfer Nvidia right now and adaptive V-sync, forcing FXAA(just tried it with far cry 3 love it)

However i agree with you 100% Amd never gets enough attention from average users i know i love reading articles on architectures and Nvidia keeps tweaking their design and then Amd just adds more cores its kinda sad to me but it doesn't matter i guess if their still competitive.
I rate the 970 high since it barley uses any power you just know the Ti should be way more impressive in terms of pure performance. Not to mention for the first time in history Nvidia was cheaper something that never happen(at least since 2008 when i got into this stuff)
 

szatkus

Honorable
Jul 9, 2013
382
0
10,780


I didn't mean price of end product. When you design a chip you have to have a clear target. Tell your your engineers that you want architecture 40% faster than last generation and they will deilver you in let's say 3 years. Tell them that you want 100% faster architecture and the time can extend to 5 years or something. If product is better than planned it's usually pure luck.
 


I'd say most of the time (if not 100%), final price reflects all that R&D + margin. And while I do agree most great discoveries that have allowed quantum leaps have been out of luck (just ask Pfizer :p), in the world of tech it isn't the norm. I don't think CEOs have to tell their engies "I want 100% better computing power in X time". I'm sure they all know how to get to port, but also know the roadblocks ahead. In particular, for the case of Intel, they developed Conroe in X time frame and it was a surprise to the world. I don't think those engies were given the order to come out with a break through TBH, but it was just a polished step up of something already tried and was part of the current R&D roadmap from Intel at the time.

Point is, when you're talking about "leap frogging" in tech, it's not something Companies are not willing to do. They will when they can, whether or not it was intended/researched for is irrelevant. Having a "leap frogging" product is a cash cow; plain and simple.

Cheers!
 

szatkus

Honorable
Jul 9, 2013
382
0
10,780


It's exactly (almost, not CEO, but marketing and manager) how projects are started. It's called "Project requirements", do it X% faster, Y% more energy efficent and in less than Z bln transistors or project will be delayed and you won't get bonus. They can't just say "do something better", because you can work on something better forever :)
 

Thats not how it works. The development of any technology in computing is made years and years in advance and companies have many many projects in development. Many things are done in academia years before they are put into some consumer chip. The companies all invest in different techniques and technologies. The most hopeful projects are then tested further and the pros and cons will be weighed in the designs for different chips. They are tested against the available tech and then they are analyzed financially. Any new chip would have been in development for years before the tech to even make the chip would become possible.
 

jdwii

Splendid


I have to say that isn't how its done if i understand what you were trying to say. They first come up with the best design they can do and then start manufacturing it, if Amd wanted 4 times more performance in 2017 that doesn't mean they can do it. A company makes the best design in a given time frame the performance measures can't be known until final stages only best guesses.

Things also change during the project i think bulldozer was being constructed back in 2005 and the CPU engineers knew it wasn't a good idea not sure why they brought it back.

edit nicely said esrever
 

jdwii

Splendid
Its been one year and the power of the PS4/ Xbox one has been pushed to the limits
http://www.gamespot.com/articles/assassins-creed-unity-uses-every-ounce-of-power-fr/1100-6423486/

I remember when the 360 came out things were different
 

szatkus

Honorable
Jul 9, 2013
382
0
10,780


Where do you see a contradiction between that what I wrote and that what you wrote?
 

con635

Honorable
Oct 3, 2013
644
0
11,010

The new consoles are just pc tech though you would think they'd max them out faster, in that article it states they've went from 200 on screen ai on last gen to 5000-10000, doesnt sound like a 'poor showing' exactly. Ubi have got a bit of bad press in recent weeks but I'm looking forward to new AC ever since the pc specs came out.
 

szatkus

Honorable
Jul 9, 2013
382
0
10,780


They can't just guess performance, they must know it, as soon as possible.
 

jdwii

Splendid


Tell your your engineers that you want architecture 40% faster than last generation and they will deilver you in let's say 3 years.

 

jdwii

Splendid


This is simply untrue and you have no way of knowing something like this before hand except in some rare cases. Only best guesses bulldozer for example was supposed to be much more then what it was and of course that didn't happen the same goes for Amd Phenom with its "true" quad core design.
 

szatkus

Honorable
Jul 9, 2013
382
0
10,780


They have simulations.

And about Bulldozer... http://semiaccurate.com/2011/10/17/bulldozer-doesnt-have-just-a-single-problem/
 

szatkus

Honorable
Jul 9, 2013
382
0
10,780


Not a contradiction. Esrever just described what happens later :)
 
Status
Not open for further replies.