AMD CPU speculation... and expert conjecture

Page 722 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
We don't really know the answer to that. Certainly, lessons from past project will figure prominently. Whether Zen is from scratch, or derived from a previous arch, will be revealed at some future date.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790
The flagship product is the K12 core, which is based on a "blank sheet of paper" according to Jim Keller. Then Zen is derived from K12 via Skybridge project. Of course starting from scratch is not incompatible with reusing experience

In our new generation, we need to take the DNA of both, the best of both. We know how to do high-frequency design, we know how to do dense design, and ARM gives us an efficiency advantage

http://www.vrworld.com/2014/05/05/amd-announces-new-amdextrous-strategy-skybridge-custom-64-bit-arm-cores/
http://www.slashgear.com/amd-chip-guru-jim-keller-arm-left-team-a-little-daunted-05327681/
 

blackkstar

Honorable
Sep 30, 2012
468
0
10,780


Yeah, so you didn't read my post where I said the comparison between Q6600 and FX is pointless because FX supports newer instruction sets.

Intel always gets a free pass on this yet AMD gets slaughtered. This thread is basically "lol AMD can't run these 30 year old instructions very well, they're terrible" followed by "well Haswell lack of IPC improvement is justified because it has more instructions than older Intel CPUs." Conroe, even Nehalem, doesn't support the same instructions FX does. They support far less. My point earlier was that even if IPC is similar, it doesn't matter since the new CPUs support more instructions. I also mentioned x86 has hit a wall and AMD and Intel need to increase performance by adding instructions. IPC from old products doesn't take that into consideration most of the time. But you can see when it does when the older parts are much further from each other in results. When they are close, they are using the same instructions. And you notice how they are far apart for most of them?

I would appreciate it if some of you read my posts. I do a bit more than just parrot marketing slides that fit my agenda.

Also, lol at that Tweaktown review. I guess when Nvidia cuts you off from the cushy "we give you free products and marketing material if you play by our rules" you get to see Nvidia products in a far weaker light. Fiji with HBM is going to humiliate Maxwell at high resolution, and Nvidia isn't even going to be viable for in home VR.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


Fantastic news! Thanks to using 8GB, AMD card is better on a resolution used by the zero percent of gamers, whereas the card is poor on resolutions used by gamers. I already suspected that when the 390X was released the marketing dept will emphasize how good is the new card at 4K or 8K.



My point is not about supporting more instructions or newer instructions. It was about supporting instructions from a scalable ISA in contrast to only supporting instructions from an ISA that is not scalable. Whereas AMD is limited to 5--15% gains per gen by relying on x86-64 (AMD64), Intel can increase IPC by up to 70% (Ivy-->Haswell) by supporting AVX 2.

The x86-64 (AMD64) ISA invented by AMD is not scalable. AMD engineers did a short-term 'fix' of x86-32, but failed to develop something with future. Intel tried to develop something for the future with IA-64 and failed miserably, but is trying again with AVX and other ISAs that will be announced for Skylake release.

The other point of the discussion was to mention how Intel can support the full AMD64 whereas AMD has problems to support part of it.
 


I bet most neanderthals would love your argument in the hands of global warming and all that... Why look forward when past and present is so darn good!

Juan, this argument is a big pile of warm manure. Plus, AMD has been pushing 4K (and now "8K") since the launch of the 7970 (IIRC). And why wouldn't they? In the 2000s 1080p was being discarded against 1680x1050 and 1280x1024, remember?



You do know why AMD dropped support for "3D Now!", right?

Cheers!
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


No problem with pushing higher resolutions. The problem is when you advertise new cards at resolutions used by 0.03% of gamers or at resolutions that nobody uses, because the new cards are not competitive enough at mainstream resolutions used by gamers.

There are some analysts predicting that the 300 series won't be the game changer that AMD needs for turning up its finances, because 4K gaming represents a niche market inside a niche market, whereas Nvidia will continue serving well the needs of the remaining 99.97% of gamers

http://www.fool.com/investing/general/2015/02/02/will-advanced-micro-devices-amd-bet-on-super-fast.aspx

3D Now! is an extension to x86. You can support it or not. It is optional. But if you claim to support AMD64 then you have to support everything what was included in the ISA by AMD engineers. You cannot start implementing some instructions and rejecting others in function of how smart are your engineers or how popular is software. Or you build a x86-64 processor or don't.
 

jdwii

Splendid


De5 now come on we both know that is not the argument, juan simply stated 8GB would beat 4GB in 8K, a res no one uses. Its not like either perform had playable performance anyways.

Lets not pretend to know what fallacious arguments are.
 


Every new technology has something called "adoption rate". I'm not saying 4K is like the second coming of Jesus or anything, but you do have to understand using the argument "yeah, this is new but no one is using it!" is a *very* bad argument when discussing "future".

If AMD has an edge on a given future technology, give them that at least. You're just taking the cake away 'cause you're angry at AMD for something.

In any case, "3D Now!" is an extension... Well, sorry about being Wikipedia, but...

"Software written to use AMD's 3DNow instead of the slower x87 FPU could execute up to 4x faster, depending on the instruction-mix."

Why support older stuff when the new stuff you support offers such gains? Same was for MMX and SSE1. Point is, AMD was *forced* to remove it when they noticed the amount of programs using it in favor of older x87 or SSE and MMX. AMD never wanted to increase their x87 performance and I bet neither Intel. In the context of Skyrim, there has to be something else at play. Maybe compiler flags?



When 1080p was being introduced, I remember the 9800PRO and GF5900FX having crappy performance at it. That lasted until the 8000 gen from nVidia was introduced and HD4k from ATI/AMD they had acceptable performance at it.

Again, point is: it's dumb to even say that just because it has always been the case with technology (in particular VGAs). New resolution -> current gen barely manages -> companies start optimizing -> decent VGAs come and we're all happy.

Let me rephrase it: AMD tries hard to move forward whereas nVidia is happy with 1080p because is outselling AMD hard. You need to put into context the shenanigans of each company here. You think Physics was hit and cool before nVidia bought PhysX? You think 3DFX imagined 32bit depth was important when the GF2 had HW support for that? AMD is just doing what everyone does when they see an edge: hype it. At least, when nVidia pushed for 32bits, the industry liked it and people accepted it quick. If you ask me about resolutions, I really don't know, but at least it's a steady movement towards higher res all the time and history proves me right.

In any case, you're right, 4K today is unplayable in single card solutions unless low detail or something else is traded. But we're discussing trends and future, right? That's why the 390X will be important. If it enables decent 4K gaming, you'll be ready to make the switch. Hell, I love bigger screens not only because of games, but because of screen real state.

Cheers!

EDIT: Typos & I forgot the link :p http://en.wikipedia.org/wiki/3DNow!
 

that's only the first part of the straw man argument. the full straw man argument is :"Thanks to using 8GB, AMD card is better on a resolution used by the zero percent of gamers, whereas the card is poor on resolutions used by gamers. I already suspected that when the 390X was released the marketing dept will emphasize how good is the new card at 4K or 8K."
does the r9 290x perform poorly at 1080p? no, and with newer driver updates, it'll likely get better. it's performance at 4K doesn't look bad either.
what is "resolutions used by gamers"? that's a stupid blanket statement. it can be from 768p to 1600p and 4K or various multimonitor high res. display configurations.
oh and here's the red herring: "I already suspected that when the 390X was released the marketing dept will emphasize how good is the new card at 4K or 8K."

gfx cards like r9 290x/290x 8GB/390x are flagships. they're always aimed at hyping good performance at highest resolutions advertisable. their target user demographic are usually high-paying users with bleeding edge performance in mind. and these people can actually run multiple displays or .. single hi-res ones. if you look at the tweaktown link you'll see exactly how they're deriving their "8K performance" numbers. it is quite possible for current pc users to do the same. not to mention benching just for the sake of curiosity.
 

jdwii

Splendid
^ What's most important is making a card that performs the best on the resolutions people play. Its like making a CPU that performs OK in somethings and well in others and then pricing it to a CPU that does great in everything.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


It is nice to support future needs of gamers but that was not the point. My point is being ignored again. My point is that AMD is advertising its support of future needs of gamers because fails to support the needs of gamers today.

You cannot say that my argument is "*very* bad" unless you don't understand it. AMD needs money to develop future products and that money is obtained from selling products today. And you sell products today when you satisfy the needs of your customers today. The above article concludes that the 300 series will not change significantly AMD finances, because the new cards will target the needs of 0.03% of gamers, whereas ignoring the rest of gamers.

AMD is pushing on gaming cards the same strategy that pushed on CPUs and in APUs before: support niche or nonexistent markets and ignore mainstream. Check numbers please, and say me how good the strategy is working. FYI AMD is loosing market share on CPUs, APUs, and now GPUs.

About 3D Now!, well, I already explained you the difference between supporting something optional and supporting something obligatory.
 


Its your position that APU's are not mainstream? I would love to hear you defend that position. Mainstream is about all they are. And they are very competitive in their target markets. The fact that Intel continues to dominate has more to do with other considerations than the competitiveness of APU chips.

 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


Raw performance: At 1080p the 290X is about 20% slower than 980. At 4K the 290X is about 17% slower. The gap is much bigger when looking at performance per watt

https://www.techpowerup.com/reviews/Gigabyte/GeForce_GTX_980_G1_Gaming/27.html

The AMD card performs poorly at resolutions used by most gamers. Releasing a 8GB version of the 290X and comparing it with Nvidia 4GB version on 8K resolutions used by 0.00% of gamers is not unfair, it is ridiculous.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


Exactly.



Indeed! AMD is copying to the GPU division the same wrong strategy used in the CPU division: sell hype

Can AMD make a comeback?
The investment thesis for AMD during the past few years seems to be something along the lines of "AMD isn't doing well right now, but when it launches so-and-so product, it will blow Intel/NVIDIA away." This was the promise of AMD's APUs, which combined its CPUs with graphics that are more powerful than Intel's integrated graphics solutions. This has been the promise of every new graphics card that AMD has launched in the past few years, and the chart above shows that AMD has been losing ground, not gaining it.

The only thing keeping AMD afloat today, besides the hope that the future will be brighter, is its deal to supply the game consoles with SoCs, a lucrative endeavor, although it masks the weakness in every other part of AMD's business. AMD is still losing market share in the CPU market to Intel, despite Intel already having a near-monopoly market share, and it continues to lose market share to NVIDIA in the GPU market.

Despite rumors about AMD's upcoming graphics cards, which will supposedly blow NVIDIA out of the water, there's no reason to believe that these new cards will be any different than the last new cards AMD released. At this point, NVIDIA has a technical advantage, able to spend more on R&D, and push both performance and efficiency to new levels. All AMD has is hype, and that's just not enough.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


No. My position is that AMD has pushed APUs to niche or nonexistent markets instead pushing them to mainstream markets.
 

noob2222

Distinguished
Nov 19, 2007
2,722
0
20,860
Gotta love reviews with a factory oc nvidia card vs reference AMD and try to claim a valid argument.

What resolution would you use from a $500 video card? 640x320? I dont think gamers are buying that card to run minimalist specs.

perfdollar_1920.gif
 


Still running a Q6600 here on my old Asus board ... creaking along ... works fine. Nothing to write home to mum about tho ... :)
 

truegenius

Distinguished
BANNED


re-read the review posted by you here,
you will see that even 290x have maxed out every game at 1080p (constant 60fps)
so what to do now, just improve efficiency and keep 1080p for forever gamer's choice ?
no, indeed push it to next level that is 4k (8k using vsr/dsr)
so what amd is doing is progressing and now trying to jump to next , and what nvidia is doing is 300fps on 1080p 60hz monitor
and amd is not alone, all display manufacturer are pushing higher resolution even on mobiles where we have reached to the limit of eye and battery is more important than display
even current 1080p owners can do 4k on their 1080p monitor using vsr/dsr
and 290x wasn't meant to compete with 980 but it is still competing

btw i posted that 8k* review link to see what you will say about this because you want us to game on apu
now it looks like that you want us to stick to 1080p so that your apu look good and can kill dgpus because dgpus will sit idle at 1080p resolution
 

jdwii

Splendid

For a min i thought that was a performance chart but nope just a performance per dollar one. Like that's ALL what matters when one uses what 30-40% more power consumption. I love having all my fans in my PC turn OFF on idle(edit and even light gaming).
 

check the perf/price charts on juan's provided link. he debunked his own claims.

are you lying by omission? why didn't you link to the perf/$$ chart where it shows r9 290x's rather significant advantage? people don't get these cards for free. despite being older gen flagship, r9 290x's perf/price advantage cannot be overlooked. i skimmed through the article and noticed that r9 290x actually outperforms (in terms of avg fps) the reference gtx 980 (the gigabyte card was oc'ed) in several benchmarks. when the difference between the 290x and 980 were significant, both cards were already delivering good, playable performance, including in the display resolutions in your stupid blanket statement of "resolutions gamers use". your claim of "performs poorly" is debunked by your own source. if you hadn't seemingly failed to check the actual benches you woulda noticed how the final tally was being skewed.. although that's not my concern. :LOL:

but we're getting sidetracked from further proving your argument is straw man. here it goes:
you are attacking user preference instead of arguing the data. in this case, user preference and card performance are, and can be, mutually exclusive.
in TT's tests, the gtx 980 sli wasn't too bad though. makes me wonder how it woulda done if nvidia and partners did release an 8GB version. according to tweaktown's test, the gtx 980 combo did pretty well in the bf4 bench. and from your tpu link, the single r9 290x 4GB does quite well against the gtx 980 4GB at 4K, outperforming it in some games, and pretty well beating it in world of warcraft :LOL:.
when 4k goes mainstream, we'll have updated and optimized hardware and software platforms e.g. higher bandwidth ports, dx12 and compliant games and gfx cards, better 4K monitors and so on. tweaktown's testing was to show how those cards perform now when they're pushed to rendering 8K. they didn't even try to hide that. it has nothing to do with your so-claimed gamers' preferences. TT's testing can be called ridiculous, but in a good way, unlike the way you're insinuating.
in addition to that, the article says that TT might update their test benches and bench again. they'll likely bench again when/if 8GB gtx 980s come out. you're getting your panties in a twist for no reason.

edit: fixed wording.
 

jdwii

Splendid


Again its the same argument why buy something like a 290X or 980 for 4K and expect it to work? You need SLI or crossfire and a 290X uses 300 watts vs 190 for a 980.

https://tpucdn.com/reviews/NVIDIA/GeForce_GTX_980/images/perfrel_3840.gif

https://tpucdn.com/reviews/NVIDIA/GeForce_GTX_980/images/perfrel_1920.gif

Also a 970 is just about equal at 4K a res 1% of gamers play on 1080P its 5% or so overall the same.

Here is power consumption as stated before neither card is good enough for 4K you would need 2
https://tpucdn.com/reviews/NVIDIA/GeForce_GTX_980/images/power_peak.gif

here is a comparison in SLI vs crossfire some is faster on the 290X crossfire others are faster with SLI(although the 290X looks like the winner in performance)
http://www.guru3d.com/articles_pages/geforce_gtx_970_sli_review,21.html

Now again keep in mind power consumption on the systems
https://tpucdn.com/reviews/MSI/GTX_970_Gaming/images/power_peak.gif
970= 181 watts
290X= 294
You would want 200 more watts for the 290x crossfire setup. A 750 watt unit would do the 970 sli setup fine but the 290X would need 950 watt unit. Factor that into it. Would you need to buy a new power supply just do even do it? Myself i would never do either and would rather game on 1440P like most here to i think less of a hassle. HECK take a look at this guys
http://store.steampowered.com/hwsurvey

A lot are on 720P still(laptops?)
 

truegenius

Distinguished
BANNED


if 1080p is enough then hd7950 + i5-2300 is enough and its tdp is stated 200w but will consume <150w at max (thus saves initial money and electricity bill too)
or just buy ps4
something to think upon, if you are using any AA then you are not satisfied with your current resolution/pixel density and you want more resolution/density or vsr/dsr

if we consider 4 hours a day gaming then we will be using 292 units extra in 365 day (290x crossfire vs 980 sli max power) means Rs1460 @Rs5/kwh (INR, Rs1460 equals ~24 US $ or around 50$ if i consider 0.17 cents per unit ) per year extra means we will need to game for several years to justify the electricity costs over card's cost

btw point was not if nvidia is more efficient or not, the point was that our top single gpu cards have maxed out 1080p so either we should stop producing powerful card and stick to 1080p or catch up and get ahead of tv and mobile display and get 4k-8k* (as we have most powerful hardware)
point was also to see what juan will say about resolution increase and how his apu will be able to handle this

and every new thing takes time for adaptation, and currently 4k monitor are very expensive so it will take time
even window 8 took time to surpass ages old xp and 8 doesn't costs as much as 4k monitor and we are not loosing any support for our 720,1080p displays due to which we will see quick adaptation rate
 

jdwii

Splendid
PS4 is a joke a lot of games can't even play at 1080P sorry but that's the res i like to play on. Heck i'm using a 27 inch monitor and 1080P res and text is small enough. Not even caring about the money of the watts more the fact that all my fans turn off in my case and its silent something i love to death. I hope Amd can make even more efficient stuff then Nvidia or Intel well i hope in a few years i think i'm done buying things for awhile well at least for my PC.

Again i will say one thing for 4K gaming Amd is the way to go for price performance even juan doesn't disagree at the moment its just that is a extremely small market guys! I'm not in denial with that but i guess i'm more into quiet machines at the moment the brute force method makes me feel like they offer sloppy engineering.

Edit i miss the old Amd from 2000-2005 were they were more efficient and faster! That is Intel today.

For old brothers sake
https://www.youtube.com/watch?v=EIPggCgYK38&list=LLjWqY6p95Cy3xLaZY1vAIRg&index=38
 
Status
Not open for further replies.