AMD Clarifies Why It Uses Intel Core i7 In Its Project Quantum Gaming PC

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Usually about a year behind everyone else on performance with some of the worst gaming drivers on the market along with Matrox. ATI could be nightmarish too, especially in the days of Direct3D where some games used palette-based textures but only specific driver versions supported that.
 
Having said that, am I the only one NOT impressed by the size of this supposedly SFF box? Look at it in his hands. It is rather large together and then their external unit has to be figured in too. Doesn't it look about as wide as him and not exactly short to begin with?

I can build my own system and probably take up less space or at least save a chunk of money I'd guess. I though SFF was supposed to take up SMALL SPACE. Why would you want all that wasted space between them and an external unit too? Storing this anywhere makes it a PITA IMHO. Not quite sure why they wasted R&D on this. It wouldn't be a massive money maker no matter how you slice it. Let someone else R&D how to put 2 furys in a box with water. Get back to concentrating on CPU/GPU/DRIVERS period. No more APU (thankfully pulled up ZEN, delayed apu crap that can't make a dime, should've listened to Dirk Meyer ages ago and stayed CORE PRODUCTS), no more crap like this or consoles, just PURE CPU (with high IPC and plenty of die space to hurt Intel with) and pure GPU/drivers.

Well, you could certainly build one, and maybe taking up less space than Quantum. But such aesthetics, and more importantly, all those watercooled parts inside (with maybe an added advantage of OCing them) really puts all those space to use. I mean, it's not too big, not too small. It's fascinatingly a good size to catch eyes, even if it sits somewhere in the corner of a room. Having said that, AMD did make use of its R&D. And as we know of today, chips companies, nvidia and Intel aren't sitting only with CPUs, GPUs, drivers and all. They are making SHEILD, NUCs, and what not. They are expanding their portfolio and it's benefiting them a lot. It's pleasant to see AMD trying to reimagine a new design of living room + gaming SFF, although it isn't quite small, and trying to expand its portfolio. However, it packs all those HPC muscle in a ready-made product, which really impresses me.

True AMD needs to get a good ground on supports, GPU and CPU architecture but it's not going to happen overnight. I presume AMD knows it's losing ground under its feet and taking initiatives, secretively of the crowd, secretively of rivals.
 
Kitguru was banned from the FuryX launch for perceived negative (by AMD) comments about AMD products.

http://www.kitguru.net/site-news/announcements/zardon/amd-withdraw-kitguru-fury-x-sample-over-negative-content/

Guess AMD is just like its fanboys and can't stand the truth about AMD products.
 
For all you younger folk, Radeon and AMD were separate companies. AMD acquired Radeon if I recall correctly. Intel really never came out with any video cards to talk about gaming wise. It's not easy. I own stuff from both so no bios here.

You do not recall correctly, old man. AMD aquired ATI.
 
Even though the GPU is more important for gaming, AMD's CPUs can't keep up or fit the thermal requirements. It's not news that you can use an Intel CPU and AMD GPU, so this is just pure embarrassment for the company.

Excavator is a joke, Zen can't come fast enough.

Not sure why you got downvoted so much, nothing you said there is inaccurate. Maybe a bit mean-spirited. And I say that as a 9590 owner.
 
It's not them being honest or them admitting defeat.
It's a clever business decision. They want to sell high margin GPU's, 2 of them at a time in this case. If that means pairing it with a range of CPU options to appeal to every prospective purchaser then so be it.
AMD are in the business of making money for their share holders.
 
Hope AMD have plans for full fat Zen cores using HBM 2 & skip DDR4 for CPU & APU's to save more space like Fury aptly demonstrates!

HBM is no replacement for DDR, unless AMD or someone else makes it into an open standard. Right now, HBM is possible only inside an APU, but if you put like 2GB of it and ditch conventional RAM, you have an expensive APU, a weak GPU with too much graphics buffer and a system with too little RAM.

But an APU with 1GB of HBM could destroy so many lower-mainstream GPUs :)

I feel like that sort of defeats the purpose of the APU though... the GPU having its own memory means that it's almost literally a dGPU thrown on the same package as a CPU. Unless I misunderstand something. I suppose that it's still different in the sense that the GPU still has direct access to main system memory, so that 1GB HBM would basically act as a huge cache to the iGPU.
 
Well, there might be an background deal with Intel. AMD uses Intel high end CPU in some of there products and Intel use new GPU's for some Intel products. So both companies get something out of it.
 


Yeah, I think that would unveil the era of "Fast-n-Furious" embedded on-die memory to PC market. 😀
 
"It's not them being honest or them admitting defeat.
It's a clever business decision. They want to sell high margin GPU's, 2 of them at a time in this case. If that means pairing it with a range of CPU options to appeal to every prospective purchaser then so be it.
AMD are in the business of making money for their share holders."

Yes, that would work except that AMD (as a whole) depends on CPU and GPU sales, so if the CPU division goes off a cliff, it affects the whole company. By admitting your #1 competitor and rival trounces the best cpu you have to offer in the consumer market, you'd either better be able to turn it around with Zen, or leave the x86 market altogether. And its not like AMD doesnt sell a direct competitor to the 4790k, the 9590, sells for around the same price (with the required water cooler).
 


I doubt they will ever put memory only on the CPU and not have an expansion option. HBM 2 will allow for up to 8 stacks or 8GB total memory. Even with that there needs to be room for expansion and RAM on the board will be much slower than on package eliminating the benefits when the system has to move to that RAM.

I can see a use for it in APUs but not in enthusiast class CPUs until they can give enough RAM to satisfy the range of memory people use.

What about if they used 2 or 4 GB of HBM as L3/L4 cache but still allowed you to install your own DDR4 memory?
 


With HSA and hUMA too??? Hmmm.... I am afraid that much of cache would be left unused imo, but if it does really comes to a use, that would demolish everything around it 😀 (yeah... right, the CPU & GPU needs to be better too, LOL).
 

That much memory would not be practical to manage as cache. It would make more sense to let the OS manage it as NUMA/hUMA: move the most frequently accessed memory pages to on-package RAM, dump the rest off-chip. System memory would effectively become a live swapfile.
 

That much memory would not be practical to manage as cache. It would make more sense to let the OS manage it as NUMA/hUMA: move the most frequently accessed memory pages to on-package RAM, dump the rest off-chip. System memory would effectively become a live swapfile.

What you are describing is in fact, a type of caching. So it would not be inaccurate to call it a cache.
 
it boggles my mind AMD thought they'd ever get Mantle off the ground. One more thing that robbed R&D from cpu/gpu/drivers. On top of that they knew there were two competitors not far away, Vulkan and DX12.

Revisionist history at it's finest. That is NOT how it went down. When they started working on Mantle, there was no DX12. Indeed there were no signs of a new low-level API from MS at all. Vulkan didn't exist in any form. Mantle's announcement and subsequent development forced MS to advance their DX timetable AND pushed them to make a full and proper low-level API to truly rival and supplant Mantle. Mantle was the catalyst, no pun intended.

And as far as Vulkan being a "competitor"? Where do you think the code for Vulkan CAME from so fast? Khronos is slow as crap, do you think they pulled that out of thin air? The very basis for Vulkan is Mantle! I repeat, Mantle forms the foundation of Vulkan. Look it up.

Oh and last but not least the development of Mantle (an API) did NOT rob away resources from CPU or GPU development. If you think it cost too much money, fine. But it didn't slow down their hardware teams. Personally I think the rapid rise of DX12 and the creation of Vulkan were worth it.
 
Just announced on the news that AMD is exploring the idea of sell off the CPU division.

Quoted from Arstechnica:
AMD Spokesperson Sarah Youngbauer issued a statement over the weekend denying Reuters' report. She wrote, "AMD provided official confirmation that we have not hired an outside agency to explore spinning-off/splitting the company... We remain committed to the long-term strategy we laid out for the company in May at our Financial Analyst Day, which encompasses all parts of the business."



Kitguru was banned from the FuryX launch for perceived negative (by AMD) comments about AMD products.

http://www.kitguru.net/site-news/announcements/zardon/amd-withdraw-kitguru-fury-x-sample-over-negative-content/

Guess AMD is just like its fanboys and can't stand the truth about AMD products.

You may want to watch this video to understand the reason behind their loss of trust in Kitguru: https://www.youtube.com/watch?v=QFWgc8qjQwk
 
Lol, That kitguru is a joke. Almost everything he speculated is wrong.
He has even blocked comments on his video. He must of gotten slammed so bad for those inaccuracies he had to disable the comments.
 


Yeah, we all know AMD can't compete with Intel High End, but it still a shock it coming from they own word..



well, it a bad move for cpu marketing.. but i think their goal is to win sympathy and looks like it's working..
something like:
AMD: underdog, value, listen to the consumer
Intel: Big top company, greedy, done lot's of underhand tactic...

aside from all thing, as human IMO it's good to know some people still want to admit their mistake..
I'm honestly question people who bashing this good conduct..

 
"I could say about Intel fanjerks"

Except I own, and am currently typing this reply on a 9590-based system with a 290x videocard.....

Well done.

Edit: spelling
 
Amd will probably incorperate hbm into their cpu architecture. Tbh 8 gigs alone is enough for 99% of the consumer market. By the time though we see this intigration i expect we can see 8 and 16 gig variants of hbm cpus hsa and apus would greatly benifit from it since it can use the cash/ram to help store and transfer the data back and forth.
 
When AMD claims a 40% IPC increase per core, I get suspicious that they're dropping the "module" nomenclature they tried to sell us on. So, as an eight-core, four-module Bulldozer chip transitions into a quad-core zen chip the IPC doubles like magic.
 
Status
Not open for further replies.