AMD CPU speculation... and expert conjecture

Page 76 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

amdfangirl

Expert
Ambassador


Nvidia just doesn't want game devs to abandon PhsyX so that it can continue to sell it as a feature in the PC space.
 

mayankleoboy1

Distinguished
Aug 11, 2010
2,497
0
19,810


sounds mildly innuendo-ish :D
 

Oh, I understand this, of they want this.
Its just that the HW itll be running on will be the exact same HW found in other PCs where it wont be able to run
 

mayankleoboy1

Distinguished
Aug 11, 2010
2,497
0
19,810


Not to mention that almost the same exact hardware will be in the Xbox 720, where Physx will be able to run.
 

jdwii

Splendid



Again it happens and why can't they do both compute and gaming performance you know like Amd.



Kinda miss the old Amd days when they made a new phenom and put it at the same price of the current phenom and lowered the lesser chip by around 10$.
 

mayankleoboy1

Distinguished
Aug 11, 2010
2,497
0
19,810


If we believe S|A , Nvidia has issues with their design, which makes fabricating their GPU chips difficult, and lots of wastage.
So Nvidia is forced to cut down on the chip size. Which means removing the compute stuff. This also has the advantage that the power consumption gets reduced . But all this doesnt matter because Nvidia has a huge fanboi base, which will pay ridiculous prices for gimped cards.
 
Understand tho, the HPC market demands low power usage, as thousands of these chips are used in one setup, and I believe Charlie was right with GF100, he really hasn't mentioned much about GK110 being this way.
And as the market matures, and NVidia or AMD for that matter pursue it, it has to fit within certain thermal designs.
Since this is AMDs first approach in this direction, they too will learn to get more bang per watt, and be capable of still giving decent GFX workloads.
 

noob2222

Distinguished
Nov 19, 2007
2,722
0
20,860



It was about money and the quadro sales, not the card design.

Nvidia argues it’d rather sell you a workstation-oriented Quadro card or dedicated Tesla-based board.
http://www.tomshardware.com/reviews/geforce-gtx-680-review-benchmark,3161-15.html

Kepler's actual compute power is pretty impressive gain over fermi.
http://www.xbitlabs.com/articles/graphics/display/nvidia-quadro-k5000_4.html#sect1

The gtx 680 was purposely compute gimped so they could sell the $2250 cards.
 

jdwii

Splendid



Agreed but that is with every brand hell i was like that i even almost bought Bulldozer after the benchmarks i guess i eventually learned and did not.
 

mayankleoboy1

Distinguished
Aug 11, 2010
2,497
0
19,810


Though one has to wonder if that perf/watt win is because of the design, or because of the switch to 28nm, that allows moar transistors to be added.

Looking at the power usage of HD79xx cards, and the relative lesser power use of GTX680/670 , it appears that the HPC/compute part of the chip uses quite a lot of power, and die size.
With AMD getting a easier design to fabricate for TSMC, they could afford to design bigger chips. Plus, AMD being the underdog, they had to make a master-of-all chip . The biggest criticism of HD79xx is the power usage.

The gtx 680 was purposely compute gimped so they could sell the $2250 cards.
That depends on if you are a gamer, or a HPC engineer. Gamers should be happy that the GTX680 uses lesser powr than HD79xx, for equal per in games. (larger/smaller depending on the game an drivers)
As Nvidia has ~90% of the GPU HPC/Pro market, they can afford to sell the "full" chips at that ridiculous price. AMD is underdog in HPC too, so it cant price their chips at the same price.

Same story as Intel and AMD. Intel can sell insanely gimped chips at high prices, because they have almost a monopoly. Its just how economics work.

Never ever believe that AMD/ATI wont do the same thing if they had a bigger share of the market.
 

mayankleoboy1

Distinguished
Aug 11, 2010
2,497
0
19,810


Dont ever call BD as "not worth my money". Its the best proc ever.
Its just that the software dont like it.... :lol:
 
^^ yeah. cerdible unbiased hardware enthusiast sites who are not paid off by amd said that it's worth gold awards. that's you, eteknix and amdzone. windows blue's bd patch will solve cpu performance issues when it comes out.
 


But TressFX is a AMD patent so you would expect Radeons to excel at this more than Nvidia regardless. So as AMD and its new partners develop games with tressfx support so it will improve.



CUDA and Physx will loose its product support because it is mutually exclusive and creates a closed market. DC and OpenCL etc encourage more software support, nvidia have a bit of a problem.


 

amdfangirl

Expert
Ambassador


Do we have an open standard for a physics API?
 
I would also add, as its often forgotten here, as well as elsewhere, the node shrinkage was a full node, not the half we are used to seeing, and is why nVidias power/perf rate looks so good compared to last gen, and why AMD could afford the first rough GNC/compute solution as well
 

truegenius

Distinguished
BANNED
^bullet :??: ! ?
100_5825.JPG

military-bullets-wallpaper.jpg

:whistle:
 


We have done this to death, so to avoid the TL;DR, AMD do use 2,4,6 and 8 cores across the varying product lines. So the FX 8000 family is a octo-core regardless of what you have been reading from the internet.
 

mayankleoboy1

Distinguished
Aug 11, 2010
2,497
0
19,810


CUDA is unlikely to die sonn.
For the same reason that PowerPC arch is stil alive : Its easier to buy newer hardware than to convert and optimise existing code to a new architecture.
Basically, code thats already on CUDA, will remain on CUDA for a long time. Newer code will like to remain x86 (Xeon Phi, anyone ?)
 
Status
Not open for further replies.