AMD CPU speculation... and expert conjecture

Page 492 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

colinp

Honorable
Jun 27, 2012
217
0
10,680


No, HDMI is not "the" replacement for DVI / VGA. It is a proprietary technology that incurs a licence. VGA, DVI and Displayport are all VESA standards, royalty free. You were wrong on both counts, I know it's hard to admit it, gamerk316.

Displayport is NOT going away. There, I used caps just like you, so it must be true.

By the way, if you look on NewEgg, roughly a third of monitors have HDMI. A similar amount have Displayport. The latest generation of GPUs from both Nvidia and AMD mostly have both. Of course they would, since you can't daisy chain monitors together with HDMI, but you can with Displayport.
 

8350rocks

Distinguished




Actually, R9-295X2 run at higher clock speed maximum on both GPUs, and uses water cooling to reduce throttling. In all the tests that TH ran on the new card, there was never a single time that the 295X2 did not out pace 2x290X cards. There were a few situations where 780Tis in SLI out ran the card @1440p, but at 4K it killed everything NVidia everytime...usually by a LARGE margin too.
 
On a completely off topic tangent, Fujitsu's Sparc X+ CPU's are now out and they look insane.

600mm^2 size at TSMC 28nm
16 cores, each core can do two threads.
Up to four sockets without glue circuitry.
Two DDR3 memory channels per chip
22~24MB of L2 cache per chip
Up to 512GB of memory per CPU
1500 pin socket.
And ECC support for the CPU and I/O bus's

http://www.tomsitpro.com/articles/fujitsu-m10-servers-sparc64-processor,1-1857.html
http://www.fujitsu.com/global/services/computing/server/sparc/products/m10-1/spec/index.html
http://www.fujitsu.com/global/services/computing/server/sparc/products/m10-4/spec/index.html

Interesting thing about the m10-4s is that you can link them together within the appropriate rack system and they will act as a single system while being fully hot plugable. Also the m10-4 has a built in closed loop watercooling system for the CPU and memory / I/O chips.

That is an insane amount of computing power for modeling.
 

cobra2

Distinguished
Oct 5, 2011
1
0
18,510
if people are just going to delete my valid comments for no reason... i don't know why i even bothered commenting on this site
 


They took it one step further. Oracle and Fujitsu have been working on embedding code into the CPU's themselves that accelerates common DB and OWLS instructions. This lets Oracle RDBMS, OWLS and some of their fusion middle-ware run even faster. I work on both x86 and Sparc systems and whenever I have to do anyone on the x86 systems I end up really missing ILOM.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


R9 295X2 != 2 R9 290x because 5>0

The problem on the slide was not about reporting only "up to" numbers, but that the Nvidia numbers were artificially increased with stuff like "our new driver boosts performance by 71%… if you also throw in a second graphics card!".

Check the link again. This is why the modified slide contains new (corrected) "up to" numbers.
 


I did because the Linux vs Windows is way too off topic now. If you have a problem with it, well that is just too bad.

Let us all just agree to disagree. Some like Linux and its customization, others prefer Windows and then some strangely prefer OSX.

Now back to AMD as they seem to be giving us news for once again.
 


The failure here, is thinking monitors, rather then TV's, drives the market. They don't. And when you look at TVs, its 90%+ HDMI.
 

colinp

Honorable
Jun 27, 2012
217
0
10,680


No, the failure here is the assertion that Displayport is intended to replace HDMI. It's not, it's intended to replace (and is replacing) DSUB and DVI.

Modern graphics cards have simply swapped out the DSUB for Displayport. And that costs nothing, since contrary to your incorrect assertion, Displayport is royalty free.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790
Nvidia being caught inventing about the performance of their new drivers ("our new driver boosts performance by 71%… if you also throw in a second graphics card!") makes sense with news as this:

3D Mark’s Hall of Fame is one of the most prestigious proving grounds for Overclockers all over the world and recently it has been dominated by Nvidia products. However the pros over at OcUK have landed the top spot with AMD’s R9 290X.

http://wccftech.com/r9-290x-reclaims-top-position-top-categories-3dmark-hall-fame/
 


Its NOT free; you forget the cost to add in the circuitry alongside HDMI. And personally? Having a DSUB is preferable in case I never need to hook up to an old, analog display without needing to use a ton of conversion cables to get the job done. No reason to support two digital standards that do the same, exact, thing.
 

colinp

Honorable
Jun 27, 2012
217
0
10,680


Gamerk, there's an old saying that goes, "If a hole is in the wrong place, no amount of digging will put it in the right place."

You said specifically that there was a licence fee associated with Displayport. That was incorrect. Period.

There is a cost associated with Displayport due to the parts, just as there is for all the other ports. Well, duh. It'll be less than $1 per port, I wager.

And if you need Dsub, then you'd better not be in the market for a current generation graphics card, as none of them have it any more. Or alternatively, you could use one of the readily available converter cables.

Really, it's starting to sound like you are trying to argue the moon is made from cheese. We all know how modest you are when you claim to have been proven right about something. Well, on this occasion you are wrong. Just admit it or drop the argument, and then we can move onto discussing the merits of Freesync on the facts rather than falsehoods.

So, to keep things on topic, I can't see how it could be a bad thing that an open, royalty free standard has had an optional feature added to it.
 

vmN

Honorable
Oct 27, 2013
1,666
0
12,160
Well, the specs of GTX 880 have been released.


20 nm GM204 silicon
7.9 billion transistors
3,200 CUDA cores
200 TMUs
32 ROPs
5.7 TFLOP/s single-precision floating-point throughput
256-bit wide GDDR5 memory interface
4 GB standard memory amount
238 GB/s memory bandwidth
Clock speeds of 900 MHz core, 950 MHz GPU Boost
7.40 GHz memory
230W Minimum Power

TSMC is on the 20nm wagon.
I don't get nVidia sometimes, like when it only have a 256-bit bus.

Atleast lets hope they get it out before broadwell, just to make fun of Intel.

Hm, Excavador surely will be out in 20nm too, gonna be interesting and with AVX2 support we will see great SIMD improvements. But skylake will have AVX512 at that point, so AMD stays one step behind.

 

Kulasko

Honorable
Jun 13, 2013
30
0
10,540
I saw many positive reviewes about the AM1 platform, just as this one in german:
http://www.planet3dnow.de/cms/9323-kabini-fuer-sockel-am1-athlon-5350-im-test/subpage-fazit/

This one also says that the dealers experience a high demand on AM1 platforms, which might be good for AMD, even though it's a low price/low margin platform
 


Those are rumored specs. I guarantee that it has a 512bit buss.

As for TSMC getting 20nm before Intel gets 14nm, Intel and Micron already have and have had 20nm NAND. Not the same but I don't think Intel worries about that stuff. They will have 14nm out later this year easily.



I will have to check but when I was working for a local shop here, they didn't sell as much AMD as they did Intel and if they did it was either the super die hard AMD fans or people who wanted the cheapest possible.

As well, due to the insane price gouging the highest AMD GPU they have kept in stock is the 270 series.

We will see if this becomes high demand or just becomes another niche market like a lot of stuff has become.
 

blackkstar

Honorable
Sep 30, 2012
468
0
10,780
gamerK man holy ***, how old are you? You whine about every new technology that comes out.

The list of technology gamerK hates

1. Alternative APIs to DirectX
2. CPUs with multiple cores
3. Replacements for DSUB
4. low IPC, high frequency designs
5. Any OS besides Windows
6. Any GPU company besides Nvidia
7. Any CPU company besides Intel
8. Any IDE that isn't Visual Studio
9. Any programming language that isn't directly supported by MS
10. Linux
11. Royalty free, open standards
12. HSA
13. Anything AMD does

Things gamerK loves

1. Propretiary Windows
2. PhysX
3. CUDA
4. G-Sync
5. Visual Studio
6. DirectX 11
7. Single core CPUs
8. Anything Intel does that resembles HSA
9. DSUB
10. Hooking really old monitors up to expensive graphic cards
11. Never admiting he's wrong.
12. Forum sliding
 

colinp

Honorable
Jun 27, 2012
217
0
10,680
Well, G-Sync needs DisplayPort as well, and additional circuits in the form of the G-Sync module in the monitor. It'll take some cognitive dissonance to find an upside to that vs. FreeSync.
 

jdwii

Splendid
FreeSync>G-Sync
Anyone notice how Nvidia always finds way to lock things down and charge a lot of money for them. When i heard Nvidia shield was 200$ i was amazed i thought they would charge 400$ for it.
 

Lessthannil

Honorable
Oct 14, 2013
468
0
10,860


FreeSync wont exactly be free, either.

1. Its an optional standard. It isn't mandatory for all of DisplayPort. Since monitors that come with DisplayPort are a minority already, monitor OEMs can close off the VRR capable ones to "Gaming" monitors that cost more.
2. FreeSync will still need an ASIC designed specfically for that monitor to work. Either AMD or monitor manufacturers will have to fund it and they can and will pass the expense onto you.
3. You are going to need a graphics card with DisplayPort 1.3

Either way, you have to buy new hardware in order to have VRR. At least NVIDIA was up front about the nature of VRR whereas AMD told us that getting FreeSync or whatever would be at little to no cost to the consumer, which just isnt true.

 
Status
Not open for further replies.