AMD CPU speculation... and expert conjecture

Page 633 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

jdwii

Splendid


And we are not surprised you are still sticking up for a design that is competing with a bottom line Intel processor as it always did, funny how people like to claim arm is simple when project denver took forever to be made i'm not even sure if bulldozer took that long. Like i said i remember Nvidia fanboys(delusional people any fanboy) talking crap about bob cat awhile ago. In a business having the product out is more important then claims.
 

UnrelatedTopic

Honorable
Nov 4, 2013
22
0
10,510
Let's take a break from the usual Juan v. The World type of stuff. I know it's entertaining but lets switch over to more current events like what Roy will talk about tomorrow.

NDA guys any hint? You don't even need to tell us you can blink once for CPU or blink twice for GPU
 

Cazalan

Distinguished
Sep 4, 2011
2,672
0
20,810


Exactly 1 year ago they unveiled Hawaii. Radeon R7 and R9.

They need a response for Maxwell.
 

ADAMMOHD

Honorable
Nov 12, 2013
32
0
10,540
Noob2222 read carefully before replying. Tegra K1 has 2 chipsets, the quadcore 2.2ghz is 32BITs and is already released on nvidia tablet and mipad. Were just waiting for the Tegra K1 with the dualcore Denver 2.5ghz which is 64bits ok?

If you guys want to compare to an i5, wait till qualcomm releases uts 64bit beast.Qualcomm has been ahead of nvidia in cpu case.
 

etayorius

Honorable
Jan 17, 2013
331
1
10,780


Let me guess... AMD announces AAA titles MANTLE support, and possibly add them to the Never Settle bundle and even some huge price drops in current R9 200 Series.

Ill just wait for 370X, it will not be long till they answer to nVidia super aggressive prices.
 

logainofhades

Titan
Moderator
My HD 7970 is still sufficient for my needs. The GTX 970 is awfully tempting though. If I had the $$$, I would probably get it. Waiting on the 950/960 to come out. My old HD 5850's, in my other rigs, could stand to be replaced with faster, and more efficient, GPU's.
 

blackkstar

Honorable
Sep 30, 2012
468
0
10,780


1. Claim to support FreeSync and G-Sync
2. Intentionally cripple FreeSync on your products
3. Start talking about how bad FreeSync is
4. Have everyone compare FreeSync and GSync on your product and "let them make their own conclusions"
5. Everyone decides FreeSync sucks compared to GSync after trying both on the same card
6. ????
7. Profit

This is OpenCL vs CUDA all over again.
 

etayorius

Honorable
Jan 17, 2013
331
1
10,780


I think the tech of both GSync and FreeSync is amazing, i have seen GSync in action and it gets rid of microstutter in games by 95%, there is a lot of MicroStutter in Skyrim and other games even if you mantain 40+ fps, this only goes away if you have 55+ fps. with GSync Games look smoother and tearing is completely gone (though i never been bothered by Tearing at all), i don`t see myself buying a new Monitor just for DP 1.3, since i rather use my money on a new GPU.

But nothing new i guess, nVidia will always be a fart.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790
I find amusing when some guys in forums pretend that development of Denver ARM core took an eternity, when everyone of us know that the original project was a x86 core and that due to licensing issues with Intel, Nvidia had to switch to ARM, which did need a redesign, and then had to wait to ARM launching the ARMv8 ISA. Those guys mention Denver and Bulldozer in same phrase by evident reasons.

I find also amusing when someone pretends that K1 Denver is a quad-core, just because he cannot digest that a dual-core ARM is matching the performance of a dual-core Haswell that consumes two times more power.

And the misinformation about TDPs and power consumption is also lovely. In the first place, TDP != power consumption. This has been explained plenty of times in this thread but some people cannot learn. The 11W that some mention are not the TDP of the SoC used in the tablet but the total power consumption of the Jetson development board. Moreover, we know that from the total of 11W consumed by whole board, the SoC+DRAM consumes only 7W. This means that the SoC alone has to be on sub 6W TDP rating, which is about one half of the 15W TDP Haswell. I understand that those guys that claimed that ARM couldn't scale up, those that claimed that ARM never never could match x86 performance now need to write silly claims such as "you simply don't understand how 8 arm cores can beat 2 high-end x86 cores". LOL
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


But TSMC announced 7nm before

"We are in early development of 7nm and we are narrowing down the options," Sun said. Meanwhile, TSMC is researching new patterning techniques that provide dimensions "in range for 5nm logic generation."

It is really interesting to watch how Intel traditional lead on foundries is reducing. With 14/16nm and beyond, AMD has an opportunity to be fully competitive again.
 

noob2222

Distinguished
Nov 19, 2007
2,722
0
20,860
What I find funny is no one wants to post out right what the power draw is on any arm device. All they want to say is "its 100000000x more efficient than x86, trust me".

If you want to compare denver to an i5 u, then dont go off the wall and saying crap like its faster than haswell 140w cpus" and not expect someone to call you out when your comparing it to a 15w cpu.
 

jdwii

Splendid
juan you can continue spitting nonsense but that doesn't change anything- again project denver took for ever it was being made since 2006!!!!!!!!!! That's crazy
No one here said Arm couldn't scale up at all in per core performance. You are creating lies and using one fallacious example after another.
Again its still being compared to Intel's lower-end CPU's i can care less what its using in power consumption, back in 2008 the same thing was happening Arm was competing with intel's lower-end CPU's while consuming less power.

The existence of Project Denver was revealed at the 2011 Consumer Electronics Show.[7] In a March 4, 2011 Q&A article CEO Jen-Hsun Huang revealed that Project Denver is a five year 64-bit ARMv8-A architecture CPU development on which hundreds of engineers had already worked for three and half years and which also has 32-bit ARM instruction set (ARMv7) backward compatibility.[8] Project Denver was started in Stexar company (Colorado) as an x86-compatible processor using binary translation like in Transmeta's projects. Stexar was acquired by Nvidia in 2006

But still i already said denver looks interesting and i'm sure its going to be the thing that brings Nvidia back into the smartphone/tablet market, Nexus 7 was amazing.
 

jdwii

Splendid
So much for troll's stating intel will kill off gaming gpu's soon this game makes my 770 look like crap starting to wonder if i should sell it for 200$ and get a 970.
http://www.pcgamer.com/2014/09/25/the-evil-within-system-requirements-demand-some-serious-hardware/
To many games are starting to use more Vram now.
 

8350rocks

Distinguished
ARM scales upward poorly.

Not because the core cannot do the work, but because by the time you get ARM to do what x86 can on the high end, you now need all kinds of additional hardware for things like OoO, and other stuff.

What is being highly misplaced is comparing cherry picked benchmarks for an in order execution chip designed for phones and tablets against an OoO capable chip for laptops.

Consider this, AnTuTu is not aimed at OoO execution, because no ARM chips to this point will do OoO with perhaps exception to 80W server chips, and even then, you are no longer in much better territory, which is what I said all along. If you want ARM to compete on that end it will not be at 20w. If their cores excel at microserver functions, that is fine, they are designed for it. However do not hold your breath watching for a wall street firm to sink big money into ARM servers...

As for HPC, everyone knows that you need just enough cores to baby sit a massive bank of GPUs and keep them fed well...let us not kid ourselves shall we?
 
Status
Not open for further replies.