AMD CPU speculation... and expert conjecture

Page 491 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

UnrelatedTopic

Honorable
Nov 4, 2013
22
0
10,510
Regarding the empty fabs, can't Intel lease some of its fabs to AMD? Seems like a win win to me. They get their fabs utilized and AMD gets the latest fab process.
 

amd will get cheaper deals from glofo and tsmc. intel does source their fabs out, but the buyers seem to use them for their premium products only i.e. only the ones that need intel's leading edge manufacturing tech. mainstream stuff goes to tsmc. amd will have a case for intel's fabs if they have something at haswell-e level. otoh, i'd like to see intel make amd's hawaii-class gpus on 14nm.
 

Cazalan

Distinguished
Sep 4, 2011
2,672
0
20,810


Like none of the 105,000 Intel employees are aware of the challenges they face.

Did you not see the TSMC add at the bottom of the page, and the other promotional piece. "Fabless: The Transformation of the Semiconductor Industry". Oh how great the fabless industry is. Only $15. TSMC is also listed as a paid subscriber so I'm sure they're totally unbiased. :sarcastic:

It has it's benefits but it also has its downsides. Whenever you consolidate power it will lead to corruption. Having a few very powerful fabs will come back to bite us in the end. TSMC can essentially control the profits of their customers by adjusting wafer allotments. It's already starting to happen. NVidia got vocal about it once and got bitch slapped for it. As soon as the alternatives are slowly squeezed out they'll start cranking up the die costs. When their profits start leveling off they'll have to shift more into services, meaning stealing jobs from the companies that used to hire them just to fab. Eventually killing off scores of fabless semiconductor companies. Not to mention the corporate espionage that can easily go on as TSMC employees will have access to lots of sensitive information. TSMC will need tighter security measures than a diamond/ruby mine.


Looks like Intel's Bohr was right about TSMCs 20nm FinFet getting scrapped.
http://www.semiwiki.com/forum/content/1239-intel-says-fabless-model-collapsing-really.html
 

colinp

Honorable
Jun 27, 2012
217
0
10,680
So the R9 295X2 is out and seems to be rather good, provided you have $1500 to burn, a phat power supply and a spare 120mm case vent. Sadly, none of these conditions apply to me, but then I never was the target market for a halo product.

 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


Intel want kills AMD, not help them.



Why do you believe Intel has open its fabs to Altera and others? :sarcastic:



I like the replies that Bohr receives:

This is complete nonsense.

Not true of course.
 

colinp

Honorable
Jun 27, 2012
217
0
10,680


I must have missed those links. Would you be able to dig them up for us, please?
 

Cazalan

Distinguished
Sep 4, 2011
2,672
0
20,810


Well it sure beats (price wise) the prior A6-5200 boards that were out there for $150. The egg has a combo with case/mb/cpu for $99. I'm sure more will follow.

The interesting thing will be if Microsoft follows through and actually reduces the cost of Windows for these "under $300" systems.
 


Yea and it also costs way too much in my eyes. The R9 290 should be $550 each, say $650 for the Asus version. That is $1300 at best. $200 for what is pretty much the equivalent to a H80 with one fan, and I bet not nearly as good of a fan as what comes with the H80i, seems a bit costly.

I would like to see THG take the AMD fan off and throw two of the H80i static AF fans from Corsair on in a push pull and see if they can help drop they temps a bit more.



Ahhh gloom and doom for Intel, huh? I am sure they are aware, like any good company would be, and are looking for solutions.



I would think that the 14nm wouldn't do quite well as it was designed for CPUs. Then again the FinFETs and other extra tech goodies Intel has wouldn't hurt in Hawaii's power leakage.....



Wish AMD would stop using Radeon for everything. Kind of lowers the brands name in my eyes. Used to be a high end GPU. Now it is meh RAM and might be a meh SSD.

I am not keen on them using OCZ for the controller. I had a lot of problems with OCZ SSDs. They should just stick to the tested Sandforce controller that is on everything (well most everything) and has been proven reliable.

 
Well, I just read Intel closed the Costa Rica factory (part of it, I think)... 1500 people without a job. The reasons are "current PC state of business".

You don't go around closing shops because everything is fine and dandy. It might not be gloom and doom, but not rose colored either.

Cheers!
 

Cazalan

Distinguished
Sep 4, 2011
2,672
0
20,810


That's like year old news. Achronix 22nm FPGAs started shipping in February 2013. Work started in 2010.

Achronix, Tabula, Netronome, Altera, and Microsemi. 4 FPGA companies and 1 packet processor. No competitors so far.

TSMC was already giving preferential treatment to Xilinx, so it makes sense for their biggest competitor Altera to partner with Intel. These are mid size 10-15B companies.
 

blackkstar

Honorable
Sep 30, 2012
468
0
10,780
Linux is a joke and will never go anywhere

All we need is DX11

We will never need more than 4 cores

Windows will rule for eternity

Who am I???????

On another topic, you guys saying Linux sucks or whatever, I can't hear you over my A4-5000 that's using AVX and tons of instructions for the entire operating system or my FX 8350 that's using FMA and AVX and more in the entire operating system.

Maybe someday you will learn how to compile your own OSes and start to use all those features on your CPUs that lay dormant, haha.
 

jdwii

Splendid


Or we can have a life thats another option...
 

colinp

Honorable
Jun 27, 2012
217
0
10,680


Even a basic bit of research would tell you that Displayport is intended to replace DVI and VGA, not HDMI. Furthermore, Displayport is royalty free, whereas HDMI is not.
 
Real geeks build their own drivers in Linux!

I have modified the kernel code to fix some stupid bugs with the Radeon driver, so yeah... I can not do that with Windows when there's an issue.

Desktop Linux is for either die hard fans, people with lots of time and knowledge or MASOCHISTS like myself, haha. Man, those Gentoo emerge that broke dependencies... My God...

Anyway, please drop the Linux marketshare discussion. It's not going anywhere and won't affect how AMD CPUs perform in Windows at all. Windows for show, Linux for pros (?).

----

In regards to DDR4... How long till Intel puts their engy samples? I really want to know how DDR4 will change the landscape for Intel first.

Cheers!
 

drinvis

Honorable
Oct 3, 2012
65
0
10,660


Well,check out the adoption rate mate.Display port unfortunately has not been that popular among manufacturers.Most laptops have HDMI,same with television.
On monitors most mid-range fhd monitors have HDMI and on monitors which have display port most of them also have HDMI.AMD did try to popularise it with eyefinity but then because of not much adoptation of display port and the active adpater extra cost and mess they also have made suitable changes to their eyefinity with newer cards which is welcome.But low adoption of display port is really very unfortunate.

 

http://www.tomsitpro.com/articles/samsung-ddr4-memory-modules,1-1855.html
samsung accidentally confirmed new haswell-e based xeons' series number. wccefghtech rumor said june launch, so reviewers might get the chips in last couple of weeks of may.
i don't know how the chart in the link claims 25 GB/s for ddr3 1600 (assuming dual channel), i get around 20 GB/s. :s the chart tops out at 51 GB/s, so quad channel might get close to twice as much.
 


But that's my point; HDMI was the replacement for DVI/VGA. It was out first. It did everything Displayport does (minus the rarely used Ethernet layer). It has market acceptance. You aren't going to see Displayport make its way into TV's or Receivers, its limited to Computer Monitors, unlike HDMI. HDMI is NOT going away, so the only cost issue is the added on cost of including the Displayport connector and related circuitry.
 
I have modified the kernel code to fix some stupid bugs with the Radeon driver, so yeah... I can not do that with Windows when there's an issue.

So...the driver is broken, so you fix the issue by putting a hack into the Kernel? Which someone will eventually need to remove again?

((And if you REALLY want to be technical, you could always disassemble any driver related .dll files, make any necessary changes, recompile them, and turn of driver signing in Windows if you wanted to make a driver side change to a non-open driver. Not pretty, but it works.))
 

Lessthannil

Honorable
Oct 14, 2013
468
0
10,860
FreeSync is still going to have an ASIC to actually work. And, since the proposal is optional, not mandatory, it allows monitor OEMs to close off VTT for "Gaming" monitors and such.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790
Ouch!

62518.png


Do you recall those Nvidia slides shared here by our favorite 'mantel' guru? They were used to predict the end of 'mantel' once again. We also did read fantastic conspiracy theories about AMD secretly crapping their DX11 driver to make 'mantel' look better...

Well, it seems now confirmed that those Thief and BF4 slides were pure lie

Yesterday, Nvidia released the GeForce 337.50 beta drivers to much fanfare. Nvidia itself said that the drivers could improve single GPU performance by up to 64%, and SLI configurations by up to 71%. When the Nvidia-provided benchmarks then failed to provide evidence of these massive performance boosts, we were a little skeptical. Then, when we looked closer and saw that Nvidia’s numbers specifically targeted on Battlefield 4 and Thief — two games that AMD has been pushing as the vanguard of Mantle support — we started to wonder if Nvidia was up to something slightly more sinister than simple number massaging. Yesterday, we also hadn’t had a chance to run our own GeForce 337.50 driver tests. Now we’ve run some tests, and looked at the benchmarks published by other sites, and it’s clear that Nvidia has overpromised and underdelivered.

http://www.extremetech.com/gaming/180088-nvidias-questionable-geforce-337-50-driver-or-why-you-shouldnt-trust-manufacturer-provided-numbers

The corrected slide

nvidia-geforce-337.50-driver-spoof-640x353.jpg
 

R9 290x performance: 1473
R9 290x performance doubled: 1473 * 2 = 2946
R9 295X2 performance: 2933

How's that result unexpected?

Well, it seems now confirmed that those Thief and BF4 slides were pure lie

"Up to". Which is exactly what AMD did not that long ago, yet you still hold on to those numbers as the absolute gospel. And yes, I noted the "up to" when i referenced the original article.
 
Status
Not open for further replies.