AMD CPU speculation... and expert conjecture

Page 11 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.



Hahhaa, I remember that with the 4200ti a friend gave to me. The card was bathed in liquor and busted the ram. I cleaned it up and worked fine after that.

Fun times.

Cheers!
 

amdfangirl

Expert
Ambassador
The HD 4890 I'm using right now a friend threw away (at me). I picked it up and saved myself the $229 it would have cost to get a GTX 660 :). It had "RGB sand". One application of the fridge trick and it was fine.
 


Thunderpants couldn't find his trousers with both hands, let alone a courthouse :kaola:

Besides, unless he sproinged a new alias in the last 2-3 years, he hasn't been on THG with one of his drive-by flamethreads :D..
 


Heh, I wouldn't base US anti-trust law on what the EC does - they are pretty much in the 'fine the big US corporations' to max out their funding, plus encourage Euro companies to compete by discouraging US companies..
 


As far as AMD CPUs go, to start with - back in 2005 SOI wafers cost around 3X what strained silicon wafers went for, at least at 300mm size: http://www.forbes.com/home/free_forbes/2005/0919/072.html

Of course that is just the starting material, but my understanding is that the processing steps are somewhat different for SOI than strained silicon, not to mention the design. And AMD or GloFlo was unable to find any other customers for SOI, which meant AMD had to bear the fab costs all alone.

2nd, lower yields as a result of a bigger die size means that an already more expensive fab cost gets magnified even further. So Intel has a lower production cost than AMD does, also GF has to make some sort of profit which means another expense to AMD..
 


It really depends on how Intel accounts each department of the company, but if they use the "internal partners" ideology (one sells a product or service to the other), then yes. Experian uses this model/way, but I don't know the exact name for this way of doing it. Maybe its just "business", hahaha.

Each dept has its own budget and income, etc; but in the big picture, I'd say they don't need to "make a profit", just cover costs when selling internally.

Cheers!
 


Only if they INTENTIONALLY abused their market position. Monoplies based on TECHNICAL advantages (EG: Simply being better) are still legal under the Sherman/Clayton Anti-trust acts anyway.

Now, if Intel actually took a loss to undercut AMD, THEN AMD would have a case.

Secondly, Intel makes CPU's. Microsoft makes OS's. If the fact Windows/OSX 98% share benefits the X86 architecture, well, good for X86 as far as the law is concerned. Feel free to drag MS/Apple into a courtroom to explain their love affair with X86, but legally, thats not Intel's problem.
 

hansmoleman1981

Honorable
Nov 21, 2012
12
0
10,510


DING DING DING DING DING

BINGO

Compute performance that no one cares about, or asked for in a mainstream consumer GPU.

ROFL. :pt1cable:

someone will undoubtedly say "DURR, I am a super crazy developer / animator who does a lot of rendering, and needs compute"

Ok, you have a big ****, no one else cares. You alone are not responsible for driving GPU revenues. Its the kid who wants 90fps on BF3 that drives GPU revenues.
 

hansmoleman1981

Honorable
Nov 21, 2012
12
0
10,510
AMD fanboys and those with a myopic viewpoint are propagating a myth that AMDs survival is necessary to keep Intel/NVidia in check. This myth is at odds with reality.

1. The HPC era is winding down, from a mainstream consumer perspective. Nobody cares about gigabertz, and cores anymore, its not 2005 (coincidentally the year AMD stopped being relevant). Dwindling PC sales, and virtually every market forecast indicates that the desktop PC will be highly irrelevant to the average consumer.

2. The future is mobile. The future is SoC. "IF AMD NOT THERE, WHO WILL STOP INTEL/NVIDIA 111!!". AMD is just the first casualty. NVidia will also go the way of the DoDo. The future is SoC. No one needs to "stop" NVidia. If they don't adapt they will also die.

3. You don't NEED a GTX 670 to game at playable framerates on most games. http://store.steampowered.com/hwsurvey
A small minority of users have the latest and greatest in their machines. Intel HD Graphics 3000 tops out at 4.25%. The only reason GTX 670s/Radeon 7990s sell is because nerds want e-peen, and frames they'll never see....Almost every PC game is optimized to run on cards from two generations ago.

The industry is changing in a big way. The future is SoC. AMD is just the first casualty, and them dying does not mean you are forced to buy NVidia cards for $700....NVidia will also go the way of the Dodo because dGPUs are legacy, and not the future.

4. DX9 is still the standard. Windows 8 only brings DX11.1. Nothing significant.

Basically I'm making the point that you don't need the latest and greatest ***, to play a video game. The bulk of the market spends less than $200 on dGPus. http://www.xbitlabs.com/articles/graphics/display/nvidia-geforce-gtx-460.html Just because AMD doesn't exist, doesn't mean NVidia can magically start charging $$$ for GPUs.....people will just stop buying them if the price is too much to bear.

So...AMD can burn in hell and nobody will notice.
 

griptwister

Distinguished
Oct 7, 2012
1,437
0
19,460


You're forgetting about the Enthusiasts... If FPS and MAXing out games wasn't an issue, then why the Hell is Intel leading the CPU market?

Nvidia should buy out AMD to compete with Intel... Intel is creating GPUs here soon... should be interesting to see if they get in the Gamer's market.

Okay, To max GAMES out, ALL eyecandy to still get a good framerate, you need a decent GPU. Hence, the GTX 670 is a great value.

DX11.1 Has a noticeable Graphic difference compared to DX11. DX11.1 WILL be used in the next Gen consoles. And Windows 7 Will be (or is already) DX11.1 compatable... Look it up.

Games are getting more demanding... they're not reaching their full potential on consoles, how ever, PC's are a different scenario.
 

noob2222

Distinguished
Nov 19, 2007
2,722
0
20,860

It cracks me up when people think AMD overpaid by 2.5B on ATI when it was AMD that put the 2.5B loan against ATI assets and not AMD assets. One company or the other was going to drop by 2.5B, and only one did.

5B company with a 2.5B loan has a value of 2.5B. ATI DIDN'T TAKE OUT THE LOAN TO BUY ITSELF, AMD DID.
 


You really think AMD or nVidia design their GPUs just for consumers? You do have a point in that regard though; the people that actually want/needs compute on a video card is little to say the least, but it is a market and you have choice. OTOH, rendering farms (or HPC) do use these GPUs and nVidia has been very successful there, that's why AMD had to come up with a solution to compete in that market, since it pays really well. Not volume, but pure profit.



AMD is relevant for the market, not because will keep in check prices alone, but because makes nVidia and Intel try harder in each gen. You think Intel would have come up with Conroe without AMD's pressure? We'd be stuck with Pentium 4's design to this day, I'm sure.

This is economic theory, not fanboism.



Ugh, "gamers" don't drive markets; or at least, not "PC gamers". Big publishers just target profitable platforms that are wide spread and in control (consoles), so they turn a profit in the short/medium range (quarter by quarter mentality). I do agree that PCs now drive innovation, but I don't really see profit for developers making big innovations in the PC sector; just incremental refinements of proven mechanics. I've talked about this with a friend of mine and I always tell him that Nintendo is the only innovative company left for gamers. They try to appeal new ways of making interaction with games putting their neck when they can. MS and Sony just used proven tech and do "leaking" innovation (not big improvements, but incremental or copies).

Anyway, big and expensive video cards are some-what justifiable because of bad porting or bad coding from consoles. People with regular computers can't max out most games not because of hardware limitations, but problems with some settings across the board. Too much hardware makes it a coding hell for devs. Even more when publishers don't give money to polish games down the road (bug fixing, basically) and just want to turn a quick buck.

Cheers!
 


?? When you buy a company like ATI, you buy its assets and outstanding liabilities. If you don't have enough cash to pay for it, you take out a loan against the merged assets (which is what the senior notes are - IIRC they are "senior" or take precedence over other AMD debt such as outstanding stock that they sold to the stockholders). Immaterial whether it was "ATI assets" used to float the loan, as those became AMD assets once the buyout occurred.

If you read the article, or at all knowledgeable about the deal, you should remember that AMD wrote down the book value of ATI (i.e., the so-called "goodwill" tax writeoff) by over half the $5.4BN, which is basically an admission by AMD that the current value of ATI is lower than what AMD paid for it.

Think of it as buying a house for say $500K, using both cash and a loan, and then finding out 2 years later the house is only worth $200K. Your net worth decreased by $300K, and it doesn't matter if you took out a mortgage loan against the house or hocked the family jewels at a pawn shop :kaola: .
 
On a positive note for AMD: http://seekingalpha.com/article/1027391-amd-s-sleeper-agent-temash-for-tablets?source=msn

What Is Temash?

Simply put, "Temash" is the codename for AMD's next generation low power X86-64 APU based on the next-generation low power "Jaguar" core. The following slide from the company's presentation at "Hot Chips" reveals a number of key details about the product:

Basically, this new chip is a very serious improvement over its already successful "Bobcat" APU. It will run at higher clock speeds, do more work per clock, come packed with more cores, have an enhanced instruction set, and sport a wider floating point unit. Part of this is due to a new design, and part of it is due to a shrink from TSMC's (TSM) 40nm process to its 28nm process.

In terms of performance and performance-per-watt, the chip should be a significant improvement over the last gen "Hondo" tablet part, which has seen very few design wins due to the fact that it is power hungry and not a fully integrated SoC. "Temash" changes that by being a full SoC (meaning lower platform power).

Simply put, this chip could be what AMD needs to try to revive itself. With a ground-up, resdesigned low-power X86 core, AMD has a real chance of threatening Intel (INTC) and the various ARM (ARMH) licensees such as Qualcomm (QCOM) at Nvidia (NVDA) at the low end of the market (tablets, netbooks, cheap notebooks). A power-efficient, low-power, cheap-to-make X86 SoC that combines AMD's graphics expertise and X86 license could work out well for the company, especially as the firm restructures itself.

This could be the kind of focused product that AMD needs to swing back to profitability.
 
Ah, sounds like good news indeed. Now... The things is "when"?

According to that article, it should be out by Q4-2013, and that means a lot of wait. Intel will have the next gen Atom by then on market. It better be one hell of a product, because at that time, ARM will also have a lot of designs out the door competing. Samsung should be playing with the next Exynos at 28nm as well. Maybe a quad core or something, since Exynos 5 is 32nm and dual A15's.

Cheers!
 
Heres the problem with X86 in mobile: How many apps are written for Android/iOS for the ARM architecture? Its going to be REALLY hard for X86 to break in now. X86 in mobile is now basically reliant on Windows 8 for its success.

EDIT

Basically, its the same exact situation with ARM on the desktop: With so many apps written against X86, what are the chances it can make inroads? Not very good.
 

mayankleoboy1

Distinguished
Aug 11, 2010
2,497
0
19,810
^

IIRC, Medfield had a sorta compatibility layer that made using android ARM apps easy. At the time of launch, from the AT review, quite a large percent of the apps he tested worked.
 

hansmoleman1981

Honorable
Nov 21, 2012
12
0
10,510


you have no idea what you're talking about. Either provide proof of these phantom "rendering farms", and millions and millions of HPC clients that need the all this GPU compute power (The titan super computer doesn't count, provide proof of mainstream GPU compute needs) or shut up. Also, if you do use Titan as an example, you'd be wrong because Titan uses Tesla GPUs, not GTX 670s. Specialized products for specialized markets.

AMD is relevant for the market, not because will keep in check prices alone, but because makes nVidia and Intel try harder in each gen. You think Intel would have come up with Conroe without AMD's pressure? We'd be stuck with Pentium 4's design to this day, I'm sure.

This is economic theory, not fanboism.

--------------------------------------------

This is like saying Fiat keeps Ferrari in check because both of them make cars based on internal combustion engines.

You're ignorant of market forces, and the relevance of AMD to the new market.

Get lost
 

amdfangirl

Expert
Ambassador


There will be people like me who will buy into AMD's Jaguar-based APU on the simple premise that I can buy a $400 ultra-portable that runs Windows 7 for Photoshop, Illustrator and Libreoffice (I hate change, and as a consequence tablets because I'm an old grump :p )

Here's a guaranteed sale for AMD :3.

All they need to do is to make the Lenovo x120e (Brazos) I had right now about 300g lighter and I will buy a new one next year with this chip in it :).
 


The broader HPC market is nearly $19B.

http://www.hpcwire.com/hpcwire/2011-06-21/idc_shares_hpc_market_figures,_trends,_predictions_at_isc.html

If you really think Intel is getting in there for penny change, you need to think things a little harder.



Good thing you brought up that particular example, because is interesting to actually see how FIAT manages Ferrari as a brand. Although, Ferrari doesn't compete with FIAT directly, they do share technology. So, in that, it would be saying that Atom is fighting the I7 market share... Which it actually is; that's why Intel is going to lower power envelopes going forward, haha.

You should have used Lambo and Ferrari, or Porche and Ferrari. Now, AMD is not into extreme or enthusiast markets, so you can say FIAT and Volkswagen or FIAT and Toyota. If you take a look, Toyota gets out something, then everyone notices then starts doing so in a similar fashion. Audi, Toyota, Volvo and BMW are developing really interesting engines tech that make all other vehicles old.

Well, long story short, the "market" is driven by the variety of the companies populating the different needs. If you take out companies so that 1 or 2 remain, it only deteriorates innovation and competition as a whole. Unless, you want to try a Communist POV in that regard, I'd say it doesn't work with few driving the industry.

Also, why so raging? Did anything bad happen to you today?

Cheers!
 

hansmoleman1981

Honorable
Nov 21, 2012
12
0
10,510
 
 
Status
Not open for further replies.