How Will AMD stay alive?

Page 25 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

First of all, youre still not getting it. I said with a SoC on die setup with a igp thats as powerful as a midrange card is today, say a 4850, with it on die, and further advancements in OpenCl,compute shader etc etc it will have a impact, and didnt mention gaming at all, and thats where people are completely missing the point, and when they do mention it, its about RT, which isnt going to happen.
Im just trying to get people to understand where things are going, and if youve only seen the cpu side of things, and havnt seen the strides at least nVidias been doing to do alot of apps, to widen its ability, theyre coming, the SW is here/coming and the HW is capable and getting better at a pace cpus cant match.
A 10$ gpu with a 20$ cpu stomping that 20$ cpu by itself is what Im talking about, and people dont seem to get it yet
The gaming part, like the Batman AA, if youd follow it, youd see where the cpu cant keep up with the physics end, it suffers, but adding a small igp plus a gfx card will allow it to work, whereas even with Intels best cranked, the fps sucked.
Thats just games, theres other attributes as well for this, as LRB, stream and CUDA will point out
 



Just out of curiosity...

Imagine this: Intel and nVidia work together and make special Intel-only optimizations that make GeForce cards run faster, but they only do so on Intel processors.

Would you not be calling for an anti-trust lawsuit in intergalactic court? Would you not say, "INTEL ARE CHEATING!!?!"?




I'm not saying that I disagree with AMD making such an optimization (in fact, I hope they do), I'm just wondering how you would react if the shoe was on the other foot?
 


How am i wrong? Ihave never posted anything that can be proven wrong, only hypothesis

Lay off the vodka and get a life
 
No, and unless LRB is a huge success, even more of a no.
Cpus have ended their ability for growth, theyre reaching middle age if you will, and the disparity between them wont be alot, as we saw P2 catch up in huge ways to Intels offerings, so too will AMDs next iteration, and close that gap further, simply because, as Ive been trying to point out here, cpus growth is somewhat stifled, with AMD having most of that growth to do yet.
Thats why there LRB and fusion
 


I see what you're saying here. Core 2 squeezed out the IPC. Then they added more cores, then they ramped up the speed.

Should be interesting, as ~3.0ghz seem to be the clockspeed ceiling, which I would have NEVER predicted years ago. I thought clock speeds would keep going up just like everything else.

So we can't really go further with clock speed... We can add more cores, which sounds nifty, but without software being written for more than 4 cores it really doesn't matter...
 
They, both AMD and Intel , need gpgpu to work, as MT has its limitations, as does the amount of ores 1 can use at once.
Its when you go to many cores, as per my earlier link, that things change, and more things can actually get done faster and more efficiently, and even better SW can be written with less restrictions etc
 
It would help, but still have all the current restrictions cpus are running into now, after the 64 boost, the IPC,core counts, SW MT limitations would still be here.
Adding "shader" units in a well thought out communication/high BW design is the future.
Itll do what cpus are slow at, itll handle many threads, its basically what wed like to see in our cpus, but cant be done, at least to the extent of 5 years from now, as cpus slow down in growth, they need something to retain that growth.
Then itll be time for new materials etc
 
OK, Ive been a bit harsh on ol LRB, and heres why:
"Dr. Who, I'm sure you realize why so many nay-sayers...

You see, the graphics war, to many of us, is far more interesting than the cpu war has ever been. The cpu war is pretty cut-dry for most of our needs, we either go with 1 company for budget or the other for performance and if we're going to game on the system we generally don't have to worry much about which brand we buy from right now on cpus, which is why Intel wants in the gpu war in the first place. With GPU's things change, on the other hand.

1 card can absolutely MURDER in every game, then comes a new game and the tide completely changes, or a driver comes out and everything completely changes. As such, when it comes to gpu's everyone is ALWAYS on a "put up or shut up" kick. Reason being? Every time in the graphics market that we heard huge things about a design for an extended period of time before the release, it's failed. Remember the R600? Remember the FX5800? Yeah...

Now you see, we've been hearing this and that about LRB for quite awhile...I seem to recall hearing it's coming soon when I bought my old 8800gtx. Here we are, 3 years later, and the only thing we've seen is a limited ray Ray Trace demo of poor quality that ran pretty slow and the camera never moved a notch. You really can't blame people for being skeptical at this point because frankly this isn't a situation like other companies. People are saying NVidia have nothing because the 5870 launched less than a week ago, and intel have had us waiting for years...

Yes, Intel has proven themselves, everywhere but one place... The gpu market, which is exactly where they're trying to go. Intel doesn't exactly have anything at all to show for themselves when it comes to said market, as their IGP's are royally under-powered and actually cited by major game developers as part of the reason for the collapse of the PC gaming industry. Not a good start, especially for the enthusiast market. Just notice these guys want numbers, and they want them yesterday, most don't understand that telling performance early is SUICIDE as your competition knows what they're preparing for in that case. Like I pointed out earlier, they're grilling NVidia (and NVidia has had a VERY solid track record as of late when it comes to their new architectures for gpus) just because ATi launched early and caught them off guard. Don't think they're going to give intel any special treatment just because of conroe... They'll instead remind you how long it took for intel to go to conroe.

TL; DR version - Don't announce a GPU over 3 years in advance, delay it, and show a demo that does nothing for anyone watching and on top of that runs slow and the camera doesn't move. You are liable to get grilled by the enthusiast community for doing so."
http://www.xtremesystems.org/forums/showthread.php?t=234780&page=7#post4036476
So, for those thats been really keen on cpus, thats a gpu perspective, it isnt the same animal
 


Yes, I read that when it was originally posted. Complete speculation. Intel broadcasts its roadmaps at least a year in advance. Where's the Powerpoint slide from an IDF saying "Larrabee: Coming in 2009!"

Intel's been very careful not to give a timeline, IMO, because they recognize this is not going to be an easy business to get into and they don't want to have market anticipation without a product to back it up. That spells lack of confidence for software vendors, and that's the last thing anyone wants.
 
Intel once made crappy chipsets ... after the BX era.

They were crappy for a few years ... laughable in fact.

Now Intel's chipsets are considered the best ...

I imagine they will do the same wrt graphics ... the quite we hear now is the quiet before the storm.

Soon it will force NVidia to the wall ... possibly after AMD or before ... who knows?

Then there will be one major cpu / chipset / graphics manufacturer for X86.

That is obviously their ultimate game.

What do we do then?

Without a number of players in the hardware and software market the industry will grind to a halt.

No more innovation due to competition ...

Prices fixed at whatever they want them to be ...

You can bet they will start with more hardware tricks to prevent overclocking ...

Then we will each have a lowly Atom cpu because thats all we will be allowed.

Maybe that's why Apple is looking at their current Intel CPU's and thinking laterally ... might be time to look at the Power chips again.

Who wants to be stuck with one supplier?






 
And thus my following post of another persons summation about LRB.
I agree with that summation, and I think its needed to be pointed out here.
Ive tried to explain the differences between the simple cpu conflicts vs what happens in the volitale gfx industry, and thats where Intels heading with LRB, and it doesnt matter if its Intel or not, it isnt a what have you done for me lately market, its, whatre you doing for me NOW market, and hurry it up while you answer, market.
I understand this, I guess others dont. I also understand about real time raytracing, and the costs it has to run it.
Sure, those quotes arent power point slides etc, but were all from Intel employees here and there, and like I said, theres more of you look.
Its Intels fault, they want to keep the LRB flame going, to be seen in the distance, without giving anything away, and have to keep saying its close etc.
 


I hate to keep harping on this, but show me a quote from an Intel employee. This is all second or third hand stuff from vendors, magazines, etc.

Until it's on an external roadmap or at the very least the product is placed by a named Intel source, consistently saying Larrabee's "late" is profoundly misleading.

It may in fact be "late" based on an internal, best-case schedule never published to the world. But I'm having a hard time agreeing with you considering it "late" when there's been no announcement of schedule.
 
Those PowerPC cpu's were very good.

The Xenon and Cell survive today in the XBox360 and PS3.

They might end up the only future alternative to Intel the way things are going.

A much better cpu too.

x86 is the problem ... the whole backward compatable problem that plagues X86 isn't there in the Xenon and Cell cpu.
 


Otellini mentioned 2009 here, and I'm sure there are other times as well :-

http://arstechnica.com/hardware/news/2008/01/larrabee-becomes-laterbee.ars

"I still think we are on track for a product in late 2009, 2010 timeframe."

 


Sure, I know, but it's not logical to reverse a transition process. They'd piss off so many software devs. It'll be more a backwards than forwards move.
 
Not for LRB they dont, and they done the pooch according to all their people saying as to when itll be out, and it still isnt out. And wont be, til who knows when, they cant even say, but they are anyways, keeping it alive on peoples minds, but then, they cant deny theyve done this either.
 
HP's Core i7 PCs show blue screens of death

HP APPEARS to be having major problems with its entire line of Core i7 PCs.

According to HP's support forums, the range is fast developing a reputation for locking up, freezing and throwing BSOD's.

http://www.theinquirer.net/inquirer/news/1556515/hp-core-i7-pcs-blue-screens-death

Maybe AMD doesn't have to worry too much when a vendor this size is having such problems with the i7 PC's.

Must be a resistor eh?

Damned resistors !!
 
Status
Not open for further replies.