How Will AMD stay alive?

Page 20 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

I would never condone the use violence by humans on rabbits or baby seals, but the use of violence by rabbits and baby seals on humans is something I would pay to see. :lol:
 


It's the reason we have VIA.
 


Actually that's exactly the idea Via took, so they're pretty much just thrive in their own market, far from Intel and AMD. IIRC, Intel engineers actually have more respect for VIA engineers than AMD engineers. :lol:
 
1 thing about Intel, and its arrogance towards gpus. Wheres LRB? They said 2 years itd be out, and vying with the best.
Problem is, the best just got better, again, and will do so even again before LRB comes out.
Heres a field Intel leads in that they cant even touch the highend after how long of trying to make a true competing product, boasts and all.
The gpu market isnt the slow plodgy cpu market, its fast paced, and having just cpu cores and die shrinks arent enough, you have to double perf, not just 5-10%.

Honestly, Intel put their foot in it when they said itd be out and competing. Last perf claim on LRB was the 285, which is now but a mid range card, and its still not here.
If Intels the gfx leader, as alot of people have claimed, pointing to sales time and again, then theyd best get off their butts and show it, and the sooned the better, as the target they keep aiming at is moving fast
 

I'm sorry, but get your facts straight. VIA own the market.

via_domination.png
 




Via chipsets are a disgrace.. Up there with nothing but useless usb. There chipsets were the slowest on the block next to SIS.

This is why Via are no longer making chipsets. The Epia Boards had their niche but always came out the cheapest option.
On board graphics were poor and so was its audio codec


To me via is the wacky races of chipsets. To be forgotten until they do something cool again.
 



Oh and the S3 Savage - what was that about.... Surely the worst graphic card since cga.. Awfull - absolutely awfull.

Im talking my self to become a Intel fanboy.... nah I just like Intels chipset. ( well appart from the g31 on a asrock motherboard - put some in the other day and only the backplate has 2w of usb power. - useless )


dispared on the 810 chipset too

long live BX and x45 and x58 - not tried a p55 yet so ill let u know - anything with x in it really ( well appart from AX and FX- eww )
 


Now, now - you're already forum-married to TC! 😀 No cheating on the husband 2 days after your nuptials - I think it's against BOM's TOS maybe :).

 


Yeah, although I'd pay actual money to see all 10,000 ENIAC vacuum tubes integrated on a 200mm die! :)

According to Wiki the thing took up the floor space of a small gymnasium, and required something like 400KW to power. And we complain about the 200+ watts that a GPU uses :).
 


Well we could say the same about Bulldozer 😀. Supposed to be out LAST YEAR according to the original roadmaps in 2005, IIRC. Then it silently sank beneath the waves as AMD took it off the roadmaps, but resurfaced with a belch and a fart after bottom-feeding a while to give AMD marketing time to think up the term "fusion". :kaola:

Larrabee is still A0 silicon and appears currently at the midrange level. By the time it comes out next year, hopefully it'll have improved to be competitive. I'm willing to wait and see what develops - after all, as you have said many MANY times 😀, 'competition are good'!

 
Well to be fair, most of these intel nuts will be using 8800gts still believing that was when progress died on gpu's. Larrabee might seem pretty good in comparison. 😀
 



Ok so LRB will be outdated in its current form, Intel have said this in a previous statement, but then again atleast intel have been upfront about it. Now they still have more funds than anyone else to get this right and out at a right spec the moment it is released.

Intel must have had lrb in tests at different performance ratings. It works on P54C modified. What if they use Atom cores instead on there next version. That would be even quicker as a atom is about the same as a 1.6 p4
 


Ok first of all you tell me I'm out of my depth on a 'joke' point I made, now you come out with this?

You *really* don't understand the concept of miniturisation do you? Atom = 45nm P54C = 35 micrometer? You realise that in order to get those 40 cores or whatever on a die, they need to be reeeeeeally small? Maybe in 5-6 years atoms would do, but that's no good now.
 


You really think a company's R&D division will be stupid enough to not count in the fact that technology is constantly improving?
 
Theyre spending billions on this, if any truth to the costs of previous releases hold.
So, as they shrink, so too does the gpus. If they arent there now, when they do arrive, the target has moved once again.
New cores? Sure, same for gpus.
Why will gpus all the sudden just stop?
This isnt a cpu scenario as Ive stated. This isnt AMD releasing every 3 to 4 years, these are gpus with entire new arches and huge IPC improvements per arch change, not just 20%, the 20% is seen in a dumb shrink, and more, if needed.
GPUs are much more flexable than cpus, and as I posted way back when, even if LRB comes out at a set design, its a small matter for the gpu makers to just adjust their gpus to make them better, or castrate them to make them worse, dependingon yields and market positioning.
LRB is aiming at a very fast moving target, a very flexable target also, and a target thats been around alot longer than LRB, and was created because of cpus failures to do this very thing.
 




Have you had kids, JennyH because all rationalisation has gone out the window now..and when women give birth - this happens


I will re phrase for the hard of reading.. Intel will produce a processor as fast as a atom at a simular level size to what he p54c is..

When the p54c came out it was much larger than what a Atom is now.. This is called progress..

There will be a quad core atom at some point and so on.

Intel will progress this technology instead of AMD which recycle technology over the old athlon 64.

Intel will improve on a p54c by uprating the processor at some point. the next level if you like..

Your slating some thing that hasnt even been released, tested or even benchmarked. Im basing my conversation on advancements of technology i have seen over the last 30 years.

I just dont think that intel will repeat its 740 video card senario it did 10 years ago (ish)

Oh yeah and one thing you forgot is that AMD bought ATI because they were crap at video cards - I bet AMD designed on and gave up like they did with their chipsets.

The more i think about it AMDs stance looks worse everytime i think about what Intel are doing and what theyre thinking behind close doors.

Intel have made video cards from pioneering technology from scratch which will come out trumps in the various the end.

Intel have enough money to fail untill they get it right - they already bought Havok if i remember.

Now the Havok engine is based in a lot of games this just adds to Intels arrow in its bow collection against AMD and nvidia.

Im not saying panic stations but the AMD camp rosey seems a bit sparse. AMD know it and so does Intel.
 


Yes, it was really miserable for AMD, at the very least, they gave intel a good fight. Maybe what they should do is sell AMD to Intel and be a one big happy family. =) But on a second thought, this isn't a good idea either since Intel will have complete monopoly of the CPU industry and we would end up buying expensive CPUs.
 


There is no way at all that atom will be small enough 'next year' Hellboy. I do believe that this is something people just aren't getting. The current larrabee was supposed to be as fast as a gtx285...you could have fooled me looking at that demo at IDF.

You have to assume that is at 45nm, with 40 of these P54c cores. So intel are close to a shrink to 32nm...thats great and all but it is not gonna be enough. At 32nm it might even be as powerful as a gtx285....but the gtx285 has aleady been left behind. Neither ATI or Nvidia are standing still (well ok Nvidia might be lol).
 
What some of you are basically saying here is 'larrabee will progress', but you seem to believe that ATI and Nvidia won't progress? At the stuff they've been doing for 10 years+?
 
Status
Not open for further replies.