How Will AMD stay alive?

Page 30 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Sweet, I like breasts.

Anywho just wanted to add, even as an Intel user, I'm not holding much faith in this LRB project. I've never used Intel powered graphics that even compare to the competition, even on-board vs on-board. I know things can change, but graphics is not an area I'm confident of Intel in.

Also to above posts, a core isn't a set size. If the core does less, or uses less transistors then it's smaller. A "core" is really just talking about an independant circuit, similar to threads in software. GPU's are designed to do the same thing hundreds/thousands of times, so they have lots of smaller cores rather than one big core that does everything. So 240 shader cores on a GPU isn't actually necessarily technically ahead of even a dual core CPU, especially since most GPU cards I've seen barely get to 1Ghz.....
 
It does matter. I never said it was the only thing that matters, or that it was more important than good architecture.

It was a simple example to illustrate that GPU tech isn't grossly ahead of CPU tech, which was implied by a post above. Probably not the best example but certainly wasn't some 'zomg mhz is everything dawg' comment.

By your comment I could say "by that rationale we should underclock our computers and video cards as far as we can, because clock speed does not matter" - but we know that's not right either.
 
Clocks do matter, and having a shader clock of say, 1.7Ghz vs just 1 Ghz helps nVidia do with less shaders.
I wouldnt worry if I were AMD or ATI at this stage about G300, as its market isnt the same, itll be priced higher, and its functions are directed at cpus and LRB for HPC usage etc.
 
Its sort of like saying, the i7 975 competes with the P2, it doesnt, as its pricing prevents this.
What will be telling this go round is, if nVidias claims at scaling down the new chip we actually happen, and how well it does it, and what costs there are etc.
Id be more worried right now if Iwere on the LRB team, as the ante has been ipped tremendously for games, and the G300 has just came in roaring in the gpgpu market
 

Warning! Theo Alert! Theo Alert!
 
Theo Valich, former Inq "journalist" turned renegade "insider." Not quite on the level of Fuad (ie. he doesn't post contradictory articles within 10 minutes of each other), but not exactly a concrete source either.
 


I think he meant more than chipset/CPU/GPU. I think he meant EVERYTHING.

And yes ATI is a great asset. They still over paid but still a great assest. Without them I am pretty sure AMD would have hit the lowest point a few months ago.



Unless they can up the Shader clocks I don't see much there. And still its taking them 1.5 years to grab GDDR5. ATI is always ahead in that aspect. Always pushing newer techs first. They went SPs first. Just funny to watch nVidia take it and find a way to out do them with lower end or less parts.



Ok so ATI currently holds the crown for both single card and dual card solutions in gaming. Why would nVidia even worry about Intel right now when LRB is not even near release?

i think G300 is just nvidias next step to try to compete with ATI. Sure LRB will be there when it is there but nVidia is such a egotistical bunch that they probably view LRB as a ATI Rage Pro. Just not anything big. Just like they view ATI right now.

The only time they worried about ATI was with the R600. When it turned out to be meh, they stopped worrying because every new gen (even the rebranded G80 as a G90) GPU they put out would push ATI right back down. Mainly because nVidia channels a lot of money into software development with game companies while ATI who used to is now like AMD.

They set in the back of the bus when there are free seats available in the front but instead they just bitch, moan and complain that its not fair when they have the ability to do the same. And I mean work with the software devs to optimize their software with their hardware.
 
We already have a whole platform made of AMD.

what i mean about a whole platform is everything, AMD they have motherboards cpus and gpus, no hdds/ssds or ram but intel on the other hand, they got motherboards cpu's ssds and now they are going for GPUS, so in a few years it wont be a surprise if intel will have a whole platform and compete with every company on the market.


That is why they purchased ATi in the first place. A single unit good a sequential and parallel processing, with the necessary technology and licenses to process 3D effectively, would be a hugely valuable asset, not to mention a marketing coup. It won't be designed to be better than current discrete graphics cards, just hugely more powerful and efficient than current integrated graphics.

yea its a good idea, but there is a long wait till 2012.

# Transistor count of over 3 billion
# Built on the 40 nm TSMC process
# 512 shader processors (which NVIDIA may refer to as "CUDA cores")
# 32 cores per core cluster
# 384-bit GDDR5 memory interface
# 1 MB L1 cache memory, 768 KB L2 unified cache memory
# Up to 6 GB of total memory, 1.5 GB can be expected for the consumer graphics variant
# Half Speed IEEE 754 Double Precision floating point
# Native support for execution of C (CUDA), C++, Fortran, support for DirectCompute 11, DirectX 11, OpenGL 3.1, and OpenCL

will i hope this is true, this one is gonna max out every game including crysis with a minimum of 60 fbs
 


You mean like getting Crytek onboard with cryengine3 eyefinity, codemasters to hold back Dirt2 for dx11, that kind of thing?
 
Hmm, I go on travel for a few days & come back to 3 more pages here?? :)

Anyway, HEXUS has an interesting take on AMD's 2010 prospects:

From being one of the principle spokespeople for AMD, Moorhead seems now to have been moved to a much more inward facing role. It says to us: "let's put handcuffs and shackles on you and duct tape your mouth".

We also have it on good authority that there will be further deep personnel cuts at AMD in the first half of 2010.

 
Maybe the last of the thinning out, lets hope.
I mean, everyone realises theyve done more lately with less than they did prior to the reoganizing, so this doesnt surprise me, as they said theyre trying to lean it out
 
^ I take it as evidence that more losses are on the way. The only way to turn this around in the short term is to cut, cut, cut.

Eventually they'll be doing less with less, if they start cutting engineers instead of marketing.
 
My thoughts exactly, tho losing Pat doesnt do alot for the brass at Intel either, as hes the last at the top as far as engineers go.
Who knows? Maybe they all should just switch companies heheh, then itd all work out
 


fry-see-what-you-did-there.jpg
 
Status
Not open for further replies.

TRENDING THREADS