Talking Heads: VGA Manager Edition, September 2010

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

danieth

Distinguished
Apr 3, 2010
19
0
18,510
Thanks for the article, it was really good. It was interesting to see there opinion from the inside instead of your opinion, as you explained. But as porksmuggler said, Andrew Ku made this article what it is. I would really like to see this article again, or maybe even ask the community what questions need to be asked.
 

tkgclimb

Distinguished
May 9, 2009
524
0
19,010
[citation][nom]Scort[/nom]The problem I see is while AMD, Intel, and Nvidia are all releasing impressive hardware, no company is making impressive software to take advantage of it. In the simplest example, we all have gigs of video RAM sitting around now, so why has no one written a program which allows it to be used when not doing 3d work, such as a RAM drive or pooling it in with system RAM? Similarly with GPUs, we were promised years ago that Physx would lead to amazing advances in AI and game realism, yet it simply hasn't appeared.The anger that people showed towards Vista and it's horrible bloat should be directed to all major software companies. None of them have achieved anything worthwhile in a very long time.[/citation]

yeah it seems all software companies are focusing more on user interface and applications, then utilization to make current (and next gen) software run smoother faster and more powerfull.
 

liquidsnake718

Distinguished
Jul 8, 2009
1,379
0
19,310
GPU's in some form or another will always be needed, the discrete ones will always be a consumer item regardless whether it be in the form of Pcie 2.0, 3.0, 4.0, or even usb 3.0, e-sata, or an even more unique way of upgrading that integrated graphics. In other words the industry will always sell it as an add on no matter what the evolution of the igp or gp/cpu hybrid will turn out to be beacuse its just good business. Also this means we will see better integrated graphics but somehow i dont believe we will always get the best of what ATi and Nvidia can do just so they can still sell gpu's in one form or another.
 

cmartin011

Distinguished
Jan 15, 2010
319
0
18,780
good article i would like to grill all the ceo's and engineers my self even just for a private session just to get some more incite and offer my ideas for reasoning.
 

jecastej

Distinguished
Apr 6, 2006
365
0
18,780
The "radical" change will be in the lower GPU level but I don't know for how long. This is very important as it will up the entire industry minimum requirements for everything, even on laptops. High performance GPUs cannot do this.

But in 2 years GPUs could be so powerful that integrated graphics wont keep up and the industry will end up in the same price/performance ratio as today. I don't think AMD or Nvidia will stop offering compelling $60-70 discrete graphic cards. They need to adjust their production lines but how long will this take.

To sustain this in time Intel needs to be very aggressive and push integrated graphics with every generation. I think this is a move in the right direction but never a be all and end all solution.
 

luke904

Distinguished
Jun 15, 2009
142
0
18,690
im excited about fusion because of the idea that it will use the gpu more, even when we are not doing 3d work...

i have this all powerful gpu (ok its only a 4850 but still) and it does nothing 80% of the time

 

experimentxx

Distinguished
Sep 4, 2010
9
0
18,510
Suddenly, NVIDIA got cornered.. Whatever happens in 2011 NVIDIA will take a lot of pressure in making nothing but the "best" to survive.. Assuming the hybrids can really perform as much as they bragged, NVIDIA just lost they're low-end and some of their mainstream graphics market.

Those gf106 cards should perform significantly better than the "older" 57xx and prove to be on par with AMDs counter a little later, or be priced significantly lower. Otherwise, only nvidia loyalists will get them.

I just hope nvidia would not be sold to Intel lol.
 

bobdozer

Distinguished
Aug 25, 2010
214
0
18,690
[citation][nom]LORD_ORION[/nom]If Nvidia can bring out the best $150 card by Black Friday, I think they will be fine.Basically, they need something that blows the 5770 out of the water and costs the same or less... because AMD will lower the 5770 prices when that happens, so it can't perform on par.[/citation]

They will be releasing the 450, but it can only beat the 5770 when it is very highly overclocked (and of course a 5770 is a highly overclockable card in itself).

Nvidia has no answer to ATI for now, and possibly never again.
 

fteoOpty64

Distinguished
Apr 15, 2006
22
0
18,510
While there is much speculation on the value and demand for the low-end market, AnandTech seems to indicate that SandyBridge with its IGP performance will kill this market up-right. That would seem like a blow to AMD which actually makes sense to Intel. But the volume shipment of SD chips and its P67 chipset will take time and gradually dilute the market.

I am just appalled by the relatively slow and backward development of the chipset market. The key culprit here is Intel because they just look after their own selfish interest. Nvidia has been ousted out of this market by Intel due to licensing restrictions which one would technically deem monopolistic. But legally they seemed to get away with it.
Well on the GPU market, it is great that AMD and Nvidia are still at it and will be for some time. I am curious that Nvidia does not seem interested in non-PC market where PowerVR is doing very well.
 

wishmaster12

Distinguished
Jul 28, 2006
197
0
18,710
maybe in the future amd will make a cpu/gpu that runs at 4.0ghz cpu, and 4.0 ghz gpu,with 12 cores for gpu and cpu with 20mb cache that will run crysis at 10000 fps
 

deathofzero

Distinguished
Aug 17, 2010
41
0
18,540
All i can say is wow. No one has even mentioned the heat involved with integrating a mid-level gpu into a new processor. I thought that would be the first thing on the list, considering that performance and even mid-level cpu's have trouble with stock heatsinks, even aftermarket heatsinks. Eventually, water cooling will be the standard. If you think about it like this, as tech advances heat is increased. I mean, look at 10 years ago. I did not need an after market cooler or case, 4 fans and/or a liquid cooling system to keep a processor under 100. Seriously, next thing you know computers will be floating in mid-air, following us around and we'll have to keep replacing the ice cubes to keep them cool. But this next generation seriously has me excited, especially considering I'm a fanboy of both Intel and AMD.
 

acku

Distinguished
Sep 6, 2010
559
0
18,980
Bloody Hell!!! Someone took the "aku" screenname. This totally messes up the norm. I don't have a first initial last name screen name like the rest of the guys! :)

So I am "Angelini's gopher," and we have a lot on the plate moving forward. It isn't just about business articles. So keep your eyes peeled.

[citation][nom]Kelavarus[/nom]That's kind of interesting. The guy talked about Nvidia taking chunks out of AMD's entrenched position this holiday with new Fermi offerings, but seemed to miss on the fact that most likely, by the holiday, AMD is going to already be starting to roll out their new line. Won't that have any effect on Nvidia?[/citation]

Yes and no. It depends a lot on what the pricing strategy is going to be. Maybe tangentially on performance given the range we are in. I didn't want to turn this into performance war of Nvidia and AMD. That isn't the aim of this article. Chris and Thomas will handle that later.

[citation][nom]Scort[/nom]The problem I see is while AMD, Intel, and Nvidia are all releasing impressive hardware, no company is making impressive software to take advantage of it. In the simplest example, we all have gigs of video RAM sitting around now, so why has no one written a program which allows it to be used when not doing 3d work, such as a RAM drive or pooling it in with system RAM? [/citation]

Chris and I both understand the frustration. I am a programmer so I am also speaking from a different vein of thought once in a while. As far as pooling graphics mem, you actually can't do that in the way you described for a variety of hardware and software reasons. Moving forward the software community should really be putting more emphasis on 64-bit and GPGPU. That is where I see a lot of the performance gains being addressed on the software end.

[citation][nom]Aerobernardo[/nom]I have two ideas for this kind of topics:1 - Tom's start a pool on wich questions do we want to ask?2 - I kinda hoped to hear something about what's next. We know about NDA's, but it would be nice to hear if AMD have intention to adopt new features like a proprietary 3D solution or what both AMD and Nvidia think about how a next generation VGA should do (or wich architecture should it have) to be successfull on todays perspective.[/citation]
Hey if you got articles suggestions for us on something like this piece something more in the flavor of business topics, etc... feel free to make suggestions. In fact, if you want to be the one to start a thread Chris and I welcome it.
[citation][nom]kilthas_th[/nom]High phone bill? If only there were some way to have our conversations or even conferences routed effectively and cheaply over the internet[/citation]
Ugh... even with VOIP for this project, you have no idea how long it took to track down some of these people. It's not just the money. It is the time and timezones. We had to jump through so many hoops and avoid talking to the PR teams, so that was a huge effort on its own. Throw in the fact that other languages were involved, this makes it more even more difficult. :)

[citation][nom]elt[/nom]Fusion is present, not very much the future[/citation]

Agreed but this is an ongoing marketing campaign. Fusion in AMD's mind is suppose to get you all tingly by making you think of merged product lines. The culmination of this will be the APU. My fault for not clarifying...


[citation][nom]Cmartin011[/nom]good article i would like to grill all the ceo's and engineers my self even just for a private session just to get some more incite and offer my ideas for reasoning.[/citation]
[citation][nom]danieth[/nom]Thanks for the article, it was really good. It was interesting to see there opinion from the inside instead of your opinion, as you explained. But as porksmuggler said, Andrew Ku made this article what it is. I would really like to see this article again, or maybe even ask the community what questions need to be asked.[/citation]

Thanks guys!
Team up with Aerobernardo, I don't mind if you want to throw ideas our way.

Cheers,
Andrew Ku
TomsHardware

 

youssef 2010

Distinguished
Jan 1, 2009
1,263
0
19,360
This is a great insight in the graphics market.My opinion may seem way off the mark because of my limited knowledge of hardware details.But, I think the future for CPU/GPU combo is to make CPUs that have the same parallel architecture as GPUs and develop the software to take advantage of parallelism.Then, the CPU & GPU can be condensed in one unit.
 

chechak

Distinguished
Jun 15, 2008
156
0
18,690
[Who knows, By 2020, AMD would have purchased Nvidea ]
DaaaAaaa ... you wish !
Nvidia future is making CPU market better than AMD\Intel cpu's
nvidia GPU + nvidia CPU + nvidia PPU + nvidia chipset = FUTURE
:D
 
Status
Not open for further replies.