Opinion: AMD, Intel, And Nvidia In The Next Ten Years

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

cutterjohn

Distinguished
Apr 16, 2009
37
0
18,530
Sound cards: Creative was pretty much it by the late 1990s and they proceeded to price themselves right out of the market given the minimal CPU load reduction that their cards gave along with the very slight increase in sound quality. They just never adapted and OEMs stepped in with nearly as good integrated solutions which are pretty much std equipment since then.

Can't see dedicated GPUs going away myself any time soon, as I can't really see enough of the functionality being jammed into a CPU die and then left having to upgrade BOTH the CPU AND GPU just to get new GPU functionality which would be a MUCH more costly upgrade than what we have now, unless of course you always upgrade to the top-of-the-line GPU, but then you're also likely to have the top-of-the-line CPU so it'd still be cheaper for them to be separated out ATM.

OTOH I suppose if clock speed get bumped more and smaller die mfg processes plus better multi-threading support in more apps/game engines that it's entirely possible that one day 10-20y from the dedicated GPU may be able to go away.

ATI drivers: heh, their linux drivers are still poor(but the OSS driver is coming along nicely), and I still experience MANY more hiccups with catalyst under windows than I do with nVidia drivers. However I think that the hiccups are probably driver optimization related as they only show up in a few select games/apps and I've conversely been able to run benchmarks/demo rendering which drive the ENTIRE system CPU/GPU MUCH harder than the problematic programs w/o the same "hiccups". (Trust when this first appeared I tried running all sorts of driver settings/demos/benches/stress tests/moving PSU to it's own high amp circuit(house) w/no other load which left me pointing the finger at some sort of program engine/driver interaction which I just live with hoping that they'll eventually be cleared up in a driver release... This is what led me to using mobility modder and trying out a number of newer catalyst releases beyond what my nb mfg supplied(VERY old relatively speaking -- c. 5m/5 driver releases).

CUDA: Should be talking about openCL now I think even though it IS almost CUDA still leaving AMD in a bit of a bind.
 

dannyboy153

Distinguished
Oct 22, 2009
2
0
18,510
Quoted from Bloomberg: http://www.bloomberg.com/apps/news?pid=newsarchive&sid=as.aATdWdDdY

“Avatar,” the highest-grossing film of all time, was also the most technologically demanding. Creating the effects required 35,000 computer processing cores and gobbled up as much storage as the three “Lord of the Rings” movies combined. The goal: make it look so real that viewers wouldn’t think about the technology involved."

Nvidia's GPUs were used in all of this year's Oscar nominees for best visual effects: “Avatar,” “District 9” and “Star Trek.”
 

Shin-san

Distinguished
Nov 11, 2006
618
0
18,980
We may need another software paradigm to help with development costs. OpenCL and DirectCompute are two that might just do that. AMD's integration of a GPU onto the CPU might also help. I hope that OpenCL and DirectCompute would be able to be run on the CPU as well. If so, that would mean that they would get adopted.

And hm... I just found a DirectCompute Benchmark. http://www.ngohq.com/news/16710-first-directcompute-benchmark-released.html
 

eodeo

Distinguished
May 29, 2007
717
0
19,010
Really really great article. I thoroughly enjoyed it.

The bit where you explain why CUDA took off and free and almighty OpenCL got stuck at the door made perfect sense. I wasn't even thinking about software part of the game.

And seeing how I teach 3d and rendering is a big part of what I do, nvidia iray discovery made it even more interesting. I knew that nvidia obtained mental images couple of years ago, and I knew they were working on mental ray for GPU code, I just wasn't aware that they actually did it. That's game changing news. If iray works as well as it could (10x faster than i7 920 on gtx260)- amd and intel are going to be replaced in the high-end hollywood segment faster than they can say fermi.
 
G

Guest

Guest
I wouldn't think of "the future of pc gaming" -or "the future of gaming" in general- as a single escalating line, in terms of graphic quality and complexity. I believe that the market will split: few "hollywood superproduction titles" and tons of technically simple and effective games -something like online java or flash games are today.

Personally i spend more time playing those small games than the others. "Wesnoth" is one of them. Also sometimes i get nostalgic and start searching for "that good ol' game" for DOSBOOX or MAME.

I don't see the market for "epic" game titles getting any bigger. Actually, i think it will get smaller.

I
 

vladimirorlovsky

Distinguished
Mar 8, 2010
3
0
18,510
Expect un-expected!...France | Germany | United Kingdom | Italy | China | Taiwan | Russia | Hungary | Turkey...6.5B of people, 5B of computers...game just begin!!
 

zero2dash

Distinguished
Oct 23, 2007
32
0
18,530
As long as Folding@home favors nVidia cards, they will be in business because their cards will be the choice of folders. ATi cards still fold at a fraction of the PPD of an nV card, even a weaker one. People were hoping that ATi's Stream Computing would even things, but it hasn't. Now we're (folders) hoping that the gpu2 client yields better PPD for ATi cards, but no one's holding their breath.

nV will be back, just like ATi has come back after disappointing with the RV670 (3850/3870).
 

john5246

Distinguished
Mar 13, 2010
132
0
18,690
I love articles like this!

But as one of the posters pointed out already the largest segment for profit is in the console market. PC gaming really is "enthusiast" based and thus from a financial/business perspective there is no real reason to tailor your games to that small market. You're ROI (return on investment) would be minimal to non-existent.

Unfortunately, for most of us we'll have to wait a long time before any advancement are made to take advantage of our hardware because Sony/MS decided to run with their current consoles a little longer. That paired with the economic downturn could mean that we won't see any big advancements till 2013+ (around the time I expect a new console).

On another note, I see gaming being done across multiple platforms becoming a much more commonplace thing as we head into the future. For example, you'll be able to play a game on your Xbox and continue it on your smartphone, PC, or other device, and vice versa. To think that the Wii got a whole segment that was left untouched by the big players! I see this type of gamer to continue to emerge, and the type that demands pixar quality to stay the same.
 

john5246

Distinguished
Mar 13, 2010
132
0
18,690
Had to add another thought here as it relates to the article:

I think that we'll see more money dedicated to the visual effects of films and less to paying the actors exorbitant salaries. I think we'll begin to see at least a few films every year that are all CGI (think Resident Evil: Degeneration), the big success will come when they release a CGI film and no one is told it's CGI. I think this concept is still a bit far off, it has to do with financing projects, financiers know that putting a certain actor/actress will draw a dedicated audience (the actors personal fanbase) and thus help reduce their risk in the investment. The real money needs to be put into story writing instead of using recycling plots from movies like Dances With Wovles, Pocahontas, Last Samuri (should I go on...) and high priced actors.
 

kokin

Distinguished
May 28, 2009
445
0
18,810
[citation][nom]john5246[/nom]Had to add another thought here as it relates to the article:I think that we'll see more money dedicated to the visual effects of films and less to paying the actors exorbitant salaries. I think we'll begin to see at least a few films every year that are all CGI (think Resident Evil: Degeneration), the big success will come when they release a CGI film and no one is told it's CGI. I think this concept is still a bit far off, it has to do with financing projects, financiers know that putting a certain actor/actress will draw a dedicated audience (the actors personal fanbase) and thus help reduce their risk in the investment. The real money needs to be put into story writing instead of using recycling plots from movies like Dances With Wovles, Pocahontas, Last Samuri (should I go on...) and high priced actors.[/citation]
You have bold ambitions. Might as well make all food corporations stop dumping their day-old food and give it away to the poor/homeless.
 

youssef 2010

Distinguished
Jan 1, 2009
1,263
0
19,360
quote"So, while game development budgets will continue to grow, it’s hard to imagine games reaching the same budget levels as Hollywood films"unquote

I disagree as the budgets of some games (GTA IV,MW2,............) exceed that of some hollywood movies (Children of Men,Slumdog Millionaire,Twilight,....) according to the table in the article.
 

kronos_cornelius

Distinguished
Nov 4, 2009
365
1
18,780
They are probably using a single server for each frame, it is probably easier to parallelized the work that way. So, it may take 24 hours to do a frame, but they work on 1000 or 10000 frames at a time using the big server farms... I don't know how many server they have.
 

Rik

Distinguished
Oct 28, 2007
4
0
18,510
The same thing that happened with the separate CPU/FPU years ago is gonna happen with the CPU/GPU.
Soon, there will be AMD CPU/GPU (and then there were one chip to rule them all) combo-killers' you know its comin', hence why Intel is "following" suit, with its CPU/GPU too, Larrabee.
This CPU/GPU meld (16 cores of GPU's and CPU's in ONE chip) inside a the future Net/Lap/Note-book, will finally usurp our big old gas-guzzlin' "chained-to-a-desk" PC, specifically in price per performance. "Specialty" servers will otherwise still exist in one form or another.
By then, both our present CPU's, and GPU's will be dead. It'll "ALL" just be parallel-processor(s) talking to "everything" virtually and simutaneously, again, in many forms, electronically, bio-neurologically,..., quantumly.
Of course, by then, we'll just be starting to populate other habitable planets within our galaxy.


 

Rik

Distinguished
Oct 28, 2007
4
0
18,510
One of the biggest changes will be the "Open" stuff, such as "Open" Operating systems, OpenCL, ..., Opoen64/128/256,.. ? all of these spell one thing. As long long as you buy the hardware the OS will be openly and freely built into it. The Hardware manufacturer with the best driver/glue-logic wins by then. Thats right, there will be no Microsoft by then either. It'll be only Linux/FreeBSD/...? or some form of "Open" and free powerfull OS, and that'll be it. You will however have to pay for all your extra "specialty" apps though, be they games, ... and that $$$, will NEVER change. :)
Thats how software "development" will progress.
 

majorgeek

Distinguished
Mar 16, 2010
14
0
18,510
Alan I could not agree with you more. I have been saying that for a few years that AMD has the complete solution with CPU and GPU technology. If anybody can pull this off my money would be on AMD. Intel realized it is much harder to build a GPU than once though. Bulldozer should give us a good idea of what is to come.
 

youssef 2010

Distinguished
Jan 1, 2009
1,263
0
19,360
I have virtually no info on REYES,but from the sentences in the article I see it closer to eyefinity but, instead of increasing the pixel count,they decreased the pixel size(nullifying the need for AA) so, I think ATI's 5000 series can do REYES rendering with some software optimizations. Am I right or wrong? (I know it may seem like a stupid comment but I really don't know much about graphics architecture so, please forgive my ignorance)
 

vexun11

Distinguished
Dec 17, 2009
719
0
18,990
Excellent high quality Article thank you. I think pc gaming will keep going for many years, consoles will have a lot more games coming out but pc gaming will live on for quite some time
 
G

Guest

Guest
Nice article. However, as an audio engineer I think your way off base with your sound card analysis. For one Creative Labs isn't losing sales for is lack audio quality. It lost its sales for it's terrible driver support and software designers. I really don't agree with the notion that there are no real benefits to a sound card. For the most part Integrated Audio is still pretty bad. and spending more money on headphones isn't going to cure your problem. That's like saying "I bought a gtx 480 to go with my Pentium 4."

Trust me, buy an actual quality card. EX: HT Omega Claro + and Sennheiser PC 350. I would call anyone a liar who says there isn't a night and day difference. Remember with the Crashing of CREATIVE Labs brings a Better company in its place.
 

bak0n

Distinguished
Dec 4, 2009
792
0
19,010
Integration is just another way to force people to buy new hardware after the on-board video burns out. Good for profits, not so good for consumers.
 
G

Guest

Guest
all we can hope is that all the tree are going to survive and there strategy will succeed. Because with only one left (or even with two), we don't have a market any more, and progress would considerably slow down. And since there's so much monney (for acutally building a factory) and even more know how required to build a cup/gpu, there wouldn't be possibly to enter the market with a start up or so...
 
Status
Not open for further replies.