Opinion: AMD, Intel, And Nvidia In The Next Ten Years

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Nothing like a good read to get the morning started. As for nvidia they got the software at least in the early stages while getting it out the door unlike ATI which had an early lead but never exploited it back in 2006. All nvidia needs to do is polish the code and work on building better gpus. If ATI can at least get their own physics out that being the HavocFX then things won't be so lop sided. ATI for every day use while nvidia it is gaming and encoding.
 

haplo602

Distinguished
Dec 4, 2007
202
0
18,680
blablabla ... I mean nothing really usable in the article .. you could just reduce it to the conclusion and it would be the same ...

ok the reyes/renderman part was interesting, but that's it. and you missed the point when you did not consider game consoles (as already pointed out).
 

KidHorn

Distinguished
Oct 8, 2009
269
0
18,790
How can you talk about nVidia and not talk about Tegra? nVidia's future, at least over the next few years, is almost completely dependent on Tegra. Not Fermi. Fermi won't sell much until a few versions from now, if at all.

Simply put, Fermi, as it stands now is not a competitor for anything. Tegra has the potential for use in next game gaming platforms and anything else that needs low per consumption and low heat generation. Something Fermi is a long way from.
 
G

Guest

Guest
Best article I've read at Tom's in quite awhile.

Just great! Please publish more articles like this!
 

theubersmurf

Distinguished
Jul 29, 2008
221
0
18,680
There are some asides that have impact on some of your claims, notably the foible that was Vista, and game makers in the PC market having to retain compatibility with older APIs and often, the lack of hardware upgrades when people see little value in it. (Back in "Have 3D Graphics reached the point of diminishing returns?")

Though I think that the limited interest in PC gaming provides a somewhat similar effect, I think once broader adoption of Windows 7 occurs, you'll see more demanding titles for modern GPUs. Right now people are still operating with the idea that people are still running their X1650 or Geforce 7600.
 

theubersmurf

Distinguished
Jul 29, 2008
221
0
18,680
As well I think a lot of software developers have gotten the message that "The PC gamers e-peen competition is over" The need for high demand titles is dwindling in part I believe because developers realize that keeping the requirements financially assailable to gamers sells them more products. Titles like Crysis do come out, but for my part, I wasn't too thrilled with Crysis as a game. I thought the controls were badly designed (suit functionality chosen by pushing down the mouse wheel? bad choice) And it alienated a large percentage of players, by not running on a significant number of computers (If you remember it getting single frames per second on 3870 x2's because they had done no optimization for ATI hardware). Those kinds of things really erode a drive to produce titles like that in other developers. Alienating part of the market? Weird, bad control schemes? Yes it looked pretty, and some parts were noteworthy (Liked the sunrises and the relative freedom to move around the map anywhere you wanted to go) But really the gaming experience wasn't that stellar from my point of view. Producing games for such a narrow audience is a poor decision imo. Only the highest end gaming machines with only a single brand of GPU? Not really the best plan in terms of future sales. And the issue with the controls is endemic of games like that, when the focus of the studio gets that narrow, and their efforts are solely dedicated to the production of part of a game, other parts get neglected.
 
G

Guest

Guest
Good read. I've come across this article that looks pretty interesting too. If they can pull it off it should be great progress.

http://software.intel.com/en-us/articles/unified-computing-framework/
 

whiz

Distinguished
Jan 15, 2009
53
0
18,630
Well it's articles like that that keep me pinned to TH. Very good work Alan! Hope to see more insightful material from you and the TH crew!
 
Very interesting read. The points it missed were not as important as the points it raised.
Can it be considered a three-horse race if the horses seem to be going in slightly different directions?
 
Creative didn't do itself any favors with all the original Xi-Fi issues and then there was the Daniel K thing. Even tho the decline of the dedicated sound card may have been inevitable, Creative drove itself off a cliff rather than having a slow decent.
 

homice

Distinguished
Sep 29, 2009
11
0
18,510
how about junctionless transistors in microchips, or the new graphene-based chips? Moore's law may be on the horizon for silicon, but silicon may be on it's way out by the end of the decade too :)
 

AlanDang

Distinguished
Nov 25, 2008
37
0
18,530
From Disney/Pixar PR:

"The average amount of time required to render a single frame of film for Up was between five and six hours. Some complicated frames took up to 20 hours. For every second of film, 24 frames are required."
 

hixbot

Distinguished
Oct 29, 2007
818
0
18,990
Excellent article, but I agree with the post that points out you forgot the console factor. Consoles are holding back advancement in game development. Lack of cutting edge PC games is hurting the GPU market and advancement is slowing down. (why buy a new GPU if their are few games that take advantage).

Since the GPU market is hurting, so will the next gen consoles. Because we all know, consoles are designed using the building blocks laid down by the GPU industry.

Everything is slowing down, and while this article is right in saying development budgets play a big role, I think it's the dominance of console gaming and the subsequent decline of PC gaming that is the biggest factor.
 

vasilecelmare

Distinguished
Jan 26, 2010
5
0
18,510
Alan, congratulations on this article. In ten years, I am sure that I will remember it - as I will put to good use what I learned from it.

Thank you!

Sincerely,
Vasile
 

AlanDang

Distinguished
Nov 25, 2008
37
0
18,530
Consoles are an interesting topic. While they hold back things from the hardware side of things, they allow for larger budgets. Sony can subsidize development of games because they make money by selling consoles. Games like GTA4 and COD:MW2 are cross-platform, further blurring the lines. AMD can't do that. NVIDIA's TWIMTBP doesn't operate at the same level.
 

g00ey

Distinguished
Aug 15, 2009
470
0
18,790
This was a very interesting article to read and also very insightful even though I don't agree to all points in it. I really wish we had more articles with this quality level! Thank you.

You mention Itanium which is interesting but I don't think it is fair to mention Itanium without mentioning for example SUNs architecture. I hear a lot of talk about how good SUN Servers are and how little performance/cost you get from IBM Power6 while I hear no word about Itanium in those circles.

I would rather say that the problem game developers actually are facing today is that most people don't have the latest hardware and therefore they have to reduce the hardware requirements so that the released games can be used by a broader audience and not just by a smaller group of gaming enthusiasts. So the challenges they are facing are the challenges that come with producing games that work on weaker hardware but at the same time utilize the additional performance gained from newer hardware.

I've always dreamed that one day there will be computer games with high terraform details, with trees and foliage with excellent detail, high variety of species and physical response to elements such as characters passing by, the wind in my face or a blasting shell. When I jump into a lake I see how the water splashes realistically and interacts with my character and others that are in it. See the following demo for some examples:

http://www.youtube.com/watch?v=2_RSYoCEMF0

Moreover, there are a bunch of more realistic but very computationally intensive rendering techniques than traditional raytracing; take for example Splutterfish

http://www.splutterfish.com

which uses a technique that simulates a stream of photons in the rendering process. Effects such as ray scattering, shade diffusion, and refraction from transparent objects comes "naturally" from this process without the need for application of additional effects and the results are incredible.

I also suspect that in the future the 3D models will achieve such a high level of detail that effects such as bump mapping may no longer be necessary. Objects may not be created from scratch in modelers but rather be scanned from real-world objects.

There is much more development to do when it comes to advanced A.I. and imagine a game world of hundreds if not even thousands of intelligent characters interacting with each other, forming teams and doing all sorts of intelligent behavior. That would be immensely computationally intensive.

So I would say no, we are nowhere near a maximum when it comes to the need for additional computational power which I would say is outright insatiable. I strongly disbelieve that programmers have some kind of pressure due to their "inability" of implementing the additional performance granted by those newer lines of GPUs in forms of budget constraints or knowledge based limitations.

There is an insatiable need to be satisfied and I wish the hardware developers all the best in their development of future hardware. And I hope they will still be around competing with each other intensely while making us gamers happy with even better and better GPUs for years to come.
 
G

Guest

Guest
The issue we have now is consoles, and they are holding the gaming industry back. Every game is developed to be playable on a console, which are now on 5 year old hardware. Games could be made to bring these newer graphics cards to their knees, but would be unplayable on a console.
 

False_Dmitry_II

Distinguished
While I agree in principle to everyone talking about consoles holding things back that isn't necessarily the only factor.

PC games have always been made to support old hardware so that a larger amount of people can play it. Look at steam's surveys for instance. Or games like painkiller's requirements, even back then I was using a computer that stomped the requirements. (which is a 1.5 gigahertz Pentium 3, I had a pentium 4 3.0 gigahertz with hyperthreading.)

This trend may have been exaggerated by games like WoW and consoles, but it isn't the only reason.
 
Status
Not open for further replies.