Intel: Integrated Graphics is Where It's At

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Just write the games so that if you set everything to Low, it will be playable on the Intel stuff. Also maybe spend some time to make sure it doesn't look absolutely horrible at that setting - an example from some years ago was Oblivion - once I had adjusted all the settings to make it playable (on my low-end system at the time), it actually looked worse than Morrowind and the framerate was lower too. Same thing trying to run Crysis on an ATI X800XT - older FPS games both looked and ran better at those low settings.
This doesn't mean games still won't feature an "Ultra-High" mode in the future to really push those SLI and Crossfire setups out there to their limits.
 
Hey guys,
PC game developers are having golden opportunity here. IGP never going to replace GPU for graphic intensive games. But at the same time IGP can provide very good income from casual players. If the game developer develops fun game and it can run on IGP it will sell in huge numbers, just like Wii. This steady income can allow them to develop better hi-end games that are better polished and runs without all those problem that we see all the time.
 
[citation][nom]JimmiG[/nom]Just write the games so that if you set everything to Low, it will be playable on the Intel stuff. Also maybe spend some time to make sure it doesn't look absolutely horrible at that setting - an example from some years ago was Oblivion - once I had adjusted all the settings to make it playable (on my low-end system at the time), it actually looked worse than Morrowind and the framerate was lower too. Same thing trying to run Crysis on an ATI X800XT - older FPS games both looked and ran better at those low settings.This doesn't mean games still won't feature an "Ultra-High" mode in the future to really push those SLI and Crossfire setups out there to their limits.[/citation]
There may very well be limitations on older systems that make it simply impossible to provide the same game for a good and a bad gaming system. For instance any strategy game would suffer from a poor cpu regardles of graphics settings, and you can't exactly lower the ai. Games where physics and similar play a major role would suffer likewise (racing games as an example). The only place where I'd think its possible is multiplayer online shooters as these need to calculate limited physics and no ai. But I would imagine those playing these on too low a setting might find themselves at a disadvantage even if the fps were okay.
 
[citation][nom]scarpa[/nom]Intel wants to stop the development of computers, we should stop them before they manage that.[/citation]
That's not correct. They merely want to stop the innovation for their competators in order to preserve an amount of income that satisfies their shareholders.
 
There's nothing discrete about a graphics chipset that errors out when you try running a game!

So, Intel's indiscrete solutions. Ha. 🙁
 
[citation][nom]D2ModPlayer[/nom]There's nothing discrete about a graphics chipset that errors out when you try running a game! So, Intel's indiscrete solutions. Ha.[/citation]
Do you know what is meant by discreet?
to make it very simple - a discreet graphics card is an addin card. Thus not an intel igp system.
 
Oh my goodness! Intel just broke their record by showing a new level of how full of themselves they are! They are too full of themselves to see that their intergrated crap chipset is actualy real CRAP! And forcing crap o game developers/gamers is suicidal for them in that industry. To actualy dream(thinkin is just pushing it) of taking over the graphics chip biz with crap is being full of sh**! Now they'll probably start saying that their integrated crap can also run physics . . . Then theyl break their record again! Oh and previously they made a similar move, thanx to their multiple sockets for i5 and i7, we wont be able to Core i5 on an x58 or i7 on a p55. I wonder what will happen to i7/x58 sales when i5/p55 is out and people realise that an OC is all is needed to close the perfomance gap between the two, if theaz any(this is hoping i5 will be any better than core 2). They were too full of themselves when making this moves towards their doom.
 
Wait for it . . .
"Intel crap inside"
Downgrade your discrete graphics chip for a more retarded visual experience! a.k.a intel visual adrenaline. LoL!
 
If this was coming from Nvidia or AMD who makes much, much better and faster IGP I might buy at least a bit of it... But more likely Larabee is probably looking to be a bust for them right now and they are starting to hedge already...
 
(paragraph 2) 'outpacing' is a cool word, but it is better to use a word with the correct meaning: 'outselling'. In general it is better to write sentences that have the correct meaning than to insert cool jargon with the incorrect meaning.
 
The only thing intel has done to spur game developemnt with thier IGP's is the plethora of crappy flash and low end games out there(Just check Yahoo games and this other one think its called Big Fish or some such rubbish, you have to install something just to download them).Do we all really want gamespot to be harping over Choclatier 5 as one of the best PC releases of the fall?The entire argument is propaganda, IGP's from intel are cheaper yah,but they dont spur any forward thinking development,just the next version of Diner Dash.
 
Bah! Consumer gaming cards will die in a couple of years anyway. See onlive.com and quakelive.com.
 
Discrete cards are probably going away at some point in the future, just like floating point processors did. Moving it to the processor die makes a lot of sense, and removes a lot performance bottlenecks, so you don't need the same powerful hardware to get the same performance. With processor lithography getting smaller and smaller, they can keep adding more cores, which becomes pointless for most apps, or they can add a graphics processor, which over time can get very powerful. It's a much more efficient design, when they have the space for it, and they will have space for it.

It might come down to computers being mainly or totally IGPs (which I feel certainly will become part of the processor, not the chipset), and then the game systems. Maybe very powerful discrete cards will still exist, for a while, but the inherent inefficiency of it will make it extinct sooner or later. It's much more likely you'll have processors with varying GPU capabilities. Maybe, the gamer CPUs will have fewer cores, but a killer GPU, whereas mainstream processors might have more (or the same, depending on the target) cores, and a graphics processors that has only relatively slow 3D support (like the G45 with respect to discrete cards).

One thing is clear, when you put the GPU with the processor, you remove a lot of performance impediments, so it's going to happen. Does anyone remember going from the 386/387 to the 486? It was essentially the same floating point processor, but just moving it on die had a dramatic effect on floating point performance. I can't say the GPU will benefit as much (maybe it will more), but I can say it will benefit materially.

 
[citation][nom]ta152h[/nom]Moving it to the processor die makes a lot of sense, and removes a lot performance bottlenecks, so you don't need the same powerful hardware to get the same performance. [/citation]

/sarcasm alert

Yes, this is exactly why GMA 950 is so much more powerfull than an GTX 295.

/end sarcasm
It's hard to buy a new motherboard without integrated graphix. What's happend is that it's become so cheap to add basic video to the NB that it's basically free. If developing DX8 games now is such a great idea, why isn't everyone doing it?

If IGP is good enough, then sure as sh|t, less expensive AMD systems are good enough. In fact, in that case as long as we're all just going to pander to the LCD then we shouldn't need to ever buy new hardware again unless it breaks, and then we should be able to buy the cheapest system possible at that point.

Future System builder marathon according to Intel vision of the future:

http://www.newegg.com/Product/Product.aspx?Item=N82E16856107044 130$
barebones +
http://www.newegg.com/Product/Product.aspx?Item=N82E16820159104 11$
ram +
http://www.newegg.com/Product/Product.aspx?Item=N82E16822136195 34$
175$
 
I'm surprised at the negative response to this, my initial response was 'about time'.

Games should be developed for the lowest common denominator since it means more people will buy them. More people buying games means more games, more money for the bigger games, more people to play multiplayer games with and cheaper graphics hardware means we can spend less on the internals and more on games and the hardware that matters (i.e. monitors and interface devices). Personally, I'd love to spend £200 less on a graphics card and £200 more on a monitor.

It would probably mean a 2-3 year hiatus in graphical progress while integrated graphics were brought up to the current level but I can't see any objections to that.

Of course, getting this to happen will be the difficult bit but hopefully the increasing market with integrated chips will prompt developers to start developing for them and intel/ati/nvidia to start making the gap between the top and bottom end hardware smaller.
 
Putting a GPU on the processor die is not as simple as "It will be faster". The GPU-CPU communications will be faster yes... but how will the memory architecture be setup and how will the devices communicate. Big shared memory pools are sub-optimal because the GPU/memory interface is typically configured differently than CPU/Memory. Does that mean you're going to add ANOTHER 128 or 256 pins to your processor to support a totally separate memory bus? How big with the chip need to be to support that? How much CPU capability are you giving up and how much GPU capability just to make the IPC link faster?

The most important interface in a GPU is the memory interface... the CPU interface carries a TINY fraction of traffic compared to vram.

You also give up flexibility... I upgrade graphics at about twice the rate of CPU upgrades.

I think you're likely to continue to see integrated graphics at the low end... but at the high end discrete will rule for the foreseeable future, though board interfaces may change to reflect increasing performance. However... I wouldn't be a bit surprised to see designs that put the GPU on the hypertransport bus, which would greatly increase GPU/CPU and GPU/Main Memory bandwidth.
 
we expect to see integrated graphics chipsets outsell discrete by three to one," Davies said.
This probably comes from a report on the rise of notebook and netbook sales versus PCs. So yea, the numbers makes sense but no one buying these netbooks and notebooks with integrated graphics are planning on playing the latest games. (with the exception of those that just don't know any better of course)
 
Right... and it's an apples to oranges comparison in any case... IGP chipsets are dirt cheap, you could probably sell 10x more and still not have as much revenue as the video card market.
 
Bounty,

You're way off. Putting it on the chipset is COMPLETELY different from putting it on the processor die. There is no advantage of putting it on the chipset, that's why I said it makes no sense that they continue to do it. The processor is different.

Dave K, you can add pins pretty easily, just like they always do. Look at LGA775. They went from 478 to 775 for the same processors. So, that's one alternative. How about all the extra lines on the motherboard for extra x16 processors? In some ways, you could simplify it by offering fewer lanes, since they aren't needed. The intercommunication between the processor and the GPU would, naturally, be a lot faster too. You'd have less traffic through the chipsets, which would help. You wouldn't need memory controllers all over the place either. You could also integrate the functions of the GPU with a processor as well, although I think in the near term this is less likely. But, surely it's also something designers have thought about too. There's a lot of potential with this.

It's not simple, but it's not very difficult either. Clearly, it's happening, and it's happening soon. Initially, because of die considerations, it will start at the low end, but will surely move up. As more die area becomes available, the GPUs will get bigger, and their inherent efficiency will give them an edge. Also, consider if you do have two memory banks, one for the GPU and one for the CPU (It's a bit un-Von Neumann, but ...), with creative use, you could use the extra bandwidth of the other while it's not busy. I personally think you'll see the same memory used, although there might be a cache, because it opens up so many opportunities for greater flexibility. On the other hand, you'd give up advantages of specific memory for graphics, but they use DDR3 a lot for cards, so it's not so bad, obviously.

Considering the inefficiency of discrete cards getting data, instructions, etc..., and also having to use main memory when the card doesn't have enough, I think it's inevitable.

You would lose flexibility, and I don't think the processor makers would offer so many CPU/GPU combinations that you'd eliminate that downfall, but it's a small negative considering how many people do it, and the advantages. Most people replace whole machines.

 
The problem is that you'll be taking parallel architectures (because you can't really SIMPLIFY the architecture) and packaging them onto a single chip. That will add cost and complexity to chip development (delaying new processor releases... mitigating against use for leading edge performance), also it will limit the processing and graphics power to values significantly below where state of the art standalone processors will field (again mitigating against use outside of the value market). You'll also slow update schedules unless parallel development is VERY closely maintained (new cpu's and gpu's delivered simultaneously or one will be waiting for the other... virtually impossible when you're talking about two very different development efforts) which again will slow rollout and make it hard to compete with standalone approaches at the top performance tier.

Personally, I don't see the value prop. for anything other than low cost systems where you've got extra fab capacity because state of the art has moved on, and die capacity because state of the art is more demanding.

There also the underlying assumption that higher IPC speeds will offset the lower processor and graphics performance, and I think that's also highly suspect... especially if you look at the potential of a socketed HT interface - which would give you much of the benefit without all the cost.

Certainly there's some valid arguments FOR integration, but there's also drawbacks... and the drawbacks increase as you get close to the top of the performance envelope. So you've got Intel and AMD saying they want to package high end graphics with high end processing... but the market forces say that it's the low end that has the best value prop for integrated gpu/cpu. Not sure how it'll work out in the end but it'll be interesting.
 
I think we should not underestimate freeing the FSB and the northbridge of data meant for graphics!

I think a major bottleneck in graphics are solved right there!
Sure, intel still sucks at graphics processors, but still,they just have created perhaps the fastest bandwidth connection for video possible.

Not only that, this could mean that the North bridge can be produced with lower TDP., resulting in just THE PERFECT solution for current atom based like netbooks!
Just like the Atom EeePc 901 outperformed the celeron EeePc 701 not only on calculating, but also graphic wise, so I believe this newer technology will (or should) dramatically reduce the cost of the overall system, while boosting fps at the same time.
Even if Intel just equipped the newer atom with a 45nm based GMA950 videochip, I still believe it will outperform the Atom paired with a GMA950 chip on a separate die.

I think we'll end up with a passive heat solution, fast enough graphics for 720, and perhaps 1080p video streaming or compressed, and perhaps the combination of integrated graphics with processor will decide the quality level of a game.

I use an old Intel GMA945 with a Core2duo 1,6Ghz laptop, and I'm able to play prince of persia, medium graphics at 1024x768 pretty fine (I have an avg fps of at least 25fps). Yes, sure the shadows look crappy, but at least I see shadow, and have a framerate of over 25fps!
Would it be better to see a nice shadow, while my framerate goes down to 18fps?
 
besides, once NVidia could come up with a Cuda technology, you'd be able to go to the store and upgrade your IGP based computer with a real videocard, which can work in parallel with the IGP, resulting in even faster framerates , and better powersavings!
 
Status
Not open for further replies.

TRENDING THREADS