Intel: Integrated Graphics is Where It's At

Status
Not open for further replies.
LOL, pathetic. Development is relative to the power of the hardware available, hence L337 3d games didn't exist for the 286. I guess Intel wants gaming systems dumbed-down to 1990s standards to make up for their shortcomings. I think they figured out that Larrabee is gonna flop...
 
I think this can trigger the beginning of the end of the need but not the use for high-end discrete graphics cards. If the market for high end graphics cards shrinks enough I could see developers dropping support. Something like what happened to the sound card market.
 
A TNT card could mop the floor vs Intel integrated graphics and they really expect developers to swallow this swill they are peddling?? Intel, you may be the biggest but your trying to impose your will where you have never proven yourself capable.
 
there will always be a market for fast gaming PC's and 500 dollar graphics cards especially now with people doing more with their PC's (picture/video editing format conversions)

i think we are going to see a spread

people with exreme low end (intergrated)

and people with 200+ graphics cards
 
This is sad, Intel might as well say "Please make your games look like crap so they run on our integrated cards" So much for high end graphics if normal users don't appreciate them...
 
I personally read this as: 'why not to bother with larabee' myself..If developers seriously considered intel chipset graphics in their games then quake 3 would still be considered 'high end' graphics
 
[citation][nom]Hatecrime69[/nom]I personally read this as: 'why not to bother with larabee' myself..If developers seriously considered intel chipset graphics in their games then quake 3 would still be considered 'high end' graphics[/citation]

Quake 3 is fun and look at quake live. if developers stopped trying to Pump more graphics and started to pump more fun we might have better games.
 
woot quake 3 rocked my world... back 1999... I am not to concerned having integrated graphics is a set back in computing... yes making things smaller and smaller seems to be the way to go but sacrificing so much power and ability isn't going to float well they will see this soon enough.
 
Intel really shouldn't be bragging about a graphics solution that is bought exclusively by those who don't care about performance. Sure, they sell the most chips, but it's only because most people don't care at all about their 3d acceleration. Anyone who cares gets something better, even if that means an integrated Nvidia or ATI chip.

Nothing is killing the PC gaming market more than the fact that one of the most common types of PC sold is the cheap Intel-based laptop - a computer that not only can't play games, it can't ever be upgraded to do so. Intel is making a dire mistake by pushing these things on consumers at all. If the entry level computer cannot play games or ever be upgraded to do so, the number of people entering PC gaming will dwindle, and they won't progress to buying high-end 'gaming' processors later.
 
Yes developers should make more 2D isometric games, becuase that is the standard intel has towards integrated graphics. If they really wanted to go integrated, then they should target a real integrated chip from AMD or nVidia. Integrated from Intel is a joke.
 
Which would you rather be selling?

A) 10 million low end IGP chipsets at an average price around $10.00 (if not less)

B) 2 million high end boards at an average price of $200.00 each?

"Lies, Damn Lies, and Statistics"
 
the only reason there is a "LOWEST COMMON DENOMINATOR" is because of Intels CRAPPY IGP video. They are the lowest common denominator. The lead(pronounced LED, Pb) in the video industries ass.
 
Intel is talking rubbish. Most games will run fine on good laptops with low power discrete graphics chips. My Lenovo runs FEAR happliy on low setting thanks to it's nVidia 7300. And as for the Atom/netbook, nVidia's Ion has been demonstrated to play CoD4 and others.

Also all the new chipsets from AMD and nVidia feature integrated DX10 GPUs.

Intel just can't admit nVidia and AMD are stealing all their low power gaming market, since anyone who casually/seriously plays the newest games will ensure they are running a AMD or nVidia GPU of some form.

If Intel want to stay in the race they need to spend some serious money and bring out a decent floating point pipelined GPU (not some Pentium 1 multicore crap ie Larrabee).
 
"What do you think? Could this mark the beginning of the end of the need for high-end, $500 discrete graphics cards?"
NO
 
What do you think? Could this mark the beginning of the end of the need for high-end, $500 discrete graphics cards?

No and I am not sure if that is the point that Intel is stressing here.

The majority of the computing world doesn't need discrete gpu cards, so obviously integrated will sell higher. This also means there is a huge target audience to run games that aren't too demanding on integrated hardware.

Intel is trying to convince developers to develop lower end capable games since the hardware exists to do so.

I think this is a good idea for developers. I have installed some old school games on my laptop that I was never able to play before, but they run well on my integrated hardware. Unreal is still fun many years later no matter how dated the graphics look.
 
It's said when desktop cards running at X1 pci-e speed are faster then intel video.

It's said that amd and nvidia boards at the about same price have much better on board video.

Why can't intel have 64-128 side port ram like amd?
 
Similar to Apple offering integrated solutions. Certainly with mobile computing CURRENTLY on the upswing, requiring smaller, integrated hardware, the short-term future potends that there will be a substantial marketing (translated "revenue stream") opportunity. Perhaps discrete graphics chipsets will become a boutique market, perhaps not. Who's to say that perhaps holographic requirements may require a paradigm shift in graphics performance that only the discrete suppliers can address, even ramping up to supplant CPU subsystems? The future offers opportunites for those prepared and willing to change.
 
Why doesn't intel just ask developers to start including an options switch within the game that a user can click, which in turn, disables everything they just payed for in thier game, so Intel can say, " ohh ya, we are compatible with that game!"
 
What a joke: "Here's your answer: Mercury Research showed that in 2008, for the first time, integrated graphics chipsets outsold discrete (graphics chips), and in 2013, we expect to see integrated graphics chipsets outsell discrete by three to one", you would need to play alot of Chinese whispers with drunk ppl before this translated to "pc gamers are choosing IGP over discrete cards". this is like when they were counting PS3s towards total bluray player sales, except alot more laughable.

Firstly Gamers make up a tiny part of all computer sales, secondly the above statement is just as likely caused by more mobos having IGP on them(for HTPC ect) than a decrease in gfx card sales.

and another thing, if the EU is going after MS for bundling IE with windows, what are they gonna do when intel starts bundling gpus with their cpus? atleast there is physical money involved in the gpu market.
(btw, I think the EU/MS/IE thing is BS)
 
Status
Not open for further replies.