Intel: Integrated Graphics is Where It's At

Status
Not open for further replies.
G

Guest

Guest
LOL, pathetic. Development is relative to the power of the hardware available, hence L337 3d games didn't exist for the 286. I guess Intel wants gaming systems dumbed-down to 1990s standards to make up for their shortcomings. I think they figured out that Larrabee is gonna flop...
 

zerapio

Distinguished
Nov 4, 2002
396
0
18,780
0
I think this can trigger the beginning of the end of the need but not the use for high-end discrete graphics cards. If the market for high end graphics cards shrinks enough I could see developers dropping support. Something like what happened to the sound card market.
 

roofus

Distinguished
Jul 4, 2008
1,392
0
19,290
2
A TNT card could mop the floor vs Intel integrated graphics and they really expect developers to swallow this swill they are peddling?? Intel, you may be the biggest but your trying to impose your will where you have never proven yourself capable.
 

rantarave

Distinguished
Apr 14, 2007
34
0
18,530
0
there will always be a market for fast gaming PC's and 500 dollar graphics cards especially now with people doing more with their PC's (picture/video editing format conversions)

i think we are going to see a spread

people with exreme low end (intergrated)

and people with 200+ graphics cards
 

brendano257

Distinguished
Apr 18, 2008
899
0
18,990
1
This is sad, Intel might as well say "Please make your games look like crap so they run on our integrated cards" So much for high end graphics if normal users don't appreciate them...
 

Hatecrime69

Distinguished
Oct 17, 2008
173
0
18,680
0
I personally read this as: 'why not to bother with larabee' myself..If developers seriously considered intel chipset graphics in their games then quake 3 would still be considered 'high end' graphics
 

engrpiman

Distinguished
Mar 16, 2006
161
0
18,680
0
[citation][nom]Hatecrime69[/nom]I personally read this as: 'why not to bother with larabee' myself..If developers seriously considered intel chipset graphics in their games then quake 3 would still be considered 'high end' graphics[/citation]

Quake 3 is fun and look at quake live. if developers stopped trying to Pump more graphics and started to pump more fun we might have better games.
 

hercules

Distinguished
Jul 23, 2008
88
0
18,630
0
woot quake 3 rocked my world... back 1999... I am not to concerned having integrated graphics is a set back in computing... yes making things smaller and smaller seems to be the way to go but sacrificing so much power and ability isn't going to float well they will see this soon enough.
 

Mitrovarr

Distinguished
Jan 24, 2009
10
0
18,510
0
Intel really shouldn't be bragging about a graphics solution that is bought exclusively by those who don't care about performance. Sure, they sell the most chips, but it's only because most people don't care at all about their 3d acceleration. Anyone who cares gets something better, even if that means an integrated Nvidia or ATI chip.

Nothing is killing the PC gaming market more than the fact that one of the most common types of PC sold is the cheap Intel-based laptop - a computer that not only can't play games, it can't ever be upgraded to do so. Intel is making a dire mistake by pushing these things on consumers at all. If the entry level computer cannot play games or ever be upgraded to do so, the number of people entering PC gaming will dwindle, and they won't progress to buying high-end 'gaming' processors later.
 

falchard

Distinguished
Jun 13, 2008
2,360
0
19,790
4
Yes developers should make more 2D isometric games, becuase that is the standard intel has towards integrated graphics. If they really wanted to go integrated, then they should target a real integrated chip from AMD or nVidia. Integrated from Intel is a joke.
 

Dave K

Distinguished
Jan 13, 2009
115
0
18,680
0
Which would you rather be selling?

A) 10 million low end IGP chipsets at an average price around $10.00 (if not less)

B) 2 million high end boards at an average price of $200.00 each?

"Lies, Damn Lies, and Statistics"
 

warezme

Distinguished
Dec 18, 2006
2,408
27
19,840
20
the only reason there is a "LOWEST COMMON DENOMINATOR" is because of Intels CRAPPY IGP video. They are the lowest common denominator. The lead(pronounced LED, Pb) in the video industries ass.
 
G

Guest

Guest
Intel is talking rubbish. Most games will run fine on good laptops with low power discrete graphics chips. My Lenovo runs FEAR happliy on low setting thanks to it's nVidia 7300. And as for the Atom/netbook, nVidia's Ion has been demonstrated to play CoD4 and others.

Also all the new chipsets from AMD and nVidia feature integrated DX10 GPUs.

Intel just can't admit nVidia and AMD are stealing all their low power gaming market, since anyone who casually/seriously plays the newest games will ensure they are running a AMD or nVidia GPU of some form.

If Intel want to stay in the race they need to spend some serious money and bring out a decent floating point pipelined GPU (not some Pentium 1 multicore crap ie Larrabee).
 

radguy

Distinguished
Jan 25, 2008
223
0
18,680
0
"What do you think? Could this mark the beginning of the end of the need for high-end, $500 discrete graphics cards?"
NO
 

ravenware

Distinguished
May 17, 2005
617
0
18,980
0
What do you think? Could this mark the beginning of the end of the need for high-end, $500 discrete graphics cards?
No and I am not sure if that is the point that Intel is stressing here.

The majority of the computing world doesn't need discrete gpu cards, so obviously integrated will sell higher. This also means there is a huge target audience to run games that aren't too demanding on integrated hardware.

Intel is trying to convince developers to develop lower end capable games since the hardware exists to do so.

I think this is a good idea for developers. I have installed some old school games on my laptop that I was never able to play before, but they run well on my integrated hardware. Unreal is still fun many years later no matter how dated the graphics look.
 

Joe_The_Dragon

Distinguished
Sep 19, 2006
512
0
18,980
0
It's said when desktop cards running at X1 pci-e speed are faster then intel video.

It's said that amd and nvidia boards at the about same price have much better on board video.

Why can't intel have 64-128 side port ram like amd?
 

Robert17

Distinguished
Feb 17, 2009
54
0
18,630
0
Similar to Apple offering integrated solutions. Certainly with mobile computing CURRENTLY on the upswing, requiring smaller, integrated hardware, the short-term future potends that there will be a substantial marketing (translated "revenue stream") opportunity. Perhaps discrete graphics chipsets will become a boutique market, perhaps not. Who's to say that perhaps holographic requirements may require a paradigm shift in graphics performance that only the discrete suppliers can address, even ramping up to supplant CPU subsystems? The future offers opportunites for those prepared and willing to change.
 

skit75

Splendid
Why doesn't intel just ask developers to start including an options switch within the game that a user can click, which in turn, disables everything they just payed for in thier game, so Intel can say, " ohh ya, we are compatible with that game!"
 

matt87_50

Distinguished
Mar 23, 2009
1,150
0
19,280
0
What a joke: "Here's your answer: Mercury Research showed that in 2008, for the first time, integrated graphics chipsets outsold discrete (graphics chips), and in 2013, we expect to see integrated graphics chipsets outsell discrete by three to one", you would need to play alot of Chinese whispers with drunk ppl before this translated to "pc gamers are choosing IGP over discrete cards". this is like when they were counting PS3s towards total bluray player sales, except alot more laughable.

Firstly Gamers make up a tiny part of all computer sales, secondly the above statement is just as likely caused by more mobos having IGP on them(for HTPC ect) than a decrease in gfx card sales.

and another thing, if the EU is going after MS for bundling IE with windows, what are they gonna do when intel starts bundling gpus with their cpus? atleast there is physical money involved in the gpu market.
(btw, I think the EU/MS/IE thing is BS)
 
Status
Not open for further replies.

ASK THE COMMUNITY

TRENDING THREADS