Intel: Integrated Graphics is Where It's At

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Dave K

Distinguished
Jan 13, 2009
115
0
18,680
I always try to keep in mind that it's all relative also, high end workstations today will be slower than low end Integrated systems in a year or two... I think even at the current trajectory, Integrated Graphics will be more than capable of rendering 1080p streams in the near future.

I also would tend to think that there might be significant cost savings at the low to mid range system realized by higher integration. You'll find systems that are very good performers and still fit nicely in the $500-$1000 price range.

As far as freeing up the peripheral chipset... Putting the GPU on the HT bus would certainly do this also (but would likely limit you to a single graphics card)
 

tipmen

Distinguished
Jun 30, 2008
244
0
18,680
Don't pick on Integrated Graphics! It saved my ass once. My gpu fried and there was no way i could get a replaceable for a few days. Which wouldn't bug me besides I had to fine tune a paper for class however i was able to proudly do it on my integrated ATI Radeon HD 3300.
 

marraco

Distinguished
Jan 6, 2007
671
0
18,990
[citation][nom]ta152h[/nom]Discrete cards are probably going away at some point in the future, just like floating point processors did.
...[/citation]But floating point processors failed. Discrete graphics cards are no other thing than math coprocessors.
 
Intel may score low numbers in SuperPi benchmarks but low score numbers in games with integrated "extreme" graphics isnt the same Intel ;)

On the other hand i always wanted options in the video card control pannels to disable textures and complex rendering schemes so that any system will run any game sort of...
 

anamaniac

Distinguished
Jan 7, 2009
2,447
0
19,790
I'm personally indecisive about this.
I realize that I won't be a gamer til I die (likely), but silicon is still in my blood. I personally prefer a PC over a laptop almost any day, my old PC ($700, 20 months, but a $90 4670, all CAD)is a beast compared to my father's new and expensive laptop (4 months, $1200).
However when I finally calm down from gaming, a laptop will likely work just fine. If not gaming, I don't need this much juice. a $300 netbook would do just fine.
So, integrated graphics are a option, only because hardcore gaming graphics aren't a must for everyone. I know myself I'll likely end up having a work laptop, and a powerhouse PC.
 

Sentinel

Distinguished
Oct 3, 2008
5
0
18,510
Intel like many other large companies of present, seems more interested
in squashing any competition for profits sake (ie. sue AMD/Nvidia ect.)
and when that doesn't work, they will even resort to STUPIDITY such as this.
 

Cuddles

Distinguished
Mar 13, 2008
266
0
18,790
And how many of those people turned off their IGP when they put in the Card. You can buy a Card for $50.00 and it will play 10 times better than any Intel IGP. Intel is the very last company that should be talking about Graphics.
 
G

Guest

Guest
It makes little or no sense to add the cpu and gpu to one die, if the CPU is already maxing out it's ~130w power envelope, how could you possibly add anything to it without having to make sacrifices for both? AFAIK AMD is aiming for the value demographic with their Fusion chip, it's not meant to replace their ATI cards. If you only had a 65w CPU, and added a 65w gpu(way more power than a northbridge could dissipate), you could stay within that envelope and still have quite a bit of power(and some very nice integrated graphics), but neither will be at the high-end of performance.
 

JonnyDough

Distinguished
Feb 24, 2007
2,235
3
19,865
This is bad. Very bad. Why am I saying this? Well, less choice for the consumer = bad. Integrated means that you have to upgrade your whole system, which means they get to sell you more stuff. Which means ultimately, it becomes MORE COSTLY for the consumer, and they get a bigger piece of the pie. Company A can produce motherboards with chipset/CPU/graphics all rolled into one for X cost. Company B has a great graphics chip design but there's no platform for them to sell to. Who do you think doesn't have a chance to compete? If you can't "do it all" you can't compete. It's monopolization. It's the same reason that Wal-mart is EVIL. Push smaller companies out of business until you have such magnificient capital and power that you rule the world. Everyone knows that the world government is one giant corporation. That government is probably already here, it just hasn't gone public yet.
 

D2ModPlayer

Distinguished
Jan 13, 2007
36
0
18,530
That's funny. He was pointing out my error and spelled discrete incorrectly. Plus I was just making a joke. :)

Anyway, Intel finally succeeded in making Direct X 8 work a whole decade after everyone had moved to DX9. Who knows if they'll ever get DX10 working on their solutions. Some games work on Intel integrated solutions because they say they use DX10 but are actually only using DX8 or 9 features. Anything else is going to not run. I haven't tested the lastest and greatest(ha) Intel since I don't own a recent notebook, but when you consider their history I doubt much has changed. I remember years ago they had a horribly slow solution (or is it a problem) for DX8 that wouldn't even run DX8. They had it on PC's for years. It was already dirt slow. Then they come out with one that said it's 20 percent faster. I didn't know whether to laugh or cry. I can't remember it's name, perhaps it was a mental block to prevent insanity! The chipset was already only 20 percent of the speed of a real video card. I guess 5 frames a second is playable to them.
 
G

Guest

Guest
[citation][nom]UhHuhSure[/nom]It makes little or no sense to add the cpu and gpu to one die, if the CPU is already maxing out it's ~130w power envelope, how could you possibly add anything to it without having to make sacrifices for both? AFAIK AMD is aiming for the value demographic with their Fusion chip, it's not meant to replace their ATI cards. If you only had a 65w CPU, and added a 65w gpu(way more power than a northbridge could dissipate), you could stay within that envelope and still have quite a bit of power(and some very nice integrated graphics), but neither will be at the high-end of performance.[/citation]
Like said,this solution is only for netbooks and laptops. Perhaps for low cost desktops as well. We're talking about a sub $500-600 system
 

zoolcomputers

Distinguished
Mar 13, 2009
23
0
18,510
I love Integrated Graphics, Cheep and nice,soutable for Kids, BUT... not for games.. not for me..
If I can play CRYSIS with everything high + full AA by 2012 on integrated, I will become a fan... and will drop my 200+ card..
 

canadian87

Distinguished
Apr 19, 2009
11
0
18,520
/sarcasm on

I think that all game developers should be looking in to making video games playable at lowest graphics and 640x480, also they should make sure that platforms like the iPhone can play them smoothly, I'm sick and tired of having to log on to my super computer loaded with a crazy expensive ($240) GTX 260, I really want to play WoW and Crysis on my ipod, but NOOOO, these game developers never think about the massive base of gamers that want to play at these well overpopulated resolutions.

/sarcasm off

Seriously Intel, nobody is using you're integrated "graphics" processor for anything but Decent 2, and Doom. So sure you have the largest consumer graphics base, but thats because colleges want their students to have pretty 3d pie charts. Don't count that as a big win, because if they don't go with an Intel motherboard, they'd go with AMD, and even colleges know that the price performance of your processors is best. Just be happy you've got the #1 processor, and stop calling IGPs anything more than what they are IPCMs, Integrated Pie Chart Makers.
 
Status
Not open for further replies.