Intel's Next IGP Slated to Run Sims 3

Status
Not open for further replies.

jincongz

Distinguished
Aug 21, 2008
12
0
18,510
0
Even if you're not dripping with sarcasm, you should be.
The Mac version of Sims 3's minimum requirements:
ATI X1600 or NVIDIA 7300 GT with 128 MB RAM or Intel Integrated GMA X3100
WoW on PC:
32 MB 3D video card with Hardware T&L or better
Battlefield Heroes:
DirectX compatible 64 MB graphics card with Pixel Shader 2.0 or better

As far as I'm concerned, their X3100 already hits all of their targets. Congratz.
 

makotech222

Distinguished
Mar 30, 2006
272
0
18,780
0
my 4500mhd cant even run runescape, europa universalis 3, and most basic games without serious lag. i hate it and regret going integrated.
 
G

Guest

Guest
What does Intel mean by "run". So far their definition of "run" seems to encompass a slideshow with minimal graphic quality. I mean sure they can "run" a game but who wants to play at 10 fps and watch their character bounce randomly around the screen or watch a blu-ray movie and miss half of the dropped frames?
 

kravmaga

Distinguished
Dec 10, 2009
74
0
18,630
0
Sounds like an invitation for jeers but intel's actually doing good work here.
Integrating better minimalist gpus into mainstream machines will ensure a leveled entry-level standard for pc gaming. Comparing these things with nvidia or amd gpus is like comparing a stock camry engine with an aftermarket custom ordered bmw supertech engine.

There's a huge population of people who go for the cheap computer using an IGP and would never think of buying separate parts just for gaming. An improvement on the performance of those machines would create a wider userbase for a ton of titles that are conservative on requirements... PC gaming would benefit way more from a moderate bump in gpu performance of 90% of the low-end beaters out there than an extra bit of performance on the 1% very high end gpus. Think of upcoming titles like starcraft2 or diablo3 where the fun isn't just in the eye candy; if these titles run correctly on these chips, I'm sold.
 

JustinHD81

Distinguished
Mar 31, 2009
39
0
18,530
0
At the moment, for the most part, you can only get Intel integrated graphics with Intel CPU's and you might get a choice with AMD CPU's as Hypertransport is a little more open than Intel's FSB/QPI. So, what Intel is really trying to do is up their performance so that even the low end Nvidea and ATI GPU's don't have a market anymore. Still if you're into gaming and want integrated, better off with AMD, at least you get some choice.
 

anonymousdude

Distinguished
Jun 20, 2009
711
16
19,065
49
I wonder how Intel defines the word "run". I define run as you can play it on your PC whether it is smooth or not. I define "playable" as at least 30 FPS. Also what resolution are they using because even my laptop using an AMD IGP can play crysis on 800x600 on low settings, but it is still choppy in parts.
 

ta152h

Distinguished
Apr 1, 2009
1,207
2
19,285
0
You'll always have the kiddies that don't really understand games, and throw around gay terms like "eye candy" so they sound cute, but the reality is fancy graphics don't make a fun game. If you are complete moron, sure. If you're a simpleton, you bet. If you're completely superficial ...

That stuff is relatively easy. But, actually making a mentally stimulating game is quite hard. It's not about resolution, it's about thought. There were old games that were completely text based that were fun. Ms. Pacman was a Hell of a lot more popular than any modern title, although I never fancied it. Defender and Gauntlet were atrociously addicting arcade games, that by today's standards would be ancient. Defender would even slow down the game at certain points, that became one of the charms of the games.

One thing is clear though. Power hungry, noisy, ovens that run in computers are never desirable. They are expensive to run, and are unpleasant to be around, and cost a lot of money. For some people, they're worth it, but for the vast majority of people, Intel solutions are more than they need. You can play a lot of really fun games, without paying massive amounts of money for your computer, or electrical bills, and not having a noisy oven in your office.

Also keep in mind, Intel IGPs of today are just as fast as old discrete cards that played games you thought were really fun years ago. Did those games suddenly become less fun? They didn't change, and human nature doesn't change so fast, so they're still plenty fun.

I still like playing games from the 1980s.

So, if Intel can boost performance without boosting cost, power use, and noise, it's a really good thing for way more people than ATI producing a $500 card that runs like a raped ape. Both are good, of course, it's just that the Intel solution will effect more people. It's not a trivial improvement. Counter-intuitively, the barn burners from ATI and NVIDIA are, since they effect relatively so few people.
 

liquidsnake718

Distinguished
Jul 8, 2009
1,379
0
19,310
5
Larrabee is dead... another flimsy integrated gpu.... well if they could work with Nvidia for hybrid SLI thatwould be great.... hybrid sli that can actually change between gpgpu processors without having to reboot.. if they could do this in an efficient manner without having to tax the processor or the gpu while switching then this would be great. I dont think Pcie2.0 is capable of that though...
 

warezme

Distinguished
Dec 18, 2006
2,408
27
19,840
20
Good gawd ta152h, you won't get any sympathy around here with your pathetic tirade of excuses for your slow antiquated excuse for modern hardware. You don't like to play games from the 80's.., you ARE from the 80's!, put away your Flock of Seagulls CD's, Deloreon posters and step into the modern world.
 

brockh

Distinguished
Oct 5, 2007
513
0
19,010
19
[citation][nom]ta152h[/nom]You'll always have the kiddies that don't really understand games, and throw around gay terms like "eye candy" so they sound cute, but the reality is fancy graphics don't make a fun game. If you are complete moron, sure. If you're a simpleton, you bet. If you're completely superficial ...That stuff is relatively easy. But, actually making a mentally stimulating game is quite hard. It's not about resolution, it's about thought. There were old games that were completely text based that were fun. Ms. Pacman was a Hell of a lot more popular than any modern title, although I never fancied it. Defender and Gauntlet were atrociously addicting arcade games, that by today's standards would be ancient. Defender would even slow down the game at certain points, that became one of the charms of the games.One thing is clear though. Power hungry, noisy, ovens that run in computers are never desirable. They are expensive to run, and are unpleasant to be around, and cost a lot of money. For some people, they're worth it, but for the vast majority of people, Intel solutions are more than they need. You can play a lot of really fun games, without paying massive amounts of money for your computer, or electrical bills, and not having a noisy oven in your office.Also keep in mind, Intel IGPs of today are just as fast as old discrete cards that played games you thought were really fun years ago. Did those games suddenly become less fun? They didn't change, and human nature doesn't change so fast, so they're still plenty fun. I still like playing games from the 1980s. So, if Intel can boost performance without boosting cost, power use, and noise, it's a really good thing for way more people than ATI producing a $500 card that runs like a raped ape. Both are good, of course, it's just that the Intel solution will effect more people. It's not a trivial improvement. Counter-intuitively, the barn burners from ATI and NVIDIA are, since they effect relatively so few people.[/citation]

I understand what you're saying, but I think it's a bit unnecessary to call everyone who likes new games simpletons while trying to preach equality for the actual content of the game over superficial qualities. There's horrible looking, horrible playing games as well. Everyone has their things.
 

JonathanDeane

Distinguished
Mar 28, 2006
1,469
0
19,310
6
Yes just because ET on the 2600 had horrid graphics does not mean it was a great game...

I agree graphics are only a small portion of a game and an old game can still be tons of fun (example Asteroids on the 2600 or Jungle Hunt or Mario Bro's) but with out the newest $500 dollar graphics your low end PC graphics would still be CGA... Its the upper end and the desire to have that new pretty that drives tech forward not the low end barely runs checkers at 20fps.

Sure some games are all flash and no content but thats nothing new (first thing that comes to mind is Space Ace or Dragons Lair, almost not games at all but they sure looked pretty in the 80's)

Now for the actual article, I pray Intel delivers something substantial this time (note that for the usual business machine they where just fine) for some one who buys a computer at Walmart or Bestbuy this could be a decent deal for them maybe pull the bottom end up finally. It seems like integrated graphics have been stuck at the same level for quite some time.
 

WheelsOfConfusion

Distinguished
Aug 18, 2008
705
0
18,980
0
[citation][nom]ta152h[/nom]One thing is clear though. Power hungry, noisy, ovens that run in computers are never desirable. They are expensive to run, and are unpleasant to be around, and cost a lot of money.[/citation]
... this is IGPs we're talking about. Nobody expects Intel to do anything with discrete graphics, but ATI and Nvidia both have moderately powerful and cool-running integrated chips that run circles, squares, and icosahedrons around anything from Intel. (well Nvidia did before they threw in the towel since Intel locked them out of the IGP market for Intel boards) There's no need to compare modern IGPs to the discrete cards of eight years ago, the aging 780G chipset is nearly as powerful as a budget gaming machine from TWO years ago. That's Intel's competition here, and AMD hasn't stopped developing integrated chips. By the time Intel's product is out I'm willing to bet we'll see ATI Radeon HD 4400something graphics that will once again spank Intel's offerings up and down the charts from gaming to HD video. Hell, I bet a 780G would still do it. You can get motherboards with these chips for fifty/sixty bucks, new. I know someone who plays Sims 3 on one today with an Athlon X2 5050e, a total power sipper based on old architecture! That's the kind of step backwards Intel is reaching for with graphics if this slide is real. There's nothing stopping you from playing old games with this hardware, but now you have the choice to actually play more recent games if you want.

That's games, though. It's not just gravy for casual gamers, HD video is better, and you get a better eye-candy (whoops, there's that gaytalk!) experience in Window's Aero interface using them than Intel's graphics.
 

pender21

Distinguished
Nov 18, 2008
125
0
18,690
1
Every in house GPU Intel develops (or announces) is crap and is just a minor hardware update to support newer APIs but can't actually play any games that require the more modern Shader Models.

I wish AMD and Nvidia would make their IGP chipsets cheaper for notebooks just to keep GMA 900/950/3100/4500 garbage out of notebooks.

Also, why has AMD not integrated a GPU onto their mobile CPUs yet? They certainly have the ATI expertise to do so and you can't upgrade notebook GPUs anyway so who cares if the GPU is on die.
 

WheelsOfConfusion

Distinguished
Aug 18, 2008
705
0
18,980
0
[citation][nom]pender21[/nom] Also, why has AMD not integrated a GPU onto their mobile CPUs yet? They certainly have the ATI expertise to do so and you can't upgrade notebook GPUs anyway so who cares if the GPU is on die.[/citation]
There wouldn't be nearly enough room for even an Intel-level GPU on the CPU die, especially since AMD's mobile CPUs are all still at 65nm as far as I know.
 

BartG

Distinguished
Jun 6, 2008
60
0
18,640
2
Makes perfect sense, why not have IGP that can run the run of the mill games with no issue... Cant be that hard to do and even though it will not please the enthusiast or hardcore gamer, it might just serve the other 75% of the computer gaming community.

 

hemelskonijn

Distinguished
Oct 8, 2008
412
0
18,780
0
WoW already ran fine on my extensa 5230M.
Though i dont have the laptop any more i am pretty sure it ran on a Intel GMA X4500HD.

Sure it did not run with all setting set to high but no where near low either.
I dont get what the big deal is since even my Netvista 8309-15G was able to run WoW on its IGP and that used to run on intel Extreem.
(be it on low setting with a frame rate of about 26 in raids).

Since WoW runs on nearly everything i think it is not really a selling point.
 
G

Guest

Guest
wow played on 7300gt? this is a joke, i have bad times with 7900gt and quad core 2,6ghz. wow needs plenty of resources specially in town where hundred of players are around
 

trinix

Distinguished
Oct 11, 2007
197
0
18,680
0
[citation][nom]jaxwins[/nom]wow played on 7300gt? this is a joke, i have bad times with 7900gt and quad core 2,6ghz. wow needs plenty of resources specially in town where hundred of players are around[/citation]

They never said where you can run it. I bet in the middle of nowhere, where no player runs around, you could run it with no problems at all.

Of course if you want to run in the towns, you might need a more beefy system. And well I wonder if it's really your computer that can't take it or just the server who can't.
 

jonpaul37

Distinguished
May 29, 2008
2,481
0
19,960
81
[citation][nom]ta152h[/nom]You'll always have the kiddies that don't really understand games, and throw around gay terms like "eye candy" so they sound cute, but the reality is fancy graphics don't make a fun game. If you are complete moron, sure. If you're a simpleton, you bet. If you're completely superficial ...That stuff is relatively easy. But, actually making a mentally stimulating game is quite hard. It's not about resolution, it's about thought. There were old games that were completely text based that were fun. Ms. Pacman was a Hell of a lot more popular than any modern title, although I never fancied it. Defender and Gauntlet were atrociously addicting arcade games, that by today's standards would be ancient. Defender would even slow down the game at certain points, that became one of the charms of the games.One thing is clear though. Power hungry, noisy, ovens that run in computers are never desirable. They are expensive to run, and are unpleasant to be around, and cost a lot of money. For some people, they're worth it, but for the vast majority of people, Intel solutions are more than they need. You can play a lot of really fun games, without paying massive amounts of money for your computer, or electrical bills, and not having a noisy oven in your office.Also keep in mind, Intel IGPs of today are just as fast as old discrete cards that played games you thought were really fun years ago. Did those games suddenly become less fun? They didn't change, and human nature doesn't change so fast, so they're still plenty fun. I still like playing games from the 1980s. So, if Intel can boost performance without boosting cost, power use, and noise, it's a really good thing for way more people than ATI producing a $500 card that runs like a raped ape. Both are good, of course, it's just that the Intel solution will effect more people. It's not a trivial improvement. Counter-intuitively, the barn burners from ATI and NVIDIA are, since they effect relatively so few people.[/citation]

Am i missing something here? you really think Ms. PacMan can hold a candle to the 12+ million WOW subscribers?
 
Status
Not open for further replies.

ASK THE COMMUNITY

TRENDING THREADS