Windows 7: Play Crysis Without a GPU

Status
Not open for further replies.

tipoo

Distinguished
May 4, 2006
1,183
0
19,280
its pretty sad that their CPU beat their graphics chips at what a GPU should do. and whats sadder is that a Geforce 8400 is 5x faster than their CPU which beat their IGP's at graphics. what does that say about their IGP's?
(basing my reply on the numbers posted on the techreport article)
 

dechy

Distinguished
Mar 7, 2008
227
0
18,690
"Systems built to take advantage of WARP from a hardware standpoint will be able to display graphics even when the video card is missing—or toasted. So if you’ve nuked your graphics card from a bad BIOS flash, fear not on a WARP-capable system. At least you will be able to boot back up until the video card is replaced."

That, from a massive enterprise point of view (28000 users / 46000 desktops & laptops) is a very nice feature. We have to deal with blown video cards monthly, and when it happens we're either waiting for warranty vendor to come & replace (a few hours) or scrambling to find an older one from an old clunker... either scenarios mean the user isn't able to work.

Definitely looking forward to WARP
 

DXRick

Distinguished
Jun 9, 2006
1,320
0
19,360
DX has always had a hardware emulation mode (run in software on the CPU). It allows a programmer to test without concern of what DX capabilities the GPU he/she is testing with is capable of.

So, MS is going to tout this as a feature of Windows 7, eh?
 

sparky2010

Distinguished
Sep 14, 2008
74
0
18,630
wait a sec, basically, from i can see on that chart, getting a single, not so great (anymore), GPU like the 8800 GTS, and playing crysis using this warp thing, you get an average FPS higher than three 280 gtx's or 2 4870 x2's?

i don't know or care about the running games using your cpu alone, but cranking up 3d performance this much? hmmmm.. first i hope it works cause high-end gaming can become so much cheaper, second, it'll be interesting to see where games can go from there because of this..since consumers will be able to buy mid-to-low end graphics cards and still be able to play games.. game creators can start cranking up the graphics alot.. also, the biggest problem with pc gaming is the price/performance ratio.. it's MUCH cheaper buying a ps3 to play games, so the pc game industry can get back on its feet again! AND, in 2-3 years, we'll start having photo realistic games!!! yummy!

but that's if the crap microsoft just spouted up there is true...lol
 

radguy

Distinguished
Jan 25, 2008
223
0
18,680
I would be excited if we could run this as for secondary graphics. My notebooks battery life would double if this would run the secondary graphics for surfing the net. When I play games my dedicated gpu would turn on. It was about the only I really did like on the new macbooks but it needs to be more seemless design. I'm hoping we get their larabee might be the frist step. Might being the key word
 
G

Guest

Guest
@sparky2010: that's FPS for 800x600 and lowest settings don't confuse yourselfurself
 

dechy

Distinguished
Mar 7, 2008
227
0
18,690
DXRick, I've heard of such things from DX, but I've never seen it work as I've had to deal with two blown home video cards (X800XT & X1900XTX) and there was no such thing as booting Windows without anything in the AGP slot (no IGP).

DX is theirs, so it doesn't matter how they spin it, wether they say "we're finally building an OS that can support the DX10-10.1 feature to boot up Windows without any gfx card" or "we're introducing a new feature in Windows 7" is basically the same thing.
 

JonnyDough

Distinguished
Feb 24, 2007
2,235
3
19,865
Awesome. Now if only my GPU could act as a partial processor when I do video encoding. Oh wait, we have that now too. I'd like to see GPUs and CPUs integrate more, and be able to do anything. It would be awesome if Windows 7 allows me to play more GPU heavy games on my older Athlon systems. It might actually be an OS worth my cash "upgrading" to.
 

Niva

Distinguished
Jul 20, 2006
383
1
18,785
mtyermom,
You joke but there is the possibility we might switch video to usb in the future, or even smaller form connector.

However, for the time being there will be a regular DVI output on the motherboard close to where your usb ports are so calm down.
 

rmicro1

Distinguished
Dec 15, 2006
136
0
18,680
If the video card is toast, what will give the display to the monitor so you can continue working?

Perhaps just logging in remotely...but this means using another computer anyways...hmm.
 

that_aznpride101

Distinguished
Aug 13, 2005
111
0
18,680
[citation][nom]radguy[/nom]I would be excited if we could run this as for secondary graphics. My notebooks battery life would double if this would run the secondary graphics for surfing the net. When I play games my dedicated gpu would turn on. It was about the only I really did like on the new macbooks but it needs to be more seemless design. I'm hoping we get their larabee might be the frist step. Might being the key word[/citation]

Good point radman, users of laptops will definitely get a huge benefit if they are able to port this over the laptop and be able to increase the lifespan of the battery or even the laptop myself.

On another note, there's 2 items that I want to ask:

1) How are nVidia and AMD going to react to this when they find out their integrated graphic chipsets will not be needed when Windows 7 comes out? This will seriously affect their revenue and their business model.

2)If graphics will now be computed on the CPU instead of the GPU, will this cause overheating issues and reduce the lifetime of the CPU?

Just some thoughts...

 

juvealert

Distinguished
Apr 13, 2006
249
0
18,690
who is the jackass running a game at 800x600 resolution ?!!

Is this another F...... from MS ? the same way they did with the vista capable logos ?
 

bvickers

Distinguished
Mar 9, 2007
3
0
18,510
Here's a key application: web interface enhancement. IE 9 and SilverLight 3 would likely have WARP integration that would allow an entirely new online experience to be had, and widely available without a client-side application running (other than the OS, obviously).
 

dreamer77dd

Distinguished
Aug 5, 2008
97
0
18,640
well thinking that the cpu can run graphics and everything else it is doing in the background. it is ok.

Now if we could get the copper out and put fibers optics in or something that sends information by light... hmm thinking we could send the amount of a 100 graphics cards they a special hook up like usb 9.2 or something. Well the cpu's are already small, just no one has a reason to make graphic chips so powerful unless windows 10 comes out. thinking it could be amazing 3d Imax kind of experience, gaming, movies and everything else you could think of. memory of love ones from pictures. feels like your their with them. i am sure we could take things far beyond this if we had the needs to.
 

hellwig

Distinguished
May 29, 2008
1,743
0
19,860
[citation][nom]rmicro1[/nom]If the video card is toast, what will give the display to the monitor so you can continue working?

Perhaps just logging in remotely...but this means using another computer anyways...hmm.[/citation]
I don't know of any remote-loggin utilities that allow the display of anything drawn in DirectX (including video). I always see the "No Capable DirectX video output found" message. Maybe this is a limitation of the current DirectX and will be altered with WARP.

As for getting local display output, maybe it can bypass the defective GPU and interface directly with the RAMDAC, in a similar fashion when you have no graphics drivers installed?
 

techguy911

Distinguished
Jun 8, 2007
1,075
0
19,460
Wait ill get 15fps omg call the press this is sort of useless most people already have a gpu, the question IS what will the fps be on my tri- gtx 280 sli on windows 7?.
 

jhansonxi

Distinguished
May 11, 2007
1,262
0
19,280
"The following are benchmarks from Microsoft’s own test of Crysis, running at 800x600 with the lowest quality settings"

Those look like numbers you see within sales departments. Outside of that environment the numbers will be lower as physics applies.
 
G

Guest

Guest
After DirectX8.1, many games asked for specific cards, and DirectX would refuse to emulate instructions that the graphics card didn't supported. So you had an Ti4200 and CoD4 refused to play on it, asking an... FX5200!
I hope we'll go back to the time when software filled the hardware gaps...
 
Status
Not open for further replies.