Windows 7: Play Crysis Without a GPU

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Someone with an 8800GTS post what it does at 800X600 low quality. I'm guessing those MS numbers are 50%-100% better. If it scales it would be amazing.
 
Systems built to take advantage of WARP from a hardware standpoint will be able to display graphics even when the video card is missing—or toasted. So if you’ve nuked your graphics card from a bad BIOS flash, fear not on a WARP-capable system. At least you will be able to boot back up until the video card is replaced.

what a ridiculous thing to say, warp, or any other type of software rendering (which is exactly what warp is) can not possibly allow you to boot your computer if your video card is toast, it's just not possible.

warp is just microsoft marketing bullshit to make it sound like they developed somthing that hasn't already existed since the dawn of the personal computer.

before we had add in graphics cards all software ran on the cpu, including graphics. when the 3d accelerators (as they were known at the time) came along (i believe the voodoo was the first one) it allowed, via the use of the graphics api (application programing interface) the most of the 3d work to be offloaded from the cpu to the card.

as gpu's became more complex and supported more features graphics api's evolved to allow programmers to exploit said features (often time microsoft worked hand in hand with the gpu manufactures).

now, with windows 7, microsoft plans to include a software rasterizer built into the direct x api but so what? i can just as easily install a software rasterizer on my current windows build and in fact i believe all direct x versions already include one for testing purposes.

if you graphics hardware stops working, there is no way for the computer to boot (the motherboards bios will still beep and the monitor will still not turn on, all you'll get is that little orange light that let's you know no signal is going to the monitor), warp or not you still need to be able to get a signal to the monitor, either from a discrete graphics card or from the on board chip via the connector on the motherboard.

the only thing that warp could allow you to do is run a 3d game if the graphics driver crashed but that's assuming windows wasn't designed to blue screen upon the crash of a video driver. and even if windows didn't blue screen (let's assume that windows used the same driver model linux does and all drivers were loadable modules that you could dynamically load and unload as the OS was running) why not just reload the driver or reboot.

lastly, if anyone wants to get really excited about something ray tracing is what you should keep an eye out for. intel recently demonstrated a 16 core (including hyper threading, so 8 actual cores) setup running a complex 3d demo, in real time, with a resolution of 1920x1080, ray tracing allows programmers to create super realistic environments without needing to write any complicated shaders and without needing dedicated graphics hardware.
 
Software rendering (aka CPU) is not new?? WTF are Microsoft talking about? All they've done is allow the CPU to do the shader calcs and the like -- it's just mathematics.

800 x 600 -- now that IS funny -- trying to remember when I played any game at 800 x 600 -- think it was 1994. Turn up the resolution to what most folks run today and turn up quality details to at least medium and you'll be measuring in seconds per frame instead of frames per second.

Just more evidence that Microsoft has gone off the deep end and scrambling to something "new" to help sell Windows 7.
 
I'm not sure I like the idea of having my operating system(especially Windows) distributing my load between graphics and CPU.......It sounds like it's a setup for disaster in a lot of cases =/
 
I don't see how it would help laptop battery life, sure you could turn off your GPU but the CPU would take extra power to render the graphics anyway.

Deadrats, I would assume that WARP would need to be supported by the mainboard, i.e. the MB would have a graphics connector on it and you could boot from that instead of your graphics card. Kind of like integrated graphics but no gpu required.
 
I think it's sad that this warp stuff already beats intel's intergrated. Of course the only reason they are making this in the first place is the vista/aero mess with intel igp's
 
The purpose of this is NOT to run Crysis. They showed those benchmark just to prove that they were able to get a lot of processing power from the CPU. Of course, it won't be as fast as nVidia's latest GPU, but it is not the point. Knowing how power angry Crysis is, what they acheived is an impressive.

What they wanted to people to realise I think is that it will be more than fast enough to display the Windows 7 destop at big resolutions on multiple monitors. So people who bought their DX9 graphic card last year won't be pissed off because they will be able to run Windows 7 on it. Don't forget that only a few people actually play games on their PC.

And to those who said that the DX SDK already included a software rasteriser, WARP is something completely different. The software rasteriser is used as a reference for video card developper. The speed does not matter. The purpose of WARP is to run fast. From what I understand, the shader are converted to X86 assembly instructions. It also supports multi-threading, that`s why it will be much faster than the software rasteriser.

People also need to understand that in a few years, 32, 64... cores CPU will be in the field. Why not use that processing power to handle graphics? Intel and Microsoft are pushing in that direction, nVidia is going in the opposite direction (running general purpose applications on the GPU). Who`s going to win? Nobody knows, but I would bet on Microsoft/Intel.
 
"Systems built to take advantage of WARP from a hardware standpoint will be able to display graphics even when the video card is missing—or toasted. So if you’ve nuked your graphics card from a bad BIOS flash, fear not on a WARP-capable system. At least you will be able to boot back up until the video card is replaced."

Thats the line that I found was funny. I mean it would be pretty hard to view graphics on a blown out video card but if their is NO graphics card what are you going to plug the monitor into....your butt????

The thing I found kind of cool about it is that if you have a DX9.1-DX10.1 graphics card you can still play DX11 games. So with my DX10 GTX260 I can play the newer DX11 games when it comes out. Also I wonder if say your playing crysis and on the highest res and its choppy if this will make it more smooth using the processing power you have? Maybe it won't be as fast as SLI but it may raise it a few FPS?
 
What this might be able to do is bring computer gaming to the main stream, making people who dont have video crds to beable to play games that require em... or it might alienate them :O
 
[citation][nom]turk_1000[/nom]Come on. Someone with an 8800GTS post some numbers.[/citation]

Ok... I have an 8800GTS (512mb) Slightly OC (not much) and I used Crysis to run multiple benchmarks when I built the system. Rest of the specs as follows.

dual core E8400 3ghz overclocked to 4ghz
4gb DDR Ram
Raptor HD
Vista Ultimate 64bit

DX 10
1,680 x 1,050 (Very High/ no AA) 18-25FPS
1,680 x 1,050 (High/ no AA) 28-30FPS
1,680 x 1,050 (Medium/ no AA) 35-42FPS

800X600 Very High/ no AA (30-45FPS - not sure why it varies so much)
800x600 High / no AA 40-55
800X600 Medium / no AA 45-60
800X600 Low / no AA 50-60

sooo... if the number MS is giving are true... I'm impressed. :)
 
Lol,

This is pretty sad. "omg, the i7 outperform the integrated intel chip. By 2 FPS. 5 fps and 7 fps is UNPLAYBLE in crysis. This is POINTLESS.

WARP = Fail.

Cheers,
 
I'd just like to add to my above comment that my benchmarks under Windows XP SP3 running DX9 averaged about 10-15FPS higher than with Vista and DX10.
 
Oh,

And trying to say my DX10 card will now take on DX11? No it won't. Your CPU will. Your GTX260 will do nothing in DX11 if your WARP in W7 is doing it.

I see the point of WARP. But ut it's stupid to think of it as anything bu ta way to get Windows 7 on OLD machines to increase sales. For a user with a graphics card WHAT SO EVER, WARP is totalloy stupid and pointless.

Cheers,
 
So if the video card is completely toasted (what ever that means..) where does the video signal come out? If you remove the graphics card where do you connect the monitor?
 
So that's how Intel plans to get widespread support for Larrabee...
 
This is great from a corporate perspective I think because if a graphics card fails the system can still run. Albeit it will be slow but it will at least work. From the gaming community it is pretty useless in my opinion though. I mean sure you can now run Crysis without a graphics card but it still will be unplayable at 5 frames a second. I mean if you like watching slide shows and getting your ass kicked it might be okay for you. I do like the idea, assuming I can totally ignore the exploding video card and still manage to run a system until my new video card shows up in the mail.
 
Lots of stupid comments about running windoze w/o or with a blown video card and how great it is for corporate dumbasses...
Wake up, windblows DOES NOT run without a video card as there are no means to attach the PC to a monitor.
All m$ bullshit is really about how it could display their lousy "3D" desktop with blown DRIVERS, as the HW accelerated 3D card will be forced to run in generic 2D VGA mode.
On the contrary, penguins run fine with no video card (mouse and KB) at all, as you can remotely log in. Xwindows runs also, if you really need some graphic UI.
 
My view on this is pretty simple. If it does something useful (e.g. stand in during a driver crash), more power to them. If it doesn't help anything, so what? I'm not inclined to believe it's going to ruin anything.

And of course it's not going to work when the card physically dies - you have to get the signal out somehow. I think the idea is more helping people with lower end computers avoid trouble when it comes to running the OS. If the IGP can't take it, why not use some cycles on that spare core to give it a boost?
 
Status
Not open for further replies.