Windows 7: Play Crysis Without a GPU

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
For the Record:

I just did the Crysis 64bit Benchmark, in Vista 64bit. Running a Q6600 (3.2Ghz) and an 8800GTS (G92). I ran at 800x600 resolution, with "Low" settings on all graphics options.

I ran through the benchmark/demo at least 3 times. Each time, at the very end I was displaying 100FPS as my Average.

I'm not running the same setup Microsoft did I'm sure. My 8800GTS does have a little bit of an OC on the GPU as well. However, if I can do 100FPS Average where they are doing 84FPS Average... something tells me their little WARP thing ain't all it's cracked up to be.
 
"Systems built to take advantage of WARP from a hardware standpoint will be able to display graphics even when the video card is missing—or toasted. So if you’ve nuked your graphics card from a bad BIOS flash, fear not on a WARP-capable system. At least you will be able to boot back up until the video card is replaced."

and what planet do you live on where the OS allows you to utilize a fried video card? you know nothing about electronics, do you?
 
I find it quite odd to plug the monitor cable in my @ss if I don't have a video card. I wonder how good woulb the new technology that allows me to see things on my screen if I have not video card.
 
So let me get this straight, a not yet out, 1000+ dollar CPU beat an old GMA X3100 (virtually costs nothing) and I'm supposed to be excited? I say GMA 3100, because I think the 4500 can probably do 10fps at 800x600 lowest settings.... All while buring more power even.
 
so a 1000+ dollar most recent highest tech CPU beat intel's poorly recieved, old (virtually free) GMA 3100.... (probably on a laptop, downclocked)

WOW AMAZING!!!!!!!!!!

All while buring more electricity. I say GMA 3100, because that's the only Intel DX10 that could possibly score that low, and they probably had to use old drivers. Anyone want to bench their GMA 4500 in Crysis for us?
 
Graphics cards have been able to be emulated for years (check out VMWare, the integrated AMD virtualization, or Sun's VirtualBox). The real cool aspect about WARP is that it allows 3D acceleration, something that I've never thought was possible from a virtual chip. Obviously, unless the technology becomes 1000% better within the next 3 years, this is totally useless for most gamers. But if you're like me and are running a virtual machine, this technology could be a huge boon. I'm currently running a virtual copy of Windows 2k inside of a Linux installation, mostly for compatibility. I've virtualized a video card with 128 MB VRAM with the help of my AMD CPU. This means my virtual copy of Win2k is 100% functional, except for 3D acceleration, which I thought was impossible. This article changed that, but we'll have to see how long it takes this technology to reach areas where it will actually be useful.
 
I high end CPU has ALWAYS been able to run a game at 5fps,
It's called software rendering. What's new here?
The only thing new here is microsoft yet again attempting to gain dominance in another sector.
I hope it fails. Not only because I love my graphics cards but because if microsoft ever gains control of the pc gaming industry the pc will be turned into a console.
I'd rather pay more $$$ for my expensive graphics cards and keep my freedom thank you very much.
 
I don't mean SLI here, rather
a single board with multiple sockets, sold with (say) 1 or 2
sockets filled & one then adds further modules as one wishes to
scale performance. Ditto for the RAM; multiple sockets, buy
a card with 1 filled to give 2GB (assuming modern arts, etc.),
3 or 4 more slots empty for future expansion. Just example
numbers of course.

I had an old Diamond Stealth 64, way back in the day, that had upgradeable video ram. I could raise from 2mb to 8mb, by buying individual vram chips.. I had other cards you could swap out the VRam on-board as well, just don't remember which ones. It was expensive as hell and surely not available on any cards today, but it was possible and is possible for them to do it. Just not worth it, really.

It would be even nicer if you had say an AMD4850 and could add another R700 GPU to it, via an empty socket like you said. :) Without needing an SLI-board or having to buy an extra or X2 version of the card right off the bat.
 
[citation][nom]tipoo[/nom]its pretty sad that their CPU beat their graphics chips at what a GPU should do. and whats sadder is that a Geforce 8400 is 5x faster than their CPU which beat their IGP's at graphics. what does that say about their IGP's?(basing my reply on the numbers posted on the techreport article)[/citation]

Microsoft doesn't manufacture IGPs! LOL. Intel, NVidia and AMD are the players in that market.

Much of the confusion in these responses stems from the fact that Microsoft was misquoted by Tom's. Microsoft never suggested that you could reasonable play Crysis on the CPU.

RTFA 😛
 
This is so obviously relevant to Larrabee, why don't you focus your attention to that aspect? Don't you see what this tells us? It gives us a first cut of what kind of performance we can expect from Larrabee. If we assume that each core of Larrabee has similar performance to a "core2" core (reasonable? Larrabee cores are smaller and generably less capable, but perhaps as capable when it comes to graphics), say 1.4 average FPS per core, and that the software rasterizer scales well with the number of cores (no reason it shouldn't - graphics rendering is a very parallel problem), then we have that the 32-core Larrabee will be equivalent of ~45 average FPS, or slightly better than the GeForce 8500 GT. The 48 core Larrabee may be closer to 70 FPS, but still not on par with the 80 FPS of the GeForce 8800 GTS. I'd say this is close to early expectations of Larrabee.
 
[citation][nom]turk_1000[/nom]Come on. Someone with an 8800GTS post some numbers.[/citation]
Uhhm,I have a GTX 260 Core 216 and a Pentium D 925 Presler(Yeah I know it's not great)
I got about 85-120 frames per second.
 
This is just microsoft giving a little help to intel's intro with larrabee
 
I have a 8800 gts 320mb, athlon x2 6000 overclock to 3.15 ... 3.5g ram in windows xp...

Framerates seem to be accurate... hovers well over 100 but varies between 70-130 depending on scenes... but I dont know how to do test for average... all settings on low in 800x600

 
I do not see why it is important to engineer a warp system if it does not give ppl what we need, hi FPS. I can understand the advantage if it uses both GPU and CPU simultaineusly to improve framerate...

I do not see the market place for such a system at the moment.
 
[citation][nom]riff_1-1[/nom]I do not see why it is important to engineer a warp system if it does not give ppl what we need, hi FPS. I can understand the advantage if it uses both GPU and CPU simultaineusly to improve framerate...I do not see the market place for such a system at the moment.[/citation]

Most people don't give a crap about getting 100 FPS in Crysis.
 
this has always been a feature since 1999

all microsoft did was come up with a way to trick the games into thinking that you have the hardware requirement to run the game (kinda like the dx10 trick for xp which allows you to use some dx 10 features in windows xp by tricking the games to unlock those higher options )
 
also you still need a GPU, software rendering is too slow, everything will be laggy

a CPU just doesn't have the same physical abilities of a GPU

and a CPU can use more power than a videocard. if you max out the load

PS the cpu can pretty much handle any kind of rendering, and can offer a higher quality than the GPU can, thats why render farm use thousands of cpu cores to handle a task instead of a few video cards (nvidia has a render engine for programs like maya which can turn a 1 week render job, into a 2-3 hour job but it cant offer the same quality as the software rendering of mental ray)

this being said, the cpu always has the ability to do any type of rendering, the problem is just getting the games to think that you have a videocard when you really don't and thats all microsoft did

 
On the "bright" side, one could reasonably expect to play something like Half-Life (not 2) with an 8 core processor at around 1024x768 on whatever old graphics card they have lying around.

Wait though, now that I think about it, I did that with my PII-400 and a rage pro 128. Hmmmm, doesn't seem very cost effective to me.
 
This is there, to run aero, so windows 7 looks better on old computer.
 
Maybe WARP was Intel's idea rather than Microsoft's. Perhaps this is the beginning of Larabee support. If WARP scales to multiple X86, maybe this is Intel's way of getting broad-based support for its new brainchild...
 
Status
Not open for further replies.