The Witcher 2 playable without dedicated graphics?

Status
Not open for further replies.

Zodeak

Distinguished
Jun 27, 2011
44
0
18,530
Hi all,

I have been in a discussion with someone who claims that The Witcher 2 will be playable on a certain laptop. The laptop has an i5 core (not sure what generation) but has no dedicated graphics. It was bought about a month ago, so is very recent (i presume) and may have the intel 3000 HD graphics thingy. I'm pretty sure on native resolution (or any reasonalbe resolution) the game will be getting literally 5 fps on the absolute rock bottom settings. Which is not playable...

I know that if the computer didn't have the HD 3000 graphics it wouldn't be able to run the game at all, or at least not in a format that we would consider a game...


Any back-up here?
 




So, in other words, he's not right about the performance. That article would lead one to believe that on the "absolute rock bottom settings" and low resolution his friend might be able to get a decently playable framerate (20+).

I would say your friend could probably get it to work IF he has Intel HD Graphics 3000 on his laptop. If memory serves, the only i5 processor that has that is the 2500k. If he only has the Intel HD Graphics 2000, it's going to be less likely.

I've gotten Witcher 2 to run on my desktop quite smoothly. It's an i7 2600k (which has Intel HD Graphics 3000), but the integrated graphics are "overclocked" and I have 16GB of OC'ed (2100 Mhz) ram. 1.7GB of that ram is shared with the integrated graphics.

This leads me to the next point: I think the other big factor is how much ram his laptop has, since part of that is going to serve as his VRAM, essentially. How much/what kind of ram does the laptop have?
 
He has 4GB I think of DDR2.

The fact that the low resoultions and low settings would have to be used to get low fps means that performance IS low. That is the whole thing about performance... If you're playing a game, you want the game to run, never mind run smoothly, just run at all. It gets to the point when it longer resembles a game.

I also question what your idea of "quite smoothly" is. You say decently playable is 20+? I say that is pretty unplayable if you're averaging that.

Genuinely I would be interested if the 3000 graphics can handle it at native resolutions (1080P) but I have scoured the internet and found notebookcheck to say that it gets 13 fps on "the rock-bottom"settings and actually doesn't play at all on any other higher settings.

His CPU is i5 2430M or somthing like that... which is quoted to have HD 3000 on the intel website I think...
 
I just looked up the laptop used to tes tthe witcher 2 on notebookcheck and they failed to mention that there was a GTX 540M hidden away, which is probably why it scores above single digit fps...
 
At 1080p, no way. Most laptops would be lower res like 1366x768, and in that case it should be at least playable with the HD3000, which is roughly equivalent to a Radeon HD 5550.

Regardless, gaming laptops are grossly over priced and under perform compared to their desktop counterparts.
 
My old XPS had 1080p res.... maybe even 1200....

Anyway, I agree that laptops are very very over-priced and that it seems a bit unfair, but then again, they do have to squeeze everything into a tiny casing and keep everything cool.
 
Considering most movies are shot at 24 fps and I don't consider them unviewable, yes, I believe 20+ framerate is playable. In fact, that's what most people think. It's not the best, and it better be pretty stable (cause if it dips much it will suck), but your friend is considering trying to play on rock bottom settings...clearly he isn't concerned with the best (or even just good) performance. I think most people view 30fps as a soft cap, with 20fps being the hard cap.

Regardless, that CPU is 2.4Ghz with a 3 MB cache so even with the HD 3000 graphics I doubt he'll be able to play with only 4 GB of DDR2 ram. Once again, though, you sound uncertain about these details, and without them, you're basically asking "Can my friend play Witcher 2 on a laptop w/o dedicated graphics?" That's simply not enough detail, as the Witcher 2 @ 40fps on my computer w/o dedicated graphics is "running smoothly." Once again, this is on rock bottom settings - my graphics card arrives sometime this week.

The last thing I'll say is that several different studies have found the HD 3000 to be equivalent to an entry-level graphics card in terms of performance. If any of the stuff you said two posts ago ends up being correct, though, he probably won't be able to play it.
 
That's just some commentary on how frames per second determines how sucked into the story of a movie you are (which is completely bogus, by the way). As if Twilight in a lower framerate would get me more involved in the movie. His whole argument also begs the question by assuming that the brain fills in the gaps - this is a philosophically open question.

I mean, how does that article even show that there's this sort of apples and oranges difference you profess? It doesn't even compare video games to movies. It just says that movies/tv shows shouldn't be shown at higher framerates because it impacts how absorbed into the story you become, which doesn't bear on this discussion.

Furthermore, let's go ahead and assume these game animations you show me are completely different in a way that does not allow for comparison to movies. Does the 15fps demonstration look completely unplayable to you? Sure, it's not great, it might even be a bit choppy like the guy says....but unplayable? Really? And that's 15fps...the threshold I was talking about is in the 20s, which would be anywhere from a 33-95% increase in framerate.

I lol'ed at that whole "By not showing enough visual information, we force the brain into filling in the gaps...it draws you in even more" line, where he stated No Country for Old Men would be like Days of Our Lives if you increased the framerate. Man, that was a hilarious article. I mean that is some webpage designer's personal homepage you are citing there, with a quote by a cinematographer who is unqualified to answer such perceptual and neuroscientific questions. Just cause he's a cinematographer doesn't mean he's an expert in how the visual process and the human mind works.
 
I don't really understand what you're wowing about. I pointed out that the article had nothing to do with what we were saying and was primarily concerned with making some inference from framerate to how involved in a movie you are. Again, LOL.

The 30fps looks choppy to you? Not to me. Sure, not as fluid as the 60fps, but that's not the same as saying it's choppy. Maybe I just have lower standards than everyone else, I dunno. I'm sure 60fps doesn't look as good as 120fps, but that doesn't make 60fps choppy.

I suppose the principle difference here is blurring. With movies, things don't get choppy or stutter even when the object is moving fast b/c the object blurs each frame. This means that object has more continuity because there is a more fluid transition from one frame to the next. In the case of video games, this clearly does not happen - there are no blurred images, just sequences of highly detailed, everything-in-focus frames. This would mean that fast-moving things would jump a further distance from frame to frame, and appear clear but choppy.

So yeah, that article does point out a good difference between movies and video games - motion blur. Motion blur makes 24fps look a lot more fluid at normal speeds, since clear and distinct objects aren't hopping large distances between frames (like they are in choppy video games). Does motion blur make it such that we're comparing apples to oranges, though? I think all this means is you need more fps in a game to make sure it's not choppy, so just think about the fps as being on a different scale, which is pretty much what the article says - they claim 120fps would still not be enough. That might be true to exceed the human eye's capability for noticing choppiness in a side-by-side comparison (e.g. 3000fps vs. 5000fps), but that seems to be an extreme standard for choppiness, generally construed. I mean, if we take them seriously, we'd have to call a game running at 120fps choppy!

I stated my original point badly. The main thing I was trying to point out is that a framerate in the 20's would not be unplayable. I slipped up and stated that point a little unclearly - that because 24fps in a movie wasn't unviewable, a framerate in the 20s shouldn't be unplayable for a game (which implies I think they are the same). However, I stand by that statement because they are not sufficiently different to call movie 24fps perfect and video game 20-29fps completely unviewable. So long as that's not the case, my point still stands. I don't think they'd be the exact same experience, as you correctly pointed out they would not be, but given that 18fps is sufficient for no choppiness whatsoever (and I'd still consider a movie "watchable" even if it was a bit choppy), I think that means a framerate in the 20s for a video game is "playable." When I go and look at those little boxes bouncing in the first link, playing a game at 30fps seems very doable. Also take into consideration that those boxes are not actually displaying their frames in a second - the boxes get desynchronized after a little bit. If the 30fps box was running 30 frames in a second, it would look a bit smoother and the box would move just as far as in the 15fps box.

Sorry if I offended you or anything, I just thought that first link was terrible, and some of the claims the dude made were ridiculous.
 
I guess all we've proved in this discussion is that frames per second seem subjective to the viewer and we all derive pleasure from different qualities than others, i.e. some have higher "standards" than others.

Also, to address your third paragraph zabuzaxsta, the fact that TV or films aren't choppy is becuase they are very, very consistent and set to a certain frame rate. They cannot increase or decrease the fps rate, it will stay the same. Human eyes are very precise and can determine the slightest difference, so it is probably the contrast between the two frame rates in games, where there are easier to generate/process areas (resulting in a higher frame rate), which makes it choppy. If it was stuck at one frame rate somehow (which would be very hard...) then a "low" frame rate would seem smoother. But 10 fps is still 10 fps. So, ten refreshes every second. A lot slower than what I would consider playable. Others could probably stand it but I personally couldn't.
 
You´re rigth about that fps is a relative matter in the way you see it, the first time i started to play games in a pc was with a hd 4550, that was plenty for me, in fact the games run smoothly but were at 25-30 fps then i changed to a hd 5670 and the difference was notably, i get used to the 35-45 fps round and below that everything is choppy to me and the same with AA at first i didn´t notice anything at all but when i started using AA is impossible to me to play without it now. 😛
 
Status
Not open for further replies.