CD Projekt Releases Hardware Requirements For The Witcher 3

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

aztec_scribe

Distinguished
Jan 22, 2010
115
0
18,710
I have a feeling this is their way of hyping our expectations. I only just bought an R9 290 so I'll be pretty pissed if it doesn't at least hit very high setting with a few ultra settings at 1080p.
 

Avus

Distinguished
Nov 2, 2001
355
0
18,780
many people (specially those who sell computer hardware) still "look down" at Intel i3 just because it is a "fake" quad core. And some will say sh!t like "you need an i7 to push a graphics card like GTX970..."
 

Justin C

Honorable
Jul 27, 2013
8
0
10,510
@Quixit

Consoles have next to zero overhead and super light API's which leads to drastically less wasted cycles. This has always been the reason why lower powered console CPU's can achieve what they do.

Lets also not forget that the shared memory in the PS4 for example is GDDR5, meaning it is quite a bit faster than your PC's standard memory. GDDR5's bandwidth is head and shoulders above DDR3/DDR4.
 
@Quixit (and Justin C),

As Justin C said consoles are more efficient.

I just wanted to add that the "recommended" specs probably apply to the game at HIGHER than 30FPS whereas the consoles will be locked to 30FPS.
 

Avus

Distinguished
Nov 2, 2001
355
0
18,780


I actually think the other way. My conspiracy mind will think it is Intel want game developers to put a higher Intel CPU requirement to "trick" people the need to upgrade their PC (specially the CPU). You know many games (if not all) are mostly GPU dependent nowadays and PC CPU advancement had been SLOW since Intel dominance. I am still using an Intel i7-920 I bought in 2009 for my gaming system. Up until today, all I needed were 1 RAM upgrade (6GB to 12GB) and couple GPU upgrades (GTX275 to GTX660 to GTX970). My "gaming" system can still play all the latest game @ 1920x1200 with either high or ultra detail with a good enough FPS. Of course my games will probably run even better if I get a brand new Intel i5 or i7 (or even i3). But to me CPU upgrade had been given a "Diminishing Returns" vibe for the last 5 years. Even a simple and relatively cheap SSD upgrade give a lot more "power upgrade feel" than a CPU. There is a reason why there are still many people gaming on an older Intel CPU like Core 2 Quad 9xxx or AMD equivalent.

This is not the first time I saw a game with an "inflated" Intel CPU requirement. (see Assassin's Creed Unity) It is either Intel behind it or game developer just suck at PC hardware....
 

slyu9213

Honorable
Nov 30, 2012
1,054
0
11,660


If anything I feel the CPU has gotten more important in games nowadays. If you have an 2nd Gen i5/i7 and up it seems that is basically all you need in the CPU department excluding a few games that get a measly 1-3 FPS. On the otherhand the older AMD processors are lagging behind if looking for smooth FPS with a higher minimum FPS. The deal is that games are more optimized to work well with how Intel CPUs are designed compared to AMD. Single Threaded IPC, FPU compared to the additional cores/threads of AMD at a lower price. I honestly think that the game will look great even on low settigs (minus having lowest texture settings). I'm praying that the 'high requirements' isn't due to the same reasons as other developers
 

qubits

Reputable
Jan 6, 2015
366
0
4,810
not just the cpu requirement is funky but also the gpu.. 770vs290

how is it these console games recommend a stronger amd gpu over nvidia when the consoles use gcn?
 

andre_888

Distinguished
Mar 23, 2010
33
0
18,540
The specifications are probably much less than what is advertised.
I was doubtful that my A10-5800k with the 7660d GPU would have been adequate for Dragon Age Inquisition, but it was.
My monitor is 1600x900 and I play with medium settings. There game has no lag and is fully playable.
 
Status
Not open for further replies.