juanrga :
No problem with pushing higher resolutions. The problem is when you advertise new cards at resolutions used by 0.03% of gamers or at resolutions that nobody uses, because the new cards are not competitive enough at mainstream resolutions used by gamers.
There are some analysts predicting that the 300 series won't be the game changer that AMD needs for turning up its finances, because 4K gaming represents a niche market inside a niche market, whereas Nvidia will continue serving well the needs of the remaining 99.97% of gamers
http://www.fool.com/investing/general/2015/02/02/will-advanced-micro-devices-amd-bet-on-super-fast.aspx
3D Now! is an extension to x86. You can support it or not. It is
optional. But if you claim to support AMD64 then you have to support everything what was included in the ISA by AMD engineers. You cannot start implementing some instructions and rejecting others in function of how smart are your engineers or how popular is software. Or you build a x86-64 processor or don't.
Every new technology has something called "adoption rate". I'm not saying 4K is like the second coming of Jesus or anything, but you do have to understand using the argument "yeah, this is new but no one is using it!" is a *very* bad argument when discussing "future".
If AMD has an edge on a given future technology, give them that at least. You're just taking the cake away 'cause you're angry at AMD for something.
In any case, "3D Now!" is an extension... Well, sorry about being Wikipedia, but...
"Software written to use AMD's 3DNow instead of the slower x87 FPU could execute up to 4x faster, depending on the instruction-mix."
Why support older stuff when the new stuff you support offers such gains? Same was for MMX and SSE1. Point is, AMD was *forced* to remove it when they noticed the amount of programs using it in favor of older x87 or SSE and MMX. AMD never wanted to increase their x87 performance and I bet neither Intel. In the context of Skyrim, there has to be something else at play. Maybe compiler flags?
jdwii :
De5 now come on we both know that is not the argument, juan simply stated 8GB would beat 4GB in 8K, a res no one uses. Its not like either perform had playable performance anyways.
Lets not pretend to know what fallacious arguments are.
When 1080p was being introduced, I remember the 9800PRO and GF5900FX having crappy performance at it. That lasted until the 8000 gen from nVidia was introduced and HD4k from ATI/AMD they had acceptable performance at it.
Again, point is: it's dumb to even say that just because it has always been the case with technology (in particular VGAs). New resolution -> current gen barely manages -> companies start optimizing -> decent VGAs come and we're all happy.
Let me rephrase it: AMD tries hard to move forward whereas nVidia is happy with 1080p because is outselling AMD hard. You need to put into context the shenanigans of each company here. You think Physics was hit and cool before nVidia bought PhysX? You think 3DFX imagined 32bit depth was important when the GF2 had HW support for that? AMD is just doing what everyone does when they see an edge: hype it. At least, when nVidia pushed for 32bits, the industry liked it and people accepted it quick. If you ask me about resolutions, I really don't know, but at least it's a steady movement towards higher res all the time and history proves me right.
In any case, you're right, 4K today is unplayable in single card solutions unless low detail or something else is traded. But we're discussing trends and future, right? That's why the 390X will be important. If it enables decent 4K gaming, you'll be ready to make the switch. Hell, I love bigger screens not only because of games, but because of screen real state.
Cheers!
EDIT: Typos & I forgot the link
http://en.wikipedia.org/wiki/3DNow!