Don't Call It 'Godavari'; AMD Updates Kaveri APUs With DX12, FreeSync And VSR Support

Status
Not open for further replies.
Please stop saying "The power of the integrated GPU can then be harnessed and used as an advantage when paired with another GPU.."

That's a feature DX12 supports however it must be implemented by the game developer at extra time (thus money).

We have no idea how common this will be nor what expectations we should have for support but suggesting it will just naturally occur once we upgrade to Windows 10 is very, very misleading.
 

Puiucs

Honorable
Jan 17, 2014
66
0
10,630
Please stop saying "The power of the integrated GPU can then be harnessed and used as an advantage when paired with another GPU.."

That's a feature DX12 supports however it must be implemented by the game developer at extra time (thus money).

We have no idea how common this will be nor what expectations we should have for support but suggesting it will just naturally occur once we upgrade to Windows 10 is very, very misleading.
i don't think game devs will have the time to implement this, but it's almost certain that game engines will implement it sometime soon (Unreal Engine, Unity, CryEngine, etc). I expect it to be a default feature in a year or 2 for the big name engines.
Having DX12 features already implemented in the engine used for the game brings dev costs down.
 


My beef is we really have NO IDEA how easy or how common this will actually be, yet I'm constantly seeing comments that imply it will simply happen like it's a given. Also similar comments about mixing AMD and NVidia GPU's or stacking VRAM.

Even if it looks like a "simple" feature to implement it still adds not only to development time (to troubleshoot GPU combos) and technical support after release.

Sure, I'd like to see it used but again that's not my complaint.
 

silverblue

Distinguished
Jul 22, 2009
1,199
4
19,285
"It’s not that often that you see a product in the computer hardware market last as long as Kaveri has."

Piledriver certainly has that beat, though you did say "not that often".
 

Creme

Reputable
Aug 4, 2014
360
0
4,860
It's nice that they're bringing in features like VSR to their lower end models. The igpu can run some games in 1080p or 900p, and if you own a 768p monitor, then downsampling with these chips is viable.

They advertise DX12, I wonder if the feature level is different from other Radeons.
 

burnneck

Distinguished
Sep 15, 2011
9
0
18,510


My beef is we really have NO IDEA how easy or how common this will actually be, yet I'm constantly seeing comments that imply it will simply happen like it's a given. Also similar comments about mixing AMD and NVidia GPU's or stacking VRAM.

Even if it looks like a "simple" feature to implement it still adds not only to development time (to troubleshoot GPU combos) and technical support after release.

Sure, I'd like to see it used but again that's not my complaint.

If only AMD can invest with their own game publishing company that would support their innovative solution. It will be easier for developer to invest time programing for those solution, as there would be a test case for them to follow suite.
 

red77star

Honorable
Oct 16, 2013
230
0
10,680
All marketing mimmick especially talk about DX12. At the end of the day you still need decent video card to push frames, your games won't magically work better because of DX12 which seems only selling point for Microsoft in regard to Windows 10.
 

xenol

Distinguished
Jun 18, 2008
216
0
18,680


My beef is we really have NO IDEA how easy or how common this will actually be, yet I'm constantly seeing comments that imply it will simply happen like it's a given. Also similar comments about mixing AMD and NVidia GPU's or stacking VRAM.

Even if it looks like a "simple" feature to implement it still adds not only to development time (to troubleshoot GPU combos) and technical support after release.

Sure, I'd like to see it used but again that's not my complaint.
This was something that was annoying me when people were saying how AMD's FreeSync would win because it was part of the DisplayPort spec. While it is, it's an optional part of the spec, meaning most display manufacturers are probably going to just reserve it for "gaming models" of their monitors and charge an upmarketed price for it (not to say FreeSync isn't exactly free as in beer to implement and test).

Also I predict one of two things to happen when DX12 ships regarding requirements: either the CPU requirement will lower for the same amount of graphical fidelity or it'll remain the same for improved quality. And if you're capping your GPU (i.e., the game is truly hitting 100%), DX12 won't do anything to help that.
 

Blueberries

Reputable
Dec 3, 2014
572
0
5,060
Looks like AMD made impressive changes to their APU which could say something about where low-wattage affordable video-game boxes or HTPCs are headed.

I'm not going to buy one, but cool.
 

xenol

Distinguished
Jun 18, 2008
216
0
18,680
I'm going to rant a bit, and you can downvote me all you like.

The Direct3D 12 test is PR manipulation. They claim their APU pushes out 3.2 times as many draw calls in DX12 versus the Core i5-4570R pushing out 1.69 times. All this tells me is how much more the APU gains in DX12. It doesn't tell me the absolute performance of either part. For all I know, the APU is pushing out 160Kcalls on DX12 versus 50K and the i5-4570 is pushing out 338K calls on DX12 versus 200K draw calls. Plus if I recall correctly, 3DMark's test pushes out draw calls until the frame rate lowers to a minimum target (30FPS? 60FPS? I forget).

The second is that there's a lot of "if developers develop correctly..." This worries me because the APU+GPU environment appears to be solely AMD's playfield. Which basically tells me that if I have don't have an AMD APU and AMD GPU, I'm screwed. How is this any different than NVIDIA shoving GameWorks onto developers? Intel and NVIDIA aren't going to create a similar ecosystem and good luck having a generic standard, especially since Intel wants to do their own thing. So if a game is developed with asymmetric rendering and HSA in mind, I'm hosed if I don't have a pure AMD system. And this is something NVIDIA and Intel CANNOT fix, whereas with GameWorks, at least AMD can optimize based on the framework's outputs, regardless of how crappy a job that is.
 

Math Geek

Titan
Ambassador
building a budget system with one of these knowing that igpu won't go to waste later on, is a big plus for the a10 in my book. obviously, the intel cpu's still take the win for performance overall, but if the budget only allows for lower end to start, this is a very compelling way to go. an htpc pretty much can't go wrong including one of these in the build.
 

none12345

Distinguished
Apr 27, 2013
431
2
18,785
"They claim their APU pushes out 3.2 times as many draw calls in DX12 versus the Core i5-4570R pushing out 1.69 times. All this tells me is how much more the APU gains in DX12. It doesn't tell me the absolute performance of either part. For all I know, the APU is pushing out 320K calls on DX12 versus 100K and the i5-4570 is pushing out 338K calls on DX12 versus 200K draw calls. Plus if I recall correctly, 3DMark's test pushes out draw calls until the frame rate lowers to a minimum target (30FPS? 60FPS? I forget)."

If you want to know absolutes there are plenty of sites that have done those benchmarks. Its more like the amd igp putting out between 2-5(number remembered off the top of my head) times as many draw calls as the intel igp in DX12. However, that doesnt mean that its going to be increasing that many times of FPS. Reducing draw call overhead will help a lot, especially going forward, however you'll probably see single digit growth in fps all other things being equal. Maybe 5-15% increase from the draw call overhead change. Tho certain cases will be much higher then that.

With all the intel IGP chips out there, i think its very likely all the main game engines will cross load the IGP as well as the GPU. Well see it in all the major games. DX12 doesnt care what brand the hardware is, it presents a compute unit to the system for use. Im definitly not familiar with the guts of DX12, however using a nvidia and amd graphics card together should be no harder then using a intel IGP + nvidia card, should be no harder then using a amd igp + amd gpu + nvidia gpu. Or any other combo, they are all just compute units.

That doesnt mean you can just add up the performance of everything and expect linear growth. Dont expect them to be able to extract 30fps(gpu1) + 25fps(gpu2 different brand) + 10fps(IGP) = 65fps(+116% of gpu1). I would expect to see something more like 30fps(gpu1) + 25fps(gpu2 different brand) + 10fps(IGP) = 45fps(+50%), and as long it doesnt add any studdering, im totally fine with that.
 

f-14

Distinguished
Please stop saying "The power of the integrated GPU can then be harnessed and used as an advantage when paired with another GPU.."

That's a feature DX12 supports however it must be implemented by the game developer at extra time (thus money).

We have no idea how common this will be nor what expectations we should have for support but suggesting it will just naturally occur once we upgrade to Windows 10 is very, very misleading.

i have to agree, to those that disagree, i have to directly point out AMD introduced 64bit architecture and it wasn't readily adopted, it's still not fully implemented to this day as most games are still being made in 32bit. the facts are what they are fanboi bias can't change that.

APU's are low power options for mobile devices yet AMD purposely throws in an R290 for the pairing, i do get the point of it's use in that particular game, however every one knows if your going to run an R290, you're going to have a 500w psu, no laptop is going to have that much juice, which leaves me to conclude this is just for show, no one would be this dumb to pair a low power apu designed for mobile with a high power top end gpu designed for a high power user that almost always will use a high power top end cpu as well, if they are going to show glitzy flash to try and bedazzle then also show the flagship cpu AMD has as well if they don't want to show the intel cpu flagship that's to be expected in being AMD you have to PUSH AMD so advertising all AMD is required, just like all other companies do for what centuries now?
seriously, might as well throw in an nvidia titan in as well for comparison. no one buys a $1,000 gpu and then sticks it with an intel atom, even toms knows better than to do a review like that. might as well get tegra and pair it with a R290 gpu to show how dumb you are.
 

Johnpombrio

Distinguished
Nov 20, 2006
252
73
18,870
I swear AMD reused the slide from how Mantle was going to accelerate their GPUs/APUs to DirectX12. In the meantime, it is STILL the same old CPU architecture for the past year and a half with only a slight bump in the internal GPU clock speed and the internal GPU is targeted at sub 1080P monitors. If Intel can get their internal GPU to within 10% of the AMD specs, AMD is going to have to come up with a much better CPU or get out of the mid and upper range desktop business
 

PaulBags

Distinguished
Mar 14, 2015
199
0
18,680


But where's dx12 for older intel chips?
 

Puiucs

Honorable
Jan 17, 2014
66
0
10,630


Even so, for a long term investment going for the more simple to implement adaptive sync that comes with dp1.2a/1.3 is a much safer bet.
The first implementation of freesync has plenty of problems they have to fix, but in a year or 2 we should see it be on par or better than current gen gsync. The published specs of this technology should allow for this to happen. We just have to hope that the hype surrounding adaptive sync doesn't fizzle out and OEMs continue releasing improved panels that take advantage of it
Freesync's biggest problem right now is the low quality panels used in the displays. Very few have good wriggle room in terms of min/max refresh rates. The rest of the problems are a combination of software and hardware bugs.

I really want to compare current gen Freesync with new monitors released in 2016.
 
Status
Not open for further replies.