Ask Me Anything - Official AMD Radeon Representative

Page 7 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

AnUnusedUsername

Distinguished
Sep 14, 2010
235
0
18,710
So far most (if not all) laptops using AMD APUs have been fairly low end. Heavier, larger systems aimed at low cost more so than at being the best product possible. Is this a trend encouraged by AMD, or by the laptop manufacturers (who are likely influenced by Intel subsidizing laptops that meet their "Ultrabook" standards)?

AMDs integrated graphics are notably more powerful than Intels. Personally, I'd much rather have an ultrabook running an AMD processor. With integrated graphics, the GPU is always the bottleneck for gaming. So, aside from Intel's marketing, wouldn't the best ultrabooks for light gaming be those running AMD APUs? And yet, no one (or nearly no one) makes the equivalent of an ultrabook with an AMD processor.
 

TripleBullet

Honorable
May 30, 2013
241
0
10,760


This. My 8350 is still kicking ass on pretty much all games but it isn't going to for much longer. This is my main question. The APU's are awesome for what they are for but I am going to Intel within a year, unless of course AMD releases some awesome CPU's for me to check out first. So, what is the expected release date of AMD CPU's that will best the FX series?

 

hjj174

Honorable
Feb 8, 2014
315
0
10,960

He has an NDA not to talk about upcoming products and he isn't an employee in the area of AMD that would know greats amount information about their CPUs. Besides, it's already been said by the staff here not to ask questions about upcoming CPUs and that they are having another 'Ask Me Anything' with a CPU specialist from AMD in January.
 

TripleBullet

Honorable
May 30, 2013
241
0
10,760


Ahh, my bad, didn't really go through the thread but thanks man.
 

cmi86

Distinguished
They are so tight lipped on the NDA he couldn't even tell me if we can expect lower TDP on the next gen GPU's. Makes sense though. That turns into speculation into a rumor and it just snowballs from there.
 

Daniel Sauvageau

Honorable
Aug 12, 2014
313
0
10,780

How many people would buy an ultrabook with gaming as a primary concern? The main reasons I might ever be interested in an ultrabook are portability and long operational battery life. Most of Intel's chips for this market segment are outrageously overpriced though.

With some luck, AMD's 20nm shrink might put them on the map there.
 
sorry.. I was reminded to read this threads rules and see now this is not a ask me anything thread after all.. what???
another amd misrepresented thing I guess ??


can I ask this ??

with the apu chips why did amd go with the chips they did ,and not there so called higher end fx line ?? like a fx 8350 and incorporate the gpu on them ?? why fall back to something like the Athlon ?? for a guy that had am3+ since day 1 and played out all upgrades and could not see keeping a 4 year old platform any longer what have you to offer ??? just rebuy the past today on something I had 4 years ago ?? or go with a blast heater 220w chip ?? maybe the console market is amd's best bet to survive ??
 

Ogrodnik

Reputable
Dec 10, 2014
4
0
4,510
Will Mantle, in near future, help with better utilization of FX CPU threads/cores? I have 6300 @4.5GHz and I haven't seen game that could use it in more then ~40-50%. More often I see about 20-30% total CPU usage during gaming. That is a lot of potential wasted. Is Mantle about to fix anything in that matter?
 

bit_user

Polypheme
Ambassador
Actually, I did think of a Freesync-related question.

Will Freesync apply to video playback? So, when playing a 24 fps or 25 fps video clip, in full-screen mode, will the monitor show the native framerate (i.e. without judder)? Would this happen automatically, or would video playback programs need to make special API calls or use techniques such as Direct 3D surfaces to benefit from Freesync?
 

bit_user

Polypheme
Ambassador
Any industry rep is going to have limits to what they know and what they're allowed to discuss. Please be realistic. If they would make any big announcements, it wouldn't be in a forum like this.

I'm going to speculate that it has a lot to do with die size and TDP. On a die with fewer cores, there's more room for the GPU (and power budget as well).

But here's the thing about APUs: memory bandwidth is always a bottleneck for APU graphics. And more cores will compete with the GPU for that bandwidth. So, why would you want a low-end GPU paired with a high-end CPU, especially when using the GPU would only hurt CPU performance and still not deliver compelling GPU performance?

The main argument I see for what Intel has done by including GPUs in their entire desktop CPU lineup is simply to cater to users who care about CPU performance but not graphics. So, it's a cost-saving measure more than a feature for enthusiasts (well, aside from Quicksync).

As for your upgrade, why not wait and see what CPUs they intro at 20 nm? Right now, they're limited by the 32 nm process.
 

Mac266

Honorable
Mar 12, 2014
965
0
11,160
Hey mate, got a couple for ya.

1. HBM. How does this affect overall performance, and can you share any results from testing?

2. Graphics Architecture. GCN has been around for awhile, are you allowed to comment on whether a new Arch. is in the pipeline?

3. Efficiency. No offense, but it's known that
Radeon GPUs are not the most effcient. Is this high on the list in terms of importance?

Thanks for your time and products.

Cap
 
what broke it off for me was I was using vista and amd decided to drop vista support and leave me hanging ?? now I cant trust them to support me ..
I agree that it is annoying but you can hack the driver to make it work on Vista. 14.12 was much more involved than 14.4/14.9 but I was able to do it.
I will post details on the hack in a few days when I have a little more time
 

iron8orn

Admirable
Hi Robert!

Looking for some clarification on DX such as the ability of the GTX 970 vs. R9 290X

People are saying that the GTX 970 has full DX12 support While the R9 290X has partial so what I am really asking is will a DX12 game run on a DX11 gpu and still get the same benefits such as cpu threading?
 


One of AMD's major arguments for consumers to use Freesync over G-sync is cost (due to licensing).

If monitor manufacturers have not made installing scalers with adaptive-sync technology a standard production practice, wouldn't that place a premium on the pricing of Freesync-capable monitors?

Or will the three companies producing scalers (Realtek, MStar, Novatek) be implementing the adaptive-sync standard on all scalers they ship to the monitor manufacturers?

Should we expect active-sync to be a monitor standard for all monitors with a DisplayPort (DP) connection beginning next year?

If the scaler companies will already be implementing adaptive-sync for DP as a standard part of their production process, that would also answer my supply/demand question from earlier.
 
my next question is regarding MS upcoming directx and GCN. so among the new feature in DX12 is:

Rasterizer Ordered View
Typed UAV Load
Volume Tiled Resource
Conservative Rasterisation

so does this new feature require new hardware or not? if not does this feature work with all GCN based hardware? also from GPU manufacturer point of view what is the difference between DX12 and DX11.3?
 

setx

Distinguished
Dec 10, 2014
225
149
18,760
Developer here. First, question about FreeSync:
1) I saw a notice that some cards are going to support VFR "for video only". How is that achieved since modern video drawing is done through the same 3D OpenGL/D3D APIs? Where/when can I find FreeSync documentation for developers?

General question for AMD graphics division:
2) I hope you know that one of the biggest complains about AMD solutions are buggy drivers. Do you have plans to be more general developer-friendly or only care about few hyped games? I don't see any maintained place for reporting specific API bugs (only "Check games that have problems"): tried to submit you reports in 'AMD Issue Reporting Form​', write on devgurus . amd . com ('Graphics Programming' section) – absolutely no reaction. (Only in 'OpenCL' section of devgurus I've got "we will test it" response from AMD.)
 

Daniel Sauvageau

Honorable
Aug 12, 2014
313
0
10,780

In the post you quoted, AMD said scaler manufacturers can enable adaptive sync on many of their existing scalers with nothing more than a firmware update so the premium should be close to nonexistent.

Unfortunately for the end-users though, not many displays have any user-friendly mechanism to apply such updates, which means we will likely see tons of displays based on exactly the same hardware re-released as AS-capable under different model numbers just because of that.
 

Thracks

Honorable
Nov 1, 2013
101
0
10,680


The cores that are in the APU are the same CPU cores you'll find in an FX chip. Exactly what you want. :) We do have one chip called Athlon, but it's a very low-power chip suitable for HTPCs, thin notebooks, mITX office PCs, etc. We're proud of the Athlon brand, and we think the value offered by the new Athlons is worthy of that name.
 

Thracks

Honorable
Nov 1, 2013
101
0
10,680


The GCN architecture, and all of our products based on it, are compatible with DX12. You're right that reduced CPU overhead and superior CPU multi-threading are the most vitally important aspects of these new-generation APIs, and the R9 290X will certainly support that.
 

Thracks

Honorable
Nov 1, 2013
101
0
10,680


I think this answers the question: http://support.amd.com/en-us/search/faq/225

While AMD has undertaken many efforts to broadly encourage the adoption of FreeSync and DisplayPort Adaptive-Sync, and I think you'll agree the route we took is very attractive from a business perspective, I also think it's unrealistic to conclude that every monitor will one day support Adaptive-Sync. It's just not feasible, desirable or necessary for that to occur.
 

Thracks

Honorable
Nov 1, 2013
101
0
10,680


What matters most in DRR is the functionality of the display controller, not the graphics or compute silicon. It is very demanding to adjust the timing of the display every 16ms (or less). It's not so demanding to drop the refresh rate once to a fixed value. Documentation will be published after the technology is formally launched in 1Q15.

General question for AMD graphics division:
2) I hope you know that one of the biggest complains about AMD solutions are buggy drivers. Do you have plans to be more general developer-friendly or only care about few hyped games? I don't see any maintained place for reporting specific API bugs (only "Check games that have problems"): tried to submit you reports in 'AMD Issue Reporting Form​', write on devgurus . amd . com ('Graphics Programming' section) – absolutely no reaction. (Only in 'OpenCL' section of devgurus I've got "we will test it" response from AMD.)

This is an issue I'm really passionate about, because I was the guy who got the www.amd.com/report form opened for people to supply feedback. When a report comes in, it goes into an internal database that's seen by PR, marketing, QA, engineering, etc. We're fully aware, and have read, every single report that comes to us.

You'll see the results of that system working in Catalyst Omega: we used the form to address the top-10 issues users care about most, in addition to hundreds of others. That would not be possible without a robust issue tracking system.

No, we do not send a "thank you for reporting" email, but we take those bug reports very seriously. I take those reports very seriously. I am on Twitter, FB, and lurking in forums like THG every day. Believe me when I tell you that we are well aware of what's going on in the community, and dedicated to fixing it.

I am not the only one doing this at AMD, either.

 

Thracks

Honorable
Nov 1, 2013
101
0
10,680


Windows controls the refresh rate at the desktop, so a video player would need full screen exclusivity. But assuming you had that, yes, the refresh rate of the display could be dropped to the precise framerate, or an exact multiple of the framerate.
 
Status
Not open for further replies.