AMD CPU speculation... and expert conjecture

Page 740 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Thats really dumb then. Pcperspective has really just become nvidiaperspective then.
 

con635

Honorable
Oct 3, 2013
644
0
11,010

The owners thread on overclockers.co.uk

 

jdwii

Splendid
man I read a lot on amds freesync not happy at all. The whole reason I wanted a gsync or freesync monitor is to get a better experience at 30-60fps. I heard people online claim gsync is smooth and flicker free at even 20fps that's what I wanted. I'll wait and read real reviews on monitors from actual professionals before I make to many claims.

This is my favorite site so far for monitor reviews where I went for that Acer

http://www.tftcentral.co.uk/reviews/acer_xb270hu.htm
 


Well that would be bother logical and objective- no room for either of these in tech reporting these days :(

From the bits I've gleaned of Freesync vs G-Sync, Freesync works in a 'preferable' way above the max VRR frequency but has problems below (with possibility of Ghosting, depending on the panel). The opposite is true for G-Sync, which is better below the minimum VRR frequency but locks the FPS above it (adding in input lag and such). Both technologies work nicely when operating within the VRR range which is good.

This is based on me reading quite a few different reviews though- wouldn't it be nice if someone could do it thoroughly enough to cover eveything?
 
GIGABYTE Intros 990XA-UD3 R5 Socket AM3+ Motherboard
http://www.techpowerup.com/210949/gigabyte-intros-990xa-ud3-r5-socket-am3-motherboard.html
AMD’s upcoming Radeon 300 series GPUs could include multiple rebrands:(
http://vr-zone.com/articles/amds-upcoming-radeon-300-series-gpus-include-multiple-rebrands/89249.html

Model numbers of AMD Carrizo-L APUs spotted
http://www.cpu-world.com/news_2015/2015032101_Model_numbers_of_AMD_Carrizo-L_APUs_spotted.html

Support For Multiple Graphics Drivers With AMDKFD Kernel Driver
http://www.phoronix.com/scan.php?page=news_item&px=AMD-KFD-HSA-Multiple-KGD
Using this new less than 100 line kernel patch, the AMDKFD DRM HSA driver can now support multiple KGD instances whether they be two AMDGPU instances (the new R9 285, Carrizo, and Rx 300 series GPU driver), two Radeon DRM drivers (for supporting all current Radeon hardware), or a combination of AMDGPU and Radeon hardware. This new patch for supporting multiple kernel graphics drivers only impacts this HSA driver and has nothing to do with CrossFire/OpenGL or other long sought after multi-GPU features.
:whistle:

AMD Is Hiring Two More Open-Source Linux GPU Driver Developers
http://www.phoronix.com/scan.php?page=news_item&px=AMD-Two-More-Open-Linux-Devs
Big Graphics Card Comparison Of Metro Redux Games On Linux
http://www.phoronix.com/scan.php?page=article&item=metro-redux-march&num=1
 


Sounds about right.

It IS worth noting, the NVIDIA approach does cover up to 144Hz panels, and at FPS that high, you pretty much require Vsync on due to tearing. So defaulting to Vsync on in that case does make a certain amount of sense.

Of course, the "true" way to fix the problem is to make panels that can handle any arbitrary refresh rate, but that would pretty much require pixel response times <1ms, or LCD's going the way of the dodo first.
 


Well, talking from experience, with my 120Hz monitor I've never used or needed VSync. The refresh happens so fast that tearing is not that obtrusive. Keep in mind I'm not saying there is no tearing, but the tearing happening is less annoying due to the high refresh rate of the monitor. What itches me about GSync is that mandatory VSync. I'm very sensitive to mouse lag, so I'm not liking that.

Cheers!
 

jdwii

Splendid


Ah i don't see those cards mattering much anyways. The 250X rebrand as a 350X did hurt a bit but that's it all the other cards are pointless for anything but old PC's that need more displays. Lets just hope the 360X is a new model and will be priced around the current 750Ti offerings.

Edit also does anyone think a new maxwell card will come out titled as a 950Ti?
 

blackkstar

Honorable
Sep 30, 2012
468
0
10,780


the thing is that gamers have always had to choose between tearing and better input latency or no tearing and worse input latency. It's a long standing problem and gsync doesn't solve it at all.

It's not going to be as bad on a high frequency monitor, but it's still going to be a large source of input lag. Until Nvidia reconsiders their stance on vsync, I would go gsync right away. Obviously you'll have to find a monitor that is doing something right and not causing ghosting with bad settings or cheap components or bad firmware that's not driving pixels right.

I also don't get the obsession with sub 60fps performance. Someone buying a gsync or freesync monitor is not going to have a mid-range system. The tech was originally designed to solve the problem of gaming PCs not being able to handle 144fps or 120fps average frame rate. I know that if I do grab freesync, it'll be for a build that targets 144fps or 120fps as opposed to 60fps with drops below 30. Look at some reviews, even old 290x handles 1440p at 60fps or higher for the majority of games. That's not that expensive of a card in comparison to an expensive gsync monitor. If you dig enough, a single 290x is only around $300. Cheapest gsync monitor I can find is a little under $600 on newegg (I didn't look for good sales and I try not to when pricing things out). If you're going to spend $600 on a monitor, you're not going to have a PC that plays games under 30fps unless you're doing something very wrong. A 1440p gsync is over $700. If you have that much money to spend on a gaming PC, sub 30fps is completely unacceptable when you can buy GTX 970s for around $300 a piece.

I was also under the impression that AMD postponed the 300 series launch to have a new product stack. But that was really only rumored to be the 380 and maybe the 370 series. Everything else should be a rebrand anyways, it's what everyone is doing since dGPU market is slowly being eaten from the bottom up (though I don't think it'll ever catch the large die GPUs). But that market is low margin so it's not a priority.
 


Re-brands at the bottom end aren't really a concern as most modern processors (especially from AMD) include graphics that are as fast anyway (unless your talking dCPU like FX- in which case your probably also in the market for something a bit higher up the ladder).

What does concern me a little though is if they really are going to use Pitcarin again- I mean it's actually a nice chip from a resource balance point of view (it's always been one of the most efficient implementations of GCN). Also from a re-brand point of view if they take a 'golden sample' pitcarin (i.e. a 270X 'OC') and then rebrand that as a 370 with something like Tonga PRO sitting directly above it as the 370X it fits from a performance stand point imo. The issue though is that it's a GCN 1.0 part- which means none of the modern feature set. If nothing else I think AMD *needs* all their new gpu's to support bridge-less crossfire and freesync. If they launch a new range lacking lots of key features it's going to make things such a mess for consumers.

The other thing to consider is that nVidia has refreshed pretty much it's entire line up bar a couple of very low end parts (admittedly the mid range card is branded 700 series but it is still maxwell). It's not really that GCN 1.0 is that bad from a performance view point (I'm on a 280 which is great at 1080p), however it's lacking a lot of features and AMD need all their future cards to support these features if they have a hope of them becoming meaningful.

True Audio has so much promise- yet no game developer is using it really, and why would they? It's only available from the smaller dgpu company, and they only support it on a fraction of their hardware. If you could say to a dev, well 25% of the graphics market will get a free boost from this and give your audio guys some headroom they might listen, but when you start talking a fraction of that- I'm not sure I'd want to be the one pitching that to the guy with the budget.

Then there is the efficiency argument- AMD *had* for literally years the best card if you couldn't get external power in the HD 7750, the sad thing is though that you can't even buy the HD7750 any more so their current *best bus powered card* is a pathetic Oland based thing- certainly no match for a 750ti. If they could take their latest revision of GCN and build a card specifically targeting a 75w bus power limit they could make something competitive, and such a chip would also be a good chip for mobile. If they're relying on another oland re-brand though they've essentially conceded that market (and frankly the entire mobile gpu market) to nVidia.

I guess no one really knows for sure just yet though, we'll see what happens when they officially announce it. I'm hopefull that one way or another they've updated the GPU's to the point that they have a common feature set across the mid and upper range if nothing else.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790
http://www.forbes.com/sites/jasonevangelho/2015/03/23/nvidia-explains-why-their-g-sync-display-tech-is-superior-to-amds-freesync/

About AMD ghosting problems:

We don’t do that. We have anti-ghosting technology so that regardless of framerate, we have very little ghosting. See, variable refresh rates change the way you have to deal with it. Again, we need that module. With AMD, the driver is doing most of the work. Part of the reason they have such bad ghosting is because their driver has to specifically be tuned for each kind of panel. They won’t be able to keep up with the panel variations. We tune our G-Sync module for each monitor, based on its specs and voltage, which is exactly why you won’t see ghosting from us.
 


I don't know if "lovely", but at least nVidia's position is very clear: they're cautious about "FreeSync".

In particular, I think AMD can make the lower refresh rate problem go away with a driver update. nVidia is limited to the hardware module inside the monitor. I believe nVidia is doubling the image rate when they hit a minimum (you get 20 FPS, but report 40FPS to the panel and go to 40Hz instead). According to what I understood, AMD can do that as well through the driver, using the video card's RAMDAC as buffer. Hell, I wonder if current RAMDACs have any sort of difference with the dinosaurs from the early 90s, haha.

In any case, looks like from the longevity point of view, FreeSync has the upper hand, since nVidia needs to change the hardware each time they make an update to the spec or want to add a new panel with GSync, whereas "Adaptive Sync" is part of DP so it's up to the maker to comply.

Cheers!
 


A trivial USD$260 module change on your monitor if it's FPGA and a tad more complex if it's already as part of the PCB of it or the mass printed thing. So not quite.

Cheers!
 

Reepca

Honorable
Dec 5, 2012
156
0
10,680
I seem to not understand FreeSync/GSync correctly. I had thought that the idea in general was that the GPU can tell the monitor when to refresh - each implementation choosing a different way of doing it, but more or less doing the same thing. If the display does not refresh until the GPU tells it to, how can tearing exist?

Obviously, however, it can exist, so there must be something wrong with my understanding - what am I wrong about?
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


They can tune per individual panel.
 


Tearing != Ghosting. Two different problems. Within the VRR range of Freesync/Gsync, there shouldn't be any tearing. Outside it however, it can still exist.

The *ideal* solution would be a monitor that can handle any arbitrary refresh rate, but limitations in panel technology make this a no-go for now.
 
BF:H CPU results:

http://www.techspot.com/review/979-battlefield-hardline-benchmarks/page5.html

Same trend we're used to out of FB3 at this point. i3 trumps FX-8350.

Whats interesting is FX-9590 scaling, which is perfectly linear from 2.5 up to 4.5 GHz, compared to the 4770k, which only gains a total of 6 FPS over the same frequency range. This is telling me, if you could only clock it high enough, that the FX would pass all the i7s. I'd imagine what's going on is the i7s are a LOT closer to a CPU bottleneck on at least one core, so there's a point where FX could theoretically pull ahead. But without CPU usage numbers, that's only a guess, though it does fit in with other FB3 titles.

Kinda shows how IPC matters though.

Farther proof of this can be done via IPC comparisions at the same clock between the two processors. Since we're comparing clock to clock, and since both CPUs have the same number of cores (HTT, but still), the math gets really simple. Theres a 20% IPC difference at 2.5 GHz, but a 1.7% difference at 4.5.

Point being, if anyone has CPU usage numbers for FX/i7, I'd love to see them. I'm betting i7s are basically CPU bottlenecked, but still faster then FX simply due to its higher base IPC.
 
Status
Not open for further replies.