AMD Outs Official Statement On R9 Fury X Pump Noise

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.


-never seen any dead cards from driver updates.

-never seen any sustained decreased performance from driver updates...any drop was erased in next release

-the 3.5 GB was a marketing blunder but performance wise a complete fake rage red herring ... there is no stalling unless you work extremely hard to create an unrealistic scenario. Test after test after test has shown this to be true

http://www.guru3d.com/news-story/middle-earth-shadow-of-mordor-geforce-gtx-970-vram-stress-test.html

It still has 4 GB (6 gears) it just used 0.5 GB in a different way..... not exactly lying but not exactly full disclosure which I called them out on a few posts back. At least the Fury X has finally forever buried the ya need more than 4 GB for 1080p / 1440p scam....and 3.5 GB is just as adequate. Why get a 980 when there's nothing actually wrong with the 970.... 7 months went by and all of a sudden its a big thing ? And yet every web site that tried couldn't reproduce thew problems in normal usage.

-Mantle was AMD leveraging their console efforts into the PC market ... it failed.

-Meanwhile Freesysnc is broken forcing firmware updates on previously sold monitors, that must be done in factory. No answer to Shadowplay / PhysX and multi card profiles take too long. They simply do not have the resources to keep up on so many fronts. They need to start selling things to generate capital and the Fury / 3xx doesn't bring anything to the table.

I can accept this approach if they are using this as a "rebuilding year" but if they have a 3rd generation of yawners, we are all in trouble. next year's $320 970 will be $550...that doesn't make them "evil", it just makes them a "normal corporation"
 
You people are funny....thinking NVidia is inherently evil and AMD is the righteous company. AMD simply CANNOT introduce proprietary stuff for a couple of reasons.

1. Unlike Nvidia, they couldnt get a significant amount of people to use it, due to the low market share.

2. By suffering so greatly in finances, and needing to also prop up a really struggling CPU division, they cannot spare the R&D/Driver development to make their proprietary tech work well enough immediately. Examples: I'm still waiting for a decent Crossfire Witcher 3 driver, and a stable Crossfire Freesync driver. Heck, we still dont even have a good CFX scaling official driver for GTA5 (the 15.20 1040 hacked leaked drivers are good for GTA5 CFX, but those are very flaky at times in other games). Dont tell me its all about Gameworks in Witcher 3---I dont even have Hairworks enabled and Crossfire is still broken.

AMD does open stuff because they HAVE TO. If they were in Nvidia's position, they would be just as "evil".

But, like the posts above, this will be downrated to hell probably an hour after I post it, despite being the truth. And I'm typing this on an all-AMD PC. 9590/990FX mobo, 2 290x(s), even silly AMD radeon (rebranded Patriot) memory. So dont call me a Nvidia fanboy. I've supported AMD quite enough, thank you.

Oh, and to the guy who says that AMD never releases gpu-breaking software----just do a search for 5970 "cold bug"---difference is, that made your brand new $1000 video card a crashing fest until it finally dies well ahead of time (mine lasted 2 years or so, no overclocking whatsoever). And the only fix was to edit or reflash the BIOS to prevent the gpu from ever throttling. How many people you think are comfortable with doing that, and voiding their warranty?

But no, Nvidia EVIL! AMD can do no WRONG!
 
dude's just face it its buyer beware and a lot of junk out there hyped up to fool you in to a buy - look at all these 500 buck caerds that really cant do $hit

but hay all that matters is them FPS and nothing else. right ?? for that kinda of money it should do it all not be limited in so many ways
 



That system will fail the majority of us, who have long term relationships. But I'm sure it will motivate you.
 
Don't understand ... relationships make it easier.
'
If you don't do something at least once a week with the person you are involved with, the reality is, it's no longer a relationship.... that's a true fail.

http://www.imdb.com/title/tt1279935/

So far with 3 kids, two still living ... or back living ..... at home.... I haven't had to enforce this rule.....active social lives with this brood.
 
funny thing I told my buddy years ago that when NVidia stopped making chipsets for amd boards amd would go in to a decline .. you know back when amd had stuff you wanted and it was at the top of the game it was mostly on NVidia chipset boards -- now look how stagnate they are now

as far as amd acquiring ati I don't really see too much that changed there more like the same old same old

and don't forget the co founder of NVidia was a amd guy
 


open because they have to? look at tress effects, they didn't have to let that be open and could have made it only work on amd while saying to bad so sad to nvidia much like what nvidia did and does with physx, but they didn't.

you also have mantle which if i remember right, had a faster uptake an use than dx11 did which later lit a fire under microsoft to up their game with 12, and passed off the api to kronos for vulcan.

as for crossfire, i have never once used it because when multi gpus were re introduced they performed like crap and most things ran like hell, its only recently that they became something to consider, but even then, crossfire and sli are sub 5% (i want to say closer to 1% if not lower) and its really hard for me to trumpet crossfire or sli being broken in a game as something to complain about.

as for the cold bug, it appears to be an issue when the gpu was cooled to much, this being unfortunate in and of itself was entirely caused by the people themselves, meanwhile nvidia put out drivers that caused fans to not work or work severely under powered that have caused cards to fail and throw sparks... not sure if it was one of these issues but my old 6800 ultra ran at around 80c idle when clean because the fan refused to spin up... that was the last nvidia card i bought, i went to a 5770 and from there was going to get a 7970 ghz, but they sold out except for the people who never lowered the price, and when the crypto currency crap ended i got a factory oc 280x,

as far as i see things, nvidia has no issue with locking other people out with proprietary software while when amd has the chance and can bolster their cards power by saying only on amd they don't but nvidia will go out of their way to try and sneak performance downgrades through or in the case of witcher 3 increase tessellation to a stupid degree, i would also say project cars but that thing is such a cluster that its hard to make heads or tails of whats going on, and if you listen to... i forget his name and position, i believe Richard Huddy where he talks about nvidia giving dlls instead of source so you cant optimize for amd or in deals where devs cant optimize for amd or give them code... amd may not be a saint but when you have the devil on one side its hard to look at the other person and not see them as a good guy.
 
TressFX---you just proved my point. If they didnt make it open to all video cards, how widespread do you think its adaptation would have been? If AMD had the lead market share like Nvidia, do you think they wouldnt do an about-face and keep its tech in-house? But when you trail the market leader by so much, all the money you spend on R&D for proprietary tech will be wasted if you cant get anybody to use it by limiting it to your small share videocards.

Mantle is already dead dude---its tech is in Vulkan. AMD will not be supporting it anymore. And it was not proprietary---AMD offered an SDK to anybody who wanted to use it, albeit by the time they got that open SDK out the door, DX12 was right around the corner. And a big reason for Mantle to even exist, BTW, was to help AMD's CPUs, which were getting clobbered in games due to the heavily single core DX11 pipeline, which, as we know, heavily favors Intel CPUS.

Oh, and BTW, Mantle also illustrated AMD's lack of support---DA:I had problems with massive loading times on CF systems and bad texture pop up until one of the most recent updates on AMD & EA's side. Battlefield 4 too---stability problems for their flagship title, in which AMD directly worked on the Mantle interface, took them 6 months after the game launched IIRC to release a build with Mantle support even in it.

"and its really hard for me to trumpet crossfire or sli being broken in a game as something to complain about. "

Then AMD shouldnt even offer it if they cant be concerned about updating drivers. Nvidia does. You shouldnt market something, support it half-heartedly and then expect to not get criticized.

"as for the cold bug, it appears to be an issue when the gpu was cooled to much, this being unfortunate in and of itself was entirely caused by the people themselves"

Not what happened. It happened in my own system, and I certainly did not add any additional cooling from the stock cooler. It was caused by AMD designing the firmware to throttle the gpus too low in idle state, the clockspeed/voltage dropped too low for the card to remain stable. The only consistent remedy was to mod the bios yourself and disable the gpu throttling at idle. Flashing and editing videocard BIOSes back then was not for the faint of heart. There was a real good chance you'd end up with a $1000 brick instantly, not to mention voiding the warranty with the modded clock settings even if everything went smoothly.
 
I never understood the argument that a company should invest millions of dollars in a technology and give it away for free.
Java:
-Yes, a free alternative to PhysX would be nice, if there was one
-Yes, a free alternative to Shadowplay would be nice, if there was one
-Yes, a free alternative to G-Sync is in the wings but Freesync is currently broken on most monitors and peeps have to send it back to get fixed.
-Yes, mantle would have been nice, if it ever lived up to its claims
-Yes, Cf is there but after getting most of the deficiencies worked out, it's slipping again.

As for SLI / CF being 5 or 1%, I don't know what price bracket you're playing in but I would say that 80% of the builds we have done or put together in last 18 months have been SLI'd 970s. We were involved in more 770 / 780 SLI builds than single card 780 / 780 GTi builds ... we did way more 560 Ti builds than all single card 5xx series builds put together...and in each of those, all single card builds had a PSU sized for a 2nd card.

Like with the old age adage.... "60 is the new 30". Everybody asks "what do I need to get 60+ fps in **all games** on ultra".

I wouldn't even think about a new 1440p build w/o SLI / CF....single card just can't support 1440p / 144 hz monitors.... tho the OC ability of the 980 Ti makes it just about 60 fps + all the way around.... so far. As for 2160p, SLI / CF can't handle 60 fp+ in all games. And if you haven't seen 144 Hz on a IPS monitor w/ G-Sync w/ ULMB technology, when you do, you will have a whole new take on gaming. Got my 1st look Sunday night w/Wicther 3 and the shadows, detail, lighting and rich color / brightness just blew me away. Very good IPS panels always had the great color but was always distracted by lag and ghosting ... those are gone with the 3 way combo of 144 hz, G-Sync and ULMB. Anxious to see Freesync work, but until 1) Freesync panels currently on the market are fixed 2) we have an IPS one and 3) it has ULMB, it won't be a fair comparison.

-I don't see Adobe making it easy for peeps to make / edit PDFs.
-MS tries their damndest to break file conversions for office files.
-AutoDesk works very hard to make importing dwg files into other programs
-How hard did MS fight against unbundling IE from Windows ? Gates even got caught lying about it under oath.
-Does MS give out all their APIs ?

Painting one competitor as evil just comes off as very Donald Trump-ish ...
 
This statement reads like it was written by a first year Corporate Communications co-op.

I think they are just trying to stress that only a few users experienced this issue and, as usual, the internet is blowing things out of proportion.

Not saying that's a bad thing because it forced them to address it. But it was definitely about getting more clicks on media sites.
 


I don't understand. Just finished a Spartan race, play games online with my girlfriend, go to movies, go to the bar, go the gym when I can, work a great job...

I think you are assuming everyone is like you.
 
It's true it's not perfect but no card is, all the whiners on here act like the card sucks horribly. It doesn't. While I like NVIDIA myself, I can at least admit i'd consider this card depending on price as it's not really far behind the 980 TI except in certain cases.
 


Don't understand what ? I never suggested you did anything but make unsupported statements.

I never made a statement about "everyone". That assumption was your invention. I made a jest that some people spend too much time on the computer to the detriment of themselves and those around them, their job or loved ones. Stereotypes don't come into being out of no where, they even have TV shows about them. Are you suggesting that there are no people out there who spend too much time on the computer and as a result are missing some of the social skills that normally develop as we mature ? Grades tumble, jobs are lost, relationships fail ... even deaths have occured. Ever read a newspaper ?

http://www.frihost.com/forums/vt-54367.html

http://www.theguardian.com/world/2010/mar/05/korean-girl-starved-online-game

http://www.olganon.org/home

http://www.video-game-addiction.org/social-consequences.html

You attempts to make it personal indicate that you have some particular hypersensitivity on the subject. If you must know, I have been married for 27 years, in a relationship with same person for 9 years before that .... tended bar for 10 years, coached / managed little leagues, travel ball, did scout master thing when my kids were active, national vice president of my professional society, mountain biker (including group trips thru various national / state parks), teach engineering courses, do volunteer work helping kids / adults build PCs (started building PCs in late 80s), ran or staffed numerous on line web forums starting in the early 1990s, ... while serving as president and founder of an engineering consulting company (1991) and VP in another one before that. Last game I had time to play / finish was Far Cry 3.... before that Witcher 2.
 


How many new CPU generations has Intel put out since AMD launched the FX line, 3 or 4? When did AMD come out with a exciting new Chipset? When will AMD support DDR-4? How about a FX MB with SATA-3 & USB-3 and or 3.1 in a mATX size? AMD put all its CPU marbles into the APUs and now the new Intel CPUs coming out whip their ass on the CPU power and integrated GPU power. AMD better come up with something new and exciting or just stick a fork in them they are done, and that will be bad news for all of us.
 


It definitely doesn't suck..... the problem is when you have a two person race and you are not the winner, ya can't brag about 2nd place when 2nd is also last .... and while sports fans for example can rattle off world series winners, superbowl winners, world cup champions, olympic gold medal winners, those who came in 2nd are soon forgotten.

My disappointment is that the industry needs competition. Why did Intel go to a cheaper epoxy thermal solution with haswell / IB CPUs ? Cause they could. There was no competition and they didn't want us taking 4670ks and OCing them all to 5+ GHz territory thereby cutting into sales of more expensive CPUs . Many pundits reported their suspicions (based upon comparing original leaked specs to release specs) that nVidia took the original 780 design and buried it, taking their original 770 and releasing it as the 780 .... no economic sense in having two cards standing above the competition's top tier so that would be a logical move.

I remember paying $1,000 for a 1 GB drive.... now I pay $100 for a 2 TB SSHD. Because of competition, the cost of "the computer you want" has actually consistently gone down and performance up because of competition.... few industries can boast of such a trend. I was paying $6k for CAD workstations in the late 80s / early 90s, now I can work faster with < $2k desktops and even laptops. The 970 was $70 cheaper than the 770. Was the 4790k cheaper than the 4770k ? Again, with no competition pressuring such a move, why would Intel do this ?

It's easier to have faith for example in a sports team that wins or comes close every once and a while. AMD has of late been the Buffalo Bills of the computer world..... the Bills got to the Super Bowl 4 years in a row and walked away w/ zero trophies. They certainly didn't "suck horribly" either. The FuryX got to the superbowl, it just didn't bring anything home. If they can use this year to "rebuild", perhaps they can do what the women's soccer team did when their return trip to the EC Final.
 
See, the problem here has very little to do with the hardware (i'm assuming AMD has rectified or will shortly rectify the pump whine when new batches hit the shelves). The new GCN arch proves it can hold its own, so that isnt the problem either.

Its what it has been with AMD since I bought the 5970 all those years ago---DRIVERS. Heck, I'd be happy with a complete rebrand of the 290X (like the 390X) and NO fury if they'd put the money they spent into making fury into hiring more people to work on their driver team. From all of the rumors going around the industry, Nvidia's driver team is many, many times bigger than AMDs, due to their (AMD's) financial troubles and having to have their hand in so many fledgling markets.

AMD has always been great hardware (well, except for the horrid Bulldozer experiment) let down by lesser support/drivers than the competition, IMO. And thats really unfortunate.
 


Most "in the channel" Freesync monitors are broken and require the new driver PLUS a "back to the factory" firmware update

http://www.tftcentral.co.uk/reviews/benq_xl2730z.htm

BenQ have confirmed that the FreeSync/AMA issue has now been fixed. A driver update from AMD is already available and should be downloaded from their website. In addition BenQ will be releasing a firmware update for the monitor itself to fix this issue. Current stocks in distribution are being recalled and updated with retailers so future purchases should already carry this new firmware. This is expected to apply for stock purchased AFTER 1st July, as V002 firmware screens should be shipped by BenQ to distributors in late June.

For those who already have an XL2730Z if you want to, you can return it to BenQ for them to carry out the firmware update for you. This only applies if the user is experiencing issues with the performance of the screen. There is no simple way for the end user to update the firmware themselves and it is not encouraged. Users should contact BenQ support through their relevant country website for more information on how to return their screen for the update.

This only applies in Europe and we do not have any information about how this update will be handled in other countries unfortunately. We would suggest contacting BenQ support for additional help if you need more information, now that a V002 firmware is in circulation. You should be able to identify the firmware version you have by accessing the factory OSD menu (hold menu button while powering the screen on, then press menu). The Firmware version (F/W) should start with V001 or V002 and then a date. You are looking for V002 for the updated firmware.

 


So how did the testing go?
 
Status
Not open for further replies.