AMD Fury X And Fiji Preview

Page 5 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Vlad Rose

Reputable
Apr 7, 2014
732
0
5,160


It was meant more as a joke since there were quite a few posts claiming that the AMD card was so hot it'd melt and that Nvidia had been through similar themselves not that long ago. But yeah, I do remember it didn't actually cook the egg, but it made for interesting conversation at the time on every news site.
 


I for one don't and won't go for the flagship anymore, it's the rest of the range that I'm interested in. At the moment I'm still running the 7790 that several "AMD supporters" suggested that I RMA and it has been steady as a rock on any driver since 14.2 (IIRC) so whilst I may invest my hard earned in another AMD in the future it won't be a flagship.
 

troger5troger5

Distinguished
Dec 7, 2009
44
0
18,540
Does anyone actually have a 4k 60Hz display without DP1,2? So why the fuss about HDMI 2.0 ...
Yes, I do. This is a terrible joke from AMD. I'm glad I didn´t wait and got a Gtx 970 months ago.
Good day, yea, the issue is bandwidth. hdmi 2.0 gives you enough to run your display at 60fps. The lower bandwidth ports are only limited to 30 fps. Does this all mean that the other cards based on this that come out will have the same limitation or is it just with these ref cards? thanks
 

alextheblue

Distinguished
The air-cooled cards, by comparison, are expected to scale back as their heat sinks and fans are saturated with thermal energy.

I like your style. It's called overheating.

No it's called throttling and it's commonplace in graphics these days. They push the GPUs as hard as possible with boost clocks and in certain circumstances they have to scale back a bit due to heat or power limits. Even Maxwell can throttle, look it up. Not just in Furmark, either (though it does throttle MORE in furmark by a lot).

If you absolutely don't want throttling (at least in real-world gaming), you can either tinker with it yourself or just get something that's overbuilt like the Fury X instead of its aircooled brethren.
 

Gurg

Distinguished
Mar 13, 2013
515
61
19,070


If these are from a CLC FuryX vs the reference cooled 980Ti then the results are hogwash. The EVGA 980Ti with a CLC attached has a 14% higher clock than their 980Ti reference card, while aircooled non-reference MSI has an 18.8% and the GIGABYTE GV-N98TG1 GAMING has a 20% clock increase over the reference card. A 14-20% 980 Ti clock increase would most likely evaporate any FuryX lead and turns most cases to loses. It will be interesting to see apples to apples FuryX vs aftermarket cooled 980Ti independent benchmarks next week.

 

somebodyspecial

Honorable
Sep 20, 2012
1,459
0
11,310


So you're telling me AMD is going to put 4x 550-580mm^2 dies onto a single card, and all the memory that goes with it and cool it with magic while we run this in our PC? You realize the wattage that card would need? How costly it would be? HBM means nothing if you're not bandwidth constrained. PERIOD. IF it was we'd see fury (even in AMD's benchmarks) BLOWING OUT 980ti/titan and it doesn't. Blowing out to me is say Intel 4790 vs. AMD whatever...IF all you get is <10% wins HBM got you nothing correct? It's the 550mm-580mm^2 GPU that got that perf! IE, comparable to Nvidia's gpu size. GM200-310-A1 variant is 601mm^2 on 28nm. All AMD has done here is catch up and lower their bad power (watts) using 20nm to get to 550-580mm^2 without totally blowing up costs, while HBM costs is a question mark that won't be answered until we have a quarter of sales behind them.

No fanboy, I said NEITHER side can do 4 chips on a card without issues. NEITHER side. Get it? Just reality of physics etc. Neither side is in the position to be able to make a toy to prove a point but not make money on it. I'm sure they'll muster a dual card (both sides can always do this, depending on clocks etc) but a quad gpu card is a joke for both in these PC's. I don't want AMD broke, so even chasing your pipe dream is ridiculously financially stupid, even worse than chasing consoles and apus. Both of those stole from core CPU/GPU/DRIVER work and has severely cost AMD across the board. I don't believe HBM this soon was wise either. It wasn't needed. They should have waited for the next round with NV to cheapen it when it MIGHT be needed. NV has 72% share and would massively drive pricing down on HBM if AMD had waited. 1st to market with something that is useless with current gpu power just costs profits that are BADLY needed by AMD (see 7B losses in the last 15yrs, 6B+ of that in the last 12yrs).
 

BadNight

Honorable
Nov 9, 2013
59
0
10,630
I don't know why people are so obsessed with early benchmarks. AMD is perfectly capable of surpassing Nvidia cards. They're both are capable of a lot better technology. If anything, their business plan works great considering the only thing they need to do is release a card 10% better, and everyone just eats it up. This has been happening the whole time. This back and forth is intended.
 

uglyduckling81

Distinguished
Feb 24, 2011
719
0
19,060


Nvidia have the market and it includes a whole lot of fan boys.
They seem to be completely oblivious to the fact that if AMD disappears innovation slows to a crawl and prices sky rocket.
I can't understand anyone rooting for Nvidia the way they do.
Everyone should be celebrating every good step AMD makes and biasedly buying their products to try and even things out.
Nothing better for consumers than fair and healthy competition.
A GTX 980 wouldn't of cost so much if we had a more balanced market. It would of been priced according to it's performance. Instead Nvidia is able to bank on fan boys to buy up inflated prices.
 

ern88

Distinguished
Jun 8, 2009
882
12
19,015
Nvidia cards are superior to AMD cards IMO. And I am not a Nvidia Fanboy. SInce I have owned ATI/AMD cards since the X800 cards came out. And I am currently running a HD7950. I say this because their cards are fast and power efficent. Something AMD lacks. They maybe a bit more expensive, but you get what you pay for. I hope that the Fury is a home run. We need healthy competition in this market.
 

Blueberries

Reputable
Dec 3, 2014
572
0
5,060
You guys are missing a key point here, AMD is releasing a higher performance/watt card with a higher-than ever TDP and scales the clock-rate back as necessary. Nvidia is releasing a higher performance/watt card with a lower-than ever TDP, and we'll probably see passively cooled 1080p solutions and single-GPU 4k solutions with Pascal (that don't come with a factory radiator, and will likely have more memory than these cards).

 

DasHotShot

Honorable


Doubtful that it is "the future". Firstly there are several other screen technologies around already which surpass 4k or give an identical experience (1440p with AA).

Secondly 4k is REALLY inefficient. I mean it you need to render 4 times the amount of pixels basically over 1080p for a very very small gain in almost all titles of perceivable resolution.

Like Hades said...until 4k prices normalize and the release of Oculus for example I would hold off and see.

For now I can see nothingbetter in terms of visuals and quality over 1440p, AA on displayed via Gsync...Just pure heaven
 

Blueberries

Reputable
Dec 3, 2014
572
0
5,060


I don't know what you're talking about. AA blurs the image and makes it look worse.

4k looks phenomenally better both in television and in rendering, it will be affordable next year, and in everybody's Blu-Ray player in 2017.
 


At which point this Radeon Fury series will matter about as much as the GTX 700 series does now.
 

Vlad Rose

Reputable
Apr 7, 2014
732
0
5,160


4k on Blu-Ray requires a special '4k blu-ray' player and not the ones currently out. Blu-Ray was a hard to adopt option for movie watching until it became standard on the PS3. Sony and Microsoft claim that the PS4 and Xbone will be able to support 4k Blu-Ray movies once they come out, but it might require a hardware revision/redesign.
 

Blueberries

Reputable
Dec 3, 2014
572
0
5,060
Actually what I really want to see is the Fury X power usage at 1080p / 1440p with FRTC enabled, because that's important when you release a card with 275W Board Power.
 

Blueberries

Reputable
Dec 3, 2014
572
0
5,060

youssef 2010

Distinguished
Jan 1, 2009
1,263
0
19,360
That's GREAT news. Why is it that the solution to every technical problem is to go vertical?

We needed more storage for SSDs and Samsung Went Vertical. And now to add more Vram to GPUs, HBM goes vertical. Weird coincidence.
 
Status
Not open for further replies.