AMD Fury X And Fiji Preview

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Vlad Rose

Reputable
Apr 7, 2014
732
0
5,160


I've never had that problem occur in the last 15+ years with either ATI/AMD nor Nvidia graphics in any of the machines I've built over the years. If it's a reoccurring problem for you, I'd check into something else you're running.



With the rehash issue, Nvidia 900s may be a rehash of their 800s, but ATI's 300s are a rehash of their 200s; which are a rehash of their 7000 line. I was hoping for newer tech than what's in my 7770 for their lower/mid range cards.
 
The lack of HDMI 2.0 is a huge mistake. It forces people to buy monitors and restrict user to a single TV, which I own, AX800 from Panasonic.

Still, I don't know if a Displayport to HDMI 2.0 adapter exist. It should in reality...
 

jamsbong

Distinguished
May 7, 2002
22
0
18,510
Beautiful water cooled design! It is so compact and yet it is the top of the range GPU.
Good to see some competition returning. Nvidia will have to do some price cuts soon.

As usual Radeon is always better value for money while Nvidia is the king of performance.
 

Epif

Distinguished
Jan 11, 2015
13
0
18,510
No DVI. Great, so I would have to buy a $130 active adapter to use it with my DVI dual link only 2560X1440 monitor. And I probably still wouldn't get the full 95 Hz I've overclocked the monitor to, even with the active adapter. Lame.
 

HideOut

Distinguished
Dec 24, 2005
560
83
19,070
Toms has gone to sheit. I dont mean the stories, i mean the posting. I loses your loggin information between posts then freezes when you do post. I typed out the entire math for the amps/watts/volts for them (but yes there is a typo in the story) and after THREE posts of it, its still not here.
 
No DVI. Great, so I would have to buy a $130 active adapter to use it with my DVI dual link only 2560X1440 monitor. And I probably still wouldn't get the full 95 Hz I've overclocked the monitor to, even with the active adapter. Lame.

Seriously, you have a 4 years old monitor however you want the top of the line card...
 

logainofhades

Titan
Moderator


I would wait on partner cards, before complaining too much. Reference models normally differ a good deal from actual retail items.
 


You realize the card will probably come with an adapter and not sure why you would need an active adapter as the active adapter is for more than two screens in Eyefinity mode.
 

royalcrown

Distinguished
In the USA, houses only average 100 amp service, with older being 60 amps, newer houses are 200 amps. No houses are 400 unless you've got friends at the power company or your own power source.

It can't be 400 amps. That's most of your house amps there!!!
 

Steakhaus

Reputable
Apr 25, 2014
5
0
4,510
I have a G-synch monitor and a GTX 970 which is exactly what I need for 1080p. However, I'm a fan of competition and so I'm really happy that the Fury X was given such a nice cooling solution. Over-Engineering for overclocking etc. really shows dedication from the statement "AMD's mission is to provide the best value".

With all the talk about VR claiming resources I wonder if this is something DX12 is supposed to help with. My Mini-ITX only supports 1 card. So how much resources are required for a smooth VR experience? The GTX 970 came out as "recommended" for Oculus, but I really like what Valve is doing and assume it will take more power than Oculus.
 

Vlad Rose

Reputable
Apr 7, 2014
732
0
5,160
Interesting fact, the original Rage Fury lines that ATI did was a rather flop compared to the competition at the time. It probably explains why they eventually dropped the Rage name and went to Radeon.

Please hope it isn't true this time with their Fury cards.
 

blackbit75

Distinguished
Oct 10, 2010
49
0
18,530
400A typo? Is it 40A or 400W?
40A is a intensity electrons by second. = I
400W is a power value. P

P (Power) = I(Intensity of current) x V (voltage - electric potential)
P = I x V

400W=40A x V so
V = 400W/40I = 10 Volts.
So power of 400W is numbered at 10V

Normally at home we talk about power at 220V, so 400W at 110V will be much higher than 400W at 10V. Take in count that when it's not constant the voltage as home, then the formula changes a bit, but the idea is there.


 

Giroro

Splendid
The AMD website says that the R7 370 won't support Freesync nor TrueAudio, but the R7 360 will. What gives? I guess it really IS a rebranded R7 265 (and by extension, its a rebranded HD 7850).

Uncool.
 

tomc100

Distinguished
Jul 15, 2008
166
0
18,680
Benchmarks are finally out!!!

http://www.forbes.com/sites/jasonevangelho/2015/06/18/amd-radeon-fury-x-benchmarks-full-specs-new-fiji-graphics-card-beats-nvidias-980-ti/
 

Arabian Knight

Reputable
Feb 26, 2015
114
0
4,680
With the new HBM design , AMD can make 4 GPU on one card guys ... If they do this , Nvidia will be in trouble ! they cant make a 4 GPU single card at all no space !
 

I agree with you. AMD should have done a few things they have not done on these cards.

Overall. After reading the benchmarks on the 300 series, I'm pretty disappointed. I'll stick with my GTX 770 for another year.
 

somebodyspecial

Honorable
Sep 20, 2012
1,459
0
11,310


There is a difference in that AMD advertises UP TO, so you never know if you get a crap chip it may not perform well. Nvidia advertises you will ALWAYS GET X, but maybe MORE. There is a big difference between you MIGHT get what you paid for if it doesn't throttle (which only the water cooled card apparently guarantees THROTTLE FREE at box speeds from AMD), and YOU WILL GET THIS AT LEAST (ALL NVIDIA CARDS to date), THEN throttle.
 

spagalicious

Distinguished
It will be very interesting to see benchmarks of how the Fury X compares directly with 980ti at stock clock rates. To me, OC comparisons cannot be trusted as overclocking ability variances in silicon make statistics irrelevant. The only benchmarks I've seen for the Fury X included a 100 mhz OC, beating the 980ti by 4-10%.

Only the reviews will show how well HBM receives a substantial overclock.
 
Just read the Forbes AMD Fury X Benchmarks. These numbers come from the AMD Press package. It beats the 980Ti in each of the dozen games, and just barely in the one synthetic, 3DMark Firestrike Ultra (4K). That card does look like its a winner. But these are AMD numbers. The embargo on press created benchmarks is still on until the 24th. And they only benchmarked 4K numbers.

Is there a reason for that? I guess we should know in a week.

I will relink this to save you from scrolling back up...
http://www.forbes.com/sites/jasonevangelho/2015/06/18/amd-radeon-fury-x-benchmarks-full-specs-new-fiji-graphics-card-beats-nvidias-980-ti/
 

somebodyspecial

Honorable
Sep 20, 2012
1,459
0
11,310


If Nvidia can build a 1200w gpu on a card and cool it amd will be in real trouble...LOL. They'll sell 1 of them and AMD will massively lose market share to this $10000 monster card...ROFL.

BTW, NV has HBM2 with pascal, so what's your point? You act as though HBM is AMD only forever, while reality is NV has it in a few more quarters when it's actually (maybe) needed by a gpu that can outstrip today's bandwidth (this gen doesn't have a bandwidth problem, but a die shrink to 16nm/14nm could cause it). By the time the fantasy ships from AMD, NV has one too? OK...Let me know when that 4 gpu chip card (from either side) can do ~300w. Get back to me at ~7nm or less...LOL. Not trying to be mathematically accurate here, but you get the point.
 

pjones78

Reputable
Jun 2, 2015
6
0
4,510
The lack of HDMI 2.0 is a huge mistake. It doesn't matter if most may game on monitors right now. A growing number are turning to gaming on their TV or buying TVs as monitors for their computer rooms and AMD just told those people to shove off.

HDMI 2.0 is not new so there was NO REASON for them to not include it other than they simply didn't care enough and for a company that is being outgunned over 2-1 in the industry you would think they would WANT to entice as many customers as possible rather than block off a growing percentage of them.

Not having HDMI 2.0 is even more comical for that project quantum since that is clearly designed for TV/"living room" use so if it doesn't have this then they are expecting people to buy these suped up all in one mini PCs that will neuter gaming performance for the very people they are targeting to buy them.

I had big interest in this card and was willing to take a chance on dealing with AMD's weaker driver support yet AMD has just told me they want me to stay with Nvidia and get a 980 Ti which I will do.

What a boneheaded decision on their part.
 
Status
Not open for further replies.