AMD's Radeon HD 4870 X2: R700 First-Look

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Mathos

Distinguished
Jun 17, 2007
584
0
18,980
[citation][nom]Trinix[/nom]Your eyes can only see about 30 frames. So even if your adrenaline can go running I doubt you will really notice the difference between 70 and 90. Maybe if you play with the FPS on your screen and see it increase, your mind is making double images and flowing them better together, but it's better to have a good FPS that doesn't increase or decrease a lot. [/citation]

Wish I'd seen this to quote before I commented. Actually, you're eye's have no limit as to what speed they can see. They're just an optical instrument that receives light patterns. Now you're optical nerve, and the area in your brain that process's the light data can only process images at up to 30fps under normal circumstances. Which is why movie theater movies that are shown at 24-30fps don't appear to flicker.

Glitch is right though. Once the fight or flight response kicks in, and the adrenaline starts flowing, it changes the the way the nerve gaps in your brain work, allowing your brain to process information at a much accelerated rate. Which is why when you're really afraid or in dire situations, the world around you appears to slow down. When in reality, it hasn't, your brain is just taking in sensory information much faster. Once you learn to control your adrenaline response, you can start to use it in situations like Paintball. It's real fun when you can start watching the paint balls flying at you, course that doesn't mean you can get your body to dodge any faster though, well sometimes.
 
G

Guest

Guest
what's up with the tiny diagrams I can hardly decipher anything.

The test setup seems to be missing vital info. What are the settings of the games. You used to write it all out in detail.
 

Arbie

Distinguished
Oct 8, 2007
208
65
18,760
Thanks for the first look at this. But **WHY** do you keep shifting card positions between the graphs?? After taking all that data, this is one of the most counterproductive things you could do in its presentation. It makes it almost impossible to see how a card performs across a range of apps/req'ts. Please - pick an order and STICK WITH IT.
 

nickchalk

Distinguished
Aug 6, 2008
35
0
18,530
Dear Vodator_21
1.it is not possible to show the same tests with AMD configuration.
2.it is not possible to show the same tests but with at least 8AA/16AF and soft shadows, dynamic lights, plus description of other visual options.

simply because they will not be the same tests.
i also think that the results will not be the same too.(much lower)

also the article says FIRST LOOK.

and i dont think anybody cares about how you do your shopping.

anyway, good review, good card, but the price...
suddenly AMD wants more?
 

shachar2

Distinguished
May 2, 2008
13
0
18,510
when will they start working with Microsoft and producing graphic cards that has extra low power/noise states for only "desktop" mode?
instead of turning off the card and letting the onboard card take over build into the card in the first place a function that enables it to go to low performance mode
 

el fiendo

Distinguished
Jul 28, 2008
10
0
18,510
What I personally like spaztic, is that when Thurin warns them to save their money for a bit because nVidia will hit back with an answer (not necessarily one that will trump the X2, but one that will invariably drive down prices) they say "wtf 'nvidiot' durr". Really? I dunno I kinda like my money. Good post by the way Thurin, though I'd say don't try to educate the masses, they're just going to swarm in the direction of the rest of sheep herd.
 

sithkiller

Distinguished
Aug 12, 2008
5
0
18,510
For the follow-up review of this card it'd be great to see a section on aftermarket heatsinks/fans. Not necessarily performance results using different ones but mention of which types fit this card since it would be one way for a user to cut down on some of the noise and heat generated by this card.

While this card may be happy with the operating temps, I try to reduce noise/heat wherever I can so the overall case temp is lower for the betterment of all my other components.)

Thanks!
 

MrAv8er

Distinguished
Dec 23, 2004
2
0
18,510
With the introduction of this card at 2GB (2x 1gb) and with Vista all but requiring 2GB, it seems to me that 32 bit OS's have now hit the wall. Is this finally going to push the game makers into the 64 bit camp?
 

doomsdaydave11

Distinguished
Oct 16, 2007
935
0
18,980
[citation][nom]Tipmen[/nom]Good job ATI/AMD! really good card you have here lets hope its cheaper then the GTX 280. I'm glad i waited it out now i can replace my old setup. Now i will get new X48 board and 2 of these puppys and im sure I need a better power supply I don't think a 750 watt will do it.[/citation]
Are you kidding? People are able to CF 4870's no problem under a 550-watt unit.

HD4870FTW!! gj DAAMIT!
 
G

Guest

Guest
To deadilest - "There are no games that need this"

That is how the software-hardware relationship works. It doesn't make sense to release software that demands hardware that doesn't exist. For one, you can only market what you can show off. Spending money on detail beyond that is a waste. Second, the higher the hardware requirements, the less market you have.

On the other hand, hardware will sell with no yet useable software. People will want something that will run the best games for several years. People will want a card that does things other than gaming as well. Not even the GTX 280 can render the total autocad 3D pipe system for a factory without being laggy. Once there are enough people with cards like these, then games will come out that can use it. I am sure they are in the works now.
 

spaztic7

Distinguished
Mar 14, 2007
959
0
18,980
OK, for those still wanting to get a first day release, newegg is out of the saphhire, but check out mwave.com and provantage.com. Provantage has it for the cheapest, but I dont trust them as much. Mwave is approved by ATi and has it in stock and cheaper than newegg (as of 2:46 EST 8-12-08).
 

Aviator64

Distinguished
Aug 12, 2008
3
0
18,510
[citation][nom]warezme[/nom]Its two GPU's even though its in one card with twice the memory of the older model (hence Xfire in a fancy package, illustrated by their lack of memory sharing). Wouldn't it have made more sense to compare it to two GTX280's? if you wanted apples to apples then?[/citation]

That would not be apples to apples: you can't put two 280s in one PCI-E slot.
 

spaztic7

Distinguished
Mar 14, 2007
959
0
18,980
I agree with Aviator64.


It's not the point that it is 2 GPU's, the point is that its under 1 HSF. It is 1 card, one board with 2 GPU's.

It’s like Intel’s current quad cores, just two dual cores under 1 heat spreader.
 

jimr9999us

Distinguished
Mar 13, 2008
18
0
18,510
Nice. nVidia fanboy here, unashamedly, but 2 weeks w/ an asus 4850 have left me pleased.

It's simply That important to us all as gamers to have two companies battling it out for our gpu dollar. The release of the hd 4870 x2 is the gaming news of the year, comeback style. Congrats amd/ati. *raises a glass*
 

NeoData

Distinguished
Aug 12, 2008
13
0
18,510
[citation][nom]El Fiendo[/nom]What I personally like spaztic, is that when Thurin warns them to save their money for a bit because nVidia will hit back with an answer (not necessarily one that will trump the X2, but one that will invariably drive down prices) they say "wtf 'nvidiot' durr". Really? I dunno I kinda like my money. Good post by the way Thurin, though I'd say don't try to educate the masses, they're just going to swarm in the direction of the rest of sheep herd.[/citation]
True I liked that about his post what I found strange was that he seemingly found the results in lower resolutions more important than the higher resolutions which this card is intended for.
If he really wanted to educate people he should have told them to save their money and go for a midlevel card if their gonna play in resolutions that low.
 

NightLight

Distinguished
Dec 7, 2004
574
16
19,645
all this talk about winning in high resolutions... focus on resolutions that people actually _use_ . also, i must agree with vorador. let's see how it performs with AA and all the perks turned on.
about power: if you buy a high end card, are you really interested in how much power it consumes? I mean, you just bought a 600$+ card.
I hope nvidia strikes back soon, let's see a war, so prices can drop for the consumer ;)
 
G

Guest

Guest
Hey guys. Gonna build a system.. What do you think is better. I have a motherboard with 3 PCI-Express x16 Slots.

Should I buy 3 of the GTX280 or 3 4870x2 or 3 9800gx2? Now I need this system to be powerful enough to run games for at least the next 2 to 3 years. Would this setup be capable of that? My current video card is a 6800 Ultra with 2 3.6GHZ 1MB Xeons. The chips are stable at 4GHZ.. They are actually 2.4's which run 3.6GHZ normally, and 4GHZ when I need to do some hard core gaming.. However its time to make the leap to a new system..

Thanks!
 

Malovane

Distinguished
Jun 17, 2008
177
0
18,680
[citation][nom]P4Northwood[/nom]Hey guys. Gonna build a system.. What do you think is better. I have a motherboard with 3 PCI-Express x16 Slots.Should I buy 3 of the GTX280 or 3 4870x2 or 3 9800gx2? Now I need this system to be powerful enough to run games for at least the next 2 to 3 years. Would this setup be capable of that? My current video card is a 6800 Ultra with 2 3.6GHZ 1MB Xeons. The chips are stable at 4GHZ.. They are actually 2.4's which run 3.6GHZ normally, and 4GHZ when I need to do some hard core gaming.. However its time to make the leap to a new system..Thanks![/citation]

4870x2's only have 1 crossfire connector.. so you could only really use two of those. I suppose it is possible to put a single 4870 between them, as it has two connectors... but seriously... 2 4870x2's in quad crossfire is enough.
 
Status
Not open for further replies.