AMD Radeon HD 7770 Pics, Specs and Benchmarks Leaked

Status
Not open for further replies.
So I'm guessing (without having to do too much digging due to laziness) that this card will perform somewhere around or below the HD 6850?
 
[citation][nom]sseyler[/nom]So I'm guessing (without having to do too much digging due to laziness) that this card will perform somewhere around or below the HD 6850?[/citation]
Below a 6850 and above a 6770. A "6830" to be more precise
 
@sseyler

My thoughts exactly :)

I went and looked at my own Sapphire 6850's 3DMark 11 score and it was a bit over 3600, so this new 7770 should be a worthy upgrade for anyone running the 5770/6770 cards. With a little touch of O/C it will be right up to the 6850's speed!

Now, if the price is right ... maybe around the $100 to $120 mark it will be a very solid performer!
 
I was going to buy a 6850 this march in my trip to US, I guess this one is going to a better choice for the money... ONLY if the price goes about $120-$130..
 
[citation][nom]Benihana[/nom]My GeForce 2 MX 440 just burned out. Guess this could be a good replacement.[/citation]
I don't think they make the 7770 in AGP buddy.
 
AMD's new cards are complete failures, simply because they don't represent a true generational leap. The new 7850 should be faster than the existing 6850, especially if it's going to initially retail at roughly the same price. AMD has lost their bang-for-the-buck advantage in the video card market, and unless NVidia changes things up in a hurry, my next graphics subsystem will probably end up being integrated Intel in 2014 or so.

Voodoo Rush->Voodoo 3->overpriced crap->6800gs->8800gt->HD6850->overpriced crap so far. Show me the next 8800gt, please, or lose me forever.
 
[citation][nom]Achoo22[/nom]AMD's new cards are complete failures, simply because they don't represent a true generational leap. The new 7850 should be faster than the existing 6850, especially if it's going to initially retail at roughly the same price. AMD has lost their bang-for-the-buck advantage in the video card market, and unless NVidia changes things up in a hurry, my next graphics subsystem will probably end up being integrated Intel in 2014 or so.Voodoo Rush->Voodoo 3->overpriced crap->6800gs->8800gt->HD6850->overpriced crap so far. Show me the next 8800gt, please, or lose me forever.[/citation]

I disagree. So far, their performance has been a convincing increase. Do you need a 100% improvement to call it a generation leap? If so, you will be disappointed with every generation. If the 7770 is almost as fast as a 6850, it stands to reason that the 7850 will be faster. How much faster, and what the price point will be is the real question. Also, they are not concerned whether they lose you at all.

With the 6850 at ~$150, this card should be in the ~$90-100 range, assuming the 7850 will take over the ~$150 price point.
 
Approx $100 price point is the sweet spot for a LOT of people, expect a large number of gamers to flock to this card because it should have plenty of clout but not break the bank
 
Is it me or the TDP is a bit weird ? The HD6850 has a 127w TDP and is made using 40nm technology.

This HD7770 is built using 28nm technology, performs slightly worse and it's TDP is only 27w less ? Is it because it has much better tessellation or am I missing something ?
 
just a question, what kind of preformance do you think the witcher 2 would get on this card? im not saying i want to max it out, but i want large textures and i want it to play at 1920x1200 without a frame rate and mouse lag problem.

im asking this because witcher 2 is the only game i cant get reasonable frame rates out of unless i go to the lowest setting for that resolution, and the low resolution textures are... to put it lightly... remind me of the n64 era.

so you know my card is a 5770 hd, and i do not want to spend much over 100$ for a new gpu considering its only 1 game im getting it for... if it was anything but witcher 2, i wouldnt even consider the upgrade and call the game dead to me till my gpu failed and i was required to get a new on, i have done that with a few games in the past.
 
Is it me or the TDP is a bit weird ? The HD6850 has a 127w TDP and is made using 40nm technology.

This HD7770 is built using 28nm technology, performs slightly worse and it's TDP is only 27w less ? Is it because it has much better tessellation or am I missing something ?

Better tesselation, reduced die size, higher clock, the list goes on mate
 
Status
Not open for further replies.