ATI Radeon 6000 seriers rumor to be released in october

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

blackpanther26

Distinguished
Nov 29, 2007
757
0
18,990
http://vr-zone.com/articles/-rumour...schedule-first-iteration-in-october/9688.html

Popular Turkish website Donanimhaber has released an expected schedule for the release of ATI's Radeon HD 6000 series. The first HD 6000 GPU to be released will be the Radeon HD 6700 series, codenamed Barts. The HD 6700 is scheduled for a release as early as October. As suggested by the nomenclature, the HD 6700 will directly replace the HD 5700 series.

The HD 6700 release will be followed up by Cayman in November, expected to be branded as the ATI Radeon HD 6800 series, replacing the current HD 5800 series.

The flagship will be Antilles, and branded as the ATI Radeon HD 6970. Antilles, as expected, will be a dual-GPU Cayman. While the HD 5970 lowers clock speeds from the HD 5870, HD 6970 is expected to feature the same clock speeds as the HD 6870 - basically a HD 6870 in CF. This will be much like the HD 4870 X2. The Radeon HD 6970 is scheduled for December.
 

That could very well be the reasoning behind the rumored release delay for the top-tier cards - more time for the drivers to mature.

Personally, I'd welcome an un-unified driver package. This whole unification thing was nice at first, but just because a few new cards need some driver tweaking for performance issues within a new game doesn't mean old cards need some crap thrown into the mix that could potentially screw them up.
 

You talkin to ME? [:jaydeejohn]
 
I think there is something to read between theses lines,
from Charlie
The reason for this can be summed up by saying that the new 'medium' shaders can't do what a complex one can in the same time, but there are more of them, and they can more than make it up in number. Since a GPU is a throughput machine, not a latency bound device, you won't see the difference, it will just work a lot faster in several kinds of operations.

There will probably be a pathological case or two that will be a bit slower, so look for the attack slide decks to float as soon as samples leak. Remember the Nvidia slides from CES about how Fermi was many times faster than HD5870 on a specific section of the Heaven benchmark? Remember how well that turned out in practice, and in sales? Wait for real benchmarks, and don't worry about the desperate sputters from the big green FUD cannon.
He is already countering the benchmarks that will not show increases, :)
I'm quoting Charlie because ATI, oops I mean AMD has not released any info, have they ?
 

The game was PURPOSELY designed to push existing and even future hardware to its limits in order to give the best graphical experience possible. It is extremely well coded for that goal, ground breaking even. The game came out 3 years ago. In terms of gaming technology that is a vast amount of time and STILL it is the best looking game around. That anyone can consider a monumental feat such as that to be "poor coding" is simply baffling.
 
I remember when the 2900 came out, then later, its little brothers, and not much to be excited about on the red side of things.
We all sat waiting for the green teams new cards, when the 9 series came out, and people were non pulsed.
Im thinking ATI wont do the same
 


so carmack during the doom3 days, was a very bad coder. if bad coders do actually own a ferrari testarossa, pioneered the fps genre, create groundbreaking game engines, then by those standards a good coder must be God himself?

 

http://forum.beyond3d.com/showpost.php?p=1469271&postcount=1704
 



This is what Crytek Executive producer Nathan Camarillo said in an interview about Crysis 2

With Crysis 1 on PC only, you can kind of brute force it. "Well, just throw more hardware at it." That was the solution to making a better game. So, in some ways, in places, it was a little unfocused and not as tight as it could have been.

He goes on to say that because Crysis2 is going to be Console as well it will have to be what he calls "tighter" because of the constraints of the platform.

Reads like hes talking about the coding to me.

http://www.joystiq.com/2010/04/09/video-interview-crytek-executive-producer-nathan-camarillo-on/

Mactronix
 


No, realising a perfectly optimized engine and tailoring your approach to the platform the game will be run on is what they are aiming at with the Console version. Which is possible due to the fixed hardware nature that is a Console.

The fact that the Executive producer of Crytek said "With Crysis 1 on PC only, you can kind of brute force it. "Well, just throw more hardware at it" Seems to be passing you by or you are choosing to ignore the fact that the company thenselves have admitted that the game could have been coded better.

mactronix
 
From what i can tell the 6 series is just a stop gap between now and their next major discrete generation so i wouldn't go expecting anything massively miraculous. Its just typical parry and riposte between AMD and Nvidia, AMD know Nvidia are working on new Fermi stuff so they're going to pre-empt it by releasing a 'refresh' of existing hardware to drum interest back up which will buy them time until they get the next generation out in the future.

Thats my take on it.
 




i think you misunderstood what the exec was talking about in response to the "Watered down" question.


 


so how can you perfectly optimized a game engine on a non-fixed hardware? you can't.

optimizing performance rests on the user's ability (the graphics options?), coincidentally, there also the this little button called "OPTIMAL SETTINGS"found on the system settings of the game. i reckon it actually optimizes something.

i swear i wouldve made a demotivational foto that says "OPTIMAL SETTINGS BUTTON: Renders Your Argument Invalid." if i had the time.


 

I'm ignoring nothing. You are simply entirely misconstruing what he is saying. He specifically states that they were able to make the original Crysis that way because it was a PC game. That obviously is tailoring the coding to the platform and the discussion is about how bringing the sequel to different(lesser) platforms is now making them use a different approach. It shows even just from the setting they are using for the game. A relatively static cityscape is much easier to generate then realistically portraying lush forests and plants of a jungle setting.
He is not saying the coding of the graphics for the original game was poor. He is talking about the methods they used to accomplish what they wanted to do. If you study programming you'll find that "brute forcing" is sometimes the only way to accomplish something. There are some things that simply need processing power and clever coding just won't cut it. They specifically chose to include graphics options on a level that require powerful hardware to make the game "better." If you consider this a poor decision that is your own opinion but I'll point out the game got a ton of attention because of it, is still considered an essential test of a gaming hardware and gets talked about three years later(as this convo shows) specifically as a result of this "poor" decision. I'll also point out that the game actually scales very well with weaker hardware. Even low end cards can handle the game if you use modest settings and despite this it still looks very good, usually better than most current games on high settings. The problem only comes in for people that can't stand to run a game on anything but the highest possible settings... and frankly altering the coding by releasing a lesser game to accommodate people with that issue is where the "poor coding" would have come into play imo.
 

I found the optimal settings button terrible, I would always try it and get the lowest recommended settings, but when I put the game at enthusiast and at 4xaa I get well above 40fps.
Of course that's probably my fault more than the game.
 


I never said you could optimise a game for a non fixed hardware set up, the very fact that i pointed out that it was possable due to the fixed hardware should have been enough for you to reason out that i know you cant do it.

Mactronix :)
 
Status
Not open for further replies.