ATI Radeon 6000 seriers rumor to be released in october

Page 8 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

blackpanther26

Distinguished
Nov 29, 2007
757
0
18,990
http://vr-zone.com/articles/-rumour...schedule-first-iteration-in-october/9688.html

Popular Turkish website Donanimhaber has released an expected schedule for the release of ATI's Radeon HD 6000 series. The first HD 6000 GPU to be released will be the Radeon HD 6700 series, codenamed Barts. The HD 6700 is scheduled for a release as early as October. As suggested by the nomenclature, the HD 6700 will directly replace the HD 5700 series.

The HD 6700 release will be followed up by Cayman in November, expected to be branded as the ATI Radeon HD 6800 series, replacing the current HD 5800 series.

The flagship will be Antilles, and branded as the ATI Radeon HD 6970. Antilles, as expected, will be a dual-GPU Cayman. While the HD 5970 lowers clock speeds from the HD 5870, HD 6970 is expected to feature the same clock speeds as the HD 6870 - basically a HD 6870 in CF. This will be much like the HD 4870 X2. The Radeon HD 6970 is scheduled for December.
 


I like you Ares, but you can't honestly think that is reasonable. I can't believe I have to do this again...

5870 vs GTX 480:
22208.png

22204.png


Idle delta: 26w
Load Delta: 102w

Average energy rate: 15 cents/kw-hr (A high energy rate actually)
http://www.eia.doe.gov/electricity/epm/table5_6_a.html

Gaming time: (assuming this is Crysis load, which it won't always be) 4 hours/day (Meaning the user has no life, and likely no job)
Idle time: 12 hours/day
Off time: 8 hours/day

Calculator:
http://www.handymath.com/cgi-bin/electric.cgi?submit=Entry

Cost per year of total gaming time (1460 hours): $22.34
Cost per year of idle (4380): $17.08

Extra cost for a no-life gamer who only plays Crysis and never any less demanding games, who pays a lot for electricity, and NEVER goes a day without gaming: $39.42.

A REAL person, will never game that much per year, and most people pay less for electricity than the above example, and still with this worst case scenario we can't top $40 extra a year, which is still pretty insignificant.
 
Heres how i did it. Using those energy consumption measures, which seem about right. So i think 11-12 cents per kw is national average? Sure why not. Lets say the average user does 4 hours load, 12 hours idle, and 8 hours in between, so we will call that say 50w difference between the 2, nice middle ground, eh? So you have 408 watts load, 312w idle, and 400 watts in between time. This is based off everything you said, and average computer use. For those who dont believe the GPU doesnt get some watts pumped it way during say internet browsing or rendering, think again. In any event, so for 1 day, do the math, and we arive at 1120w used by the 480 in one day of use more than the 5870. I went more in depth last time for efficiency numbers, but lets just call it an even 80% for the sake of time. 1120w draws 1344w from the wall. 80% is even generous for some PSU's. Anyway, do the math, 1344w x 365 days = 490,560w a year, or 490kW a year, or $56.414. These numbers are actually more generous than some of those you gave, but this has been proven. So lets just say $50 a year. Considering most will likely keep these cards 2-3 years at least, thats a savings of $100-150 over the life of the card. Now i dont usually bring this up, as its boring, takes me a while to do the realistic math. This doesnt really matter to most, the energy consumption of the 480 doesnt matter as much to me as the heat, whats a PSU for anyway, but it doesnt make it a reasonable amount. Oh btw, this estimate is a computer that you dont turn off, which most gamers, including me dont.
 
BTW, if you notice a FLAMING error in my calculation, please point it out, hate to spread false info, but this is using all the averages and knowledge that i can. This is a fairly average enthusiast too, believe me, i know people who will have this thing at load far more than 4 hours. The thing people question most is that 50w in between time, and there was an article about this time, ill try to dig it up, but especially with CUDA, there is most definitely a normal usage time.
 
Your calculations make absolutely no sense. 50w on average? You can just go by load and idle. Yes, while web browsing and such your GPU is in idle. Most gamers don't turn off the PCs? Says who? That is complete BS, and is just stupid to let your PC idle while you sleep every day. Efficiency? There won't be any major change in efficiency between the two so why bring it up? It is a constant variable.

And yes, folders will have a much higher cost difference, but the performance of the GTX 4xx series SPANKS the 5xxx series in folding, and these people already realize how their electricity bills will go up.

I used standards, and a real calculator not some half-baked equation.

Dude no, just no.
 
This whole argument that a video card , especially, a high end gamer card VS another is going to be a game changer on your utility economic situation is ridiculous.
I just updated two bathrooms in my house, went from 3 bulb vanities to 4 bulb. Thats a extra 60 watt bulb in each bathroom that goes on every use X 2 !!!!
I can't sleep at night anymore!

If your hobby is gaming, and you can put aside 10 hours a week, while working a full time job, your probably lucky. Sure all of us sometimes go on binges, for any hobby.
Buying a V8 sports car has implications if you drive 100 miles round trip to work every day. In car forums I don't see pathetic mpg rants next to 0-60 times.
rofl
 


funny-pictures-cat-scratches-post.jpg
 
Ok, i used 26 watts for 12 hours, lets say while your sleeping. I leave my computer on all day, im not sure about everybody, but the majority of people i know do. Ill try very hard to find the test, but it found the GPU and CPU are VERY rarely on full idle, and with hardware acceleration and CUDA, the 480 is even more so not at idle. Like i said, i used the idle numbers, but the majority of the day when you are browsing, doing work, the 480 will use 40-50 more watts, vs 30 more when on full idle. I dont see whats hard to believe about that. For efficieny, i was just saying that in a more in depth one, i did differeing efficiency for idle and load, as the efficiency does change, even if by small amounts. This time i didnt, and just use 80% as a constant. Folding never really came in to this. Please, point out anything you see, like i said, flaming errors, but everything i have done is using standards, averages, and a tiny amount of estimation. If it makes you feel any better, BEST case scenario, your computer is magic and runs perfect idle when doing normal things, run the numbers again, 4 hours load, 20 hours "idle" (now THATS outrageous) and you get 408 watts load, 520 watts idle. 928x1.2= 1136. 1166w a day, 406 kW a year. 406x .12, and we are sitting at $48.72. Ill even through the PERFECT scenario in there, you shut your computer off for 8, maybe 10 hours a day? 9, 9 sounds nice. Ok, so 9x26=234, thats how much that saves, even though it will still draw power while off, and the power needed to turn on is actually a LOT, sure, ill leave those out. Anyway, 832.8, the ABSOLUTELY PERFECT ENTIRELY BS amount that the 480 would take per day over the 5870. 303 kW per year. Do the math, and you get AN ABSOULTE MINIMUM of $36.47 a year. I put all the numbers out there, feel free to call them out. Either way, its $36.47 a year perfect, and entirely unrealistic scenario. If you want, say its $35-55 a year. Lifetime of product being 2-3 years, you looking at $70-105 lifetime minimum, and $110-165 realistic. Ive used all good math, all real numbers, and accounted for every possible way of using your computer. If you notice anything wrong, please tell me.
 


I was thinking similar things, since there is no die shrink.
AMD is going for a bigger performance foot print with the 6770 to compete with the gtx 460 or not.
In regards to the power debate above.
They are going to replace one part that uses 107 watts with one that uses 2 6 pin. Or 150+ watts. Is this acceptable ?
Imo, it is. Its the cost of the card whether its a success. Everything else is just playing with market position.
 


If you refer to 6770 taking more power than 5770 than of course it is acceptable. Considering not only is the 6770 set to give 5850 like performance, so its technically better fps/watt, but its also a lot higher respectively to the rest of the lineup. 57xx to 58xx was a massive gap in performance, pathetically filled by the 5830, and beautifully exploited by the GTX 460. Considering with its current specs, this 6770 seems to be a lot closer to the rest of the pack, and AMD even said that was one of their goals with it, its not in the 5770 performance bracket anymore. Its in the 460 and 5850, which both use the same or more. So IMO, its very acceptable. UNLESS it uses more power than say a 5850/5870, and performs like a 5830, which seems doubtful, but still possible.
 
Sigh, notty, this is exactly why i just did that. 60 watts at load? Yeah, try 2x that. Seriously, like i said, i dont have a terribly big problem with the power, its BAD, but it isnt what makes fermi thermi. The heat is what makes fermi thermi, and thermi is bad :lol: . All i want is people to give the right info about power usage. Most people think power usage is just for kicks, and dont realize it can cost them $100-150 over the usable lifetime of the card. And thats a cheap 470, they usually come and go every once and a while, but there ARE 5850's for less, so it isnt cheaper:

http://www.newegg.com/Product/Product.aspx?Item=N82E16814125318&cm_re=5850-_-14-125-318-_-Product

So like i said, give good info, and i have no problem. :)
 
PLEASE, if you are going to say im wrong, show me how im wrong and stop complaining im wrong. Ive shown every possible power usage from, well turning it off half the time and on idle most of the rest, to a normal gamer. So PLEASE, show me oh wise one how im magically wrong, point it out, and dont just say "your wrong". As if you cant, im not, and your the one wrong saying the 470 is cheaper than the 5850. I dont know if you missed the 5850's price, or you are just advocating for NV right now, but i proved how you were mistaken, now you prove how i am. 😉
 
:lol: :lol: :lol: Good point ya got there. ATI pulled an intel in a way, saw they had by far the better lineup at the time, and therefore jacked up the prices to whatever they wanted. The people who bought release 5xxx were lucky, as they started VERY low.
 
I'm not sure why I still care but here goes:

YOU CANNOT GUESS.

Idle =/= 0% load.

Idle = idle clocks. You show me one non-CUDA based activity that takes your GPU out of the idle/2D state and then we can talk. For a reference, my HTPC's 5670 idles during 1080/p bluray playback with my codecs, while my CPU (E6750 @ 3.2GHz) is at 80%.

The bottom line is that both cards downclock so severe, it is EXTREMELY hard to believe that you would gain more than a few watt while web browsing. A figure of 40-50w is just ludicrous.

Efficiency doesn't enter into it, because you can't factor them because every PSU has a different efficiency number, but that number will be constant regardless of GPU.

Also, if you keep you computer on while you sleep then you have no right to b*tch about power consumption, because your wasting electricity for no other reason then laziness. If you have you computer doing something ALL DAY LONG, then your GPU's power consumption doesn't matter much.

Also, as notty pointed out you aren't likely to hit more than 12 hours a week of gaming, so my 4 hour gaming a day EVERY day is THE WORST CASE SCENARIO.

Lets take your uses under scrutiny with a real calulator.

Gaming per week = 12 hours (Pretty high)
Idle per week = 153 hours (never shut off, which is plain stupid)

Average energy cost: 13 cents

Deltas:
Idle: 30w (more than enough for your "never idle" argument)
load: 102w (Crysis load, which is higher than most games)

Cost for your gaming: $8.27
Cost of your ridiculous sudo-idle: $31.03

Total: $39.30

Now lets see how much you could save by turning your computer off 5 days a week:

Idle of 5870: about 170w

Hours off: 153 - (8x5) = 113 x 52 = 5876

7956 (PC NEVER shut off all year) - 5876 (PC turned off 5 days a week all year) = 2080

At 13 cents, the money you saved: $45.97 a year.

You want to keep complaining about GTX 480 power consumption? Note that if the user shuts off their PC 8 hours a day all year long their total is reduced by $11 over yours.

The point is that even $50 a year is nothing, its less than a dollar a week more. People throw away more than that in change every week.


 
Ok, I just going to say i respect your figures, as they are right in the same ball park as mine, $35-55, and say we stop arguing over something that doesnt matter. And just for the record, if idle is 30 watts saved on the 5870 side, and even if its only a few more watts as you say, which with new hardware acceleration, it really isnt, even something like 35 or 40 is very realistic. And how does efficiency not matter????? :heink: Basic math man! 10 and 12. 2 numbers. lets say 12 is 2 more watts. 10x1.2 and 12x1.2. Now its 12 and 14.4. Notice how the 12 has increased by 2.4, where as the 10 has increased by 2? SO YEAH, it does make a difference, and DOES need to be factored in!!!!!! Factor that in, and your right back to what i said! And yes, like i said, i agree with it not being TERRIBLY important. Im just tired of people downplaying power consumption like it means nothing at all. And $50 a year? It adds up. Ive already done a comparison showing the 480 is 25-35% more expensive on newegg, so what do you think an extra $50 after a year does to that number? Im tired of arguing about this, as id recommend the 460 or 470 over the 480 anyway, and this is a thread about the 6xxx.
 
I don't expect AMD to release a better price to performance card than the gtx 460.
A 6770 replacement ? There is going to have to be a angle to this, and I'm very curious to see how it plays out.
Imo, f it has 5850 performance, its going to be 250+ dollars.
 
Respectfully, hoping this wont lead to another flame war, but why not? The 460 is the best budget card out, no question. But 6770 was suppost to be $180-250, as they wanted to have a strong middle ground. 5850-5870 performance, WAY better tesselation, comparitively less power possibility. If 5850 gets better performance than 460, and 6770 is suppose to land somewhere between a 5850 and a 5870, then its entirely reasonable the 6750 might be more on the exact level of a 460. AMD may be a lot of things, but they definitely arent stupid, this early release, IMO, has on purpose in mind, and thats to dethrone the 460 from its spot in the 220ish $ value market.
 


I agree, It wont surprise me if ATI takes it back with between 5850-5870 performance for the same price as the GTX 460. The tessellation, and power consumption of these cards can't really be compared now since we don't know anything about it besides that the 6770 MAY have 2 6 pin GPU connectors.

In truth ATI needs to be REALLY aggressive with SI. I mean nVidia was just ravaging them constantly until the 4xxx series.

ATI: "We have the 3870 and 3850 which are good cards for the money, I mean there really isn't any competition for price/performance in the marke-"

nVidia: "BAM G92!!!"
 


Yeah, I was still really impressed with G92 and then the GTX 2xx series came out and I was really impressed with that performance boost, but a little while later the 4xxx series swooped in out of no where and brought the best prices war in years.
 
Status
Not open for further replies.