Rumor: AMD's Tonga GPU Will Be Efficient; Arriving Soon

Status
Not open for further replies.
This sounds more like a low-end card, rather than mid-range. I always considered the 760 to be more of the mid-range card.
 


Here's to hoping. Really wouldn't mind seeing the next generation of cards this year.
 
"This card is expected to carry just 2 GB of memory..." Why would you want more VRAM on a mid-range card that most likely won't be able to handle more than 1080P and maybe 1440P (if you're lucky) anyway? I'm speaking in gaming terms of course. Perhaps if you intend to drive more than 3 monitors just for general computing?
 
Tonga. Reminds me of tenga. Heehee.
That aside. I wonder how 'efficient' it will be, since amd is known to hog more power than nvidia. Ive been using them and love them for their price to performance ratio. But for a mobile chips, that is a big nono from my experience
 
It's going to be hard for AMD to beat 750ti efficiency using the overhyped GCN arch. I own a 750ti card, the performance/watt is just nothing short of remarkable given that it's still based on TSMC 28nm process. I would love to see much better efficiency, whoever can achieve that, Intel, Nvidia, AMD, ARM, etc, I don't care. That's what matters most given market trend. I hope soon we reach the 750ti performance level within the mobile power envelope.
 
This sounds more like a low-end card, rather than mid-range. I always considered the 760 to be more of the mid-range card.
I actually feel offended by your comment and may i say this, it's a snobby comment. Though it gives me comfort to think that you felt the need to make it.
 
Wait a minute. High middle from AMD will be more power efficient than low middle from Nvidia? Damn, if thats true that just rocks cuz i was going to Nvidia when both companies would go to 20nm and seems AMD is fixing their troubles, if the rumor will be truthful ofcourse.
 
"This card is expected to carry just 2 GB of memory..." Why would you want more VRAM on a mid-range card that most likely won't be able to handle more than 1080P and maybe 1440P (if you're lucky) anyway?

Because with the new consoles out, games will start using more video memory, also for 1080p.
 
In summary Maxwell kicked AMD right square in the nuts and if you can't beat em join em.

Kicked them in the nuts? Can I have what you are smoking? See how much the 750ti costs, now see what you can get from AMD for the same price. Unless all you care about is a few less dollars a year in power costs, AMD clearly wins by a huge margin.
 
Low power consumption for GPUs or ICs in general is not about electricity bill my friend. We all know that even the most power sipping GPU cards have negligible to none impact on budget for those people who can afford such products. To me, it's all about innovation and making progress towards scaling computation horsepower for battery-powered devices and HPC centers. To devise a CMOS design that achieves same task/throughput/delay using lower power is very challenging nowadays, and It is a widely accepted fact in industry that power wall has become the major impeding factor for CMOS chips. So any effort (such as Nvidia's Maxwell) to mitigate this challenge is always very appreciated. Of course you can pack as many transistors as you can into a GPU to increase chip throughput at the expense of higher TDP and sell the damn thing at lower profit margin to stay somehow competitive. That's not innovation, that's a corporate cheating and laziness.

 
To the person downvoting my comment, maybe you would care to explain why I'm wrong? Lots of games are multiplatform these days, whether you like that or not. And that means developing for the consoles first, then making a version for the pc.

Textures are made to fit in the consoles' memory, so therefore the consoles dictate how much video memory your pc needs.
 
Power efficiency seems to be the trend of the day. Intel makes more power efficient CPUs. ARM makes more power efficient CPU so does AMD (the new low cost versions like puma, not the old FX line...)
Nvidia start to make more power efficient GPUs and AMD is also moving to the same direction... Summa summarum, it seems that CPU and GPU are powerful enough at this moment, so it is better to concentrate to power efficiency.
Good for consoles, tablets and other mobile devices, is it good for PC gaming... I just wonder.
 
Power efficiency seems to be the trend of the day. Intel makes more power efficient CPUs. ARM makes more power efficient CPU so does AMD (the new low cost versions like puma, not the old FX line...)
Nvidia start to make more power efficient GPUs and AMD is also moving to the same direction... Summa summarum, it seems that CPU and GPU are powerful enough at this moment, so it is better to concentrate to power efficiency.
Good for consoles, tablets and other mobile devices, is it good for PC gaming... I just wonder.

Why doesn't anyone mention that also lower power means lower heat which can translate into lower noise on the same cooling solution (if i think about it, this also means your fans will require less power to cool 😛 powerception)?
 
My comment was purely from a PC power users perspective. I honestly don't give much thought about tablets, consoles, htpcs et al. So, if somebody wants to sell me item a which is very efficient power wise, but has about 35%-40% worse performance than competitors moderately higher power drawing item b, for the same price, of course I am going to go for more power for the dollar.

If we were talking about the difference between the 750ti and the 290x in power and heat, I could perhaps give a little sympathy in this area...but we aren't. We are talking about a 270 which can be had for $15 extra, or the 265 which is at the same price with still at least a 20% lead in performance. Neither of these cards are massive power users, but are much better performers than the 750ti for the same dollars.

All of these cards can be used with a decent rated 450w psu. Think about it.
 
I hope that one day graphics cards and CPU run less than 20W under load and having more power like the GTX 770 and i7-4770. Then, we could play games for more than 4 hours without plugging into a power socket, and have 24 hours of normal computing time! :)
 
Status
Not open for further replies.