Report: Nvidia Prepping Maxell-based 750 Ti for February

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Lessthannil

Honorable
Oct 14, 2013
468
0
10,860
This isn't meant to beat Tahiti... This will be a GM117 GPU which would make it the slowest out of all of the Maxwells. Maybe a GM114 would match Tahiti
 
So I am pretty stoked and curious about the Maxwell release.

1) These are supposed to have onboard ARM processors right? And it is the onboard ARM and GPU that are going to share vRAM right? The system memory controller for both AMD and Intel is on the CPU... so I am pretty sure that there is no way for the GPU to have any kind of DMA unless it is enabled by the CPU chip maker... and I don't see that happening any time soon. The memory sharing almost has to be between the GPU and onboard ARM chip.

2) The 570 was about on par with the 660ti, and the 660ti was about as powerful as the 760... so a new architecture 750ti should be somewhere around the same performance as a 570? With 2GB of ram then this could be a pretty nice cheap "sideways upgrade" to my 570... the thing is powerful enough for games... but I really should have sprung for the version with 2x the memory. 1GB just does not cut it for hevily modded games and next gen titles.

3) AMD will supposedly introduce cards with GDDR6 this year... any word if this will be coming to nVidia as well? Or are they going to have to wait a year? I suppose that either way it would not be introduced on a 'budget' card.

4) Without a die shrink I wonder if this is going to make much in the way of power efficiency gains that were promised with Maxwell. It certainly won't help idle all that much, but if the architecture is significantly more efficient then it may be able to do more work with the same number of transistors. Cant wait to see benchmarks!
 

tolham

Distinguished
Jul 10, 2009
347
0
18,780
tomshardware made an article back in september about gaming at 4k. they showed the titan was able to get half decent fps by itself in some games. I wonder if the upcoming flagship of the 800 series will be the first GPU that can drive 4k gaming at decent fps with just one card.
 



I certainly hope so! My little mini-itx rig is going to be getting the new 1440p, 120Hz Asus as soon as it drops here, and will be begging me for some more firepower. My GTX 670 is just starting to show its age on a 1080p, 120Hz monitor, and I'm afraid with the monitor upgrade, I'll have save up and wait for the dual-maxwell 890 to be released for $1k or more.
 

zeppast

Honorable
Jan 20, 2014
9
0
10,510
The GTX 750 ti will most likely be around AMD's r9 270 in performance but will it be cheaper? is the question.
 

Rammy

Honorable


That's not what the graph shows at all, it shows triple the GFLOPS per Watt, which will refer to the die shrink as well as efficiency improvements. If the 750Ti launches on the now standard 28nm fabrication then it's not likely to be anywhere near that performance, and at that level an extra 20-50W off the tdp (or whatever it transpires to be) won't make a huge amount of difference as it's still likely to use a single 6pin PCIe connector.

Personally, I'm getting a bit sick of the last few generations of both AMD and Nvidia cards. It might make sense from a manufacturing and marketing perspective to constantly attack any niche by producing a card to hit a performance/price target, but it ends up being incredibly confusing for everyone. The GTX660 "OEM", the HD7870XT and GTX650Ti Boost spring to mind as particularly bad recent examples, but frankly any "Ti", "Ghz" or "Boost" suffix is nonsense.
 

game junky

Distinguished
It's all speculations until we see the benchmarks and get the opportunity to test again mainstream apps/games. For folks trying to build economy rigs, this sounds like a decent option if pricing is sub $200. Right now, AMD is killing at that price point because it doesn't have a competitor on a current platform. Competition is a good thing for the market so I say mush on Nvidia and AMD.
 

xerxces

Distinguished
Dec 28, 2010
328
0
18,810
That is a huge jump from generation to generation. Funny thing is last gen to this gen was the biggest jump now maxwell will clearly be the biggest jump. Nice. I like AMD CPUs. but I had nothing but trouble with their GPUs. I switched to Nvidia and have had zero regrets.
 

mynith

Distinguished
Jan 3, 2012
133
0
18,680
There's no way the efficiency difference between Fermi and Kepler was that big. Sure, I heard the 480 was hot, but that was a very compute-oriented card. I don't think the 680 was three times as efficient as the 580.
 

hannibal

Distinguished


It is easy to see that next Nvidia ARM solution has Maxell gpu part!
 

hannibal

Distinguished
The 750 is interesting because it will show how good the new architecture is compared to kepler. The differense in speed can not be big, but the posibility to connect ARM cores with Maxwell cores in one product is the intersting part. Yeas the system is very much like AMD has done, but it mean that there will be more competition in ARM based APUs in the future, so embedded systems can get some real boost when both of these companies goes to the mobile in a big way!
This is much more important in mobile platform than desktop environment. Maybe in the future allso decktop computer can better use CPUGPU systems. That day is not today... pity...
 

XGrabMyY

Honorable
Jan 8, 2014
61
0
10,630
AMD Mullins4.5w Quad core X863X CU 192 unitGCNK15W not confirmed more like 10wDual core ARMKepler 192 unitsQualcomm 805 is suppose to boost gprahics by over 40% over existing 800 series chip.As far as HSA goes Nvidia is years behind the competition
Lol, I don't know where you folks are getting that 5W number. The K1 32bit is going to operate at or below 1W. There is every indication that the new ARMv8 chips will actually perform better at a lower TDP. I wouldn't be surprised if the K1 Dual-Denver operates just fine at or under 1W, which should be enough to terrify Intel into doing something.After-all, we already know that Intel is not only fabricating these new Denver CPUs, but is investing in the technology to improve or replace their Atom/Mobile offerings.
 
Sep 22, 2013
482
0
10,810
x50TI (GM117), means 128-192bit memory bus max., 128bit more likely. i expect this one to be memory choked like 650ti boost. current kepler gpus could easily perform higher with 256bit bus, but nvidia cripples them so that they can gouge customers (gtx 660, 660ti).and "
TechPowerUp reckons Nvidia might be testing the waters with Maxwell on the existing 28 nanometer process before taking things to the next level on the future 20 nanometer nodes.
"? doesn't that mean qualcomm and apple fab-blocked nvidia? tsmc is already going into high volume production for 20nm arm socs ( seemingly from vendors that are not nvidia, lol).
[/quote]

This card will NOT use 192bit VRAM. The type of VRAM and the bus used are dependent partially on the GPU. Let's also get with the program on modern memory: 192 is old and probably harder to obtain that higher bit VRAM, so this makes it even less likely.I have a feeling they will use either 384bit (as found on the 780Ti) or finally move to 512bit.The memory speed is probably more of a consideration, though bus width certainly matters under some circumstances.
With the same production node they can not make this much quicker without making this chip bigger. 5-10% top more speed than 650. (if the size remains the same.) As someone said, this is a test run, and "real" 800 series comes later with smaller production node. What this allso means, is that 20nm production node is not quite in there just yeat. Interesting to see how long it will take untill it will be out and running in highend GPU production.

28nm refers to the size of the transistor, not the size of the chip. A 28nm GPU or CPU could feasibly be the size of a football field, or it could be a centimeter.There's nothing about the 28nm process that inherently prevents a new chip with a new design from being more powerful or in any way restricts the size of the GPU.

its a RUMOR and the big "news" of this rumor is shared GPU and CPU memory. which with nvidia must mean system memory. um.... AMD already did HSA and actually makes CPUs/GPUs...and their first card would be a midrange card no one with current gen would go for? sounds really week to me

Nvidia already did it, too, and AMD hasn't even launched their version. CUDA supports UVA which allows the drivers to be configured to have system memory assigned to them, however it hasn't been optimized w/in the current Nvidia lineup and we haven't really seen what it can do yet. AMD's HSA is also targeted at being used in conjunction with one of their "APU" CPUs but not limited to this.I think the first implementation we'll see of HSA will probably be in scenarios where performance can hurt due to physical limitations on the system, like a laptop with integrated GPU (like Radeon HD graphics) where moving the GPU off-die would allow for better performance due to heat and equivalent performance or better could be achieved by the shared memory architecture.
 

bochica

Distinguished
Oct 7, 2009
146
0
18,680
I would be interested to see if this will be a "true" Maxwell, one with the Denver based chip. Seeing how the Denver will work under gaming benchmarks should be interesting.
 

redeemer

Distinguished


Consoles are using HSA based hardware right now, Mantle adoption will only help the cause. That's the direction everything is going now better communication between the CPU and GPU sharing one pool of ram
 


But hasn't there already been indications that the consoles won't be using Mantle?
 

jaghpanther

Distinguished
Sep 23, 2010
150
0
18,710


Yes, MS nixed Mantle on Xbox. PS4 is more convoluted but the consoles already have close to metal API's that are closer to metal than Mantle.
 

rohitbaran

Distinguished
In another news, nVIDIA is preparing two high end cards.Titan Black edition: Same GPU as GTX 780 ti and 6 GB of RAM. Price: $999GTX 790: Two GK110 (not sure if the one in GTX 780 or the fully unlocked one in GTX 780ti). Price: Not mentioned, but definitely above $1000Price gouging at its finest.
 


Links?
 
Status
Not open for further replies.

TRENDING THREADS