News Nvidia's Tiny RTX 4000 SFF 20GB Offers RTX 3070 Performance at 70W

thisisaname

Distinguished
Feb 6, 2009
799
438
19,260
Less performance that a 4070 for more than a 4080, that is Nvidia :eek:

Not sure why you would compare it with a 3090Ti tech should get better and cheaper. Not better and more expensive else it is not progress.
 

healthy Pro-teen

Prominent
Jun 23, 2022
58
54
610
Cheap... Performance wise 70w is insane :) I want one
I hope this ends up being sold cheap on Ebay soon. Also RX 7900XT 20GB or RTX 4070Ti at 70Watts would be the same if not better at 70 Watts. They have more cores but that also means lower clocks that ultimately lead to higher efficiency. Just like RTX 4090 limited to 285Watts is a lot more efficient than 4070Ti at 285Watts. Only problem is no one makes Tiny 4070Tis and 7900XTs.
 
Last edited:
  • Like
Reactions: Amdlova
Mar 24, 2023
4
6
10
Less performance that a 4070 for more than a 4080, that is Nvidia :eek:

Not sure why you would compare it with a 3090Ti tech should get better and cheaper. Not better and more expensive else it is not progress.


Your work should get better and your wages lower or it's not progress. Sound stupid? Yeah, it does. Just like your comment.

More productivity should come at a cost. And for the end user, 300w vs 70w (times countless employees) will payoff that GPU cost over time. This isn't some low-wage-earning couch potato gamer card.

Imagine me penalizing your paycheck because you did 8hrs of work in 4 hrs. What do you value more? Time, or your money? Anyone wanting to be productive values time over money. Those who can work faster deserve more money, not less. Working faster than yesterday IS progress.
 
  • Like
Reactions: Loadedaxe
Mar 24, 2023
4
6
10
20 gb of vram is just enough to fail at deep learning so you'll have to buy ar 3090 or 4090 series

Most NLP workstation loads (the most demanding of AI), including GPT-4 training, can be done under 10GB and 12GB. That also applies to stable diffusion too.

This isn't an enterprise card designed for production-class workloads on a server doing training and/or nference for thousands of users at a time.

Also, most ML/AI training and inference (with the exception of NLP) can still be done on 12GB (x2) K80, albeit slower. Heck, a whole lot of ML work can still be done on a 1080ti. Ask me how I know.

Not a Kaggle user, are you?
 
  • Like
Reactions: Dave Haynie
Mar 24, 2023
4
6
10
Quadro are always more expensive, so not sure that is a fair comment.

RTX A2000 is popular card because it is basically an RTX2060 that operates also 70W and is low profile capable.

If this one sells poorly, maybe it will also come down in price and become the new SFF card of choice.

RTX 2060 is a fair comment. However, 20GB, vs 8GB. For those using Cuda/Tensor cores for productivity (like AI/ML work) understand how much better this is than a 2060 Super.

I have a 2060 Super in one rig, a 1080ti in another, and a mini server running 2 K80s (4 x 12gb) There's some things that K80 (and the 1080ti) can do that my 2060 Super cannot. Simply because of RAM. Lots of good a faster card is when you end up with out 9f memory issues running models...

They are nearly the same under gaming conditions, though. Data science and Machine learning people understand that the flagship cards aren't always the better choice. Sometimes dropping down to the slower card with more ram is the better choice.
 
  • Like
Reactions: Dave Haynie

Eximo

Titan
Ambassador
RTX 2060 is a fair comment. However, 20GB, vs 8GB. For those using Cuda/Tensor cores for productivity (like AI/ML work) understand how much better this is than a 2060 Super.

I have a 2060 Super in one rig, a 1080ti in another, and a mini server running 2 K80s (4 x 12gb) There's some things that K80 (and the 1080ti) can do that my 2060 Super cannot. Simply because of RAM. Lots of good a faster card is when you end up with out 9f memory issues running models...

They are nearly the same under gaming conditions, though. Data science and Machine learning people understand that the flagship cards aren't always the better choice. Sometimes dropping down to the slower card with more ram is the better choice.

Okay, not really sure how this is a response to pricing, other than its use as a gaming card by some people. The reason it should be popular for SFF is the low power requirements And yes, if you want to run higher resolutions with everything on then 20GB is going to be useful in the future.

If people want to use it for its actual purpose, they can? No one is stopping them.

You have to understand that the audience for this site in general is gaming focused, so articles will lean that way.

I wouldn't mind paying a lot more for say, the mobile 4090 on a card (Which is a 4080) so I don't have my computer being a space heater. It shows what Nvidia could do if it chose and restrict the power requirements to something reasonable, along with good binning. For now, I just run my 3080Ti with a small power limit to knock the edge off.
 

TJ Hooker

Titan
Ambassador
On consumer grade card we will not going to get 3070 performance at 70w even with 3nm process.
You can already get a long way with simply lowering the power limit for the Geforce 4k cards, I'd bet you could get 3070 level performance from a 4090 for somewhere between 100-200 W. Unfortunately the few sites that have done power limit scaling testing with the 4090 haven't pushed it that low, so I can only speculate.
 
  • Like
Reactions: Dave Haynie

thisisaname

Distinguished
Feb 6, 2009
799
438
19,260
Your work should get better and your wages lower or it's not progress. Sound stupid? Yeah, it does. Just like your comment.

More productivity should come at a cost. And for the end user, 300w vs 70w (times countless employees) will payoff that GPU cost over time. This isn't some low-wage-earning couch potato gamer card.

Imagine me penalizing your paycheck because you did 8hrs of work in 4 hrs. What do you value more? Time, or your money? Anyone wanting to be productive values time over money. Those who can work faster deserve more money, not less. Working faster than yesterday IS progress.

When I first purchased memory it was £25 per MB, would you be happy that a 8GB stick was £200,000?
 

Firestone

Distinguished
Jul 11, 2015
99
18
18,535
It's also very amusing the people in here complaining about the price tag. Because they're the exact people who would never be buying these cards in the first place. Consumers do not purchase Quadro line, the vast majority are bought by enterprise and enterprise is not quibbling about the price. When i tell my boss that i need a new workstation, the boss does not say "you have $1000 to spend", the boss says "tell me what specs you need and we'll buy it".
 
Mar 29, 2023
1
0
10
Less performance that a 4070 for more than a 4080, that is Nvidia :eek:

Not sure why you would compare it with a 3090Ti tech should get better and cheaper. Not better and more expensive else it is not progress.

I will never understand you nerds of the world. You seem to keep wanting the pinnacle of computing/gaming technology (as if you upgrade ever year to the latest and greatest) then get upset when their price increases.
For example, bmw m3, let's say in 1990 it was what I could afford at £30k then in 2023 it's now £80k. If I'm too awful at my job that I am still getting paid in 2023 what I was in 1990, should I complain that I can no longer afford the latest and greatest? What if my wages increase and the car is still out of my price range, then the way I look at it is I was spoilt during those years I had my m3, c'est la vie.

I'm different to you lot, I can afford the latest and greatest pc but I'm not a massive nerd, in that I don't feel the need to upgrade every time a newer thing comes out. As it is I'm happy with my 3090ti and could afford a 4090. Now I know the jump this time is huge (like the price 🤔) but I'll keep running my 3090 until the 50 or 60 series comes out and then I'll run that for a few years before I think of an overhaul to my entire rig.

..... Or maybe I'm wrong. Maybe the lay abouts of the world should have m3's and 4090's and I'm the idiot for working my ass off 🤷🏻‍♂️
 

Dave Haynie

Distinguished
Jan 9, 2015
15
16
18,525
Less performance that a 4070 for more than a 4080, that is Nvidia :eek:

Not sure why you would compare it with a 3090Ti tech should get better and cheaper. Not better and more expensive else it is not progress.
That is pro-market GPUs. It probably does outperform an RTX4070 for its intended job, particularly on OpenGL.

I got an RTX A2000 for $250 earlier this year and felt that was a pretty amazing deal. This is for a desktop electronics CAD workstation with 12-core AMD Ryzen 9 5900X, 64GiB DDR4, and 6TB NVMe SSD, and just two half-height PCIe slots. A PC I integrated myself for my beach house office, not intended or used for gaming.

My primary use is in electronics CAD which isn't 3D-CAD centric... if I was running SolidWorks all day long, like the Mech-E's I work with, I might want more. If I was crunching deep learning networks all day, maybe something even more different. Don't judge the needs of the entire industry by what's working well (or what you don't own but fanboi over) for your specific use.
 
  • Like
Reactions: thisisaname

Dave Haynie

Distinguished
Jan 9, 2015
15
16
18,525
When I first purchased memory it was £25 per MB, would you be happy that a 8GB stick was £200,000?
My first big DRAM purchase was an extra 16KiB for my Exidy Sorcerer back in 1979. That ran $19... a huge bargain for the day, and shipped, loose chips, in an aluminum tube. They were really worried about static damage on memory chips back then.

That's $1,216 per MiB! At that rate, I'd be typing on an $80 million PC!
 
  • Like
Reactions: thisisaname

TRENDING THREADS