cryoburner
Judicious
So, even the leaker had no actual numbers, just a suggestion that it could potentially draw more power than a 3070, and from that you jump to the conclusion that the card could draw up to 350 watts? >_>Prominent hardware leaker kopite7kimi (opens in new tab) believes that the GeForce RTX 4060 will consume more power than Nvidia's current generation GeForce RTX 3070, far more than the GeForce RTX 3060 12GB that it's replacing. But the real question is; how much more power are we talking about?
...
Unfortunately, the leaker doesn't know the exact power specifications of the GeForce RTX 4060, so it's entirely an educated guess as to what it'll be like.
A reference 3070 has a TDP that's just 10% higher than a reference 3060 Ti (220w vs 200w). If Nvidia positions this card as a successor to the 3060 Ti, then a potentially modest 10% increase in power draw seems like a reasonable possibility.
Realistically, I can't see Nvidia giving this card anything remotely close to a 300-350 watt TDP. That wouldn't exactly be friendly to PC manufacturers who are typically looking to cut corners on PSUs, nor to any mainstream gamers considering upgrading from an older "mid-range" card, only to find their new card is unstable and overheats their system. Something closer to 220 watts seems far more likely than what this article is suggesting, and that's roughly in line with many previous cards around this general price range. And keep in mind that Nvidia shifted model numbers since the 20-series in an attempt to disguise those cards mediocre performance gains outside of raytracing, and to encourage people to move up to a higher tier than they would normally consider buying, so the 2060, 3060 and likely 4060 cards would have likely been given x70 model numbers in prior generations. Likewise, the 1660 was the real x60 card that generation, and the 3050 is intended to be targeted at that market currently.
Those "hacks" are what are known as "optimizations", allowing for notably better visuals and more performance than what a given level of hardware could otherwise support. Those optimizations are effectively trading a small amount of visual fidelity in less noticeable areas along with some additional development effort for massive improvements to performance.The big gain of RT over several decades of raster hacks piled on top of each other is in development workflow. When you have an array of cubemaps, reflection maps, screen-space reflections and shadows, deferred rendering, shadow volumes, etc all in play, it is incredibly easy for one minor art change to have knock-on effect across the whole toolchain. Or worse, for it not to have that knock-on effect and start to introduce rendering errors because (e.g.) a cubemap did not get updated when a lightmap changed because someone moved a lamp from one desk to another.
Raytracing calculates all of those effects in real time at runtime. You make a change, and everything is correct immediately. All the tower of hacks you need for - for example - continuous variable time-of-day lighting and shadowing changes are now free and performed by default.
And if anything, raytraced effects are only likely to increase developer workload for many years to come. Since the current generation of game consoles only have very limited support for RT, even if developers want to use extensive RT effects for newer PCs and potentially future "Pro" versions of those consoles, then they will need to support two entirely different lighting models, since a lot of their target audience will be on hardware with limited support for those effects. It will likely be many years before most developers will be willing to drop support for non-RT lighting. In any case, lighting the game world is typically only going to amount to a small portion of the resources and budget put into a game's development, so even if RT were to eventually help streamline that process, it might not make that much of a difference to the overall development of a game.
If you are referring to the 640k memory thing, there's no verifiable evidence that Bill Gates ever said that, and for decades he's always denied ever having made that quote. And if he did ever say something similar, it was no doubt in the context of what made sense for PCs to support for the near-future at that time. It's rather unlikely that he ever thought computer memory needs would never rise above that level. So it's likely a fake quote, or at least something taken out of context.Yeah, no, negative people have said similar things about technology in the past as well and were proven blatantly wrong with their predictions (hi Bill Gates), it’s better to not talk like this about the future. We will see, but I’m cautiously optimistic.