News Offloading Lossless Scaling Frame-Gen to secondary GPU eliminates overhead

Any word how much worse the latency is? I know running Lossless scaling already has some pretty hefty latency and I'd imagine moving to another GPU only makes things worse. I'd love to see some testing done on Lossless scaling as a whole by TH.
 
And back to dual GPU setups we go. I knew they were going to come up with something to drive up sales.
Some iGPU are getting very powerful (680M and up would be great, Strix Point is even nicer, and Strix Halo... that's a different kind of beast), so I'm certainly not against this line of work.
 
Last edited:
Consider giving credit to the LS Discord server, where this was originally discovered. There's thorough testing and data from ~30 different systems there too.

Starting around 8 months ago, I gave 4060ti + 1660 Super a try and made a guide. Now, I use 4060ti + Arc B570. Latency added from dual GPU LSFG is around half that of DLSS 3 FG with Reflex 1 enabled. Credit to IV0435 for testing multiple configurations and making the original overview on YouTube, CptTombstone for latency testing, NotAce for helping it grow, and many others on the LS Discord for setting it up and testing on their systems.
 
TLDR? They used an iGPU and 1050ti as the secondary.
Completely.

And they just made owners of HX99G (and similar) very happy, because they had an iGPU that's quite powerful just standing there not doing anything on games (the dGPU was doing everything), and now they can scale + frame gen... for free! Even on games that don't have it natively.

I see this as a HUGE win, and I'm not sure why its barely getting any attention.

(Personally not a fan of frame gen, but I'm quite fond of scaling if it's done right).
 
TLDR? They used an iGPU and 1050ti as the secondary.

Yes, they did. But it's obvious they really want you to use 2 GPU's, maybe a last gen and current gen.
This way they can sell higher volumes of dedicated GPU's, and the second hand market for GPU's would shoot up due to lower availability since users would keep their "old card" for the secondary purpose.
 
Yes, they did. But it's obvious they really want you to use 2 GPU's, maybe a last gen and current gen.
This way they can sell higher volumes of dedicated GPU's, and the second hand market for GPU's would shoot up due to lower availability since users would keep their "old card" for the secondary purpose.
Geez what are you talking about? This post talks about Lossless Scaling and Lossless Scaling was made by a single person who wanted more than just latest gen PC gamers to have access to frame generation, it's not a "ploy to drive up sales"
 
Geez what are you talking about? This post talks about Lossless Scaling and Lossless Scaling was made by a single person who wanted more than just latest gen PC gamers to have access to frame generation, it's not a "ploy to drive up sales"
It's all a conspiracy theory!
👽
 
Consider giving credit to the LS Discord server, where this was originally discovered. There's thorough testing and data from ~30 different systems there too.

Starting around 8 months ago, I gave 4060ti + 1660 Super a try and made a guide. Now, I use 4060ti + Arc B570. Latency added from dual GPU LSFG is around half that of DLSS 3 FG with Reflex 1 enabled. Credit to IV0435 for testing multiple configurations and making the original overview on YouTube, CptTombstone for latency testing, NotAce for helping it grow, and many others on the LS Discord for setting it up and testing on their systems.
This is the right way not what these so called journalists do.
 
Consider giving credit to the LS Discord server, where this was originally discovered. There's thorough testing and data from ~30 different systems there too.

Starting around 8 months ago, I gave 4060ti + 1660 Super a try and made a guide. Now, I use 4060ti + Arc B570. Latency added from dual GPU LSFG is around half that of DLSS 3 FG with Reflex 1 enabled. Credit to IV0435 for testing multiple configurations and making the original overview on YouTube, CptTombstone for latency testing, NotAce for helping it grow, and many others on the LS Discord for setting it up and testing on their systems.
Yeah I was running lossless scaling through my iGPU on my laptop back in August. That was sweet, it was like a free upgrade to performance
 
Any word how much worse the latency is? I know running Lossless scaling already has some pretty hefty latency and I'd imagine moving to another GPU only makes things worse. I'd love to see some testing done on Lossless scaling as a whole by TH.

It has been tested by several users. Not only gives you Free Framegen, 0 performance penaly on the main GPU. The latency is decreased severely.


Compared to Loseless Scaling single GPU of almost 65ms total (being 10ms slower than DLSS3). When using Loseless Scaling with Dual GPUs, latency is 10ms less than using native DLSS3.
 
It has been tested by several users. Not only gives you Free Framegen, 0 performance penaly on the main GPU. The latency is decreased severely.


Compared to Loseless Scaling single GPU of almost 65ms total (being 10ms slower than DLSS3). When using Loseless Scaling with Dual GPUs, latency is 10ms less than using native DLSS3.
Curious on your source? I'd love to dive in deeper. Thanks for your reply regardless.
 
My source, the official discord server for Loseless Scaling.

You are invited to join, lots of information, tests, benchmarks and how-to.
 
Here is the recent test done, regarding latency, with an RTX 4090 as main GPU and RTX 4060 as secondary for Frame Generation.

LSFG2v3vDLSS3-Cyberpunk-dark.png


The community also came up with a nice Google Spreadsheet with current performance for the dedicated GPU for FrameGen and the actual output.

https://docs.google.com/spreadsheet...oeXB1eXEfI/edit?gid=1980287470#gid=1980287470
 
Curious on your source? I'd love to dive in deeper. Thanks for your reply regardless.
My source: I was there, Gandalf. I was there 3000 years ago! The lossless scaling discord server, we have been running all kinds of tests and experiments to find out the most optimal way to go on about this, I'll only be impressed by DLSS FG once they manage to make it have no performance cost at all
 
Last edited:
My source: I was there, Gandalf. I was there 3000 years ago! The lossless scaling discord server, we have been running all kinds of tests and experiments to find out the most optimal way to go on about this, I'll only be impressed by DLSS FG once they manage to make it have no performance cost at all

https://www.reddit.com/r/lotrmemes/comments/k5we0z/i_was_there_gandalf_i_was_there_3000_years_ago/


And I just bought my new RTX 4060 as secondary GPU. That's how much faith I have in Loseless Scaling!
 
to bad there may not be enough pcie lanes for this kind of setup...

even now, add a card in the 2nd x16 slot, and both run at x8
Tbh even 3.0 x4 pcie runs well in most secondary GPU setups, the only situations in which it really starts to hurt is when you use an intel battlemage GPU with only x4 lanes or try to achieve really high framerates with 3.0 x4
 
to bad there may not be enough pcie lanes for this kind of setup...

even now, add a card in the 2nd x16 slot, and both run at x8
Not quite correct on both accounts.

- For most Z790 boards for example, it is pretty normal to have the main PCIe 5.0 x16 connected to the CPU while having a second PCIe for VGA connected to the Chipset at PCIe 4.0 4x

In most test it was asserted that even at speeds of PCIe 4.0 at 4x they can run full 4K @ more than 120 FPS without any issue in Loseless Scaling, reducing the latency considerably and basically giving the main GPU, "Free Frame Generation".
 
Tbh even 3.0 x4 pcie runs well in most secondary GPU setups, the only situations in which it really starts to hurt is when you use an intel battlemage GPU with only x4 lanes or try to achieve really high framerates with 3.0 x4

Not quite correct on both accounts.

- For most Z790 boards for example, it is pretty normal to have the main PCIe 5.0 x16 connected to the CPU while having a second PCIe for VGA connected to the Chipset at PCIe 4.0 4x

In most test it was asserted that even at speeds of PCIe 4.0 at 4x they can run full 4K @ more than 120 FPS without any issue in Loseless Scaling, reducing the latency considerably and basically giving the main GPU, "Free Frame Generation".
and what if a persons system is already loaded, ie, all M.2 dives full, all sata used, even a few other pci e cards installed like a sound card, and another card ?

with boards now, at some point, things get disabled, or bandwith runs out....

thats the situation i am in with one of my comps here on X99, with a video card, sound card, capture card and a hardware raid card, as well as m.2 and sata used, the only platform i can go to, is threadripper.. as i need pci e lanes...
 
Well that's unfortunate.
I have to remove my internal soundcard, a SoundBlaster AE-7 to make space for the second card and I also removed my third M2, leaving the system only with 2 M2.

But unfortunately you always have to make consessions with desktop systems. As you point out, if you want full power on PCIe you need a Threadripper motherboard/system.
 
or, just skip the 2nd vid card... and not use the fake frame gen...much better option.

i also priced out a TR board, ram and cpu... i think it was north of 2k for me... so a little out of reach..