While I agree with
@jsmithepa - I don't know why anyone would try to do this - I think it's helpful to understand the technology, so I'll have a crack at explaining it...
All LCD and OLED displays have a fixed number of pixels. That means without additional processing hardware, the displays themselves can only produce an image at their fixed resolution. If you fed a 1080p BluRay movie into a dumb 4K display, the best the display could do is present the movie with one pixel in the source mapping to one pixel on the display, meaning your movie would take just a quarter of the screen. Try booting a PC on that display, and your basic 480p boot screen and BIOS would be relegated to a tiny fraction of the screen and you'd need a magnifying glass to see what was going on.
To work around this, almost all displays include an inbuilt "scalar". It is responsible for scaling the incoming signal up or down to fit the display resolution. So when I send a 1080P movie to my 4K TV, the signal is first passed to the scalar which remaps the 1080P source frames up to 4K so that I can enjoy the movie at the 55 inches I paid for.
While scalars are pretty mature these days, they still need to crunch lots of numbers. Take the 1080P to 4K example: the scalar has to "move" the ~2 million pixels from the source frame to their appropriate location on a 4K frame, and then calculate what colour the ~6 million missing pixels should be. Then it has to do all that for each frame; we're talking 24, 30 or even 60 times a second. That's a
lot of math! Because of that, scalars have limits. At some resolution and frame rate, they simply won't have enough processing power to scale the image and you'll get an "out of range" or similar error. That is why
some scalars have limited headroom and will be able to
downscale a slightly higher source onto their lower pixel count displays. Some 1080P monitors as in the example above will accept and scale down a 1440p source. Try sending it 4K though? Or 8K? At some point there's too much asked of the scalar and it will stop working.
TL: DR -> it seems the scalar in your laptop display can't handle anything higher than a 720p signal. That's not surprising.
... just to briefly add to this (already too long!) post: Why do you want to do this? I suspect in most cases you'll get a
worse image. It's worth pointing out that display scalars are often very basic - they tend to use the simplest scaling algorithms. The scalar in your graphics card is likely to produce a better final image. Let's say you're trying to play a 1440p Youtube clip on your laptop, for example. If you leave everything at default, your graphics card will use its scalar hardware to downscale the video and send it on to your display at the native 720p. That way, the display scalar isn't needed and you'll likely get a slightly better result. In either case though, you are limited by physics to 1280 x 720 pixels on the display.