Edit2: I'm assuming it's 7680*4320*10bit*240hz = 79,626,240,000 bits = 79.62624 gbps.
Samsung 57" Odyssey Neo G9 has reso of 7680x2160 and that up to 240 Hz.
So, if you use 10-bit color depth, calculation would be: 7680x2160x10x240 = 39,813,120,000 bits = 39.81312 gigabits per one monitor.
One DP 2.1 port would be barely enough to power two monitors at their fullest, since DP 2.1 in UHBR20 transmission mode, has bandwidth of 80 gigabits per second.
Would it be able to run by splitting the cable into two? do they make cables like that?
There are DP splitters out there. But i didn't saw any that support DP 2.1 since DP 2.1 is relatively new.
You can search it yourself:
https://www.amazon.com/s?k=displayport+2.1+splitter+8k+240hz
Some rumor say it will only have 1xdpi 2.1 port.
And why is Nvidia being so stingy on the ports? Is there a technical or cost limit?
I don't believe rumors.
For comparison, RTX 4090 has: 3x DP and 1x HDMI.
And same with: RTX 3090 Ti, RTX 2080 Ti, GTX 1080 Ti.
GTX 980 Ti has: 3x DP, 1x HDMI and 1x DVI.
GTX 780 Ti has: 1x DP, 1x HDMI and 2x DVI.
So, at least since GTX 900-series GPUs (past 9 years), the top GPUs from Nvidia have all had 3x DP and 1x HDMI. I see no reason why it should change with RTX 5090.