Sixa’s Rivvr Wireless VR Upgrade Kit More Versatile, Cheaper Than Competition; Pre-Orders Open

Status
Not open for further replies.

WFang

Reputable
Dec 10, 2014
130
0
4,680
So if it adds 6 to 11 ms of additional latency, it ranges from 'nearly impossible' to 'impossible' to render at 90Hz? What am I missing?

Unless the GPU somehow is able to (or currently already do) scan outs of an image frame before the full image has been compiled, I don't see how adding 6 to 11ms of transfer time to a render pipeline that already uses a big chunk of that time can work comfortably?
 

Jeremy Cox

Reputable
May 7, 2014
3
0
4,510
@WFANG The ability for humans to discern the latency or delay in the frame delivery is something like 30-100ms. 10ms will not be noticeable. Case in point, most people don't notice 30ms of lag in online gaming.
 

lasvideo

Honorable
Jun 24, 2012
7
0
10,510


Wrong! I can HEAR a 10ms delay in my music studio software when there are issues in the pipeline. It is VERY noticable and palpable. VR industry hardware devs state any video delay over 30 ms incurs nausea and put a hard target on keeping it lower then that number. ;)

 


The amount of time to transmit an image to a screen doesn't necessarily limit the frame rate. The computer doesn't have to wait for one frame to be completely drawn onto the screen of the headset before it can send the next one. In this case, for example, the system might spend several milliseconds compressing the current frame, a few more transmitting it, and then another few milliseconds uncompressing it on the headset's end. While the headset is uncompressing one frame, it could be receiving the next. One frame doesn't have to be completely out of the transmission pipeline before the next enters it. Of course, you'll still want to keep the latency as low as possible, since if there's too much delay it could make people nauseous, though presumably such a device could use onboard hardware to pan and scale the image after it's received it to better match the headset's current positional data.



That's not exactly a good comparison, since games perform various lag-compensation techniques to hide latency. Your own movements and actions in an online game shouldn't show any additional delay at all, since the game doesn't bother checking with the server before updating those kinds of things on your end. Even for the movement of other players, well-designed netcode will use predictive techniques to estimate what their approximate position should be without having to check every frame, then adjust for any discrepancies as seamlessly as possible as their actual location data arrives. This tends to hide network latency pretty well most of the time, though sometimes those guesses can be slightly inaccurate, resulting in situations where it looks like you shot a player, for example, but they don't get hit, or where the hits from a player shooting you don't register until you're already behind cover. You might not directly see this latency, but that's largely because the game is doing its best to hide it.

Sixa isn’t stopping at local server-based VR experiences, either. Minchenko’s ultimate vision for wireless VR includes offering a PC-free solution for people who don’t have a VR-ready machine and don’t want to invest in one. Sixa plans to expand its cloud-based desktop computer service to offer VR gaming over Rivvr-equipped headsets. Theoretically, you wouldn’t even need a computer in the house--just a Wi-Fi connection and a Vive.

Now this I would be a bit skeptical of seeing pan out, at least any time within the next several years or more. In order for this to work, you would need really low latency to the render servers, since on-headset adjustments to the video feed can only go so far. The latency would really add up between not only the time it takes to transmit the video to your headset, but also for the headset to transmit its positional data to the server. The servers would likely need to be located near your town, and aside from in some urban centers, that would be difficult to accomplish. Existing game-streaming services introduce enough delay and compression artifacts to make even traditional games somewhat uncomfortable to play, and those issues would be greatly compounded in VR.
 

therealduckofdeath

Honorable
May 10, 2012
783
0
11,160
@Wfang, you're comparing apples to pears. Input lag and display refresh rates aren't really connected. You can have a billion Hz refresh rate with a billion years lag. In a more real comparison, a lot of the 144 Hz displays we can buy they use VA panels. A VA panel generally adds 10-30 ms lag.
 

bit_user

Polypheme
Ambassador
250x compression is a lot. This is only possible with full-frame analysis, which imposes a lower-bound on the amount of latency it will add.

I would much rather see a higher-frequency wireless tech and lower compression. Also, I don't know how well their solution would deal with packet collisions & other types of errors.

Line of sight isn't really a problem, for VR, since you typically have a dedicated VR space, anyhow.

 

bit_user

Polypheme
Ambassador
Yes, I've been saying that ATW belongs in the HMD. This will definitely help hide wireless transmission delays. However, I doubt they're currently doing this, since it would require deep integration with the HMD and/or game engine. You definitely don't want to do ATW twice, as it's not without latency or artifacts. HMD-based ATW would be an absolute necessity for any kind of remote rendered VR.

I think your skepticism about remotely rendered VR is well-placed. These guys might just be overly optimistic, but they might also be saying anything they can to pump up the valuation of their company (either for acquisition or VC funding purposes).
 
Status
Not open for further replies.