Tom’s Hardware Giveaway – Elite Dangerous: Deluxe Edition

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
As a VR developer, how do you approach the problem of motion sickness? A lot of people seem to get sick if their camera shakes from any kind of damage impact or character movement. Do we have to regress back to rail shooter feeling of static cameras or what is your approach to keep motion sickness down while keeping immersion up?
 
I would ask the question which is in my brain as long I know VR becoming a thing with oculus few years ago. And it's: When we will see full immersion into the OS? Let me explain it more. VR desktop looks cool but I still think its unfinished solution. What I want from VR desktop (or shell if like it this way), is not only big rectangle canvas inside 3D enviroment. It's cool though. But would like to bring any of my windows, 3D icons and manipulate them acros the view. And I hope this will eventualy happend. Finger crossed 😉
 
Given our current technology, can VR be made more efficient (in terms of graphical processing power needed) than it currently is? And if so, which side would this efficiency come from: the software or the hardware?
 
Viewing VR for longer than 30 minutes can strain most peoples eyes. As a VR Developer, is it possible to create software-based solutions to allow unlimited viewing time?
 
My question to a VR developer would be:
How do you think a more freely controllable in game perspective will effect game design in regards to making sure crucial information about the story or experience isn't missed if the person chooses to look away from some triggered event?
 
I would ask them to develop a VR education tools for bunch of technical disciplines like carpentry, sports, craft, and others, so that people can create their own VR instruction easily. Youtube has lots of resources, but has certain limits. For example, right now I am trying to replace an ice maker in my freezer, but all tips I found on youtube or simple instructions don't give me clear idea about details. I have to guess many things especially about locations of certain parts and angles of approach when things are hiding behind other objects. I heard a doctor created 3D model for his patient before surgery to make the operation be planned with VR. Education/training with VR will be very effective.
 
With the way that VR allows people to become immersed in the world they play, how has this medium shifted the way you do development, and how will this effect the evolution of the tools used in game development?
 
Are they afraid of a dystopian future? Like Ready Player One where people live in mass slums as close to the nearest servers as they can afford? A world where people retreat into VR and waste away in real life?
 
my questtion is this -
if you are to play vr game with mouse and keybord you cant move around , whitch is less immersive but more importently can be very uncomfortable too move your neck around so how can they work around that from the software side?
and if you are to use a controler like the vive has but you dont have an empty room or alot of space to work with
again is it a software solution?
 
My question for VR developers is simple and goes to the very heart of the matter. What are you doing to make VR gear more comfortable for people who wear glasses? I would argue that having to put something on over your glasses was a major reason that 3d (whether in the home or theater) failed to live up to industry goals. Most people I know don't want to wear a device over the top of the glasses they already wear.
 
How far away is vr technology, more specifically the precise translation of movement of hands and fingers without any lag, from actually being used in medicine for example by surgeons to perform remote robotic surgeries?
 
My main question would be how would the technology improve and how long would it take to the point where it is no longer such a bulky headset and perhaps something reminscent of Google Glass, where it only covers your eyes. I can't exactly properly wear the helmet with current designs.
 


Haha! I know- it'd be amazing!
 
Can the fact that our brains guess at the details in our peripheral vision be used to reduce the GPU power needed to run VR? Could a cable management solution on the back of a VR headset be used to counterbalance the weight of the visor?
 
My question would be, "Some people cannot use VR due to motion sickness or other health reasons, is there a plan to somehow include these people in VR only games?"
 
Most current and previous game engines are built for viewing from a flat screen perspective. Is it easier to create content in VR vs looking at a screen?
 
If you could ask a VR developer one thing, what would it be?
I want to know if there is thought going toward the potential for these to be used at a desk, with a HOTAS or mouse and keyboard setup? I keep seeing them needing a pair of 3D controllers and half a room to move around, and that's not going to be something I want.
 


Actually, they do have something like this in the works for Rez∞

111.jpg