Originally Posted by stargate125645
That would be the standard use I mentioned.
If games need to be programmed to work with VR, that makes a lot more sense, as many games won't be then. I'm not sure how turning one's head is going work with a standard computer setup, either. Entire moving platform, or only limited view from turning one's head?
You're not making much sense to me. (Not intending to be insulting, just saying I don't get what you're saying.)
Standard computer setup with screen, playing a game, operate input devices to move your point of view (mouse) and body position (keyboard W-A-S-D): the screen in your room stays still but the part of the 'game world' you see inside the bezel moves around.
HMD on face, your visual field is pretty much filled with the display - maybe a little tunnel vision as if you were wearing glasses with side shields. You turn your head with the head mount display, and the motion sensors in the HMD (or something tracking the HMD externally, as with the Vive lighthouses) react by panning what portion of the game world you see to match. Probably still use the W-A-S-D keys to move your 'body position'. The extra immersion is because a) no peripheral 'world' outside the screen bezel, and (b)you're not moving something in your hand and seeing the world shift, you're turning your head and seeing another part of the world the same way you do IRL. But the game can effectively be programmed "the same" you're just switching input devices, essentially.
The biggest issue with games "not programmed for VR" is where that head movement = view movement equation breaks or is inconsistent. Think of an RTS game where you move your cursor all over the screen and the screen only 'pans' when you hit the edge...how do you manage that with a HMD...a dead zone where head movement only moves a cursor up to a certain off-central look angle, then the world starts shifting? Seems like that would cause nausea. Or 3rd person games where you stand 'behind' your main character's shoulder (Witcher, Dead Space, etc.) - making the world swivel around with your head probably causes a sense of dislocation that can induce nausea, same as if your head movement and the view 'tracking' are out of sync, but seeing the world swivel around a character on a fixed screen in front of you is easier to absorb. Or switching from in-engine (head movement moves FOV) to cutscenes (cutscene is rendered like a movie camera, and you don't get to look around while it's on). Swapping in and out of that feedback immersion is going to be interesting on the human processor.
Games programmed for VR can avoid those pitfalls, for example in engine 'cutscenes' like Half-life2 that let you walk around or look around while the chatter goes on. More advanced approaches that recognize things like hand position or controls beyond just where you're looking (Vive lighthouse or Oculus hand controls), may permit you to look one way while aiming 'off the center' of your field of view, further increasing the immersion.
Plus, HMD should be able to have different left and right eye views so what you see is more 3D (parallax)....part of the reason everyone talks about higher graphics requirements is it's computing two different views, not one. Games not programmed for VR might not get parallax correction - whatever the HMD drivers are might let you port it onto your face, but it might still just look 2D since your 2 eyes are seeing the same thing. I'm not sure on that count. (Seems to me with LCD shutter 3D glasses synced to NVidia cards years ago there were drivers that took the Z rendering 'depth' and essentially created parallax for games that weren't "designed to be" 3D...I remember playing the first version of Alien that way and nearly pissing myself, and not just because of the headaches that a 30fps refresh flicker gave me. So the same should be possible with the HMD drivers, but I've not heard this confirmed as working for absolutely "anything".)
In a near ideal case take a game like Elite:Dangerous. Normally you fly your ship, pointing it where you want to shoot (ignoring gimbaled and turreted weapons for now). But you also have left and right side displays in your cockpit, so by hitting a button mapped to 'pilot view' (or a number for one of those displays) you 'focus' on that display and bring it into the foreground. e.g. I can look to the left inside my cockpit while not re-aiming the ship. But I can't do both at once, at least not just with a keyboard and mouse! It's a toggle as to whether my mouse is controlling roll/pitch or "pilot looking direction". With an HMD, my head can control my "pilot looking direction" while the mouse continues to control roll/pitch flight direction. (And short of an HMD you can also do this with a joystick and a thumb hat, or with a head movement sensor or eye tracking sensor perhaps...but to me a piloting game is one of the best initial HMD candidates. They also assume the user is 'sitting' so the lack of motion for the rest of your body - not just look angle - is easier to accept.
The even more advanced conceptual HMD's like Fove want to engage eye tracking and head tracking independent. So turn head, the part of the world 'in front of you' moves, but you can also move your eyes inside the HMD and look at any point within what you can currently see (now my RTS example doesn't cause nausea!). Adding separate head and eye tracking helps both to set an 'aim point', or in the case of foveated rendering, so they can damp down the graphical purity off-axis and reduce the graphical thruput requirement.
Count me in as one of the ones highly interested in an HMD (right now leaning Vive, want to give the underdog a chance, and hate FB with a passion...but will do my best to wait to see how the initial reactions are...for at least a month). Still want my nice big 21x9 for many other games (and virtually all non-gaming compute work, with the possible exception of CAD modeling).
Crap, wall of text - hope I didn't get completely incoherent.Edited by rtrski - 2/17/16 at 1:26pm