Locomotion based on In-Place Motion Sensing
When the user wants to move in the VR landscape, he or she will need to execute a specific physical action on the spot to do so. This could be something simple like a small movement of the right arm forward to move the right leg forward. This is pertaining to the Sensory Conflict Theory, where the aim is to try and minimize the conflict since now both the user’s mental model which knows that the arm moved, is matched by the movement in the VR landscape.
Reduce Motion Sickness using Sensory outputs
Since in the Sensory Conflict Theory, the user experiences motion sickness due to a mismatch in the senses and/ or mental model of the user, we could try to output some form of sensory feedback to the user when they walk, to try and fill in the gap.
For example, when the user moves a joystick to move the player in the VR landscape, whenever the player in the VR landscape steps on the ground, a small vibration could be sent to the user through the joystick, to simulate the feeling of actually stepping on the ground. This can be the attempt to bridge the gap in the sensory cues.
Reduce Motion Sickness using User Interface as a REST Frame
In usual 2D/ 3D games, the User Interface is usually fixed to the game screen.
In VR, we can try to do the same thing by having the User Interface fixed onto the user’s vision just like how it would usually be in a regular 2D/ 3D game. The User Interface can then act as a REST Frame for the user, since the User Interface elements can act as fixation points for the user during actual locomotion.