As it has already been mentioned, Virtual Reality poses some new challenges for both the rendering structure and the application design.
Both Unity and Oculus provide a best practice guideline([8],[9]) , which states these challenges and provides possible solutions. Do note that most of these points are mere instructions
and ideas that have worked well for others. Most of them have not been scientifically proven or been accepted as the general solution to a problem, and might not always be applicable. To quickly introduce the reader to the topic of
VR, here are the most important issues and concepts:
VR sickness
VR sickness - in other related literature refered to as VR nausea, simulator sickness, or motion sickness - is the main issue to deal with. Users of HMDs usually experience a neauseating feeling after only a few minutes of usage, depending on the application. It resembles classical motion sickness, but while it is usually triggered by actual movement (e.g. bobbing of a ship), VR sickness is induced by remaining stationary, compared to a visually registered movement.
While the most common symptom is nausea, other users may feel disoriented, or find their eyes to be strained (oculomotor discomfort).
While the exact cause of VR sickness remains unknown at this point, the widely accepted theory is that the nausea is a counter-reaction of the body to conflicting sensations, which is also a side-effect of poisons. The natural reaction is to regurgigate the poison - resulting in nausea.
Especially for fast-changing environments, free rotational axes or interference of rotation control can lead to a faster occurrence of VR sickness. Most of the best practice points are intended as a treatment of this feeling, and
should serve to maximize the user comfort. [17]
Hardware limits
The second big issue for VR nowadays is the limitation of hardware. Both the split display (images have to be displayed from two slightly different angles) and the high framerate require a high-end setup for the optimal experience.
Thus, a simple scene and optimization of the scene renderings can improve the overall results significantly, too.
Rendering
Since Unity takes most of the rendering process into its own hands, few changes need to be made by the user. One of the possibilities is the newly introduced quality setting, which allows the creator to specify varying levels of e.g.
antialiasing, texture resolution, etc. for a set of modes, which can be chosen at runtime ("low quality" to "best quality"). Do note that this does not affect the complexity of the objects in this case, but can theoretically be implemented.
Also, it is recommended to make use of techniques used around 2000 in the gaming industry, to reduce the amount of dynamic calculations, e.g. with static batching or texture atlases.
A few other points do still hold for the game design:
Immersion should be kept for the whole time, meaning the user spawns inside a virtual area, and is not "placed inside" a character. Most Virtual Reality environments like Oculus Home or Steam VR [10]
provide an additional "home" outside of the individual applications, to ensure a smooth transition when diving into a game or VR movie.
If the ressources are available, supersampling and antialiasing should be enabled to avoid flickering, as it is visible in this project's forest scenery. Due to the low resolution of current-gen devices, this can only be partially resolved by the aforementioned
techniques. Other methods would include reduced distance visibility, to minimize a flickering for small head movements.
Optimization
When optimizing scenes or underlying code, the main goal is always the minimization of latency. The Oculus DK2 works best with a (constant) framerate of 75+ frames per second, while a sudden hit in frames is impacting the VR user in a more drastic way
than a constantly lower framerate. Each time the FPS drop, the brain registers this as some kind of brain lag, which is only fostering VR sickness.
One way to efficiently determine the suitability of a certain scene is the analysis of draw calls per frame. As for the Gear VR [11]
(a HMD built constructed out of a cardbord headset and a current high-end smartphone), the approximate best-practice lies between 50 to 500 draw calls per frame and eye, with 50k – 100k polygons per frame. While this might not be close to the potential of desktop graphics
chips, it gives a certain feel for the order of magnitude for the complexity of scenes.
Tracking
Tracking is affecting the head-tracking as well as the positional tracking of the user. The first refers to the high-frequent input of the head movements (for the Oculus Rift DK2, the update rate is at 1000Hz), which is correctly
handled by Unity, as long as assigned to a specific kind of controller (see "FPS controller"). It should always be consistent with the actual head-movement of the user, since this can easily create a huge discomfort as well.
It is especially important that the program reacts to movement even during "freezes", e.g. menus or scene transitions. The latter was way harder to achieve, since Unity is blocking some of the necessary channels, and has not yet implemented
a trigger to enable this. Proposed workarounds can be found in "Transitions".
For the latter, it is important to either strictly limit the pathing of the user, or assert a "clean" visibility and scene rendering from any point of view. Especially holes in objects due to minimum z distances, or back-face culling, should be heavily
avoided.
Movement and acceleration
Movement is critical in itself. Optimal experiences include a limited freedom of movement. While this would heavily restrict the possibilities in a virtual museum, there are still some ways to improve the situation for FPS trackings:
Minimize/Stabilize FOV
By adding a static interface for the user (e.g. a instrument panel in a car or spacecraft), the edges of the vision are not as critical as without one, since sudden movements are effectively impacting the peripheral vision as well. As an additionla remark for this, head bobbing should be avoided. Similarly, constant jumping or up/down movements may trigger strong VR sickness for some users.
Input devices
Although the natural devices for movement are mouse and keyboard for a PC experience, users should consider the option of a gamepad or even Oculus Touch, which are both more haptic in their use with the HMD put on, as well as a natural limitation of input compared to mouse and keyboard (e.g. the mouse tracking would interfere with the head tracking, so it is disabled anyways).
Avoid movement in sideward direction
As the title already implies, sideward movement is perceived quite badly. If it should be necessary at all, it should at least be slower than a forward/backward movement, and with a different acceleration rate.
Acceleration
Sudden changes in velocity should be avoided, as it is not natural. This does also include zooming effects, especially when not in sync with head movement. Some compromises include the scaling of the relative head movement, to pretend a zooming-like effect. This might still be uncomfortable for some users, which is why the rotation scene in this project does not include such a feature.
User interfaces (UI)
User interfaces are tricky, because they usually involve static placement and stop the head tracking process.
Thus, UI elements in VR can be presented in two different ways:
The first method is a floating menu, which is either rotating with the user, or statically projected in one direction. While the environment may be paused, it is important that the user can still look around in the scene, and get the continuous feeling of immersiveness.
A second variant would be a so-called diegetic UI, which are elements of user interfaces included in the environment, e.g. a interactive map on the wall, or a working clock inside a cockpit.
Most importantly, user interaces make up for a large part of the user's focus during the playtime. To make this as pleasant as possible, the distance of the menu should be rougly 0.7m-3.5m. Since the general scaling consens estimates 1m to be roughly one unit in the virtual space, this can be easily implemented, and should be considered at all times, even if the menu is only displayed for a short period. Especially people with glasses tend to react to this in unwanted ways, and might get a headache way earlier than users without glasses.
Broad testing audience
Currently, most people are not used to the feeling of immersive VR, which may enhance their perception for VR sickness. As a frequent tester, developers are usually much less perceptible for this, which is why a broad test audience should be included in the development process, preferably of various age groups of both genders. To this point, no general trend has been published, but may be so in the close future.
If the reader should be interested in a more detailed introduction, the extended best practice guide by Oculus [12] is highly recommended. Another recommended read is the VR optimization guide for the Unreal Engine 4 [13], which discusses topics also relevant in Unity and VR in general.
|