VR Resources + WIp
Hey, I've been working on the vr experience bedtime story and working on a pillow in C4d. Heres a list of resources
Vive Game Jam- (mainly focused on gaming but some of best vr devs were there)
The following user experience guidelines comprise our recommendations to ensure that your Leap-enabled application is easy to learn and use.
· Keep in mind that symbology can be difficult to learn and memorize.
Avoid forcing users to learn complex hand gestures to interact with your application.
· Instead, draw inspiration from physical interaction and real-world behaviors.
The more physically inspired interactions are, the less training a person needs and the more intuitive and natural your application feels.
· Don’t feel constrained by the limitations or inconveniences of the real-world — this is your world.
Interaction doesn’t have to be the way it has always been. It can be any way we imagine it to be. Why force the user to reach all the way out and grab an object? Why not have the object reach back? — Give them “the force”!
· The user should feel as if their intent is amplified rather than subdued or masked.
For example, users often like their movements to be amplified when using a mouse (i.e. they don’t need 10 inches of mouse movement to move 10 inches on screen). For gestural interactions, amplifying or exaggerating responses can have an even more positive result. Keep in mind that some people are more sensitive than others, so link this exaggeration to a sensitivity setting for users to modify this effect to their preference.
· Concentrate on giving the user dynamic feedback to their actions. The more feedback they have, the more precisely they can interact with your software.
For example, the user will need to know when they are “pushing” a button, but can be more effective if they can see when they are hovering over a button, or how much they are pressing it.
· On screen visuals (such as representations of hands, tools, or digital feedback) should be simple, functional, and non-intrusive.
The user should not be distracted from the task by their tools or environment. Decoration should not distract from your purpose.
· Require more deliberate action for destructive or non-reversible acts than for harmless ones.
Subtle gestures should be reserved for subtle actions. Conversely, an act such as closing an application or deleting a file can be a non-reversible event requiring a more deliberate action. Double check with the user when unsure, such as a prompt for confirmation.
· Provide a clear delineation and specific sense of modality between acts of navigation and interaction, unless both are simple or one is handled automatically (or with assistance). Mixing the two in a complex situation can lead to confusion or disorientation.
For example, moving an object while having the user simultaneously position their viewing angle inside a 3D environment is inherently difficult. However, if the viewing angle moves automatically in response to the user’s movement, then working with the object is easier. Likewise, when navigating a large data set the user will want the view to move easily, but when highlighting a portion of the data the view should remain still.
· Overall, imagine that your user is faced with no instructions or tutorials on how to use your application.
Strive at all costs to make their first intuitive guesses the right ones. Where appropriate, create more than one proper way to do something.
UIs should be a 3D part of the virtual world and sit approximately 2-3 meters away from the viewer—even if it’s simply drawn onto a floating flat polygon, cylinder or sphere that floats in front of the user.
• Don’t require the user to swivel their eyes in their sockets to see the UI. Ideally, your UI should fit inside the middle 1/3rd of the user’s viewing area; otherwise, they should be able to examine it with head movements.
• Use caution for UI elements that move or scale with head movements (e.g., a long menu that scrolls or moves as you move your head to read it). Ensure they respond accurately to the user’s movements and are easily readable without creating distracting motion or discomfort.
• Strive to integrate your interface elements as intuitive and immersive parts of the 3D world. For example, ammo count might be visible on the user’s weapon rather than in a floating HUD.
• Draw any crosshair, reticle, or cursor at the same depth as the object it is targeting; otherwise, it can appear as a doubled image when it is not at the plane of depth on which the eyes are converged.