Nicht aus der Schweiz? Besuchen Sie lehmanns.de
Oculus Rift - Bradley Austin Davis, Karen Bryla, Alex Benton

Oculus Rift

in Action
Buch | Softcover
440 Seiten
2015
Manning Publications (Verlag)
978-1-61729-219-4 (ISBN)
CHF 73,25 inkl. MwSt
  • An in depth guide for creating immersive VR experiences
  • Shows the right way to create compelling VR applications
  • Plenty of useful examples with detailed instructions

The Oculus Rift is an exciting next generation VR headset developed by OculusVR. Whether it's in a game, an architectural walk-through, or a teaching simulation, the goal of any immersive virtual reality experience is to make users feel like they're in the middle of the action.

With precise, high-quality optics and a flexible programmatic interface, the Rift provides real-life field of view and head tracking hardware for natural interaction that finally nails the feeling of being there.

Oculus Rift in Action introduces the powerful Oculus Rift headset and teaches you how to integrate its many features into 3D games and other virtual reality experiences. You'll start by understanding the capabilities of the Rift hardware. Then you'll follow interesting and instantly-relevant examples that walk you through programming real applications using the Oculus SDK. Examples are provided for both using the Oculus C API directly and for using Unity, a popular development and 3D graphics engine, with the Oculus Unity integration package.
Topics include:
  • Creating immersive VR experiences
  • Integrating the Rift with the Unity3D SDK
  • Implementing the mathematics of 3D
  • Avoiding motion-sickness triggers

Readers can use this book even if they don't yet own the Oculus Rift hardware. Some experience with C++ or another OO language is required for the programming examples. No previous knowledge of optics, display, or motion tracking is expected.

Bradley Austin Davis is a software developer for IMDb.com and the maintainer of the community version of the Oculus VR SDK on GitHub.

Karen Bryla is a freelance technical writer and developer.

Philips Alexander Benton is an associate lecturer in Advanced 3D Graphics at the University of Cambridge and a senior software engineer at Google.

foreword
preface
acknowledgments
about this book
about the authors
author online
about the cover illustration
Part 1 Getting Started
1. Meet the Oculus Rift
1.1. Why support the Rift?
1.1.1. The call of virtual reality
1.1.2. But what about the Rift?
1.2. How is the Rift being used today?
1.3. Get to know the Rift Hardware
1.3.1. The DK2
1.3.2. The DK1
1.3.3. The GPU
1.4. How the Rift Works
1.4.1. Using head tracking to change the point of view
1.4.2. Rendering an immersive view
1.5. Setting up the Rift for development
1.6. Dealing with motion sickness
1.7. Development Paths
1.8. Summary
Part 2 Using the Oculus C API
2. Creating Your First Rift Interactions
2.1. SDK interfaces
2.1.1. Oculus runtime
2.1.2. Oculus SDK
2.2. Working with the SDK
2.2.1. SDK management
2.2.2. Managing the HMD
2.3. Getting input from the head tracker
2.3.1. Reserving a pointer to the device manager and locating the headset
2.3.2. Fetching tracker data
2.3.3. Reporting tracker data to the console
2.3.4. Exiting and cleaning up
2.3.5. Understanding the output
2.4. A framework for demo code: the GlfwApp base class
2.5. Rendering output to the display
2.5.1. The constructor: accessing the Rift
2.5.2. Creating the OpenGL window
2.5.3. Rendering two rectangles, one for each eye
2.6. What’s next?
2.7. Summary
3. Pulling Data Out of the Rift: Working with the Head Tracker
3.1. The head tracker API
3.1.1. Enabling and resetting head tracking
3.1.2. Receiving head tracker data
3.2. Receiving and applying the tracker data: an example
3.2.1. Initial setup and binding
3.2.2. Fetching orientation
3.2.3. Applying the orientation to the rendered scene
3.3. Additional features: drift correction and prediction
3.3.1. Drift correction
3.3.2. Prediction
3.3.3. Using drift correction and prediction
3.4. Summary
4. Sending Output to the Rift: Working with the display
4.1. Targeting the Rift display
4.1.1. Extended vs. Direct HMD mode
4.1.2. Creating the OpenGL window: choosing the display mode
4.1.3. Creating the OpenGL window: Extended Desktop mode
4.1.4. Creating the OpenGL window: Direct HMD mode
4.1.5. Full screen vs. windowed: extensions with glfwCreateWindow()
4.1.6. Dispensing with the boilerplate
4.2. How the Rift display is different: why it matters to you
4.2.1. Each eye sees a distinct half of the display panel
4.2.2. How the lenses affect the view
4.3. Generating output for the Rift
4.4. Correcting for lens distortion
4.4.1. The nature of the distortion
4.4.2. SDK distortion correction support
4.4.3. Example of distortion correction
4.5. Summary
5. Putting it all together: Integrating Head Tracking and 3D Rendering
5.1. Setting the scene
5.2. Our sample scene in monoscopic 3D
5.3. Adding stereoscopy
5.3.1. Verifying your scene by inspection
5.4. Rendering to the Rift
5.4.1. Enhanced data for each eye
5.4.2. Improved user settings
5.4.3. Setting up the SDK for distortion rendering
5.4.4. The offscreen framebuffer targets
5.4.5. The Oculus texture description
5.4.6. Projection and modelview offset
5.4.7. The Rift’s rendering loop
5.5. Enabling sensors
5.5.1. Implications of prediction
5.5.2. Getting your matrices in order
5.6. Summary
6. Performance and quality
6.1. Understanding VR performance requirements
6.2. Detecting and preventing performance issues
6.3. Using timewarp: catching up to the user
6.3.1. Using timewarp in your code
6.3.2. How timewarp works
6.3.3. Limitations of timewarp
6.4. Advanced uses of timewarp
6.4.1. When you’re running early
6.4.2. When you’re running late
6.5. Dynamic framebuffer scaling
6.6. Summary
Part 3 Using Unity
7. Unity: Creating Applications that Run on the Rift
7.1. Creating a basic Unity project for the Rift
7.1.1. Use real-life scale for Rift scenes
7.1.2. Creating an example scene
7.2. Importing the Oculus Unity 4 Integration package
7.3. Using the Oculus player controller prefab: getting a scene on the Rift, no scripting required
7.3.1. Adding the OVRPlayerController prefab to your scene
7.3.2. Doing a test run: the Unity editor workflow for Rift applications
7.3.3. The OVRPlayerController prefab components
7.4. Using the Oculus stereo camera prefab: getting a scene on the Rift using your own character controller
7.4.1. The OVRCameraRig prefab components
7.5. Using player data from the user’s profile
7.5.1. Ensuring the user has created a profile
7.6. Building your application as a full-screen standalone application
7.7. Summary
8. Unity: Tailoring Your Application for the Rift
8.1. Creating a Rift-friendly UI
8.1.1. the Unity GUI tools to create a UI
8.1.2. Creating an in-world UI
8.2. Using Rift head tracking to interact with objects
8.2.1. Setting up objects for detection
8.2.2. Selecting and moving objects
8.2.3. Using collision to put the selected object down
8.3. Easing the user into VR
8.3.1. Knowing when the health and safety warning has been dismissed
8.3.2. Re-centering the user’s avatar
8.3.3. Creating splash scenes
8.4. Quality and performance considerations
8.4.1. Measuring quality: looking at application frame rates
8.4.2. Using timewarp
8.4.3. (Not) Mirroring to the display
8.4.4. Using the Unity project quality settings
8.5. Summary
Part 4 The VR User Experience
9. User Interface Design for Virtual Reality
9.1. New UI paradigms for VR
9.1.1. UI conventions that won’t work in VR and why
9.1.2. Can your world tell your story?
9.1.3. Getting your user from the desktop to VR
9.1.4. Cutscenes
9.2. Designing 3D user interfaces
9.2.1. Criteria for a good UI
9.2.2. Guidelines for 3D scene and UI design
9.2.3. The mouse is mightier than the sword
9.2.4. Using the Rift as an input device
9.3. Animations and avatars
9.3.1. Cockpits and torsos: context in the first person
9.3.2. Character animations
9.4. Tracking devices and gestural interfaces
9.4.1. Beyond the gamepad
9.4.2. Gestural interfaces
9.5. Summary
10. Reducing Motion Sickness and Discomfort
10.1. What does causing motion sickness and discomfort mean?
10.2. Strategies and guidelines for creating a comfortable VR environment
10.2.1. Start with a solid foundation for your VR application
10.2.2. Give your user a comfortable start
10.2.3. The golden rule of VR comfort: the user is in control of the camera
10.2.4. Rethink your camera work: new approaches for favorite techniques
10.2.5. Make navigation as comfortable as possible: character movement and speed
10.2.6. Design your world with VR constraints in mind
10.2.7. Pay attention to ergonomics: eyestrain, neck strain, and fatigue
10.2.8. Use sound to increase immersion and orient the user to action
10.2.9. Don’t forget your user: give the player the option of an avatar body
10.2.10. Account for human variation
10.2.11. Help your users help themselves
10.2.12. Evaluate your content for use in the VR environment
10.2.13. Experiment as much as possible
10.3. Testing your VR application for motion sickness potential
10.3.1. Use standardized motion and simulator sickness questionnaires
10.3.2. Test with a variety of users and as many as you can
10.3.3. Test with new users
10.3.4. Test with users who have set their personal profile
10.3.5. Test in stages
10.3.6. Test in different display modes
10.4. Summary
Part 5 Advanced Rift Integrations
11. Using the Rift with Java and Python
11.1. Using the Java bindings
11.1.1. Meet our Java binding: JOVR
11.1.2. The Jocular-examples project
11.1.3. The RiftApp class
11.1.4. The RiftDemo class
11.2. Using the Python bindings
11.2.1. Meet our Python binding: PyOVR
11.2.2. Development environment
11.2.3. The pyovr-examples project
11.2.4. The RiftApp class
11.2.5. The RiftDemo class
11.3. Working with other languages
11.4. Summary
12. Case Study: A VR Shader Editor
12.1. The starting point: Shadertoy
12.2. The destination: ShadertoyVR
12.3. Making the jump from 2D to 3D
12.3.1. UI layout
12.3.2. User inputs
12.3.3. Project planning
12.3.4. Picking our feature set
12.3.5. UI design
12.3.6. Windowing and UI libraries
12.4. Implementation
12.4.1. Supporting the Rift in Qt
12.4.2. Offscreen rendering and input processing
12.5. Dealing with performance issues
12.6. Building virtual worlds on the GPU
12.6.1. Raycasting: building 3D scenes one pixel at a time
12.6.2. Finding the ray direction in 2D
12.6.3. Finding the ray direction in VR
12.6.4. Handling the ray origin: stereopsis and head tracking
12.6.5. Adapting an existing Shadertoy shader to run in ShadertoyVR
12.7. Summary
13. Augmenting Virtual Reality
13.1. Real-world images for VR: panoramic photography
13.1.1. Panorama photos
13.1.2. Photo spheres
13.1.3. Photo spheres…​in space!
13.2. Using live webcam video in the Rift
13.2.1. Threaded frame capture from a live image feed
13.2.2. Image enhancement
13.2.3. Proper scaling: webcam aspect ratio
13.2.4. Proper ranging: field of view
13.2.5. Image stabilization
13.3. Stereo vision
13.3.1. Stereo vision in our example code
13.3.2. Quirks of stereo video from inside the Rift
13.4. The Leap Motion hand sensor
13.4.1. Developing software for the Leap Motion and the Rift
13.4.2. The Leap, the Rift, and their respective coordinate systems
13.4.3. Demo: integrating Leap and Rift
13.5. Summary
Appendix A: Setting up the Rift in a development environment
A.1. Selecting a display mode: Direct HMD Access or Extended Desktop mode
A.2. Configuring the displays in your OS for Extended Desktop mode
A.2.1. Extending or cloning (mirroring): which should you choose?
A.3. Improving your development environment
A.3.1. Fix it
A.3.2. Fix it cheaply
A.3.3. Clone it with a gadget
A.3.4. Remote development
A.4. Configuring the Rift for your use
A.4.1. Create a user profile
A.5. Verifying your setup and troubleshooting
A.6. Developing without a Rift
Appendix B: Mathematics and software patterns for 3D graphics
B.1. Coordinate systems
B.2. Introduction to matrices
B.3. Matrix transforms
B.4. Representing rotation
B.4.1. Euler angles
B.4.2. Quaternions
B.4.3. Spherical linear interpolation ("slerp")
B.5. The scene graph software design pattern
B.6. The matrix stack software design pattern
B.7. The modelview software design pattern
Appendix C: Suggested books and resources
Appendix D: Glossary
index

A complete and grounded overview. From the Foreword by Philip Rosedale, Creator of Second Life

Not just a must-read, it's a must-use! You'll want it constantly at hand when working with the Oculus Rift. Jose San Leandro, ACM S.L.



The best way to dive deeper into VR. George Freeman, Founder, Mazepuncher LLC

Excellent style, clear and understandable examples. Dr. Cagatay Catal, Istanbul Kultur University

Erscheint lt. Verlag 3.9.2015
Verlagsort New York
Sprache englisch
Maße 189 x 234 mm
Gewicht 758 g
Einbandart kartoniert
Themenwelt Informatik Grafik / Design Digitale Bildverarbeitung
Informatik Grafik / Design Film- / Video-Bearbeitung
Mathematik / Informatik Informatik Programmiersprachen / -werkzeuge
Informatik Software Entwicklung Spieleprogrammierung
Informatik Software Entwicklung User Interfaces (HCI)
Informatik Web / Internet Web Design / Usability
Informatik Weitere Themen Hardware
Schlagworte Oculus Rift • Virtual Reality • Virtual Reality (VR) • Virtuelle Realität
ISBN-10 1-61729-219-2 / 1617292192
ISBN-13 978-1-61729-219-4 / 9781617292194
Zustand Neuware
Haben Sie eine Frage zum Produkt?
Mehr entdecken
aus dem Bereich
alles zum Drucken, Scannen, Modellieren

von Werner Sommer; Andreas Schlenker

Buch | Softcover (2024)
Markt + Technik Verlag
CHF 34,90
Einstieg und Praxis

von Werner Sommer; Andreas Schlenker

Buch | Softcover (2023)
Markt + Technik (Verlag)
CHF 27,90