Human Walking in Virtual Environments (eBook)
X, 402 Seiten
Springer New York (Verlag)
978-1-4419-8432-6 (ISBN)
Frank Steinicke is a professor of computer science in media at the Department of Computer Science and the Department of Human-Computer-Media at the University of Würzburg. He received his Ph.D. in computer science from the University of Munster.
Yon Visell is assistant professor at Drexel University in Philadelphia, in the Department of Electrical and Computer Engineering. His research concerns engineering and scientific aspects of haptic and multisensory interaction in virtual and augmented reality environments.
Dr. Campos is a Scientist at Toronto Rehab where her research focus is on multisensory integration, perception-action coupling and visuomotor control.
Anatole Lécuyer is a senior researcher at Inria in Rennes, France. His research concerns Virtual Reality, 3D User Interfaces, Haptic Feedback and Brain-Computer Interfaces.
This book presents a survey of past and recent developments on human walking in virtual environments with an emphasis on human self-motion perception, the multisensory nature of experiences of walking, conceptual design approaches, current technologies, and applications. The use of Virtual Reality and movement simulation systems is becoming increasingly popular and more accessible to a wide variety of research fields and applications. While, in the past, simulation technologies have focused on developing realistic, interactive visual environments, it is becoming increasingly obvious that our everyday interactions are highly multisensory. Therefore, investigators are beginning to understand the critical importance of developing and validating locomotor interfaces that can allow for realistic, natural behaviours. The book aims to present an overview of what is currently understood about human perception and performance when moving in virtual environments and to situate it relative to the broader scientific and engineering literature on human locomotion and locomotion interfaces. The contents include scientific background and recent empirical findings related to biomechanics, self-motion perception, and physical interactions. The book also discusses conceptual approaches to multimodal sensing, display systems, and interaction for walking in real and virtual environments. Finally, it will present current and emerging applications in areas such as gait and posture rehabilitation, gaming, sports, and architectural design.
Frank Steinicke is a professor of computer science in media at the Department of Computer Science and the Department of Human-Computer-Media at the University of Würzburg. He received his Ph.D. in computer science from the University of Munster.Yon Visell is assistant professor at Drexel University in Philadelphia, in the Department of Electrical and Computer Engineering. His research concerns engineering and scientific aspects of haptic and multisensory interaction in virtual and augmented reality environments. Dr. Campos is a Scientist at Toronto Rehab where her research focus is on multisensory integration, perception-action coupling and visuomotor control. Anatole Lécuyer is a senior researcher at Inria in Rennes, France. His research concerns Virtual Reality, 3D User Interfaces, Haptic Feedback and Brain-Computer Interfaces.
Foreword 5
Contents 9
Part I Perception 11
1 Sensory Contributions to Spatial Knowledge of Real and Virtual Environments 12
1.1 External Sensory Information 14
1.2 Internal Sensory Information 16
1.3 Efferent Sources of Information 17
1.4 Relative Influence of External and Internal Sensory Information 19
1.4.1 Sensory Contributions in the Real World 19
1.4.2 Sensory Contributions in Virtual Environments 21
1.5 Conclusion 30
References 31
2 Perceptual and Cognitive Factors for Self-Motion Simulation in Virtual Environments: How Can Self-Motion Illusions (``Vection'') Be Utilized? 36
2.1 Introduction: The Challenge of Walking in VR 36
2.2 Visually Induced Self-Motion Illusions 38
2.2.1 Circular Vection 39
2.2.2 Linear Vection 40
2.3 Self-Motion Sensation from Walking 41
2.4 Interaction of Walking and Other Modalities for Vection 42
2.4.1 Walking and Auditory Cues 42
2.4.2 Walking and Visual Cues 42
2.5 Further Cross-Modal Effects on Self-Motion Perception in VR 45
2.6 Simulator Sickness and Vection in VR 47
2.7 Perceptual Versus Cognitive Contributions to Vection 47
2.7.1 Lower-Level and Bottom-Up Contributions to Vection 47
2.7.2 Cognitive and Top-Down Contributions to Vection 49
2.8 Does Vection Improve Spatial Updating and Perspective Switches? 52
2.9 Conclusions and Conceptual Framework 53
2.10 Outlook 56
References 57
3 Biomechanics of Walking in Real World: Naturalness we Wish to Reach in Virtual Reality 64
3.1 Introduction 64
3.2 Kinematics of Human Walking 65
3.2.1 Global Description 66
3.2.2 Joint Kinematics 71
3.3 Dynamics of Human Walking 73
3.3.1 Forces and Torques Description 74
3.3.2 Energetics of Human Walking 76
3.3.3 Balance 78
3.4 Comparison Between Ground and Treadmill Walking 80
3.5 Conclusion 81
References 82
4 Affordance Perception and the Visual Control of Locomotion 87
4.1 Introduction 87
4.2 Taking Body Dimensions and Movement Capabilities into Account 88
4.2.1 Theoretical Approach 88
4.2.2 Affordance Perception and the Control of Locomotion 89
4.2.3 Eyeheight-Scaled Information 89
4.3 Perceiving Body-Scaled Affordances 91
4.4 Perceiving Action-Scaled Affordances 94
4.4.1 The Information-Based Approach 94
4.5 Testing the Information-Based Approach 95
4.5.1 An Alternative Account 97
4.6 Testing the Affordance-Based Approach 99
4.7 Extensions of the Affordance-Based Approach 101
4.8 Affordance Perception and the Continuous Control of Locomotion 102
4.9 Conclusions 103
References 105
5 The Effect of Translational and Rotational Body-Based Information on Navigation 107
5.1 Introduction 107
5.2 Applications of Virtual Environments 108
5.3 Ecological Validity 109
5.4 The Effect of Body-Based Information 110
5.4.1 Review Framework 111
5.4.2 Studies Investigating the Effect of Body-Based Information 113
5.5 Summary and Conclusions for VE Applications 116
5.5.1 Model-Scale Environments 116
5.5.2 Small-Scale Environments 117
5.5.3 Large-Scale Environments 118
5.5.4 Further Research 118
References 119
6 Enabling Unconstrained Omnidirectional Walking Through Virtual Environments: An Overview of the CyberWalk Project 121
6.1 Introduction 122
6.2 Gait and Biomechanics 124
6.2.1 Natural Unconstrained Walking 124
6.2.2 Overground Versus Treadmill Walking 128
6.2.3 Potential Implications for CyberWalk 132
6.3 Multisensory Self-Motion Perception 132
6.3.1 Multisensory Nature of Walking 133
6.3.2 Integration of Vestibular and Proprioceptive Information in Human Locomotion 135
6.3.3 ``Vection'' from Walking 139
6.3.4 Potential Implications for CyberWalk 140
6.4 Large Scale Navigation 141
6.4.1 Potential Implications for CyberWalk 144
6.5 Putting it All Together: The CyberWalk Platform 144
References 147
Part II Technologies 153
7 Displays and Interaction for Virtual Travel 154
7.1 Introduction 154
7.2 Display Systems 156
7.3 Interaction Devices 161
7.4 Travel Techniques 173
7.4.1 Travel as a Control Task 173
7.4.2 Direct Self Motion Control Techniques 177
7.4.3 Indirect Self Motion Control Techniques 178
7.4.4 Scene Motion Techniques 178
7.4.5 Other Control Inputs 179
7.5 Conclusion 179
References 180
8 Sensing Human Walking: Algorithms and Techniques for Extracting and Modeling Locomotion 183
8.1 Introduction 183
8.2 Sensing and Interpreting Global Gait Parameters 184
8.2.1 Step Length and Frequency 184
8.2.2 Curvature and Non-linear Walking 185
8.2.3 Gait Asymmetry and Regularity 189
8.3 Joint Angles, Torques and Muscle Activity 189
8.3.1 Measuring Joint Displacements 189
8.3.2 Measuring Joint Angles 192
8.3.3 Estimating Joint Torques with Inverse Dynamics 194
8.4 Isolated Segments 194
8.5 Global System and Controllers 195
8.6 Conclusion About Inverse Dynamic Approaches 196
8.6.1 Measuring or Estimating Muscle Activities 197
8.7 Conclusion 201
References 201
9 Locomotion Interfaces 204
9.1 Introduction 204
9.2 Sliding Shoes 206
9.2.1 Virtual Perambulator 206
9.2.2 Powered Shoes 207
9.2.3 String Walker 208
9.2.4 Evacuation Simulator Using the Virtual Perambulator 209
9.3 Treadmills 210
9.3.1 Related Works in Treadmill-Based Locomotion Interface 210
9.3.2 Torus Treadmill 211
9.3.3 Control Algorithm of the Torus Treadmill 213
9.3.4 Effects of Walking on the Torus Treadmill 214
9.3.5 Limitation of Torus Treadmill 214
9.4 Foot Pad 215
9.4.1 Related Works in Foot-Pad-Based Locomotion Interface 215
9.4.2 Gait Master 215
9.4.3 Control Algorithm of the GaitMaster 218
9.4.4 GaitMater for Walking Rehabilitation 219
9.5 Robotic Tiles 220
9.5.1 The CirculaFloor 220
9.5.2 User Study of the Robot Tile Approach 220
9.6 Conclusion 223
References 223
10 Implementing Walking in Virtual Environments 225
10.1 Introduction 225
10.2 Virtual Reality Workspaces 227
10.3 Isometric Virtual Walking 229
10.3.1 One-to-One Mappings 229
10.3.2 Reference Coordinates 230
10.3.3 Virtual Traveling 231
10.4 Nonisometric Virtual Walking 231
10.4.1 User-Centric Coordinates 232
10.4.2 Scaling Self-Motions 234
10.4.3 Redirected Walking 237
10.5 Conclusion 241
References 242
11 Stepping-Driven Locomotion Interfaces 245
11.1 Designing Stepping-Driven Locomotion for Virtual Environment Systems 245
11.2 Walking-in-Place Interfaces 248
11.2.1 Setting Speed: Interpreting Stepping Gestures 248
11.2.2 Setting Direction for Walking-in-Place 253
11.2.3 The Future for Walking-in-Place Interfaces 255
11.3 Real-Walking Interfaces 256
11.3.1 Manipulating Speed 256
11.3.2 Manipulating Direction 258
11.3.3 Reorientation Techniques 262
11.3.4 The Future for Real-Walking Interfaces for IVE Systems 264
References 264
12 Multimodal Rendering of Walking Over Virtual Grounds 267
12.1 Introduction 268
12.2 Auditory Rendering 269
12.2.1 Introduction 269
12.2.2 Footstep Sound Synthesis 271
12.2.3 Walking Sounds and Soundscape Reproduction 276
12.2.4 Footstep Sound Design Toolkits 278
12.3 From Haptic to Multimodal Rendering 279
12.3.1 Introduction 279
12.3.2 Touch Sensation in the Feet 282
12.3.3 Multimodal Displays 285
12.3.4 Display Configurations 286
12.3.5 Interactive Scenarios 290
12.4 Conclusion 294
References 294
Part III Applications and Interactive Techniques 300
13 Displacements in Virtual Reality for Sports Performance Analysis 301
13.1 Introduction 301
13.1.1 Why Virtual Reality for Sports? 302
13.1.2 Requirements for Using Virtual Reality for Sports 306
13.1.3 Some Applications of Virtual Reality for Sports 307
13.2 Case Study 1: Deceptive Movements in Rugby 308
13.2.1 Setup 308
13.2.2 Method 309
13.2.3 Results 310
13.2.4 Discussion 310
13.3 Case Study 2: Wall Configuration for Soccer Free Kicks 312
13.3.1 Setup 313
13.3.2 Methods 314
13.3.3 Results 315
13.3.4 Discussion 316
13.4 Conclusion 316
References 317
14 Redirected Walking in Mixed Reality Training Applications 321
14.1 Locomotion in Virtual Environments 322
14.2 Redirected Walking 323
14.3 Practical Considerations for Training Environments 324
14.3.1 Impact of Redirection on Spatial Orientation 324
14.3.2 Augmenting Effectiveness of Redirected Walking 325
14.3.3 Designing Experiences for Redirected Walking 327
14.4 Redirection in Mixed Reality Environments 328
14.5 Challenges and Future Directions 330
References 331
15 VR-Based Assessment and Rehabilitation of Functional Mobility 334
15.1 VR-Based Assessment and Rehabilitation to Promote Functional Mobility 337
15.1.1 VR-Based Assessment and Rehabilitation Following Motor Dysfunction 337
15.1.2 VR-Based Assessment and Rehabilitation Following Visual Dysfunction 340
15.2 Dynamical Disease and VR-Based Assessment 342
15.2.1 Dynamic Measures for Assessing Local Functional Mobility Using VR 343
15.2.2 Dynamic Measures for Assessing Global Functional Mobility Using VR 345
15.3 Conclusion 347
References 348
16 Full Body Locomotion with Video Game Motion Controllers 352
16.1 Introduction 352
16.2 Video Game Motion Controllers 353
16.2.1 Wiimote 354
16.2.2 Playstation Move 358
16.2.3 Microsoft Kinect 360
16.3 Dealing with the Data 363
16.3.1 Understanding the Data Coming from the Device 364
16.3.2 Research the Algorithm Options Suited for the Data 365
16.3.3 Modifying the Models to Address Error and Uncertainty 369
16.3.4 Applying All the Data Toward a Solution 370
16.4 Creating an Interface 371
16.4.1 Challenges 371
16.4.2 Controlling Travel 372
16.4.3 Understand Your Design Tradeoffs and Users 373
16.4.4 Find How People Want to Interact 374
16.4.5 Compensate For Technology Limitations 374
16.5 Conclusion 376
References 376
17 Interacting with Augmented Floor Surfaces 378
17.1 Introduction 378
17.2 Background 379
17.2.1 Input from the Foot in Human-Computer Interaction 381
17.2.2 Relevance to Virtual Reality 382
17.3 Techniques and Technologies 383
17.3.1 Indirect Optical Sensing 383
17.3.2 Contact Sensing 384
17.3.3 Usability 385
17.4 Case Study: A Distributed, Multimodal Floor Interface 388
17.4.1 Contact Localization 388
17.4.2 Virtual Walking on Natural Materials 391
17.4.3 Floor Touch-Surface Interaction Techniques 391
17.4.4 Usability of Foot-Floor Touch-Surface Interfaces 392
17.4.5 Application: Geospatial Data Navigation 394
17.4.6 Foot-Based Gestures for Geospatial Navigation 394
17.5 Conclusions 398
References 398
Index 401
Erscheint lt. Verlag | 15.5.2013 |
---|---|
Zusatzinfo | X, 402 p. |
Verlagsort | New York |
Sprache | englisch |
Themenwelt | Mathematik / Informatik ► Informatik |
Technik ► Bauwesen | |
Technik ► Elektrotechnik / Energietechnik | |
Schlagworte | augmented reality • haptic Perception • Human-Computer interaction • Human locomotion • Multimodal Interface • multisensory integration • self-motion perception • Tacticle Sensing • Virtual Reality |
ISBN-10 | 1-4419-8432-1 / 1441984321 |
ISBN-13 | 978-1-4419-8432-6 / 9781441984326 |
Haben Sie eine Frage zum Produkt? |
Größe: 11,5 MB
DRM: Digitales Wasserzeichen
Dieses eBook enthält ein digitales Wasserzeichen und ist damit für Sie personalisiert. Bei einer missbräuchlichen Weitergabe des eBooks an Dritte ist eine Rückverfolgung an die Quelle möglich.
Dateiformat: PDF (Portable Document Format)
Mit einem festen Seitenlayout eignet sich die PDF besonders für Fachbücher mit Spalten, Tabellen und Abbildungen. Eine PDF kann auf fast allen Geräten angezeigt werden, ist aber für kleine Displays (Smartphone, eReader) nur eingeschränkt geeignet.
Systemvoraussetzungen:
PC/Mac: Mit einem PC oder Mac können Sie dieses eBook lesen. Sie benötigen dafür einen PDF-Viewer - z.B. den Adobe Reader oder Adobe Digital Editions.
eReader: Dieses eBook kann mit (fast) allen eBook-Readern gelesen werden. Mit dem amazon-Kindle ist es aber nicht kompatibel.
Smartphone/Tablet: Egal ob Apple oder Android, dieses eBook können Sie lesen. Sie benötigen dafür einen PDF-Viewer - z.B. die kostenlose Adobe Digital Editions-App.
Zusätzliches Feature: Online Lesen
Dieses eBook können Sie zusätzlich zum Download auch online im Webbrowser lesen.
Buying eBooks from abroad
For tax law reasons we can sell eBooks just within Germany and Switzerland. Regrettably we cannot fulfill eBook-orders from other countries.
aus dem Bereich