Abstract

Background: Motion sickness is common within most forms of transport; it affects most of the population who experience varied symptoms at some stage in their lives. Thus far, there has been no specific method to quantify the predicted levels of motion sickness for a given vehicle design, task and route. Objective: To develop a motion sickness virtual prediction tool that includes the following inputs: human motion, vision, vehicle motion, occupant task and vehicle design. Method: A time domain analysis using a multi-body systems approach has been developed to provide the raw data for post-processing of vehicle motion, occupant motion and vision, based on a virtual route designed to provoke motion sickness, while the digital occupant undertakes a specific non-driving related task. Results: Predicted motion sickness levels are shared for a simple positional sweep of a vehicle cabin due to a prescribed motion and task. Two additional examples are shared within this study; first, it was found that the model can predict the difference found between sitting forwards and backwards in an autonomous vehicle. Second, analysis of a respected and independent study into auxiliary display height shows that the model can predict both relative and absolute levels between the two display heights congruent to the original physical experiment. Conclusion: It has been shown that the tool has been successful in predicting motion sickness in autonomous vehicles and is therefore of great use in guiding new future mobility solutions in the ability to tune vehicle dynamics and control alongside vision and design attributes.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call