Abstract
For the development of a better product which fits to the target user population, physical workloads such as reach and visibility are evaluated using digital human simulation in the early stage of product development; however, ergonomic workload assessment mainly relies on visual observation of reach envelopes and view cones generated in a 3D graphic environment. The present study developed a quantitative assessment method of physical workload in a digital environment and applied to the evaluation of a Korean utility helicopter (KUH) cockpit design. The proposed assessment method quantified physical workloads for the target user population by applying a 3-step process and identified design features requiring improvement based on the quantified workload evaluation. The scores of physical workloads were quantified in terms of posture, reach, visibility, and clearance, and 5-point scales were defined for the evaluation measures by referring to existing studies. The postures of digital humanoids for a given task were estimated to have the minimal score of postural workload by finding all feasible postures that satisfy task constraints such as a contact between the tip of the index finger and a target point. The proposed assessment method was applied to evaluate the KUH cockpit design in the preliminary design stage and identified design features requiring improvement. The proposed assessment method can be utilized to ergonomic evaluation of product designs using digital human simulation.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.