Abstract

The qualification of components in terms of Durability and Reliability is based on the analysis on a lot of sensors, CAN, IIOT data, which needs a data management infrastructure, to understand the customer usage and variability. Such big data infrastructure is often called « data lake », and may lead to storing huge amount of data. This infrastructure must be generic yet test data-oriented to understand the data structure and its analysis required, and to be optimized for such application. All those data may come from connected equipment, instrumented fleets, test lab or proving ground measurements, digital twins and multi-body dynamics simulations. Further, the data must be managed in terms of quality and traceability. It must be indexed to be able to be retrieved through searches (customer, vehicle, measurement site, engine specification, road condition, usage conditions).Once this step is achieved, the product development team is able to have a better understanding of customer usage and inputs variabilities in different environments and conditions, which have to be taken into account in the product's mission profile or duty cycle. This ad-hoc mission profile enables the creation of realistic, meaningful design and test specifications.Understanding the customer usage and input variabilities enables a probabilistic approach to fatigue life prediction. The uncertainties on inputs (geometry, material and loading) may be propagated through the life process, knowing each input's probability distribution function, using a Monte Carlo analysis. The infrastructure enables the life analysis to be done through multiple runs on cloud-oriented server, which enables automation and streamlining the whole process. A use case will be presented to illustrate the approach and its benefits.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call