Abstract
A car that will not start on a cold winter day and one that will not start on a hot summer day usually indicate two very different situations. When pressed to explain the difference, we would give a winter account-'Oil is more viscous in cold conditions, and that causes . . .'-and a summer story-'Vapor lock is a possibility in hot weather and is usually caused by . . .' How do we build such explanations? One possibility is that understanding how the car works as a device gives us a basis for generating the explanations. But that raises another question: how do people understand devices? Model-based reasoning is a subfield of artificial intelligence focusing on device understanding issues. In any model-based-reasoning approach, the goal is to 'model' a device in the world as a computer program. Unfortunately, 'model' is a loaded term-different listeners understand the word to mean very different concepts. By extrapolation, 'model-based reasoning' can suggest several different approaches, depending on the embedded meaning of model.'.< <ETX xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">></ETX>
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.