Abstract

Scientific evidence is easily misunderstood. One of the most insidious instances of misunderstanding arises when scientific experts and those receiving their evidence assign different meanings to the same words. We expect scientific evidence to be difficult to understand. What is unexpected, and often far more difficult to detect, is the incorrect understanding of terms and phrases that appear familiar. In these circumstances, misunderstandings easily escape notice. We applied an evidence-based approach to investigating this phenomenon, asking two groups, one with legal education and one with scientific education, to define five commonly-used phrases with both lay and scientific connotations. We hypothesised that the groups would significantly diverge in the definitions they provided. Employing a machine learning algorithm and the ratings of trained coders, we found that lawyers and scientists indeed disagreed over the meanings of certain terms. Notably, we trained a machine learning algorithm to reliably classify the authorship of the definitions as scientific or legal, demonstrating that these groups rely on predictably different lexicons. Our findings have implications for recommending avoidance of some of these particular words and phrases in favour of terminology that promotes common understanding. And methodologically, we suggest a new way for governmental and quasi-governmental bodies to study and thereby prevent misunderstandings between the legal and scientific communities.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call