Abstract

Dental education in the United States has changed considerably over the last 30 years and it could be argued that not all the changes are positive. The number of dental schools has increased and the number of graduates has increased, but the level of dental care in the country as a whole has not increased. The majority of US dental schools are now private and profit making and even the state schools need to generate income. The curriculum has also changed at the expense of the basic sciences.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call