Header Ads



">
Breaking News
recent

Majority of U.S. Doctors No Longer White, Male

doctor patient handshake illustration

A new study finds the U.S. medical field is less dominated by white men than it used to be, but there are still few Black and Hispanic doctors, dentists and pharmacists.



from WebMD Health https://ift.tt/3irXjFP

No comments:

For more information you like & comment

Theme images by PLAINVIEW. Powered by Blogger.