This was the question addressed by the landmark 1910 Flexner Report from the Carnegie Foundation for the Advancement of Teaching. Back in the early 1900’s, residency training did not exist yet, and students entered clinical practice immediately after graduation from medical school. The quality of medical training varied significantly with alarming deficiencies in many medical schools. An independent, nonprofessional organization was commissioned to report about the situation in order to pressure the public to reform medical school education.
In an upcoming issue of the Academic Emergency Medicine journal, there is a glowing review of a collaborative project that I was involved in. If you are a medical student about to do an EM rotation, or serve as a faculty advisor for an EM medical student, feel free to distribute this EM Clerkship Primer (FREE book!) for them to read. [Update 11/21/13: New link for free download PDF] This was the first official project to come out of the Clerkship Directors in Emergency Medicine (CDEM). It was written by 22 established medical educators in EM, led by our fearless leader/ editor-in-chief, Dave Wald. Go, Dave!
Thanks to Dr. Rob Roger’s podcast on EM-RAP Educator’s Edition series, I learned of one of the funniest publications EVER in a medical journal. It was published on April 1, 2009 in JAMA. The article focuses on teaching medical students the essential skill set– how to survive “pimping”.
Pimping traditionally occurs when an attending physician poses a difficult question to a learner in a public forum, such as board rounds or in the operating room. As a student or resident, you know that this will happen during your training, and you should be prepared. If you think of pimping as a form of battle, you will need a good defense, and you should mix it up to be successful.
I am developing a new microsimulation module to help EM clerkship students gain a more realistic exposure to high-acuity patients. Emergent conditions, such as ectopic pregnancy, acute tricyclic overdose, and ST elevation MI, are usually cared for by senior residents and attendings. Rarely are students primarily involved in these cases.
A man recently presents with knee pain after pivoting and torquing his knee while falling. He complains of concurrent mild ankle pain. He presents with this tib-fib xray. Realizing that a proximal fibular fracture can present concurrently with a medial malleolus fracture or deltoid ligament rupture, we obtained xrays of the ankle. We were looking for a Maisonneuve fracture.
Do you see an ankle injury in these four images?
A moderately intoxicated patient presents with a facial or scalp laceration. S/he adamantly refuses to have it repaired in the ED, because of the disbelief of that there is indeed a laceration. You want to show the patient, using a mirror, but you don’t have one. (more…)
Imagine this. You are about to conduct an innovative educational project and want to get a research publication out of it. What are considered strong methodological qualities of an educational research study? What can you do to improve your chances for publication?
What is the MERSQI instrument?
The authors in this study developed and use an instrument to help measure the methodological quality of quantitative studies in medical education. This instrument, the Medical Education Research Study Quality Instrument (MERSQI), was used to show that scores were predictive of manuscript acceptance into the 2008 Journal of General Internal Medicine (JGIM) special issue on medical education.
What is the MERSQI instrument?
The 10-item MERSQI instrument can be divided into 6 domains, each with a maximum score of 3 points. Numbers within parenthesis denote the number of different items assessed in that domain. The maximum score is 18.
- Study design
- Sampling (2)
- Type of data
- Validity of evaluation instrument (3)
- Data analysis (2)
The specific scoring criteria are seen in this table below, from an earlier publication on the MERSQI instrument (Reed DA et al, Association between funding and quality of published medical education research. JAMA. 2007;298(9):1002-1009). This earlier publication showed that a high MERSQI score correlated with successful funding of the study.
- A randomized controlled trial conducted at over 2 institutions
- A response rate of >75%
- Data assessment was done using an objective measure (not self-assessment by study participant)
- When using an evaluation instrument, internal structure validity, content validity, and criterion validity are reported.
- The data is analyzed appropriately and goes beyond just descriptive analysis.
- The outcome measure goes beyond simple survey measures and focuses on instead patient/health care outcomes.
Getting a perfect score of 18 is extremely difficult, as evidenced in this article. In the 100 submitted manuscripts to JGIM, the mean MERSQI score was 9.6 (range 5-15.5). Most manuscripts were single-group cross-sectional studies (54%), conducted at a single institution (78%). Few (36%) reported validity evidence for their evaluation instruments.
The mean total MERSQI score of accepted manuscripts was significantly higher than rejected manuscripts (p=0.003).
- Mean score for ACCEPTED manuscript = 10.7
- Mean score for REJECTED manuscript = 9.0
I have found the MERSQI scoring instrument extremely helpful in helping me design my educational research studies. Methodologic rigor is almost always the Achilles heel of rejected educational research submissions.
Reed DA, Beckman TJ, Wright SM, et al. Predictive validity evidence for medical education research study quality instrument scores: quality of submissions to JGIM’s Medical Education Special Issue. J Gen Intern Med. 2008 Jul;23(7):903-7.