This was the question addressed by the landmark 1910 Flexner Report from the Carnegie Foundation for the Advancement of Teaching. Back in the early 1900’s, residency training did not exist yet, and students entered clinical practice immediately after graduation from medical school. The quality of medical training varied significantly with alarming deficiencies in many medical schools. An independent, nonprofessional organization was commissioned to report about the situation in order to pressure the public to reform medical school education.
I am developing a new microsimulation module to help EM clerkship students gain a more realistic exposure to high-acuity patients. Emergent conditions, such as ectopic pregnancy, acute tricyclic overdose, and ST elevation MI, are usually cared for by senior residents and attendings. Rarely are students primarily involved in these cases.
Imagine this. You are about to conduct an innovative educational project and want to get a research publication out of it. What are considered strong methodological qualities of an educational research study? What can you do to improve your chances for publication?
What is the MERSQI instrument?
The authors in this study developed and use an instrument to help measure the methodological quality of quantitative studies in medical education. This instrument, the Medical Education Research Study Quality Instrument (MERSQI), was used to show that scores were predictive of manuscript acceptance into the 2008 Journal of General Internal Medicine (JGIM) special issue on medical education.
What is the MERSQI instrument?
The 10-item MERSQI instrument can be divided into 6 domains, each with a maximum score of 3 points. Numbers within parenthesis denote the number of different items assessed in that domain. The maximum score is 18.
- Study design
- Sampling (2)
- Type of data
- Validity of evaluation instrument (3)
- Data analysis (2)
The specific scoring criteria are seen in this table below, from an earlier publication on the MERSQI instrument (Reed DA et al, Association between funding and quality of published medical education research. JAMA. 2007;298(9):1002-1009). This earlier publication showed that a high MERSQI score correlated with successful funding of the study.
- A randomized controlled trial conducted at over 2 institutions
- A response rate of >75%
- Data assessment was done using an objective measure (not self-assessment by study participant)
- When using an evaluation instrument, internal structure validity, content validity, and criterion validity are reported.
- The data is analyzed appropriately and goes beyond just descriptive analysis.
- The outcome measure goes beyond simple survey measures and focuses on instead patient/health care outcomes.
Getting a perfect score of 18 is extremely difficult, as evidenced in this article. In the 100 submitted manuscripts to JGIM, the mean MERSQI score was 9.6 (range 5-15.5). Most manuscripts were single-group cross-sectional studies (54%), conducted at a single institution (78%). Few (36%) reported validity evidence for their evaluation instruments.
The mean total MERSQI score of accepted manuscripts was significantly higher than rejected manuscripts (p=0.003).
- Mean score for ACCEPTED manuscript = 10.7
- Mean score for REJECTED manuscript = 9.0
I have found the MERSQI scoring instrument extremely helpful in helping me design my educational research studies. Methodologic rigor is almost always the Achilles heel of rejected educational research submissions.
Reed DA, Beckman TJ, Wright SM, et al. Predictive validity evidence for medical education research study quality instrument scores: quality of submissions to JGIM’s Medical Education Special Issue. J Gen Intern Med. 2008 Jul;23(7):903-7.
Bedside teaching is a unique educational skill, which academic faculty are often assumed to just know how to do. In the ED, it is especially difficult to do this well, because of crowding and unexpected time-sensitive clinical issues, which create distractions and general chaos. Experientially, unpredictable clinical issues negatively impact bedside teaching. Thus, faculty should be flexible and knowledgeable of basic bedside teaching tenets.
One shared experience amongst all emergency physicians is the “handoff” or “signout” of patients at the end of your shift to the oncoming physician. A recent article in Annals of Emergency Medicine explores and explains how this process can often lead to delays and errors in patient management. Just envision ED handoffs as a high-stakes game of Telephone, which you played as a child.
U.S. medical students traditionally spend the first 3 years of training in a pre-determined curriculum. In their 4th year, however, students have significant flexibility in how they tailor their time. For this last year before residency, they shift from a learner-centered curriculum to a patient-centered curriculum. There is a shift in mentality from “I am here to learn as much as I can about medicine” to more of a “How do I best prepare myself for working in a hospital in my chosen specialty?”
The EM-RAP Educator’s Edition podcast just released its 6th podcast episode. Dr. Rob Rogers et al discuss practical tips and approaches to giving feedback on medical student presentations. Presentations in the ED are very different from those in other specialties, such as internal medicine and surgery. The discussants dissect and comment on parts of the presentation.