Article review: How do you assess the quality of educational research articles?

Imagine this. You are about to conduct an innovative educational project and want to get a research publication out of it. What are considered strong methodological qualities of an educational research study? What can you do to improve your chances for publication?

The authors in this study developed and use an instrument to help measure the methodological quality of quantitative studies in medical education. This instrument, the Medical Education Research Study Quality Instrument (MERSQI), was used to show that scores were predictive of manuscript acceptance into the 2008 Journal of General Internal Medicine (JGIM) special issue on medical education.

What is the MERSQI instrument?
The 10-item MERSQI instrument can be divided into 6 domains, each with a maximum score of 3 points. Numbers within parenthesis denote the number of different items assessed in that domain. The maximum score is 18.

  • Study design
  • Sampling (2)
  • Type of data
  • Validity of evaluation instrument (3)
  • Data analysis (2)
  • Outcomes

The specific scoring criteria are seen in this table below, from an earlier publication on the MERSQI instrument (Reed DA et al, Association between funding and quality of published medical education research. JAMA. 2007;298(9):1002-1009). This earlier publication showed that a high MERSQI score correlated with successful funding of the study.

MERSQI scoring criteria (3 points per domain)
Click to enlarge table image.


So what garners a perfect MERSQI score of 18?

  • A randomized controlled trial conducted at over 2 institutions
  • A response rate of >75%
  • Data assessment was done using an objective measure (not self-assessment by study participant)
  • When using an evaluation instrument, internal structure validity, content validity, and criterion validity are reported.
  • The data is analyzed appropriately and goes beyond just descriptive analysis.
  • The outcome measure goes beyond simple survey measures and focuses on instead patient/health care outcomes.

Getting a perfect score of 18 is extremely difficult, as evidenced in this article. In the 100 submitted manuscripts to JGIM, the mean MERSQI score was 9.6 (range 5-15.5). Most manuscripts were single-group cross-sectional studies (54%), conducted at a single institution (78%). Few (36%) reported validity evidence for their evaluation instruments.

The mean total MERSQI score of accepted manuscripts was significantly higher than rejected manuscripts (p=0.003).

  • Mean score for ACCEPTED manuscript = 10.7
  • Mean score for REJECTED manuscript = 9.0

I have found the MERSQI scoring instrument extremely helpful in helping me design my educational research studies. Methodologic rigor is almost always the Achilles heel of rejected educational research submissions.

Reference

Reed DA, Beckman TJ, Wright SM, et al. Predictive validity evidence for medical education research study quality instrument scores: quality of submissions to JGIM’s Medical Education Special Issue. J Gen Intern Med. 2008 Jul;23(7):903-7.

2016-10-26T17:05:40-08:00

Article review: Bedside teaching in the ED

Bedside teaching is a unique educational skill, which academic faculty are often assumed to just know how to do. In the ED, it is especially difficult to do this well, because of crowding and unexpected time-sensitive clinical issues, which create distractions and general chaos. Experientially, unpredictable clinical issues negatively impact bedside teaching. Thus, faculty should be flexible and knowledgeable of basic bedside teaching tenets.

(more…)

2016-10-26T17:05:40-08:00

Article review: Handoffs in the Emergency Department

One shared experience amongst all emergency physicians is the “handoff” or “signout” of patients at the end of your shift to the oncoming physician. A recent article in Annals of Emergency Medicine explores and explains how this process can often lead to delays and errors in patient management. Just envision ED handoffs as a high-stakes game of Telephone, which you played as a child.

(more…)

2016-10-26T17:05:40-08:00

Article review: Optimal training during fourth year of medical school

U.S. medical students traditionally spend the first 3 years of training in a pre-determined curriculum. In their 4th year, however, students have significant flexibility in how they tailor their time. For this last year before residency, they shift from a learner-centered curriculum to a patient-centered curriculum. There is a shift in mentality from “I am here to learn as much as I can about medicine” to more of a “How do I best prepare myself for working in a hospital in my chosen specialty?”

(more…)

2016-10-26T17:05:41-08:00

Hot off the press: Improving medical student presentations in the ED

PodcastHeadsetWebsite: www.emrapee.com

The EM-RAP Educator’s Edition podcast just released its 6th podcast episode. Dr. Rob Rogers et al discuss practical tips and approaches to giving feedback on medical student presentations. Presentations in the ED are very different from those in other specialties, such as internal medicine and surgery. The discussants dissect and comment on parts of the presentation.

(more…)

2016-11-18T11:11:09-08:00

What is a journal "impact factor"?

Journals use the numerical “Impact Factor” as an indirect quantitative measure of a journal’s importance in the medical field and scientific literature. Thompson Scientific calculates the impact factor scores annually. This score provides journals with bragging rights, especially when it comes to marketing. Be aware that there are ways to manipulate the numbers a little and thus brings the true value of this score into question.

How is the impact factor calculated?

The impact factor is a calculation of how frequent a journal’s articles are cited in a 2-year period. As an example, the 2009 impact factor for a journal would be:

Impact Factor = A / B

  • A = Number of times 2007-08 articles are cited from a given journal
  • B = Number of total “citable items” published in given journal during 2007-08

The ambiguous issue is how the denominator of “citable items” is determined. Basically articles which qualify as potentially citable items include original research, reviews, proceedings, and notes. These do not include such items as editorials, coresspondences, and errata. Sometimes it’s unclear which articles don’t qualify. The more articles that you exclude, the smaller your denominator and thus the higher (and better) the impact factor.

Below are impact factors of several journals, relevant to those interested in publishing in EM and medical education. In addition to impact factors, you should also consider the journal’s general focus when deciding where to submit your manuscript. If you read through several back-issues, you will get a sense of each journal’s “flavor”:

Emergency Medicine journals

  • Annals of Emergency Medicine 3.755
  • Academic Emergency Medicine 2.46
  • Emergency Medicine Journal 1.347
  • American Journal of EM 1.188
  • Journal of Emergency Medicine 0.778


Education journals

  • Academic Medicine 2.57
  • Medical Education 2.181
  • Teaching and Learning in Medicine 0.83
2019-09-10T14:06:28-08:00

Educator’s portfolio

Are you a medical educator and can’t quite illustrate the importance and impact of your work in your CV?

I’ve always had this problem when compiling and updating my CV. The traditional CV format caters especially to academic physicians who are active in public service, traditional research, and leadership positions. What about the great procedural course that you ran with stellar evaluations? What about the lecture you gave at a national conference?

(more…)

2018-10-28T21:34:48-08:00