What the heck is a systematic review?
(And why should social workers care?)
“Any one study can be flawed,” says Joanne Yaffe, a professor at the University of Utah College of Social Work. “That’s why systematic reviews are becoming recognized as probably the best source of information about research, both for practitioners and for other researchers.”
Dr. Yaffe appreciates this rigorous, comprehensive type of research because it produces such a high-quality synthesis of information. “Systematic reviews are a way to sift through all the studies on a particular topic,” she says. “It’s not just the high-profile study sitting on your desk, but also the unpublished study done by a lowly social worker somewhere in England.”
Systematic reviews, Dr. Yaffe explains, start by developing a protocol around a carefully defined research question. “You explain everything you’re about to do: how you’ll find the studies, how you’ll select which studies to include, how you’ll pull the data from those selected studies, and how you’ll treat those extracted data.” The protocol is published ahead of time and you’re held to the standard of your protocol, explains Dr. Yaffe. “If you deviate from that standard, you have to explain why you deviated.” This carefully thought-through process diminishes the chances that the results will be influenced by a researcher’s own biases.
Once the protocol has been established, the search for studies begins with researchers reviewing the titles and abstracts of every available study that could be relevant. “What makes a systematic review good is that it looks at all the studies,” says Dr. Yaffe. And when she says, “all the studies,” she means it—published, unpublished, dissertations, and studies in foreign languages. “The reason that’s important is because of publication bias. Studies that are published tend to have larger, more positive effect sizes than studies that remain unpublished. If you skip unpublished research, you might be missing studies that give you a fuller, rounder picture of what’s happening.”
“If a study looks like it might be a fit, then you look at the full text of the study,” said Dr. Yaffe. As a measure to prevent the influence of bias, two people independently select the studies and, if there’s disagreement, there’s a negotiation process, which is specified ahead of time. There must be a consensus about a study for it to be included.
A systematic review not only includes a list of the studies that were selected for inclusion, but also a list of the studies that were excluded during the full-text stage, along with an explanation about their exclusion. Two of the strengths of a systematic review are its transparency and reproducibility.
With the studies selected, researchers then extract the previously-specified data from those studies. What’s the population? How many people participated? What kind of intervention did they use? What were the results? As with selection, two people independently extract the data to minimize bias.
Then, two people do a systematic appraisal of the risk of bias within those studies. “Basically,” says Dr. Yaffe, “is the study well done and well reported, or is it junk?” Researchers also look across the studies and evaluate the risk of bias. Inclusion of these different examinations of risk of bias helps temper the conclusions of the systematic review.
Finally, researchers are ready to run the statistics. Dr. Yaffe notes that it is important to look at the type of data extracted when determining the best, most informative way to display the data.
Many systematic reviews are accompanied by meta-analysis, which is a statistical averaging of the affect size across a variety of studies that are essentially asking the same question. But meta-analysis isn’t always advisable, depending on the diversity or number of the studies included. Dr. Yaffe explains, “A meta-analysis conducted without a systematic review runs the risk of being biased, because you don’t know how those studies were collected.”
“You want to push the field a little bit further by talking about where the gaps are in the research or the limitations of the research. But you also want to talk about what it means practically,” says Dr. Yaffe
“We have more information coming out every single day in more journals than anybody could possibly keep track of,” says Dr. Yaffe. “We need a way to sift through that information. If we want high-quality synthesis of information, then we want to have high standards for the way the synthesis is conducted. That’s where systematic reviews come in.”