Check your Sources: Tips for evaluating clinical trial results
Clinical validation is a crucial part of developing and proving the value of new healthcare solutions, but not all studies are created equal.
Clinical trials sometimes get a bad rap, and for good reason - it can be hard to understand their nuances. As a clinician, I’ve been trained to read and understand clinical studies, which makes it easier to distinguish good from bad.
Here are a few tips to help you weigh the importance of a study in your decision-making.
Ask: ‘what is this study trying to prove?’
Every clinical study is designed to prove something. But they take a long time to prepare and run, and they’re often expensive, so some researchers design their studies to guarantee results.
Make sure the goal of the study is clear, that it addresses the right questions and that the outcomes make sense in the real world. If this is not clear, the study designer or the company funding the study should at least be able to tell you why they made the decisions they made and declare any biases.
As an example, in our clinical studies on hip and knee replacement, here is how we stated the aim of the study:
"This study was designed to compare the clinical outcomes of a [digital] home-based program against conventional home-based rehabilitation after total knee replacement (...). We hypothesize that the clinical outcomes of such a program will be at least similar to those of conventional rehabilitation".
This means that we had a hypothesis that Sword’s program would perform just as well as traditional PT. It ended up performing 30% better!
Look for randomized trials with a control group (RCTs)
The quality of a study depends on its design. In the digital health field, you’ll mainly find two types of studies:
-
Observational studies Observational studies follow real patients over time. Some observational studies have a control group, others don’t. While they are very useful for broad, directional feedback, there are many variables that can confound the results (especially if they have no control group) - So, be wary of evidence coming only from these types of studies.
-
Experimental studies Sometimes called Randomized Controlled Trials (RCTs), these studies take the top spot in the hierarchy of scientific evidence. They compare the effects of an intervention with a control group, with random allocation of participants to reduce the chance of bias.
Make sure the control group received active treatment
Not all RCTs are of the same quality. An active control group, where you are comparing a proven treatment to the new one (for example, in-person physical therapy), is much better than comparing an intervention against a partial treatment (for example, just education without exercise) or no treatment at all.
At Sword, our clinical studies are designed to understand how our solution compares with the gold standard in physical therapy, so our clinical trials are always RCTs, our treatment group receives the full Sword treatment and our control group performs intensive, one-on-one physical therapy - three one-hour sessions per week.
Make sure the study is published in a credible journal
Unfortunately, there are a lot of journals that have a fraudulent or non-existent peer review process. Look for well-established journals with high relevance in the field, as these offer more guarantees of an independent and thorough peer-review process. It is not easy to check the credibility of a journal, but a relatively quick ways is to use this site to see where the journal sits relative to other journals in that field. The simplest way to tell the quality of the journal is to look at what quartile it’s in (indicated by a colored square and the letter Q). The highest-quality journals have a green square labeled Q1 - this means they’re in the top 25% of all publications in their space. For example, Nature Scientific Reports, where we have published the results of our knee replacement study, is in the top quartile (Q1) in its category.
Do the results matter?
Back to square 1.
Every clinical study wants to prove something. But proving it does not mean that it matters clinically.
Many studies claim that a group of patients has improved from baseline by [x], or that there was [y] difference between the two groups. However, this difference may not actually mean anything in the real world.
Statistics can be deceptive. For example, in our knee replacement study, we found a 4.9 second difference in a functional test of the lower limbs (called Timed Up and Go) between the Sword group and the conventional PT group, favoring our group. Is this clinically meaningful? Yes, because the Minimal Clinically Significant Difference (MCID) for this test is 2.27 seconds.
Providing context is crucial here - so make sure this was made clear by the authors, and if you’re unsure, check with them.
Need help deciphering a study?
Our clinical team is happy to help you understand the results of any clinical study. Email the study to clinical@swordhealth.com and we’ll help you get to the bottom of it.
Check out our clinical studies:
- Digital rehabilitation vs conventional PT after total knee replacement
- Digital rehabilitation vs conventional PT after total knee replacement: mid-term results
- Digital rehabilitation vs conventional PT after total hip replacement: short and mid-term results
About the author: Dr. Fernando Correia, M.D.
Dr. Fernando Correia is the Chief Medical Officer at SWORD Health, where he leads clinical validation and medical affairs. He is a physician with a specialty in Neurology, and also holds an Executive Masters degree in Healthcare Management.
He co-founded Sword with the firm belief that technology can lead healthcare into a new era, one where high-quality, evidence-based medicine is available to everyone, not just a select few. He also believes that a more humanistic approach to healthcare is needed, and that technology and the human touch can go hand in hand and make each other better.
Fernando received his M.D. from the University of Coimbra and his Executive Masters from Católica Porto Business School. He trained in Portugal and in the UK (National Hospital for Neurology and Neurosurgery and Great Ormond Street Hospital for Children). He lives in Porto, Portugal with his family, where he enjoys playing tennis, reading all kinds of books and savoring a good glass of wine.