So meta-analysis, a lot of people equate it with systematic reviews, but

it's actually different.

Not all systematic reviews would have a meta-analysis,

and meta-analysis is really a component of a systematic review

where you have enough data to combine them statistically.

And here are two classic definitions for systematic review.

It is the statistical analysis of a large collection of analysis results from

individual studies for the purpose of integrating the findings.

Or alternatively, a statistical analysis which combines the results of

several independent studies considered by the analyst to be combineable.

Personally, I like the second definition better because you have to decide,

as a systematic reviewer, whether the studies are similar enough,

such that you can combine them in your meta-analysis.

Most meta-analysis are presented in forest plot.

And here is one example of a forest plot.

Here we have five studies.

And each line and the square in the center represent the results from one study.

And the size of the square,

is proportional to the weight that each study is taken in the meta-analysis.

The larger the square, the more weight the study is taking.

And we also have the two sticks around the square which shows you

the confidence interval for each study.

If you look down on the plot, you will see a diamond, a blue diamond.

That's where the meta-analytical effect lies and where the point estimate is, and

the confidence interval for the meta-analysis.

Depending on the measure of association you're going to use,

you may have different scale on the xx.

For example, here we're using risk ratio.

A risk ratio of one is the now effect.

And you can label your figure such that, if the diamond

lies on the left of the line of no effect, it favors the treatment.

Or if the diamond lies to the right-hand side of the line of

no effect it favors the control.

So you can actually show the direction of effect on the same plot.

This is called a forest plot, and it shows you the meta-analysis results,

as well as the results from individual studies you put into your meta-analysis.

Meta-analysis provides us statistical methods for

answering what is the direction of effect or association?

What is the size of effect?

And is the effect consistent across studies?

You may also want to ask the question, what is the strength of evidence for

the effect.

Assessment of the strength of evidence relies additionally on the judgement of

the study quality, study design, as well as the statistical measure of uncertainty.

Again, a general framework for synthesis or for your meta-analysis

is that you want to answer the question of what is the direction of effect?

What is the size of effect?

And whether the effects are consistent across studies.

What Meta-Analysis Can Help you to Do?

If you have several studies included in your systematic review, and

they're similar enough, when you put them together in a meta-analysis,

you can determine whether an effect exists in a particular direction.

It helps you to combine the results quantitatively and

obtain a single summary result which is shown as a diamond on the forest plot.

You can also use systematic reviews to invest heterogeneity,

to examine reasons for different results among studies.

Again, very unlikely you will get identical results from studies on

a research question.

But you will be able to look at why they are different using meta-analysis and

related methods.

As I mentioned earlier, I really like the definition where meta-analysis is defined

as by the analysts of whether the studies are combinable, and here are why.

The justifications for combining results.

You have to decide whether studies are estimating in whole, or

in part a common effect.

This is very important, because as I said,

very unlikely you will have two identical studies.

They are similar in some way, and you have to decide if they are similar enough.