Some build the house. Others ensure the foundation will hold.

Have you ever wondered who checks the checkers?

Who ensures that the methods behind medical decisions are sound?

The Invisible Architects

The Tale of the Bridge Inspector:

A city built a magnificent bridge. Engineers celebrated. Citizens crossed daily. But one woman spent her days beneath, checking welds, measuring stress, documenting cracks.

"Why don't you build bridges?" they asked.

"Because," she said, "I ensure every bridge built will stand."

Methodologists are the inspectors of medical evidence. They don't conduct the trials—they ensure the trials tell us truth.
$2.6B
Saved when RECOVERY trial methods proved dexamethasone works
15K+
Excess deaths from flawed aprotinin studies before methods review
85%
Of medical research has methodological limitations

Methods work isn't glamorous. But it's the difference between medicine that heals and medicine that harms.

Consider the story of Dr. Doug Altman...

Doug Altman (1948-2018): The Scandal of Poor Medical Research

In 1994, a statistician published an editorial in BMJ titled "The Scandal of Poor Medical Research." He wrote:

"We need less research, better research, and research done for the right reasons."

Doug Altman didn't discover new drugs. He didn't perform surgeries. He spent his career developing reporting guidelines (CONSORT, STROBE, PRISMA) that changed how the world presents evidence.

His citations: 500,000+. His tools are used in every major journal. He built no bridges—but every bridge now uses his inspection standards.
Altman DG. The scandal of poor medical research. BMJ 1994;308:283-4. Citations: 6,000+

What They Do

• Design rigorous study methods

• Develop reporting guidelines

• Create risk of bias tools

• Advance statistical techniques

• Train the next generation

What They Don't Do

• Usually not primary data collection

• Usually not patient care

• Rarely in the headlines

• Often not "first author"

• Not the face of breakthroughs

Methodologists are the architects of evidence, not the builders of individual studies.

The Story of John Snow's Cholera Map:

In 1854, London physicians believed cholera spread through "miasma" (bad air). Their maps were beautiful, showing wind patterns and elevation.

Dr. John Snow created a plain map—just dots showing deaths clustered around the Broad Street pump. His simple, methodologically sound map identified contaminated water as the source.

The miasma maps were elegant but wrong. Snow's plain map saved lives.

Methodology over presentation.

You are a London physician in 1854. Cholera is spreading. What do you choose?

Path A: Follow the miasma theory—create beautiful maps of wind patterns and air quality.
→ Patients die while you study the air. The disease continues unchecked.
Path B: Map the deaths systematically—plot cases and look for patterns in the data.
→ Find the Broad Street pump. Remove the handle. Save lives through methodological thinking.
+340%
Growth in HTA positions since 2015
$95K-$180K
Salary range for senior methodologists
72%
Report high job satisfaction

Demand is rising. Regulatory agencies, pharma, and health systems all need people who understand evidence quality.

The path is clear for those willing to walk it.

Module 1: What Methodologists Do

They do not build the ship. They ensure it will not sink.

Picture a day in the life...

Not one life—many lives, in many settings.

The Methodologist Who Changed Medicine: Archie Cochrane

In 1972, a physician named Archie Cochrane published Effectiveness and Efficiency, arguing that most medical decisions lacked evidence from controlled trials. He challenged the entire medical establishment to prove that treatments actually worked.

Before Cochrane's advocacy, the vast majority of medical treatments had never been tested in randomized controlled trials. By the time the Cochrane Collaboration was founded in 1993, systematic reviews had become the gold standard of evidence-based medicine.

One methodologist's insistence on rigour created the infrastructure that modern medicine now depends on.

You are a young researcher in 1975. A senior professor dismisses your suggestion to conduct a systematic review, saying: "We already know what works from clinical experience."

Path A: Defer to authority—accept that clinical experience is sufficient and abandon the idea of systematic evidence synthesis.
→ Untested treatments continue unchallenged. Patients are harmed by ineffective or dangerous care that "everyone knows works."
Path B: Follow Cochrane's vision—insist on systematically gathering and appraising the evidence, even when it is unpopular.
→ You help build the methodology that transforms medicine globally. Systematic reviews become the foundation of clinical guidelines worldwide.

Setting 1: Academic Medical Center

Dr. Sarah Chen, Associate Professor of Biostatistics
McMaster University, Hamilton, Canada

8:00 AM: Reviews a draft systematic review protocol from a PhD student. Catches a fatal flaw: they planned single screening. Recommends pilot calibration exercises.

10:00 AM: Teaches GRADE workshop to 40 clinicians. Half are confused by "indirectness." She uses the aspirin-for-headache-vs-heart-attack example. Lightbulbs go on.

2:00 PM: Co-author meeting for a methods paper on network meta-analysis inconsistency. Argues for a new visualization approach.

4:00 PM: Reviews grant application as methodologist. Flags sample size calculation error that would doom a $2M trial.

Impact: Prevented one flawed trial, trained 40 future evidence users, advanced the field.

Setting 2: Health Technology Assessment Agency

Ahmed Al-Rashid, Senior HTA Analyst
Saudi Food and Drug Authority, Riyadh

8:30 AM: Receives manufacturer's dossier for new diabetes drug. 847 pages. His job: assess the evidence quality in 3 weeks.

10:00 AM: Spots a problem: the pivotal trial used a surrogate endpoint (HbA1c) but claims mortality benefit. Surrogate ≠ patient-relevant outcome.

1:00 PM: Meets with clinical experts. "The drug works," they say. "But does it matter to patients?" he asks. Evidence gap identified.

3:00 PM: Drafts assessment report. Recommends conditional approval with real-world evidence collection requirement.

Impact: A nation's drug coverage decision now based on rigorous methods, not marketing claims.

Setting 3: Pharmaceutical Industry

Dr. Maria Santos, Director of Evidence Strategy
Global Pharmaceutical Company, Basel

9:00 AM: Designs Phase III trial protocol. Fights for pragmatic design over explanatory—harder to recruit, but results will apply to real patients.

11:00 AM: Reviews competitor's published trial. Notes: open-label, subjective outcomes, high attrition. Prepares competitive landscape briefing.

2:00 PM: Coaches clinical team on FDA Pre-Submission meeting. "They will ask about your primary endpoint. Have the sensitivity analyses ready."

4:30 PM: Joins global call on network meta-analysis for HTA submissions. Debates transitivity assumptions for 90 minutes.

Impact: Better-designed trials, cleaner regulatory path, evidence that serves patients and business.

Setting 4: WHO/Global Health

Dr. Kwame Asante, Technical Officer
World Health Organization, Geneva

8:00 AM: Morning call with AFRO region. Malaria guideline update needed. Reviews which rapid reviews can be adapted vs. need fresh evidence.

10:30 AM: Guideline Development Group meeting. 14 experts disagree on intervention. His role: ensure the evidence is presented objectively, regardless of who's loudest.

1:00 PM: Drafts GRADE evidence profile for HIV testing recommendation. "Very low certainty" doesn't mean "don't recommend"—he writes the plain-language summary.

4:00 PM: Reviews living systematic review dashboard. Three new RCTs since last month. Triggers update protocol.

Impact: Guidelines used by 194 member states. Methods decisions affect millions.

Design

Creating study protocols, sample size calculations, endpoint selection

Appraise

Risk of bias assessment, GRADE ratings, critical appraisal

Synthesize

Meta-analysis, network meta-analysis, qualitative synthesis

Develop

New tools, guidelines, reporting standards, software

Teach

Training researchers, clinicians, policymakers

Advise

Consultations on specific studies, grants, submissions

Consider what happens when methods fail...

The Aprotinin Disaster

1993-2007: A drug that should have been stopped years earlier

Aprotinin was used in cardiac surgery to reduce bleeding. Early studies suggested it worked. But methodologists noticed problems:

• Observational studies had severe confounding
• RCTs were too small for safety outcomes
• Meta-analyses pooled incompatible populations

2007: BART trial finally showed increased mortality. Drug withdrawn.

Estimate: 15,000-22,000 excess deaths while methodological concerns were ignored.
Fergusson et al. NEJM 2008. Shaw et al. BMJ 2014.

The RECOVERY Triumph

2020: Methods done right, lives saved

The RECOVERY trial used rigorous methods from day one:

• Pragmatic design embedded in routine care
• Adequate sample size (thousands, not dozens)
• Pre-specified endpoints and analysis plan
• Independent data monitoring committee

Result: Dexamethasone proven in 100 days. Estimated 1 million lives saved globally.

The methodologists who designed this protocol are among the most impactful scientists of our era—yet few know their names.
RECOVERY Collaborative Group. NEJM 2021. Horby P, Landray M (co-chief investigators).

A methodologist's primary role is to:

Scenario
The aprotinin case resulted in an estimated 15,000-22,000 excess deaths. The primary methodological failures were:

Select the most accurate answer:

Methods are not abstract. Methods are life and death.

Module 2: The Skill Stack

A methodologist is not one thing. A methodologist is many things woven together.

What skills separate the methodologist from the researcher?

Not depth in one area—but bridges between many.

1. Statistics & Quantitative Methods

• Regression, survival analysis

• Meta-analysis (pairwise, network)

• Bayesian methods (increasingly)

• Missing data, sensitivity analysis

2. Epidemiology & Study Design

• Bias recognition (selection, information, confounding)

• RCT design and CONSORT

• Observational study limitations

• Causal inference frameworks

3. Domain Expertise

• Clinical knowledge (enough to ask right questions)

• Understanding healthcare systems

• Policy and decision-making context

• Patient-relevant outcomes

4. Communication

• Translating complexity to clarity

• Writing for diverse audiences

• Teaching and mentoring

• Diplomatic disagreement

The Story of DNA's Discovery:

In 1953, discovering DNA's structure required four disciplines:

Rosalind Franklin's X-ray crystallography revealed the helix shape.
Erwin Chargaff's chemistry showed base-pairing rules.
Linus Pauling's modeling expertise suggested the spiral.
Watson and Crick's biological insight assembled the pieces.

No single specialist could have solved it alone. The double helix emerged only when disciplines converged.

A methodologist is the translator between specialists. Not the deepest in any one field, but fluent enough in all.

You have crucial X-ray data on DNA structure. What do you choose?

Path A: Work in isolation—perfect your analysis alone, guard your data carefully.
→ Others see your data through back channels, publish first. Your contribution is footnoted in history.
Path B: Collaborate across disciplines—share insights, engage with biologists and chemists.
→ Collective discovery. The breakthrough emerges from synthesis. Science advances together.

Core Competencies (Must Have)

Regression fundamentals: Linear, logistic, interpretation of coefficients

Survival analysis: Kaplan-Meier, Cox regression, hazard ratios

Meta-analysis: Fixed/random effects, heterogeneity (I², τ²), forest plots

Missing data: MCAR/MAR/MNAR, multiple imputation basics

Growth Areas (Increasingly Valuable)

• Network meta-analysis and indirect comparisons

• Bayesian approaches (priors, credible intervals)

• Machine learning for prediction models

• Causal inference methods (IPW, G-methods)

The Story of John Ioannidis

From Clinician to Methods Revolutionary

John Ioannidis trained as an infectious disease physician. He could have spent his career treating patients.

Instead, in 2005, he published "Why Most Published Research Findings Are False"—a statistical argument that most claimed research results are wrong.

The paper has been cited over 15,000 times. It changed how journals, funders, and researchers think about evidence.

He didn't need to be the world's best statistician. He needed to understand statistics well enough to ask the question no one else was asking.
Ioannidis JPA. PLoS Medicine 2005;2(8):e124. Most-accessed article in journal history.

Selection Bias

Who enters the study ≠ target population

Example: Trials excluding elderly patients, then applied to nursing homes

Information Bias

Measurement error, recall bias, detection bias

Example: Unblinded outcome assessors rating subjective endpoints

Confounding

Third variable creates spurious association

Example: Vitamin D and COVID—sick people stay indoors

The Methodologist's Question

"Which biases could explain this finding—and how much would they need to explain?"

The Story of the Coffee and Heart Disease:

For decades, observational studies showed coffee drinkers had more heart disease. Guidelines warned against coffee.

Then methodologists noticed: coffee drinkers also smoked more. When studies adjusted for smoking, the association vanished.

Later studies, with better methods, showed coffee might actually be protective.

The methodologist's job is not to declare truth—it is to ask: "What else could explain this?"

50%
Of methodologist time spent communicating with non-methodologists
1
Slide that explains GRADE better than 100 papers
Impact multiplier when clinicians understand methods
The test: Can you explain "very low certainty evidence" to a patient in one sentence?

"We're not confident in these results—the true effect could be quite different from what studies showed."
Decision tree: Can you explain a confidence interval to a clinician? If yes, can you identify 3 biases in an observational study? If yes to both, you're ready with foundation in place. If you can't identify biases, take a study design course. If you can't explain confidence intervals, take a biostatistics course first.

The four pillars of methodologist skills are:

Scenario
A study shows that people who eat organic food have lower cancer rates. The study is observational with 50,000 participants. What should a methodologist's first question be?

Select the best methodological question:

Skills are the foundation. Now—where will you build?

Module 3: Career Paths

Many roads lead to methods. Each road shapes what you become.

Where do methodologists work?

The answer may surprise you.

Academia

Universities, research institutes

$80K-$180K

HTA Agencies

NICE, CADTH, IQWIG, regional authorities

$70K-$140K

Cochrane/JBI

Evidence synthesis organizations

$65K-$120K

WHO/Global Health

International organizations, NGOs

$90K-$160K

Pharmaceutical

Industry evidence teams, CROs

$100K-$220K

Consulting

Independent or boutique firms

$120K-$300K+

Salary ranges approximate, USD, vary by region and seniority. Source: Industry surveys 2023-2024.

Path 1: Academia

The Profile

Day-to-day: Teaching, research, supervision, grant writing, committee work
Autonomy: High (you choose your questions)
Stability: Variable (tenure track vs. soft money)
Impact timeline: Long (publications take years)

Who thrives here: Those who love teaching, want research freedom, can tolerate slow pace and academic politics

Watch out for: Publish-or-perish pressure, grant uncertainty, limited practical application
Academic Methodologist, 15 years
"I've published 300 papers. Maybe 10 changed practice. But the students I've trained? They're changing practice every day."

Path 2: HTA Agencies

The Profile

Day-to-day: Reviewing manufacturer submissions, writing assessment reports, guideline development
Autonomy: Medium (structured processes)
Stability: High (government/quasi-government)
Impact timeline: Immediate (decisions affect coverage)

Who thrives here: Those who want direct policy impact, enjoy structured work, comfortable with accountability

Watch out for: Political pressure, tight deadlines, less research creativity
NICE Senior Analyst, UK
"Every report I write might determine whether 100,000 patients get a treatment. That focus keeps me sharp."

Path 3: WHO & Global Health

The Profile

Day-to-day: Guideline development, evidence reviews, technical assistance to countries
Autonomy: Medium (diplomatic constraints)
Stability: Medium (contract-based, but renewable)
Impact timeline: Variable (guidelines can take years; emergency response is immediate)

Who thrives here: Those who want global impact, enjoy cross-cultural work, can navigate bureaucracy

Watch out for: Slow decision-making, political considerations, frequent travel
WHO Technical Officer, Geneva
"Our malaria guideline is used by 40 countries. But getting 15 experts to agree on one sentence can take six months."

Path 4: Pharmaceutical Industry

The Profile

Day-to-day: Trial design, HTA submissions, competitive intelligence, medical affairs support
Autonomy: Medium (business objectives)
Stability: Medium (restructuring common)
Impact timeline: Medium (product lifecycle)

Who thrives here: Those who want high compensation, enjoy fast pace, comfortable with commercial context

Watch out for: Conflicts of interest perception, pressure to "find the right answer," less publication freedom
Director of Evidence Strategy, Global Pharma
"Yes, we want our drugs approved. But a bad trial design that fails Phase III costs $500M. Rigor is good business."
Decision tree: Is direct policy impact more important than research freedom? If impact, choose between global scope (WHO/NGO) or national focus (HTA agency). If freedom, choose between compensation priority (Pharma/Consulting) or autonomy priority (Academia).

The Story of Three Methodologists:

Doug Altman spent 40 years at Oxford, slowly building CONSORT, PRISMA, and reporting standards—his impact spans decades. David Sackett left academia to practice evidence-based medicine directly with patients in Hamilton—immediate clinical utility. Hans Rosling used his epidemiology training to create Gapminder—reaching millions with data visualization.

Same training, three paths.

Altman the canyon (deep, slow impact), Sackett the mill (daily clinical use), Rosling the delta (wide public reach).

You've developed expertise in evidence synthesis. Choose your path:

Path A (Academia): Like Altman—build standards, shape methodology over 40 years.
→ CONSORT, PRISMA, EQUATOR. Deep, slow, lasting impact. You change how science is reported.
Path B (Clinical Practice): Like Sackett—bring evidence directly to patient care.
→ Immediate clinical utility. Every patient benefits. You see the impact daily.
Path C (Public Communication): Like Rosling—translate data for millions.
→ Gapminder, TED talks, global reach. You change how the world sees data.

Years 1-5: Foundation

Graduate degree (MPH, MSc, PhD). First job often academic or junior HTA role.

Years 5-10: Specialization

Develop niche expertise. May move sectors. Build publication record and network.

Years 10-15: Leadership

Senior roles. Often consulting on the side. Invited expert for guidelines.

Years 15+: Influence

Shape the field. Write the textbooks. Train the next generation.

Most senior methodologists have worked in 2-3 sectors. Mobility is an asset.

Which career path typically offers the highest compensation but requires comfort with commercial context?

The path is chosen. Now—how do you become visible?

Module 4: Building Your Portfolio

In a field of invisible architects, how do you become known?

A methodologist without visibility is a methodologist without impact.

Your work must speak—but first, people must hear it.

The Tale of the Unknown Statistician

The Methodologist Who Changed Medicine—Anonymously:

In 1996, a statistician named David Moher led a group to create CONSORT—guidelines for reporting randomized trials.

He wasn't famous. He wasn't at Harvard. He was a methodologist at a Canadian research institute.

Today, CONSORT is required by 600+ journals. It's cited 35,000+ times. It's arguably saved more lives than most drugs—by ensuring we know which drugs actually work.

Moher built his reputation not through brilliance alone, but through relentless, visible contribution to a problem everyone faced.

1. Publications

• Methods papers in good journals

• Collaborative applied papers (methods role)

• Commentaries and letters

• Tutorials and how-to guides

2. Tools & Resources

• Software packages (R, Stata)

• Checklists and reporting guidelines

• Teaching materials (open access)

• Calculators and decision aids

3. Presence

• Conference presentations

• Workshops and training

• Social media (X/Twitter, LinkedIn)

• Blog posts and newsletters

4. Network

• Cochrane/Campbell involvement

• Guideline panel memberships

• Journal editorial boards

• Mentorship relationships

The Power of Tools

metafor
R package by Wolfgang Viechtbauer. 10,000+ citations.
ROB 2
Risk of bias tool. Standard for Cochrane reviews worldwide.
GRADE
Framework by Guyatt et al. Used by 100+ organizations.
Tools scale. A paper is read once. A tool is used thousands of times.

If you can solve a recurring problem and package it well, your impact multiplies.

The Methods Paper Formula

Problem: Identify a recurring methodological challenge

Solution: Propose a clear, implementable approach

Demonstration: Apply it to real data or simulation

Accessibility: Provide code, tools, or templates

Validation: Show it works better than status quo

Target journals: BMJ, JAMA, Annals of Internal Medicine (methods sections), Statistics in Medicine, Research Synthesis Methods, Journal of Clinical Epidemiology

The Collaboration Multiplier

How Georgia Salanti Built a Methods Empire:

Georgia Salanti didn't just write about network meta-analysis—she collaborated with clinical teams who needed it.

Each collaboration produced: (1) an applied paper solving a clinical question, (2) methods insights for her next methods paper, (3) a grateful colleague who recommended her to others.

Her network grew exponentially. Now she leads one of the most productive evidence synthesis groups in Europe.

Lesson: Don't wait to be asked. Offer your methods expertise to teams with important clinical questions.
Salanti G, University of Bern. 500+ publications. Developed CINeMA framework for NMA confidence.

What Works

• Sharing useful content (tutorials, explainers)

• Engaging with methods debates respectfully

• Celebrating others' work

• Making complex ideas accessible

• Showing your learning journey

What Doesn't

• Self-promotion without value

• Attacking others' methods publicly

• Jargon-heavy posts

• Inconsistent presence

• Ignoring engagement

Best methodologist accounts: @EpiEllie, @statsepi, @CochraneCollab, @GRADEWorkingGrp

The Story of Rosalind Franklin:

Rosalind Franklin's X-ray photograph (Photo 51) was crucial to discovering DNA's structure. But she published cautiously and didn't self-promote.

Watson and Crick saw her data, published first, and won the Nobel Prize. Franklin died at 37, largely uncredited.

Decades later, her notebooks revealed she was weeks from solving the structure herself. Brilliant work, shared too quietly, was overshadowed.

Impact requires not just discovery, but communication.

You've made a crucial discovery. What do you choose?

Path A: Publish cautiously—wait for perfect data, avoid self-promotion.
→ Others see your work, publish first. You're remembered as a footnote, vindicated only decades later.
Path B: Share findings actively—engage the community, present at conferences, collaborate openly.
→ Risk being scooped, but ensure your contribution is visible. Credit follows communication.

The most scalable way for a methodologist to increase impact is:

Visibility without substance is noise. Substance without visibility is silence.

Module 5: The Toolkit

A craftsman is only as good as their tools—and their mastery of them.

What software do methodologists use?

What resources shape their thinking?

R (Essential)

metafor: Gold standard for meta-analysis

netmeta: Network meta-analysis

robvis: Risk of bias visualization

tidyverse: Data manipulation

Free, open source, reproducible

Stata (Common)

metan/admetan: Meta-analysis

network: NMA

• Strong in survival analysis

• Preferred by some HTA agencies

Commercial license required

RevMan (Cochrane)

• Official Cochrane tool

• User-friendly interface

• Limited customization

• Good for standard reviews

Free for Cochrane reviews

Python (Growing)

PyMeta: Meta-analysis

• ML integration for screening

• Text mining capabilities

• Automation scripts

Free, good for AI integration

Covidence

Screening, extraction, collaboration

$$

Rayyan

AI-assisted screening, free tier

Free

ASReview

Active learning screening

Free

EPPI-Reviewer

Full workflow, text mining

$$

DistillerSR

Enterprise, AI features

$$$

SysRev

Open source option

Free

Essential Resources

The Methodologist's Bookshelf:

Cochrane Handbook: The gold standard reference. Free online. Read chapters 6-10 closely.

Higgins & Green → Higgins et al.: Updated regularly. Your reference for any SR question.

Borenstein et al. "Introduction to Meta-Analysis": Best statistical foundation.

Guyatt et al. JAMA Users' Guides: Clinical epidemiology essentials.

IOM "Finding What Works in Health Care": Standards for systematic reviews.

Read one chapter per week. In two years, you'll have a graduate education in methods.

Cochrane

Largest SR producer. Training. Methods groups. Volunteer opportunities.

Campbell Collaboration

Social science SRs. Growing. Less competitive to join.

JBI (Joanna Briggs)

Nursing/allied health focus. Strong training program.

GRADE Working Group

Certainty assessment. Influential. Membership by invitation.

EQUATOR Network

Reporting guidelines hub. Guideline development opportunities.

HTAi

HTA professional society. Annual conference. Interest groups.

The Story of Florence Nightingale's Graphics:

Florence Nightingale had basic statistical training—far less than university mathematicians of her era. But she mastered one tool: the polar area diagram.

Her "coxcomb" charts showed that soldiers died more from infection than battle wounds. Parliament, which ignored tables of numbers, understood her graphics instantly.

She didn't have the best tools. She mastered what she had, and changed military medicine forever.

You have data showing soldiers die from infection, not battle wounds. Parliament must act. What do you choose?

Path A: Present tables of numbers—the data speaks for itself, show the raw statistics.
→ Parliament ignores you. Numbers without visualization are invisible to busy politicians.
Path B: Create visual diagrams—transform data into pictures that tell the story at a glance.
→ Parliament understands instantly. Policy changes. Military hospitals are reformed. Lives are saved.

The most widely used R package for meta-analysis is:

Tools are learned. Now—where will you specialize?

Module 6: Finding Your Niche

The generalist knows something about everything. The specialist knows everything about something. The methodologist knows which questions to ask about both.

You cannot master all methods.

But you can become the world expert in one.

Network Meta-Analysis

Indirect comparisons, complex interventions

High demand

Diagnostic Test Accuracy

Sensitivity/specificity, SROC curves

Specialized

IPD Meta-Analysis

Individual patient data, advanced stats

Growing

Living Systematic Reviews

Continuous updating, automation

Emerging

Qualitative Synthesis

Meta-ethnography, framework synthesis

Underserved

Risk of Bias Tools

Tool development, validation

Influential

Reporting Guidelines

CONSORT, PRISMA extensions

High impact

AI/ML in Evidence Synthesis

Automation, screening, extraction

Frontier

Deep Dive: Network Meta-Analysis

Why This Niche?

The problem: Policymakers need to compare 10 drugs. We have trials of A vs B, B vs C, C vs D... but not A vs D directly.

NMA solves: Borrows strength across the network to estimate all comparisons.

The opportunity: HTA agencies require NMA for reimbursement decisions. Pharma needs NMA experts. Few people truly understand it.

The barrier: Requires Bayesian statistics, graph theory, and deep understanding of transitivity. 2-3 year learning curve.

Key names: Salanti, Caldwell, Dias, Welton. Key resource: NICE DSU Technical Support Documents.

Deep Dive: AI/ML in Evidence Synthesis

Why This Niche?

The problem: Screening 50,000 abstracts by hand takes months. Extracting data from 200 papers is tedious and error-prone.

AI/ML offers: Active learning for screening (50-90% workload reduction). NLP for data extraction. GPT for protocol drafting.

The opportunity: Field is new. Standards don't exist yet. Whoever writes the validation frameworks shapes the field.

The barrier: Requires both ML knowledge AND deep SR methods understanding. Most ML people don't know SR; most SR people don't know ML.

Key names: Marshall, Wallace, Thomas. Key tools: ASReview, RobotReviewer, Trialstreamer.
Decision tree: Are you drawn to statistical complexity or process improvement? If statistics, are you comfortable with Bayesian methods? If yes, consider NMA or IPD meta-analysis. If not yet, consider DTA or prediction models. If process-focused, do you have tech/coding interest? If yes, consider AI/ML or living systematic reviews. If no, consider guidelines or risk of bias tools.

The Story of Archie Cochrane:

Archie Cochrane could have remained a general physician. Instead, he spent decades asking one question: "Where is the evidence?"

He championed randomized trials when colleagues trusted clinical intuition. He was mocked, ignored, called obsessive. By the time he died in 1988, his "obsession" had become the foundation of evidence-based medicine.

The Cochrane Collaboration bears his name. Deep expertise in one question changed healthcare forever.

Your colleagues trust clinical intuition over trials. You believe evidence should guide practice. What do you choose?

Path A: Accept the status quo—clinical experience is valued, go along with established practice.
→ Medicine remains opinion-based. Treatments continue without evidence. Patients receive care based on tradition, not proof.
Path B: Persist in demanding evidence—keep asking "Where is the proof?" despite mockery.
→ Get called obsessive for decades. Eventually transform medicine. A global collaboration bears your name.

Network meta-analysis is particularly valuable when:

The niche is chosen. Now—learn from those who walked before.

Module 7: Real Journeys

Every methodologist was once a beginner who didn't quit.

How did they actually get here?

The paths are rarely straight.

Journey 1: The Clinician Who Asked Why

Gordon Guyatt: From Physician to GRADE Creator

Gordon Guyatt was a practicing internist in the 1980s. He noticed that senior physicians made confident pronouncements—but their evidence was weak.

The turning point: He coined "Evidence-Based Medicine" in 1991. But he didn't stop there. He realized clinicians needed frameworks to assess evidence quality.

The path: MD → Clinical practice → Frustration with dogma → Methodological training → GRADE working group leadership

Today: GRADE is the global standard for certainty assessment. Used by WHO, Cochrane, 100+ organizations.

Lesson: Clinical frustration can become methodological innovation.

Journey 2: The Statistician Who Saw Chaos

Doug Altman: From Numbers to Standards

Doug Altman trained as a statistician, not a clinician. But when he looked at medical journals, he saw statistical chaos—wrong tests, misreported results, impossible numbers.

The turning point: His 1994 "Scandal of Poor Medical Research" editorial in BMJ. He could have complained. Instead, he built solutions.

The path: Statistics degree → Medical statistics unit → Frustration with reporting → CONSORT, STROBE, PRISMA, EQUATOR Network

Today: His reporting guidelines changed scientific communication. Cited 500,000+ times across his work.

Lesson: Identify a widespread problem, then systematically solve it.

Journey 3: The Outsider Who Found a Gap

From PhD Student to NMA Pioneer

Georgia Salanti was a PhD student in Greece when network meta-analysis was a niche curiosity. Few people understood it. Fewer could teach it.

The turning point: She recognized that NMA would become essential for HTA. She positioned herself at the frontier.

The path: Statistics PhD → Postdoc at Cambridge → Deep NMA focus → Own research group at Bern → CINeMA framework developer

Today: One of the most cited NMA methodologists globally. Consulted by WHO, NICE, pharmaceutical companies.

Lesson: Find the emerging method that everyone will need—and master it before they do.

Journey 4: The Regional Pioneer

Building Methods Capacity in the Middle East

In 2010, systematic review methods training in the MENA region was scarce. Most methodologists were trained in Europe or North America.

The pioneers: Clinicians and researchers who sought training abroad, then returned to build local capacity.

The path: Local medical degree → International methods fellowship → Return home → Establish training programs → Become regional experts

Today: Jordan, Lebanon, Iran, Saudi Arabia, Oman have growing evidence synthesis capacity. Regional Cochrane branches emerging.

Lesson: Methods expertise is portable. Bringing it home multiplies your impact.

They All Had...

• A frustration that became a mission

• Willingness to learn continuously

• Collaborators who believed in them

• Persistence through rejection

• A focus that deepened over time

They Didn't Have...

• Perfect pedigrees

• Clear paths from the start

• Instant recognition

• Freedom from setbacks

• All skills from day one

The Story of Ignaz Semmelweis:

In 1847, Ignaz Semmelweis discovered that handwashing could reduce maternal mortality from 18% to 2%. The medical establishment rejected him.

He was fired, mocked, and eventually committed to an asylum, where he died at 47. It took 20 more years for germ theory to vindicate him.

Semmelweis didn't see his impact. But every surgeon who scrubs today follows his path. The river reached the sea—after his death.

You've discovered handwashing saves lives, but colleagues reject it. What do you choose?

Path A: Give up—accept rejection, stop fighting the establishment.
→ Die forgotten and heartbroken. Be vindicated posthumously, but never see your impact.
Path B: Document everything—persist despite rejection, ensure the evidence survives.
→ Even if rejected in your lifetime, your documented evidence survives. Future generations inherit the truth.

Gordon Guyatt's path to creating GRADE began with:

Best Entry Points

Research Assistant on systematic review teams

Junior HTA Analyst at national/regional agencies

Cochrane/JBI Fellow (competitive but career-defining)

Postdoc with established methodologist

What Employers Want

• Evidence of SR/MA experience (even one published)

• R or Stata proficiency

• GRADE or risk of bias training certificates

• Strong writing samples

Pro tip: Volunteer as a screener on a Cochrane review. It's free training and a publication credit.

The Reading List: Priority Order

Start Here (Month 1-3):
1. Cochrane Handbook — Chapters 6-10 (free online)
2. Borenstein "Introduction to Meta-Analysis" — Statistical foundation

Then (Month 4-6):
3. Guyatt's JAMA Users' Guides — Clinical epidemiology
4. GRADE Handbook — Certainty assessment (free online)

Specialize (Month 7-12):
5. NICE DSU Technical Support Documents — For NMA
6. Ioannidis papers collection — Meta-research classics

One chapter per week = graduate-level education in 2 years.

Regional Pathway: MENA & Global South

Building Methods Capacity in the Gulf & Middle East:

The Challenge: Limited local training programs. Most methodologists trained abroad.

The Opportunity: Growing demand. Saudi FDA, Qatar PHCC, UAE DOH all building HTA capacity. First-mover advantage is real.

Pathway:
• Complete international training (McMaster, Oxford, JBI)
• Return with certification and network
• Partner with regional institutions
• Become the local expert others consult

Regional conferences: ISPE Middle East, Dubai Health Forum, Gulf HTA Network meetings.

McMaster GRADE/EBM

Gold standard. Online + in-person options.

Essential

Cochrane Training

Free online modules. Certificate tracks available.

Free

JBI Programs

Comprehensive SR training. Strong in qualitative.

Respected

Oxford CEBM

Short courses and MSc options.

Prestigious

Coursera/edX

Epidemiology, Biostatistics foundations.

Affordable

AHRQ EPC Fellowships

US-based. Competitive but career-defining.

Elite

You have seen the journeys. Now—are you ready to begin yours?

Final Assessment

You have journeyed through the world of methods.

The Five Principles of the Methodologist

1. Rigor is not optional—it is the foundation of trust.

2. Invisible work has visible consequences.

3. Depth in one area enables breadth in impact.

4. Tools serve methods; methods serve truth.

5. The methodologist asks: "What else could explain this?"

Doug Altman's CONSORT guidelines have been cited over:

The aprotinin case (15,000-22,000 excess deaths) illustrates:

When evaluating an observational study showing coffee drinkers have more heart disease, a methodologist's first question should be:

The most scalable way for a methodologist to multiply their impact is:

The career path typically offering highest compensation but requiring comfort with commercial context is:

Commit to your development. Check off each milestone as you complete it:

Your progress is saved automatically to this browser.

You have completed the journey.

Go forth and build the foundations others will stand upon.

Remember:
Methods are invisible. Impact is not.
Tools are learned. Judgment is earned.
The path less traveled is less crowded for a reason. Walk it anyway.