Some build the house. Others ensure the foundation will hold.
Have you ever wondered who checks the checkers?
Who ensures that the methods behind medical decisions are sound?
The Invisible Architects
A city built a magnificent bridge. Engineers celebrated. Citizens crossed daily. But one woman spent her days beneath, checking welds, measuring stress, documenting cracks.
"Why don't you build bridges?" they asked.
"Because," she said, "I ensure every bridge built will stand."
Methodologists are the inspectors of medical evidence. They don't conduct the trials—they ensure the trials tell us truth.
The Hidden Impact
Methods work isn't glamorous. But it's the difference between medicine that heals and medicine that harms.
Consider the story of Dr. Doug Altman...
In 1994, a statistician published an editorial in BMJ titled "The Scandal of Poor Medical Research." He wrote:
"We need less research, better research, and research done for the right reasons."
Doug Altman didn't discover new drugs. He didn't perform surgeries. He spent his career developing reporting guidelines (CONSORT, STROBE, PRISMA) that changed how the world presents evidence.
His citations: 500,000+. His tools are used in every major journal. He built no bridges—but every bridge now uses his inspection standards.
What Is a Methodologist?
What They Do
• Design rigorous study methods
• Develop reporting guidelines
• Create risk of bias tools
• Advance statistical techniques
• Train the next generation
What They Don't Do
• Usually not primary data collection
• Usually not patient care
• Rarely in the headlines
• Often not "first author"
• Not the face of breakthroughs
Methodologists are the architects of evidence, not the builders of individual studies.
The Story of John Snow's Cholera Map:
In 1854, London physicians believed cholera spread through "miasma" (bad air). Their maps were beautiful, showing wind patterns and elevation.
Dr. John Snow created a plain map—just dots showing deaths clustered around the Broad Street pump. His simple, methodologically sound map identified contaminated water as the source.
The miasma maps were elegant but wrong. Snow's plain map saved lives.
Methodology over presentation.
You are a London physician in 1854. Cholera is spreading. What do you choose?
The Market Reality
Demand is rising. Regulatory agencies, pharma, and health systems all need people who understand evidence quality.
The path is clear for those willing to walk it.
Module 1: What Methodologists Do
They do not build the ship. They ensure it will not sink.
Picture a day in the life...
Not one life—many lives, in many settings.
The Methodologist Who Changed Medicine: Archie Cochrane
In 1972, a physician named Archie Cochrane published Effectiveness and Efficiency, arguing that most medical decisions lacked evidence from controlled trials. He challenged the entire medical establishment to prove that treatments actually worked.
Before Cochrane's advocacy, the vast majority of medical treatments had never been tested in randomized controlled trials. By the time the Cochrane Collaboration was founded in 1993, systematic reviews had become the gold standard of evidence-based medicine.
One methodologist's insistence on rigour created the infrastructure that modern medicine now depends on.
You are a young researcher in 1975. A senior professor dismisses your suggestion to conduct a systematic review, saying: "We already know what works from clinical experience."
Setting 1: Academic Medical Center
McMaster University, Hamilton, Canada
8:00 AM: Reviews a draft systematic review protocol from a PhD student. Catches a fatal flaw: they planned single screening. Recommends pilot calibration exercises.
10:00 AM: Teaches GRADE workshop to 40 clinicians. Half are confused by "indirectness." She uses the aspirin-for-headache-vs-heart-attack example. Lightbulbs go on.
2:00 PM: Co-author meeting for a methods paper on network meta-analysis inconsistency. Argues for a new visualization approach.
4:00 PM: Reviews grant application as methodologist. Flags sample size calculation error that would doom a $2M trial.
Impact: Prevented one flawed trial, trained 40 future evidence users, advanced the field.
Setting 2: Health Technology Assessment Agency
Saudi Food and Drug Authority, Riyadh
8:30 AM: Receives manufacturer's dossier for new diabetes drug. 847 pages. His job: assess the evidence quality in 3 weeks.
10:00 AM: Spots a problem: the pivotal trial used a surrogate endpoint (HbA1c) but claims mortality benefit. Surrogate ≠ patient-relevant outcome.
1:00 PM: Meets with clinical experts. "The drug works," they say. "But does it matter to patients?" he asks. Evidence gap identified.
3:00 PM: Drafts assessment report. Recommends conditional approval with real-world evidence collection requirement.
Impact: A nation's drug coverage decision now based on rigorous methods, not marketing claims.
Setting 3: Pharmaceutical Industry
Global Pharmaceutical Company, Basel
9:00 AM: Designs Phase III trial protocol. Fights for pragmatic design over explanatory—harder to recruit, but results will apply to real patients.
11:00 AM: Reviews competitor's published trial. Notes: open-label, subjective outcomes, high attrition. Prepares competitive landscape briefing.
2:00 PM: Coaches clinical team on FDA Pre-Submission meeting. "They will ask about your primary endpoint. Have the sensitivity analyses ready."
4:30 PM: Joins global call on network meta-analysis for HTA submissions. Debates transitivity assumptions for 90 minutes.
Impact: Better-designed trials, cleaner regulatory path, evidence that serves patients and business.
Setting 4: WHO/Global Health
World Health Organization, Geneva
8:00 AM: Morning call with AFRO region. Malaria guideline update needed. Reviews which rapid reviews can be adapted vs. need fresh evidence.
10:30 AM: Guideline Development Group meeting. 14 experts disagree on intervention. His role: ensure the evidence is presented objectively, regardless of who's loudest.
1:00 PM: Drafts GRADE evidence profile for HIV testing recommendation. "Very low certainty" doesn't mean "don't recommend"—he writes the plain-language summary.
4:00 PM: Reviews living systematic review dashboard. Three new RCTs since last month. Triggers update protocol.
Impact: Guidelines used by 194 member states. Methods decisions affect millions.
The Core Activities
Design
Creating study protocols, sample size calculations, endpoint selection
Appraise
Risk of bias assessment, GRADE ratings, critical appraisal
Synthesize
Meta-analysis, network meta-analysis, qualitative synthesis
Develop
New tools, guidelines, reporting standards, software
Teach
Training researchers, clinicians, policymakers
Advise
Consultations on specific studies, grants, submissions
Consider what happens when methods fail...
The Aprotinin Disaster
Aprotinin was used in cardiac surgery to reduce bleeding. Early studies suggested it worked. But methodologists noticed problems:
• Observational studies had severe confounding
• RCTs were too small for safety outcomes
• Meta-analyses pooled incompatible populations
2007: BART trial finally showed increased mortality. Drug withdrawn.
Estimate: 15,000-22,000 excess deaths while methodological concerns were ignored.
The RECOVERY Triumph
The RECOVERY trial used rigorous methods from day one:
• Pragmatic design embedded in routine care
• Adequate sample size (thousands, not dozens)
• Pre-specified endpoints and analysis plan
• Independent data monitoring committee
Result: Dexamethasone proven in 100 days. Estimated 1 million lives saved globally.
The methodologists who designed this protocol are among the most impactful scientists of our era—yet few know their names.
Module 1 Quiz
A methodologist's primary role is to:
Module 1 Quiz (2)
Select the most accurate answer:
Methods are not abstract. Methods are life and death.
Module 2: The Skill Stack
A methodologist is not one thing. A methodologist is many things woven together.
What skills separate the methodologist from the researcher?
Not depth in one area—but bridges between many.
The Four Pillars
1. Statistics & Quantitative Methods
• Regression, survival analysis
• Meta-analysis (pairwise, network)
• Bayesian methods (increasingly)
• Missing data, sensitivity analysis
2. Epidemiology & Study Design
• Bias recognition (selection, information, confounding)
• RCT design and CONSORT
• Observational study limitations
• Causal inference frameworks
3. Domain Expertise
• Clinical knowledge (enough to ask right questions)
• Understanding healthcare systems
• Policy and decision-making context
• Patient-relevant outcomes
4. Communication
• Translating complexity to clarity
• Writing for diverse audiences
• Teaching and mentoring
• Diplomatic disagreement
The Story of DNA's Discovery:
In 1953, discovering DNA's structure required four disciplines:
Rosalind Franklin's X-ray crystallography revealed the helix shape.
Erwin Chargaff's chemistry showed base-pairing rules.
Linus Pauling's modeling expertise suggested the spiral.
Watson and Crick's biological insight assembled the pieces.
No single specialist could have solved it alone. The double helix emerged only when disciplines converged.
A methodologist is the translator between specialists. Not the deepest in any one field, but fluent enough in all.
You have crucial X-ray data on DNA structure. What do you choose?
Statistics: What You Need
Core Competencies (Must Have)
• Regression fundamentals: Linear, logistic, interpretation of coefficients
• Survival analysis: Kaplan-Meier, Cox regression, hazard ratios
• Meta-analysis: Fixed/random effects, heterogeneity (I², τ²), forest plots
• Missing data: MCAR/MAR/MNAR, multiple imputation basics
Growth Areas (Increasingly Valuable)
• Network meta-analysis and indirect comparisons
• Bayesian approaches (priors, credible intervals)
• Machine learning for prediction models
• Causal inference methods (IPW, G-methods)
The Story of John Ioannidis
John Ioannidis trained as an infectious disease physician. He could have spent his career treating patients.
Instead, in 2005, he published "Why Most Published Research Findings Are False"—a statistical argument that most claimed research results are wrong.
The paper has been cited over 15,000 times. It changed how journals, funders, and researchers think about evidence.
He didn't need to be the world's best statistician. He needed to understand statistics well enough to ask the question no one else was asking.
Epidemiology: The Bias Hunter's Toolkit
Selection Bias
Who enters the study ≠ target population
Example: Trials excluding elderly patients, then applied to nursing homes
Information Bias
Measurement error, recall bias, detection bias
Example: Unblinded outcome assessors rating subjective endpoints
Confounding
Third variable creates spurious association
Example: Vitamin D and COVID—sick people stay indoors
The Methodologist's Question
"Which biases could explain this finding—and how much would they need to explain?"
The Story of the Coffee and Heart Disease:
For decades, observational studies showed coffee drinkers had more heart disease. Guidelines warned against coffee.
Then methodologists noticed: coffee drinkers also smoked more. When studies adjusted for smoking, the association vanished.
Later studies, with better methods, showed coffee might actually be protective.
The methodologist's job is not to declare truth—it is to ask: "What else could explain this?"
Communication: The Underrated Skill
"We're not confident in these results—the true effect could be quite different from what studies showed."
Decision Tree: Do You Have the Foundation?
Foundation in place
Study design course needed
Biostatistics course first
Module 2 Quiz
The four pillars of methodologist skills are:
Module 2 Quiz (2)
Select the best methodological question:
Skills are the foundation. Now—where will you build?
Module 3: Career Paths
Many roads lead to methods. Each road shapes what you become.
Where do methodologists work?
The answer may surprise you.
The Six Career Paths
Academia
Universities, research institutes
HTA Agencies
NICE, CADTH, IQWIG, regional authorities
Cochrane/JBI
Evidence synthesis organizations
WHO/Global Health
International organizations, NGOs
Pharmaceutical
Industry evidence teams, CROs
Consulting
Independent or boutique firms
Salary ranges approximate, USD, vary by region and seniority. Source: Industry surveys 2023-2024.
Path 1: Academia
Day-to-day: Teaching, research, supervision, grant writing, committee work
Autonomy: High (you choose your questions)
Stability: Variable (tenure track vs. soft money)
Impact timeline: Long (publications take years)
Who thrives here: Those who love teaching, want research freedom, can tolerate slow pace and academic politics
Watch out for: Publish-or-perish pressure, grant uncertainty, limited practical application
Path 2: HTA Agencies
Day-to-day: Reviewing manufacturer submissions, writing assessment reports, guideline development
Autonomy: Medium (structured processes)
Stability: High (government/quasi-government)
Impact timeline: Immediate (decisions affect coverage)
Who thrives here: Those who want direct policy impact, enjoy structured work, comfortable with accountability
Watch out for: Political pressure, tight deadlines, less research creativity
Path 3: WHO & Global Health
Day-to-day: Guideline development, evidence reviews, technical assistance to countries
Autonomy: Medium (diplomatic constraints)
Stability: Medium (contract-based, but renewable)
Impact timeline: Variable (guidelines can take years; emergency response is immediate)
Who thrives here: Those who want global impact, enjoy cross-cultural work, can navigate bureaucracy
Watch out for: Slow decision-making, political considerations, frequent travel
Path 4: Pharmaceutical Industry
Day-to-day: Trial design, HTA submissions, competitive intelligence, medical affairs support
Autonomy: Medium (business objectives)
Stability: Medium (restructuring common)
Impact timeline: Medium (product lifecycle)
Who thrives here: Those who want high compensation, enjoy fast pace, comfortable with commercial context
Watch out for: Conflicts of interest perception, pressure to "find the right answer," less publication freedom
Decision Tree: Which Path Fits You?
The Story of Three Methodologists:
Doug Altman spent 40 years at Oxford, slowly building CONSORT, PRISMA, and reporting standards—his impact spans decades. David Sackett left academia to practice evidence-based medicine directly with patients in Hamilton—immediate clinical utility. Hans Rosling used his epidemiology training to create Gapminder—reaching millions with data visualization.
Same training, three paths.
Altman the canyon (deep, slow impact), Sackett the mill (daily clinical use), Rosling the delta (wide public reach).
You've developed expertise in evidence synthesis. Choose your path:
The Hybrid Path: Most Common
Years 1-5: Foundation
Graduate degree (MPH, MSc, PhD). First job often academic or junior HTA role.
Years 5-10: Specialization
Develop niche expertise. May move sectors. Build publication record and network.
Years 10-15: Leadership
Senior roles. Often consulting on the side. Invited expert for guidelines.
Years 15+: Influence
Shape the field. Write the textbooks. Train the next generation.
Most senior methodologists have worked in 2-3 sectors. Mobility is an asset.
Module 3 Quiz
Which career path typically offers the highest compensation but requires comfort with commercial context?
The path is chosen. Now—how do you become visible?
Module 4: Building Your Portfolio
In a field of invisible architects, how do you become known?
A methodologist without visibility is a methodologist without impact.
Your work must speak—but first, people must hear it.
The Tale of the Unknown Statistician
In 1996, a statistician named David Moher led a group to create CONSORT—guidelines for reporting randomized trials.
He wasn't famous. He wasn't at Harvard. He was a methodologist at a Canadian research institute.
Today, CONSORT is required by 600+ journals. It's cited 35,000+ times. It's arguably saved more lives than most drugs—by ensuring we know which drugs actually work.
Moher built his reputation not through brilliance alone, but through relentless, visible contribution to a problem everyone faced.
The Visibility Stack
1. Publications
• Methods papers in good journals
• Collaborative applied papers (methods role)
• Commentaries and letters
• Tutorials and how-to guides
2. Tools & Resources
• Software packages (R, Stata)
• Checklists and reporting guidelines
• Teaching materials (open access)
• Calculators and decision aids
3. Presence
• Conference presentations
• Workshops and training
• Social media (X/Twitter, LinkedIn)
• Blog posts and newsletters
4. Network
• Cochrane/Campbell involvement
• Guideline panel memberships
• Journal editorial boards
• Mentorship relationships
The Power of Tools
If you can solve a recurring problem and package it well, your impact multiplies.
Publication Strategy: Quality Over Quantity
The Methods Paper Formula
Problem: Identify a recurring methodological challenge
Solution: Propose a clear, implementable approach
Demonstration: Apply it to real data or simulation
Accessibility: Provide code, tools, or templates
Validation: Show it works better than status quo
Target journals: BMJ, JAMA, Annals of Internal Medicine (methods sections), Statistics in Medicine, Research Synthesis Methods, Journal of Clinical Epidemiology
The Collaboration Multiplier
Georgia Salanti didn't just write about network meta-analysis—she collaborated with clinical teams who needed it.
Each collaboration produced: (1) an applied paper solving a clinical question, (2) methods insights for her next methods paper, (3) a grateful colleague who recommended her to others.
Her network grew exponentially. Now she leads one of the most productive evidence synthesis groups in Europe.
Lesson: Don't wait to be asked. Offer your methods expertise to teams with important clinical questions.
Building Your Online Presence
What Works
• Sharing useful content (tutorials, explainers)
• Engaging with methods debates respectfully
• Celebrating others' work
• Making complex ideas accessible
• Showing your learning journey
What Doesn't
• Self-promotion without value
• Attacking others' methods publicly
• Jargon-heavy posts
• Inconsistent presence
• Ignoring engagement
Best methodologist accounts: @EpiEllie, @statsepi, @CochraneCollab, @GRADEWorkingGrp
The Story of Rosalind Franklin:
Rosalind Franklin's X-ray photograph (Photo 51) was crucial to discovering DNA's structure. But she published cautiously and didn't self-promote.
Watson and Crick saw her data, published first, and won the Nobel Prize. Franklin died at 37, largely uncredited.
Decades later, her notebooks revealed she was weeks from solving the structure herself. Brilliant work, shared too quietly, was overshadowed.
Impact requires not just discovery, but communication.
You've made a crucial discovery. What do you choose?
Module 4 Quiz
The most scalable way for a methodologist to increase impact is:
Visibility without substance is noise. Substance without visibility is silence.
Module 5: The Toolkit
A craftsman is only as good as their tools—and their mastery of them.
What software do methodologists use?
What resources shape their thinking?
Statistical Software
R (Essential)
• metafor: Gold standard for meta-analysis
• netmeta: Network meta-analysis
• robvis: Risk of bias visualization
• tidyverse: Data manipulation
Free, open source, reproducible
Stata (Common)
• metan/admetan: Meta-analysis
• network: NMA
• Strong in survival analysis
• Preferred by some HTA agencies
Commercial license required
RevMan (Cochrane)
• Official Cochrane tool
• User-friendly interface
• Limited customization
• Good for standard reviews
Free for Cochrane reviews
Python (Growing)
• PyMeta: Meta-analysis
• ML integration for screening
• Text mining capabilities
• Automation scripts
Free, good for AI integration
Systematic Review Tools
Covidence
Screening, extraction, collaboration
Rayyan
AI-assisted screening, free tier
ASReview
Active learning screening
EPPI-Reviewer
Full workflow, text mining
DistillerSR
Enterprise, AI features
SysRev
Open source option
Essential Resources
Cochrane Handbook: The gold standard reference. Free online. Read chapters 6-10 closely.
Higgins & Green → Higgins et al.: Updated regularly. Your reference for any SR question.
Borenstein et al. "Introduction to Meta-Analysis": Best statistical foundation.
Guyatt et al. JAMA Users' Guides: Clinical epidemiology essentials.
IOM "Finding What Works in Health Care": Standards for systematic reviews.
Read one chapter per week. In two years, you'll have a graduate education in methods.
Key Organizations & Networks
Cochrane
Largest SR producer. Training. Methods groups. Volunteer opportunities.
Campbell Collaboration
Social science SRs. Growing. Less competitive to join.
JBI (Joanna Briggs)
Nursing/allied health focus. Strong training program.
GRADE Working Group
Certainty assessment. Influential. Membership by invitation.
EQUATOR Network
Reporting guidelines hub. Guideline development opportunities.
HTAi
HTA professional society. Annual conference. Interest groups.
The Story of Florence Nightingale's Graphics:
Florence Nightingale had basic statistical training—far less than university mathematicians of her era. But she mastered one tool: the polar area diagram.
Her "coxcomb" charts showed that soldiers died more from infection than battle wounds. Parliament, which ignored tables of numbers, understood her graphics instantly.
She didn't have the best tools. She mastered what she had, and changed military medicine forever.
You have data showing soldiers die from infection, not battle wounds. Parliament must act. What do you choose?
Module 5 Quiz
The most widely used R package for meta-analysis is:
Tools are learned. Now—where will you specialize?
Module 6: Finding Your Niche
The generalist knows something about everything. The specialist knows everything about something. The methodologist knows which questions to ask about both.
You cannot master all methods.
But you can become the world expert in one.
The Major Niches
Network Meta-Analysis
Indirect comparisons, complex interventions
Diagnostic Test Accuracy
Sensitivity/specificity, SROC curves
IPD Meta-Analysis
Individual patient data, advanced stats
Living Systematic Reviews
Continuous updating, automation
Qualitative Synthesis
Meta-ethnography, framework synthesis
Risk of Bias Tools
Tool development, validation
Reporting Guidelines
CONSORT, PRISMA extensions
AI/ML in Evidence Synthesis
Automation, screening, extraction
Deep Dive: Network Meta-Analysis
The problem: Policymakers need to compare 10 drugs. We have trials of A vs B, B vs C, C vs D... but not A vs D directly.
NMA solves: Borrows strength across the network to estimate all comparisons.
The opportunity: HTA agencies require NMA for reimbursement decisions. Pharma needs NMA experts. Few people truly understand it.
The barrier: Requires Bayesian statistics, graph theory, and deep understanding of transitivity. 2-3 year learning curve.
Key names: Salanti, Caldwell, Dias, Welton. Key resource: NICE DSU Technical Support Documents.
Deep Dive: AI/ML in Evidence Synthesis
The problem: Screening 50,000 abstracts by hand takes months. Extracting data from 200 papers is tedious and error-prone.
AI/ML offers: Active learning for screening (50-90% workload reduction). NLP for data extraction. GPT for protocol drafting.
The opportunity: Field is new. Standards don't exist yet. Whoever writes the validation frameworks shapes the field.
The barrier: Requires both ML knowledge AND deep SR methods understanding. Most ML people don't know SR; most SR people don't know ML.
Key names: Marshall, Wallace, Thomas. Key tools: ASReview, RobotReviewer, Trialstreamer.
Decision Tree: Finding Your Niche
The Story of Archie Cochrane:
Archie Cochrane could have remained a general physician. Instead, he spent decades asking one question: "Where is the evidence?"
He championed randomized trials when colleagues trusted clinical intuition. He was mocked, ignored, called obsessive. By the time he died in 1988, his "obsession" had become the foundation of evidence-based medicine.
The Cochrane Collaboration bears his name. Deep expertise in one question changed healthcare forever.
Your colleagues trust clinical intuition over trials. You believe evidence should guide practice. What do you choose?
Module 6 Quiz
Network meta-analysis is particularly valuable when:
The niche is chosen. Now—learn from those who walked before.
Module 7: Real Journeys
Every methodologist was once a beginner who didn't quit.
How did they actually get here?
The paths are rarely straight.
Journey 1: The Clinician Who Asked Why
Gordon Guyatt was a practicing internist in the 1980s. He noticed that senior physicians made confident pronouncements—but their evidence was weak.
The turning point: He coined "Evidence-Based Medicine" in 1991. But he didn't stop there. He realized clinicians needed frameworks to assess evidence quality.
The path: MD → Clinical practice → Frustration with dogma → Methodological training → GRADE working group leadership
Today: GRADE is the global standard for certainty assessment. Used by WHO, Cochrane, 100+ organizations.
Lesson: Clinical frustration can become methodological innovation.
Journey 2: The Statistician Who Saw Chaos
Doug Altman trained as a statistician, not a clinician. But when he looked at medical journals, he saw statistical chaos—wrong tests, misreported results, impossible numbers.
The turning point: His 1994 "Scandal of Poor Medical Research" editorial in BMJ. He could have complained. Instead, he built solutions.
The path: Statistics degree → Medical statistics unit → Frustration with reporting → CONSORT, STROBE, PRISMA, EQUATOR Network
Today: His reporting guidelines changed scientific communication. Cited 500,000+ times across his work.
Lesson: Identify a widespread problem, then systematically solve it.
Journey 3: The Outsider Who Found a Gap
Georgia Salanti was a PhD student in Greece when network meta-analysis was a niche curiosity. Few people understood it. Fewer could teach it.
The turning point: She recognized that NMA would become essential for HTA. She positioned herself at the frontier.
The path: Statistics PhD → Postdoc at Cambridge → Deep NMA focus → Own research group at Bern → CINeMA framework developer
Today: One of the most cited NMA methodologists globally. Consulted by WHO, NICE, pharmaceutical companies.
Lesson: Find the emerging method that everyone will need—and master it before they do.
Journey 4: The Regional Pioneer
In 2010, systematic review methods training in the MENA region was scarce. Most methodologists were trained in Europe or North America.
The pioneers: Clinicians and researchers who sought training abroad, then returned to build local capacity.
The path: Local medical degree → International methods fellowship → Return home → Establish training programs → Become regional experts
Today: Jordan, Lebanon, Iran, Saudi Arabia, Oman have growing evidence synthesis capacity. Regional Cochrane branches emerging.
Lesson: Methods expertise is portable. Bringing it home multiplies your impact.
Common Patterns Across Journeys
They All Had...
• A frustration that became a mission
• Willingness to learn continuously
• Collaborators who believed in them
• Persistence through rejection
• A focus that deepened over time
They Didn't Have...
• Perfect pedigrees
• Clear paths from the start
• Instant recognition
• Freedom from setbacks
• All skills from day one
The Story of Ignaz Semmelweis:
In 1847, Ignaz Semmelweis discovered that handwashing could reduce maternal mortality from 18% to 2%. The medical establishment rejected him.
He was fired, mocked, and eventually committed to an asylum, where he died at 47. It took 20 more years for germ theory to vindicate him.
Semmelweis didn't see his impact. But every surgeon who scrubs today follows his path. The river reached the sea—after his death.
You've discovered handwashing saves lives, but colleagues reject it. What do you choose?
Module 7 Quiz
Gordon Guyatt's path to creating GRADE began with:
Your First Job: Where to Start
Best Entry Points
• Research Assistant on systematic review teams
• Junior HTA Analyst at national/regional agencies
• Cochrane/JBI Fellow (competitive but career-defining)
• Postdoc with established methodologist
What Employers Want
• Evidence of SR/MA experience (even one published)
• R or Stata proficiency
• GRADE or risk of bias training certificates
• Strong writing samples
Pro tip: Volunteer as a screener on a Cochrane review. It's free training and a publication credit.
The Reading List: Priority Order
1. Cochrane Handbook — Chapters 6-10 (free online)
2. Borenstein "Introduction to Meta-Analysis" — Statistical foundation
Then (Month 4-6):
3. Guyatt's JAMA Users' Guides — Clinical epidemiology
4. GRADE Handbook — Certainty assessment (free online)
Specialize (Month 7-12):
5. NICE DSU Technical Support Documents — For NMA
6. Ioannidis papers collection — Meta-research classics
One chapter per week = graduate-level education in 2 years.
Regional Pathway: MENA & Global South
The Challenge: Limited local training programs. Most methodologists trained abroad.
The Opportunity: Growing demand. Saudi FDA, Qatar PHCC, UAE DOH all building HTA capacity. First-mover advantage is real.
Pathway:
• Complete international training (McMaster, Oxford, JBI)
• Return with certification and network
• Partner with regional institutions
• Become the local expert others consult
Regional conferences: ISPE Middle East, Dubai Health Forum, Gulf HTA Network meetings.
Training Programs & Fellowships
McMaster GRADE/EBM
Gold standard. Online + in-person options.
Cochrane Training
Free online modules. Certificate tracks available.
JBI Programs
Comprehensive SR training. Strong in qualitative.
Oxford CEBM
Short courses and MSc options.
Coursera/edX
Epidemiology, Biostatistics foundations.
AHRQ EPC Fellowships
US-based. Competitive but career-defining.
You have seen the journeys. Now—are you ready to begin yours?
Final Assessment
You have journeyed through the world of methods.
The Five Principles of the Methodologist
1. Rigor is not optional—it is the foundation of trust.
2. Invisible work has visible consequences.
3. Depth in one area enables breadth in impact.
4. Tools serve methods; methods serve truth.
5. The methodologist asks: "What else could explain this?"
Final Quiz (1/5)
Doug Altman's CONSORT guidelines have been cited over:
Final Quiz (2/5)
The aprotinin case (15,000-22,000 excess deaths) illustrates:
Final Quiz (3/5)
When evaluating an observational study showing coffee drinkers have more heart disease, a methodologist's first question should be:
Final Quiz (4/5)
The most scalable way for a methodologist to multiply their impact is:
Final Quiz (5/5)
The career path typically offering highest compensation but requiring comfort with commercial context is:
Your 6-Month Action Plan
Commit to your development. Check off each milestone as you complete it:
Your progress is saved automatically to this browser.
You have completed the journey.
Go forth and build the foundations others will stand upon.
Remember:
Methods are invisible. Impact is not.
Tools are learned. Judgment is earned.
The path less traveled is less crowded for a reason. Walk it anyway.