Choosing the Right Literature Review: A Doctoral Training Guide

Choosing the appropriate literature review is not a stylistic choice but a methodological commitment that shapes how research is evaluated, reviewed, and interpreted. This post introduces the fourteen review types identified by Grant and Booth and situates them within doctoral training, transdisciplinary research, STEM education, and tabletop role-playing game studies. By clarifying the differences between narrative, systematic, bibliometric, and hybrid approaches, the article offers practical guidance and a decision framework to help researchers select, name, and justify their review strategy with precision.
Infographic showing a decision tree for choosing the proper literature review, mapping research intent to review types such as narrative, systematic, bibliometric, and scoping reviews.

By Cristo León, last review January 3rd, 2026.

One of the earliest methodological decisions in doctoral research is also one of the most frequently misunderstood: What type of literature review am I conducting?

Doctoral students are often told to “write a comprehensive literature review” without being told what comprehensive means methodologically. As a result, reviews are misaligned with advisor expectations, reviewer standards, and journal criteria.

This guide reframes the problem for doctoral researchers by doing two things:

  1. Translating the Gant and Booth (2009) 14 review types into doctoral decision logic, and
  2. Providing a decision tree that links research intent to review methodology.

1. Why do doctoral students struggle with review types?

Three structural issues recur in doctoral training:

  1. Terminological overload
    Many labels describe scope (comprehensive, focused), others describe method (systematic, bibliometric), and others describe writing mode (narrative, integrative). These are often conflated.
  2. Implicit expectations
    Advisors and reviewers often assume a shared understanding of review types, but rarely make that understanding explicit.
  3. Methodological overreach
    Doctoral students frequently attempt systematic or bibliometric reviews without the time, tooling, or epistemic need to do so.

The result is methodological drift: reviews that promise one thing and deliver another.

2. Reframing the 14 review types for doctoral work

For doctoral training purposes, the 14 review types can be grouped by research intent, rather than by library science taxonomy.

2.1. Group 1: Reviews that build theory

  • Literature review (traditional / narrative)
  • Critical review
  • State-of-the-art review

These reviews prioritize conceptual synthesis, positioning, and argumentation. They are the most common and most appropriate for theory-driven dissertations and conceptual papers.

2.2. Group 2: Reviews that map a field

  • Overview (including bibliometric reviews)
  • Mapping review / systematic map
  • Scoping review

These reviews describe structure, volume, and distribution of research. They answer “what exists” rather than “what does it mean.”

2.3. Group 3: Reviews that aggregate evidence

  • Systematic review
  • Meta-analysis
  • Qualitative systematic review
  • Mixed studies review
  • Umbrella review

These reviews answer evidentiary questions and require strict protocols, explicit appraisal criteria, and high methodological discipline.

2.4. Group 4: Reviews constrained by time or training context

  • Rapid review
  • Systematic search and review
  • Systematized review

These are pragmatic compromises, common in policy contexts or graduate research, but should be labeled carefully to avoid overclaiming rigor.

3. A doctoral decision tree for selecting a review type

Use the following decision tree before you start writing, not after.

3.1. Step 1: What is the primary purpose of your review?

A. To develop, refine, or position a theory or conceptual framework
→ Go to Step 2A

B. To describe the structure, scope, or evolution of a research field
→ Go to Step 2B

C. To answer an evidence-based question about effects, outcomes, or practices
→ Go to Step 2C

3.2. Step 2A: Theory-driven intent

Ask yourself:

  • Am I synthesizing ideas rather than counting studies?
  • Is my contribution conceptual, interpretive, or argumentative?

If yes:
→ Choose a Narrative Literature Review or Critical Review
Optional refinement:

  • If emphasizing recent developments → State-of-the-art review

Doctoral default:

Comprehensive narrative literature review anchored in foundational works

3.3. Step 2B: Field-mapping intent

Ask yourself:

  • Do I want to show how much research exists, where it clusters, or how it has evolved?
  • Am I more interested in structure than interpretation?

If yes:
→ Choose an Overview, Mapping Review, or Scoping Review

Further distinction:

  • Using citation data, networks, keywords → Bibliometric review (Overview subtype)
  • Clarifying boundaries and concepts → Scoping review

Doctoral caution:
Mapping is descriptive. Do not claim theoretical synthesis unless you actually do it.

3.4. Step 2C: Evidence aggregation intent

Ask yourself:

  • Am I answering a narrowly defined research question?
  • Am I prepared to document databases, search strings, inclusion criteria, and appraisal methods?

If yes:
→ Choose a Systematic Review
Optional extensions:

  • Quantitative effects → Meta-analysis
  • Qualitative evidence → Qualitative systematic review
  • Mixed evidence → Mixed studies review
  • Reviews of reviews → Umbrella review

Doctoral warning:
If you cannot fully execute these protocols, do not label your review as systematic.

4. Where bibliometric reviews belong (and where they do not)

For doctoral researchers, bibliometric reviews should be understood clearly:

  • They map relationships, not meanings.
  • They analyze metadata, not arguments.
  • They belong under Overview-type reviews, not under narrative or critical reviews.

Bibliometric analysis is powerful, but it does not replace conceptual synthesis. Using it does not automatically make a review “more rigorous.”

5. A practical doctoral rule of thumb

If your dissertation or paper:

  • argues → narrative or critical review
  • maps → overview, scoping, or bibliometric review
  • tests → systematic or meta-analytic review

Misalignment between the verb and the review type is the fastest path to reviewer rejection.

6. Final takeaway for doctoral researchers

Choosing a review type is not a formatting decision. It is an epistemic commitment.

Name your review type early, justify it explicitly, and write in a way that fulfills the expectations of that choice.

7. Fourteen Examples by review type (2020–2026)

7.1. Critical review

Example (STEM, game-based learning): Sailer et al. (2024) use a review-of-meta-analyses approach plus second-order meta-analysis to critically re-interpret technology-enhanced learning effects through the lens of learning activity, illustrating how “critical” work reframes a field’s dominant explanatory assumptions rather than merely summarizing studies.

Example (Transdisciplinary Research): León et al. (2025), the authors explore three epistemological approaches to prepare this critical or conceptual review, which includes a PRISMA protocol.

7.2. Literature review (traditional narrative)

Example (TTRPG, education adjacent): Devyn Rorem Colquhoun et al. (2026) frame TTRPGs in relation to structurally marginalized youth and synthesizes relevant work in a narrative form consistent with a traditional literature review (broad coverage, argument-led synthesis).

7.3. Mapping review / systematic map

Example (transdisciplinary education context): Wang and Kajfez (2025) survey transdisciplinarity in engineering education and explicitly reference mapping-review logic in relation to “wicked problems,” aligning with the “map the territory, identify gaps” function of mapping reviews.

7.4. Meta-analysis

Example (STEM learning, game-based): Tsai and Tsai (2020) provide a meta-analysis of digital game-based learning, illustrating effect-size aggregation logic that can be repurposed for STEM education questions about learning outcomes.

7.5. Mixed studies review / mixed methods review

Example (STEM education): Khushk et al. (2023) explicitly describe a convergent mixed-methods review that combines scientometric mapping with narrative synthesis, which is methodologically characteristic of mixed studies reviews.

My own dissertation uses a mixed-method review (León, 2024)

7.6. Overview (including bibliometric reviews)

Example (STEM-adjacent education, bibliometric): Akhmetova et al. (2025) conduct a cross-database bibliometric analysis and visualization of game-based learning research, exemplifying an overview whose primary contribution is field structure, growth, and clustering rather than interpretive synthesis.

Example (STEM learning): Lipuma and León (2024) overview of the literature arround equity.

7.7. Qualitative systematic review

Example (STEM-adjacent, games in math education): Yusri & Zainal (2025) is explicitly framed as a qualitative systematic review on how digital games in mathematics education may support mathematical reasoning, illustrating a systematic search paired with qualitative synthesis.

7.8. Rapid review

Example (transdisciplinary methods): The “How to conduct a transdisciplinary rapid review” working paper provides an applied, time-compressed approach designed for end-user decision-making, which is the defining intent of rapid reviews (Beynon & Straker, 2022)

(interdisciplinary methods): The Exploring Researchers’ Perspectives and Experiences of Digital Childhoods Research in Schools (Bunn & Dobson, 2024)

7.9. Scoping review

Example (TTRPG): Yuliawati et al. (Yuliawati et al., 2024) is a scoping review of TTRPGs as an intervention, structured around mapping empirical extent and future directions, consistent with scoping review goals.
Additional TTRPG scoping exemplar (2026): Garcia-Soriano et al. (2025) maps innovative applications of tabletop role-playing games and justifies scoping methodology for a fragmented interdisciplinary landscape.

Arenas et al., (2022) perform a scoping review on the literature about RPGs as a therapeutic tool or prevention strategy in psychotherapies and mental health, highlighting studies’ populations, forms of RPG, and interventions used.

7.10. State-of-the-art review

Example (STEM education, emerging tech): Otto et al. (2025) explicitly labels itself a systematic state-of-the-art review on GenAI within active learning in STEM education, exemplifying the “current frontier” orientation.

7.11. Systematic review

Example (tabletop/analog games in learning): Sousa et al. (2023) is a systematic literature review mapping analog tabletop games in learning processes, illustrating protocol-oriented synthesis that is directly relevant to TTRPG-adjacent educational scholarship.

Example (TTRPG): Leon et al. (2025)is a systematic literature review of the constitutive factors of campaigns in TTRPGs.

7.12. Systematic search and review

Example (STEM-adjacent engineering education): Jamison et al. (2022) is explicitly labeled “a systematic search and review” in engineering education, illustrating the genre where searching is systematic but synthesis may be more flexible than a full systematic review.

7.13. Systematized review

Example (STEM-adjacent engineering/design education): Kotecha et al. (2025) presented a conference paper on digital twins in engineering and design education, which describes itself as a systematized review, aligning with the graduate-friendly pattern of adopting systematic elements without complete systematic-review completeness.

7.14. Umbrella review

Example (education technology broadly, applicable to STEM): Huang et al. (2026) conducts an umbrella review synthesizing many systematic reviews on AI in K–12 education, illustrating the “review of reviews” logic that can be mirrored for STEM or game-based learning bodies once enough reviews exist.
GBL-specific umbrella chapter (2025): Vallejo-Imbaquingo et al. (2025) present a chapter in the book Learning Technology for Education Challenges. An umbrella review on game-based learning (GBL) and mobile learning provides a closer thematic bridge to STEM game-based learning contexts.

Spanish Example (review-methods contribution): Leon et al. (2025) discuss the GPS model and can illustrate systematized review practice or the design logic behind systematic review protocols.

8. Sources

Akhmetova, A. I., Seitenova, S. S., Khodjaev, B. K., Jamoldinova, O. R., Yerkebaeva, S. Z., & Kazybayeva, K. U. (2025). Evolution of game-based learning research: A cross-database bibliometric analysis and visualization study (2015-2024). Contemporary Educational Technology, 17(3), ep585. https://doi.org/10.30935/cedtech/16451

Arenas, D. L., Viduani, A., & Araujo, R. B. (2022). Therapeutic Use of Role-Playing Game (RPG) in Mental Health: A Scoping Review. Simulation & Gaming, 53(3), 285–311. https://doi.org/10.1177/10468781211073720

Beynon, A., & Straker, L. (2022, August 3). Topaz Project: How to Conduct a Transdisciplinary Rapid Review. Digital Child. https://digitalchild.org.au/research/publications/working-paper/topaz-project-how-to-conduct-a-transdisciplinary-rapid-review/

Bunn, A., & Dobson, M. (2024). Exploring researchers’ perspectives and experiences of digital childhoods research in schools. Computers and Education Open, 6, 100186. https://doi.org/10.1016/j.caeo.2024.100186

Garcia-Soriano, F., Fabregat-Cabrera, M. E., & Ruiz-Callado, R. (2025). Beyond Play: A Scoping Review of Innovative Applications of Tabletop Role-Playing Games. European Public & Social Innovation Review, 11, 1–28. https://doi.org/10.31637/epsir-2026-2097

Grant, M. J., & Booth, A. (2009). A typology of reviews: An analysis of 14 review types and associated methodologies. Health Information & Libraries Journal, 26(2), 91–108. https://doi.org/10.1111/j.1471-1842.2009.00848.x

Huang, R., Yin, Y., Zhou, N., & Lang, F. (2026). Artificial intelligence in K-12 education: An umbrella review. Computers and Education: Artificial Intelligence, 10, 100519. https://doi.org/10.1016/j.caeai.2025.100519

Jamison, C. S. E., Fuher, J., Wang, A., & Huang-Saad, A. (2022). Experiential Learning Implementation in Undergraduate Engineering Education: A Systematic Search and Review. European Journal of Engineering Education, 47(6), 1356–1379. https://doi.org/10.1080/03043797.2022.2031895

Khushk, A., Zhiying, L., Yi, X., & Zengtian, Z. (2023). Technology Innovation in STEM Education: A Review and Analysis. IJERI: International Journal of Educational Research and Innovation, 19, 29–51. https://doi.org/10.46661/ijeri.7883

Kotecha, M., Dandridge, T., & Reid Smith, T. (2025). Digital Twins in Engineering and Design Education: A Systematized Review. 2025 ASEE Annual Conference & Exposition  Proceedings, 56296. https://doi.org/10.18260/1-2–56296

León, C. (2024). Colaboración Interdisciplinaria: Tablero de Control para una Institución Politécnica R01 en los EE. UU. (CLDM_LDO) [PhD Thesis, IEU Universidad].

León, C., Arroyo, A., & Lipuma, J. (2025). Playing by Feel: Gender, Emotion, and Social Norms in Overwatch Role Choice. Journal of Systemics, Cybernetics and Informatics, 23(7), 152–163. /Research/Cultural and Social Studies. https://doi.org/10.54808/JSCI.23.07.152

Lipuma, J., & León, C. (2024). Minorities and the AI Revolution: Examining the Literature on Equity and the Digital Divide. Academia Journals Puebla, 16, 4.107-4.115. /Research/Education. https://digitalcommons.njit.edu/stemresources/89/

Otto, S., Lavi, R., & Brogaard Bertel, L. (2025). Human-GenAI interaction for active learning in STEM education: State-of-the-art and future directions. Computers & Education, 239, 105444. https://doi.org/10.1016/j.compedu.2025.105444

Rorem Colquhoun, D., Melenberg, M., & Pei, J. (2026). Tabletop Role Playing Games as a Way Forward With Structurally Marginalized Youth: A Narrative Review. Children and Youth Services Review, 181, 108739. https://doi.org/10.1016/j.childyouth.2025.108739

Sailer, M., Maier, R., Berger, S., Kastorff, T., & Stegmann, K. (2024). Learning Activities in Technology-Enhanced Learning: A Systematic Review of Meta-Analyses and Second-Order Meta-Analysis in Higher Education. Learning and Individual Differences, 112, 102446. https://doi.org/10.1016/j.lindif.2024.102446

Sousa, C., Rye, S., Sousa, M., Torres, P. J., Perim, C., Mansuklal, S. A., & Ennami, F. (2023). Playing at the School Table: Systematic Literature Review of Board, Tabletop, and Other Analog Game-Based Learning Approaches. Frontiers in Psychology, 14, 1160591. https://doi.org/10.3389/fpsyg.2023.1160591

Tsai, Y.-L., & Tsai, C.-C. (2020). A Meta-Analysis of Research on Digital Game-Based Science Learning. Journal of Computer Assisted Learning, 36(3), 280–294. https://doi.org/10.1111/jcal.12430

Vallejo-Imbaquingo, R., Villacrés, R., & Torres-Toukoumidis, A. (2025). Game-Based Learning and Mobile Learning: An Umbrella Review on Pedagogical Foundations and Outcomes. In L. Uden & D. Liberona (Eds.), Learning Technology for Education Challenges (Vol. 2551, pp. 196–207). Springer Nature Switzerland. https://doi.org/10.1007/978-3-031-98003-9_16

Wang, S., & Kajfez, R. (2025). A Literature Review of Transdisciplinarity in Engineering Education. 2025 ASEE North Central Section  (NCS) Annual Conference Proceedings, 54645. https://doi.org/10.18260/1-2–54645

Yuliawati, L., Puspieta-Wardhani, P. A., & Ng, J. H. (2024). A Scoping Review of Tabletop Role-Playing Game (TTRPG) as Psychological Intervention: Potential Benefits and Future Directions. Psychology Research and Behavior Management, 17, 2885–2903. https://doi.org/10.2147/PRBM.S466664

Yusri, A. A., & Zainal, M. Z. (2025). Unleashing Gamification: A Systematic Review in Primary Schools. Journal of Education and Learning (EduLearn), 19(4), 2313–2321. https://doi.org/10.11591/edulearn.v19i4.22009