posted on 2013-04-16, 10:03authored byCarl R. May, Tracy Finch, Frances Mair, Luciana Ballini, Christopher Dowrick, Martin Eccles, Linda Gask, Anne E. MacFarlane, Elizabeth Murray, Tim Rapley, Anne Rogers, Shaun Treweek, Paul Wallace, George Anderson, Jo Burns, Ben Heaven
Background: The Normalization Process Model is a theoretical model that assists in explaining the
processes by which complex interventions become routinely embedded in health care practice. It offers a
framework for process evaluation and also for comparative studies of complex interventions. It focuses
on the factors that promote or inhibit the routine embedding of complex interventions in health care
practice.
Methods: A formal theory structure is used to define the model, and its internal causal relations and
mechanisms. The model is broken down to show that it is consistent and adequate in generating accurate
description, systematic explanation, and the production of rational knowledge claims about the workability
and integration of complex interventions.
Results: The model explains the normalization of complex interventions by reference to four factors
demonstrated to promote or inhibit the operationalization and embedding of complex interventions
(interactional workability, relational integration, skill-set workability, and contextual integration).
Conclusion: The model is consistent and adequate. Repeated calls for theoretically sound process
evaluations in randomized controlled trials of complex interventions, and policy-makers who call for a
proper understanding of implementation processes, emphasize the value of conceptual tools like the
Normalization Process Model.