“Real”​ process tracing: Part 3 — epistemology

Thomas Aston
6 min readDec 28, 2020

--

Jagosh, 2019

“Reality can only be grasped indirectly — seen reflected in a mirror, staged in the theatre of the mind (Susan Sontag, 1979).”

Bad boys; good science?

Realistic Evaluation emerged as a critique of established social science methods in the late 1990s, and this inspired something of a pugilistic posture which has lasted for decades. Co-author Nick Tilly, for instance, describes himself and Ray Pawson as the ‘bad boys of methodology.’ As one reads through the recently published book Doing Realist Research, there is an unmistakably jubilant tone of the underdog that somehow succeeded against the odds. Pawson himself refers to Realist Evaluation (RE) as surpassing the 25-year mark of paradigmatic entropy from which other methods apparently suffer. “Realism is here to stay,” he asserts, “because of its scientific credentials” and due to what Popper terms an “evolutionary epistemology” whereby “knowledge itself evolves by natural selection.”

This Darwinian narrative of other methodologies collapsing while Realist Evaluation flourishes is perplexing, because as Westhorp notes, RE should view itself as a member of a “family of theory-based evaluation approaches.” However, it can be said that Realist Evaluation is different in the sense that it’s chiefly a “school of philosophy” rather than a methodology. It’s “trans-disciplinary,” and in principle, it’s method neutral (provided that the methods chosen fit with realist philosophy). Though RE does offer quite stringent guidance with express reporting standards.

Supposed “outsiders to the evaluation establishment” and “challenging opposing established orthodoxies,” RE expends considerable effort in defining itself by what it is not. As Westhorp notes, it’s not positivist and it’s not constructivist. It’s neither deductive, nor inductive. It’s neither experimental, nor interpretive. RE sits somewhere in between all of these, as the following slide from Jagosh (2019) shows:

For realists, according to Westhorp, both the material and social world are real, but there’s no such thing as ultimate “truth,” social systems are open rather than closed. And Realist Evaluation has a particular view of how we should think about context and of how causation works, as I discussed in the first two blogs in this series (context & causation).

Fake news and alternative facts

Realist Evaluation has an unfortunate habit of presenting itself as if it were David fighting Goliath in single combat. For me, this is not only misleading, but it’s potentially a hamartia (a hero’s fatal flaw leading to a downfall). In a world of “fake news” and “alternative facts,” few of us would subscribe to a radically positivist view of the social world. Even if claims of objectivity may be an attempt to conceal the values underpinning those claims, this doesn’t mean that those who make such claims actually hold radically positivist views. Nor do they necessarily believe that only a regularity view of causality is legitimate (as Maxwell asserts in his canonical text A Realist Approach to Qualitative Research). On the other hand, few of us would endorse radical forms of constructivism, or deconstruction either (as Maxwell, 2012 accepts), even if it makes us sound deep.

Beyond this, it’s clear that a systems view of social interaction has been relatively popular for decades, so too has the acknowledgement that political context matters, and there has long been acceptance of theory based methods. So, this is all pretty mainstream.

Most such realisations happened independently of Realist Evaluation. For instance, in Outcome Mapping’s facilitation manual we see a critique of ‘linear, “cause and effect” thinking [which] contradicts the understanding of development as a complex process that occurs in open systems.’ Outcome Harvesting also refers to itself as being ‘suitable for complex programming contexts where relations of cause and effect are not fully understood.’ So, neither method accepts an acontextual, closed, and certain view of the world. Most “complexity-aware” methods subscribe to a worldview broadly consistent with realist philosophy, and Process Tracing (PT) is no exception.

Neo-positivist and constructivist Process Tracing

Process Tracing theorists, Derek Beach and Rasmus Brun Pedersen (2019), for example, self-identify as “neopositivist” or “actualist.” But Alexander George and Andrew Bennett argue that Process Tracing “finds a place also in the constructivist approach.” Causal mechanisms may not (necessarily) be observed but they can be studied anyway (George and Bennett, 2005). For psychological processes, mechanisms can be assessed in an indirect fashion through proxies for the observable implications (Janis, 1982 in Beach and Pedersen, 2019). That is to say, the observable evidence provides only a partial view of the “real phenomena.” In this sense, PT refers to observable evidence of traces left by each part of the mechanism in between cause (e.g. meteor collision) and outcome (extinction). Maxwell’s (2012) explanation of “process or realist approach to causality,” and which cites George and Bennett (2005) and Falleti and Lynch (2009), nearly gets us there, but realist evaluators appear not to have made the link.

A recent research programme at the University of Manchester on the politics of social protection in Eastern and Southern Africa drives home a constructivist epistemological message. Tom Lavers and Sam Hickey (2015: 24), who led the programme, note that ‘all data constitute imperfect representations of social reality.’ They cite critical realist Colin Hay as justification:

‘[Political] actors “rely on perceptions of … [a] context that are, at best, incomplete and that might often prove to have been inaccurate after the event (Hay 2011: 67).” As such, interests, rather than being defined by the material context in which actors find themselves, are “irredeemably ideational, reflecting a normative … orientation toward the context in which they will have to be realized (Hay, in Lavers and Hickey, 2015: 12).’

Taken together then, Process Tracing is clearly reconcilable with a realist position that there is a real world that exists (independently of our beliefs and constructions), but that our knowledge of that world is inevitably our own construction (see Maxwell, 2012).

Truthiness and causal explanations

Another apparent misinterpretation of PT is that it can reach ultimate “truth.” Beach and Pedersen (2016) and Befani and Stedman-Bryce (2016) both argue that we will never be 100% certain in our conclusions. Even if we may reach conclusions “beyond reasonable doubt,” in practice evidence is irredeemably fallible. In the book which defined PT’s evidence tests, Van Evera (1997) refers to a bank security camera recording the faces of bank robbers as proving suspects guilty and innocent. He refers to this as “doubly decisive”evidence. However, even security camera evidence can be faulty, especially with the recent rise of deepfake technology. So, in my view, there is always room for some doubt, as I explain in an earlier blog on Process Tracing in a murder case.

One final question I was once asked by a realist evaluator was whether PT was causal, rather than merely descriptive. What they meant by the question was whether Process Tracing is simply describing a narrative, or explaining why behaviour changes. And yet, if only a minority of realist evaluations published in peer reviewed journals subscribed to the deeper definition of mechanisms, then surely, RE also has to answer the causal vs. descriptive question?

For Process Tracing, Beach and Petersen wrote a whole book on Causal Case Study Methods: Foundations and Guidelines for Comparing, Matching, and Tracing. They argue that mechanisms are about ‘explaining why something occurred by analysing the actual causal process whereby an outcome was produced (Beach and Pedersen 2016: 31).’ They critique other PT guidance for offering only what they call “minimalist mechanisms (George and Bennett, 2005; Collier, 2011; Bennett and Checkel, 2014).” These are Causal Process Observations (CPOs) or “diagnostic evidence” assumed to be linked to empirical fingerprints. “Plausibility probes” are employed to test plausibility rather than “opening up the black box” entirely. Somewhat unfairly, Beach and Pedersen (2016) refer to these as “grey boxes” or mere “congruence methods” rather than “Process Tracing” because they do not explicitly unpack mechanisms, in their view.

Whatever the shade of PT, it seems clear that it’s important to go beyond the description of a chronological sequence of events. In my own experience, I’ve found that you often require more than material evidence; you almost always rely on some testimony. And both require significant interpretation. Indeed, when you interview people, they can provide you with either thin or thick information. If they don’t provide much detail or sense of their motivation, their testimony is useful for little more than basic plausibility. However, if they provide you with information on specific details about key events, the links between events, and about their motivations and the motivations of other actors, just as in RE, you can go far beyond mere description.

In the next blog, I will compare the approaches Realist Evaluation and Process Tracing take to sampling.

--

--

Thomas Aston
Thomas Aston

Written by Thomas Aston

I'm an independent consultant specialising in theory-based and participatory evaluation methods.

No responses yet