Conflict Theory: A Leftist Mistake Theorist's Steelman
That title’s weaselwordy. What exactly do you mean?
First, you should probably read this. There, Scott Alexander provides a bunch of examples to help you understand a mental model that you can use to understand how people approach political issues. That model is patterned off of a reddit post, which he links. He summarizes the two groups like this:
Mistake theorists treat politics as science, engineering, or medicine. The State is diseased. We’re all doctors, standing around arguing over the best diagnosis and cure. Some of us have good ideas, others have bad ideas that wouldn’t help, or that would cause too many side effects.
Conflict theorists treat politics as war. Different blocs with different interests are forever fighting to determine whether the State exists to enrich the Elites or to help the People.
This is the main idea I wish to try to address. I’m not exactly going to go point-by-point down the list of his examples, but I’d encourage you to read them anyway.
Why would a reasonable leftist—maybe even a mistake theorist—act like a conflict theorist? I’m specifically focusing on a leftist perspective because it’s a perspective I can earnestly stand behind. I’m not interested in trying to directly justify the positions of neo-Nazis, nor anyone else who flirts too much with the authoritarian right, which are the main focal points when we’re talking about the conflict-right. My main goal here is to criticize the idea that this conflict/mistake divide is useful. I think that there’s some small kernel of truth or usefulness if you read deeply into it, but I think it ends up being too reductive to describe plenty of reasonable people. Specifically, after a bit of a preamble, I’m going to claim that you can construct a view of mistake theory that implies a view of conflict theory, given a few extra assumptions, which I’ll try to substantiate as much as I can.
I understand that Scott Alexander doesn’t really stand by this post as a fair characterization of the people he’s discussing, and this isn’t really supposed to be a direct attack of the guy, at least not about that post or in this one. (I met him at one of the meetups once! He was nice!) I have not read a large corpus of arguments against this model, so my comments here are more-or-less mine alone, and they fall into this general interest I have in contextualizing the actions and rhetoric of leftists to people who aren’t lifelong leftists.
A lot of interesting commentary has been generated about contextualizing the behavior of the far-right to a leftist audience. I think there’s only been a mediumish amount of effort in contextualizing some of the behaviors and rhetoric of the left to people who are more-or-less like me in most ways, but lean right. I can and will happily gesture towards The Alt-Right Playbook series as a good way to help understand leftists’ thought about about the rhetoric of rightists on the ’Net, but it is generally addressed to leftists, and there are plenty of rightists out there who could reasonably react to large amounts of that series with “he’s not really talking about me at all right now.” I think it’s especially important to note these two things as we move forward:
I will talk about the use of rhetoric and cultural criticism on the right, and I’m not going to mince words about what parts of it I think are bad. I’m not saying that the only explanation for why anyone would hold rightist beliefs would stem from misinformation they heard in the media, but I am saying that this media is culturally influential.
The way “the left” approaches “the right” isn’t particularly good in my eyes, and I have no interest in pretending it is. I am speaking as a leftist, but not on behalf of leftism.
Ok, let’s get back to the background. We’ve got some ground I think we should cover before I bring up the main point.
Erode our ability to communicate and you erode the “free marketplace of ideas”
I’ve talked to a lot of people who seem bitterly offended by the concept of toxic masculinity, and none of them did a particularly good job of convincing me that they really knew what it meant. Given how the tactics of social commentators on the right who try to appeal to general audiences work, and given that I don’t think listening to rightist cultural commentary constitutes some sort of moral failure, I honestly do not feel like I can blame them for this. I can apply very similar reasoning to other terms, like “white fragility” or “non-binary gender” or even “feminism.” (I’m mentioning feminism-related topics right now merely because they’re especially hot-button: people tend to get more worked up about statements like ‘white fragility is a real phenomenon’ than ‘Fox News whitewashed this wartime atrocity,’ at least in my experience on the Internet. If you ask me, the latter is what we should all be fixating on more.)
The right-wing state-of-the-art commentary is not especially focused on discussing what any of these things mean. They’re more interested in describing their worldviews and how they perceive that these things will effect their social bottom lines. This isn’t an bad thing either, so long as you make an effort to ensure that your audience continues to at least understand where your oppostion is coming from.
General-audience right-wing media doesn’t consistently do this. Many of the best-known right-wing media outlets that make a show of describing their opponents also have an unfortunate habit of constructing strawmen and/or playing fast-and-loose with the truth. (I will simply gesture in the direction of PragerU and hold off on further comment on this for the moment.)
When it comes to lower-effort partisan media, well. I find it very difficult to imagine you’d happen into the rosiest view of feminism when the fairest representation you’d see from your friends or family looks more like something from PragerU than the speech of actual feminists, and the meat of what you see casually shared around online (and we’re not even going to touch the chans here quite yet) are supercuts (may I add that they’re usually quotemining?) of prickly-demeanored women yelling at people on the street. You’re not really getting a meaningful slice of what a typical leftist actually thinks.
You’re less likely to push back on a conservative take about, say, traditional family values if you already have a reflexive distrust towards feminism. Keeping you under or misinformed about the actual beliefs of leftists will push you further to the right. Making it harder for people who disagree with you to have an actual conversation with people who disagree with you will keep you further to the right. Keeping you in the right means keeping you as an audience member. Keeping an audience implies keeping the money that they’re being paid for your attention. Commentators on the right know this, fringe groups know this, internet trolls know this, and (regrettably) media savvy leftists have had to catch on too.
To complicate all this, terms like “toxic masculinity” and “white fragility” have very highly inflammatory-sounding names. (It’s a shame that those of us who care about extra-academic ethos didn’t pick the names, huh?) This is an image issue for the left. A typical feminist discussing toxic masculinity is not talking about it out of a deep hatred of you and your fellow men, and a typical intersectional feminist talking about white fragility doesn’t give a shit about the color of your skin. They’re usually talking about how a bunch of old, conservative social norms hurt you and everyone else around you. It is often deeply frustrating to discuss these topics with people on the right, and that isn’t because you need to be a leftist for them to be platonically compatible with your worldview.
And then, of course, we have radicalization pipelines. This is where I should point to Ian Danskin’s series The Alt-Right Playbook again. If you’re not a fan of the video essay format, note that there’s a link to the transcript in the description of each of those videos. We’ll come back to this one later, and this one is specifically relevant to radicalization.
Alexander further clusters mistake theory and conflict theory into “easy” and “hard” variants:
Consider a further distinction between easy and hard mistake theorists. Easy mistake theorists think that all our problems come from very stupid people making very simple mistakes; dumb people deny the evidence about global warming; smart people don’t. Hard mistake theorists think that the questions involved are really complicated and require more evidence than we’ve been able to collect so far – the weird morass of conflicting minimum wage studies is a good example here. Obviously some questions are easier than others, but the disposition to view questions as hard or easy in general seems to separate into different people and schools of thought.
(Maybe there’s a further distinction between easy and hard conflict theorists. Easy conflict theorists think that all our problems come from cartoon-villain caricatures wanting very evil things; bad people want to kill brown people and steal their oil, good people want world peace and tolerance. Hard conflict theorists think that our problems come from clashes between differing but comprehensible worldviews – for example, people who want to lift people out of poverty through spreading modern efficient egalitarian industrial civilization, versus people who want to preserve traditional cultures with all their thorns and prickles. Obviously some moral conflicts are more black-and-white than others, but again, some people seem more inclined than others to use one of these models.)
I’m mostly interested in the hard-theory sides of those spectrums. A hard-mistake theorist will occasionally say that we need to perform some intervention that seems consistent with easy-mistake theory. Maybe sometimes people really are just making very stupid mistakes, and the solution is to directly deal with that. The same applies to hard-conflict theory. I totally am that person who says it’s a mistake to treat Nazis with too much charity, by the way, and it’s not because I think they’re Disney villains.
I’d say hard-mistake and hard-conflict are very similar once you assume the title to this section. We’ve got a couple more assumptions to take note of before we can tie that knot.
Different people have very different epistemologies
I want to take this opportunity to state that I’m not taking hard-conflict theory to mean “people’s worldviews are more-or-less immutable to well-reasoned argument.” My thoughts there are much more complicated, and they honestly deserve their own essay. But in brief, having one-on-one, private discussions with acquaintences, friends, or family with beliefs you’re worried about can often be helpful. So can public discussion and social media use, when you know what you’re talking about and use evidence-based strategies to do so. (Here’s a guide on that.) A good technique for combatting extremism would probably include a “vaccine:” some sort of tool that could allow someone to more-safely be exposed to some virulent ideas. (Those-in-the-know, like social scholars and historians, really don’t seem to expect that a solution to extremism exists. Dealing with extremism takes constant, evolving effort from lots of people.)
danah boyd has written a piece that I think is highly relevant to this discussion. I don’t think I can give her piece justice with a short summary, so I’m going to strongly recommend you watch or read this one. It’s her presentation at SXSW Edu 2018 and the written piece from which it was derived. In it, she reacts to the current push to get educators to teach their students about media literacy: she thinks that the designs she sees may not produce the results—in my words, a vaccine against antiprogressivism—their proponents are expecting. Furthermore, she argues that pushing that progressive agenda is a bad idea to begin with:
Most media literacy proponents tell me that media literacy doesn’t exist in schools. And it’s true that the ideal version that they’re aiming for definitely doesn’t. But I spent a decade in and out of all sorts of schools in the US, where I quickly learned that a perverted version of media literacy does already exist. Students are asked to distinguish between CNN and Fox. Or to identify bias in a news story. When tech is involved, it often comes in the form of “don’t trust Wikipedia; use Google.” We might collectively dismiss these practices as not-media-literacy, but these activities are often couched in those terms.
I’m painfully aware of this, in part because media literacy is regularly proposed as the “solution” to the so-called “fake news” problem. I hear this from funders and journalists, social media companies and elected officials. My colleagues Monica Bulger and Patrick Davison just released a report on media literacy in light of “fake news” given the gaps in current conversations. I don’t know what version of media literacy they’re imagining but I’m pretty certain it’s not the CNN vs Fox News version. Yet, when I drill in, they often argue for the need to combat propaganda, to get students to ask where the money is coming from, to ask who is writing the stories for what purposes, to know how to fact-check, etcetera. And when I push them further, I often hear decidedly liberal narratives. They talk about the Mercers or about InfoWars or about the Russians. They mock “alternative facts.” While I identify as a progressive, I am deeply concerned by how people understand these different conservative phenomena and what they see media literacy as solving.
The core of her argument stems from how students from different backgrounds will naturally hold different epistemologies, due to the usual network effects stemming from how people—consciously or not—tend to prefer to live near or interact with people who are politically, culturally, or racially similar to them. (How exactly we choose to deal with that is a matter of perspective, but I think a good socially-liberal take on that is this. Just be aware that other social effects like gentrification and the presence of extremists can make this all quite a bit more complicated in practice.) The sorts of media literacy education she’s talking about don’t do anything to account for these epistemological differences between people, or worse, they just try to unsubtly assert that their epistemology is better.
Those whose worldview is rooted in religious faith, particularly Abrahamic religions, draw on different types of information to construct knowledge. Resolving scientific knowledge and faith-based knowledge has never been easy; this tension has countless political and social ramifications. As a result, American society has long danced around this yawning gulf and tried to find solutions that can appease everyone. But you can’t resolve fundamental epistemological differences through compromise.
No matter what worldview or way of knowing someone holds dear, they always believe that they are engaging in critical thinking when developing a sense of what is right and wrong, true and false, honest and deceptive. But much of what they conclude may be more rooted in their way of knowing than any specific source of information.
If we’re not careful, “media literacy” and “critical thinking” will simply be deployed as an assertion of authority over epistemology.
I’d describe this second paragraph, itself in part an observation borrowed from Cory Doctorow, a near-perfect nutshelled description of why culture-warring occurs. You might say “well, that’s just because she just stated that core idea of conflict theory.” You’re not that far off; this might be the biggest assumption I’m going to make today. The trick here is that personal epistemology isn’t quite the same as personal worldview. Starting with an epistemology typical of, say, the post-Enlightenment, you can derive a ton of conflicting worldviews. Same with Scott Alexander’s own post-LessWrong rationalism: his readers have a ludicrously diverse set of worldviews, even if their working epistemologies are similar.
Propagandists already know how to make our communication and epistemic divides worse.
Here’s where that Alt-Right Playbook episode I mentioned earlier comes in. Postmodern conservatism—the formulation popularized by David Roberts, not Quillette, though that one may be worth a read—pays less attention to Quillette-PoMoCo-style rejection of rationalism (in the 1960s sense) in favor of arguing with a sort of provisionally-held pseudobelief, because claiming you hold whichever belief makes your opponent wrong makes it easier to dominate an argument (especially a public one).
As Danskin clarifies, this isn’t necessarily capital-T True, but it may occasionally be a useful way to model the people you’re talking with. Sometimes, it’s a really, really good model. (If you can’t stand HuffPost, just bear in mind that you can read the full document at the bottom of the page.) I can’t read that article and honestly think “clearly, they’re just making a mistake tied to simple/complex social issues.” I think the only honest read you can make of an article like that while making all of the assumptions in these section titles is “this is how you radicalize people through use of performative irony.”
And, I’ll add, I don’t think the correct model to use for Daily Stormer contributors is “basically Cruella de Vil.” I’m totally willing to call their worldviews grossly self-consistent and stemming from completely comprehensible ontologies and epistemologies; that’s part of why they’re so dangerous. (Talking about why is probably a conversation for another time.)
Does all of this mean that I think this divide is completely useless? Is this a rock-hard fundamental rebuttal of this conflict/mistake divide? Am I done linking you lengthy things to watch/read?
I’d say that Conflict vs. Mistake should be called what it is: a weak heuristic or intuition pump. Furthermore, I’d say it’s more useful to use it to loosely cluster arguments, not arguers.
My personal experience is that if I’m talking with someone who proudly calls themself a mistake theorist (which I think the SSC piece is gently suggesting is what a typical reader should call themself), they’ll occasionally make a conflicty argument. I suspect that many of them read Scott’s description of mistake theory as “it’s worthwhile to assume that people’s beliefs can be changed by careful, well-reasoned argument over time, and these sneaky conflict theorists think they can’t.” If you think that way to some degree, consider trying to disentangle the two angles in your head, and especially if you think that might be effecting how you’re responding to someone else’s arguments. I hope that one of your takeaways from this post is “it’s possible that they don’t think ‘as a rule, people can’t be convinced by fair argument,’ but they do think that fair argument alone isn’t likely to help at the scales of some societal issues.”
I want to give a quick shoutout to TheSkeward, who read over some of the working document that became this post. I didn’t find satisfying places to work these SSC posts that he suggested into the post, but I think they’re interesting enough to tuck links to them down here: In Favor Of Niceness, Community, and Civilization, The Toxoplasma Of Rage. If I find that Scott Alexander mentions this post publicly (no pressure), this text will be a link to that reply.
- 2020-03-07: Made some minor wording edits.
- 2021-10-13: Significantly reworked the beginning of the section “Different people have very different epistemologies.”
I invoked the story of Daryl Davis as an example of changing people’s minds through deliberate, sustained conversation. Davis certainly still has his collection of Klan robes, and that does make a pretty good story, but I’m definitely personally questioning a lot of his methods at this point, especially how he’s doing his talking-to-extremists schtick a bit too publicly for my nerves. I don’t think his story is especially appropriate for this essay because I’ve begun to doubt his methods, because his example is pointlessly extreme relative to the extremists most people might feel an urge to talk to, and I’d hate to uncritically point to him as an example of “doing things right.” I’d have to do a lot of new research (combing through Davis’ massive number of interviews, and watching his documentary, for starters) to do that “doing things right” idea much justice, and it’s not necessarily really the sort of thing I feel I should write about.
I think I got a bit too in-the-weeds talking about “solutions” to extremism, and probably was too aggressively optimistic about the idea. I’ve significantly toned down a lot of my wording. Since I still have some bugbears with a piece I wrote so long ago, this might be an indication that I should revisit a bunch of these ideas in the newer voice I’ve been writing with. Food for thought?