About half-way through the Masters degree I’ve just finished I began to get interested in how and when I had got so stupid, and not just in a trivial sense that might be explicable by diminished memory function or a fight-or-flight response to the alarmingly patterned trousers that suddenly surrounded me as I went among The Young again. My self-image was that of a reasonably clever person who found studying easy. But writing my dissertation was like repeatedly pressing a switch and not understanding why nothing was happening. Somehow, the appropriate facts were not crashing into each other in the right way, in the way I was pretty sure they had last time I had attempted to do something like this. At one point a tutor told me I was good at spotting the flaws in other people’s arguments. This formed a poignant counterpoint in my violently over-inducting brain to what another tutor said to me about ten years ago, which was that I was good at seeing what really mattered. Interesting category difference there, I think.
So naturally as a fan of bullshit pop psych I first wondered about the whole 10,000 hours thing. Most people in postgrad study are building on the subject they chose when they were 18, which gives them an advantage in both data-set and mindset familiarity. Was it just too much of a stretch to acquire basic mastery of Near Eastern prehistory sufficient to enable me to write meaningfully about it? When I did my first postgraduate work I’d been studying for the previous sixteen years, and the particular subject of my postgrad work for the previous three. That has to make a difference.
And maybe there are other consequences to getting older that are more about changing your mental landscape than depleting it. Maybe I am epistemologically harder on myself these days. I know more in general, I have a higher standard of what it means to have a sound understanding of something than I used to. Probably late teens and early twenties are the optimal time for learning big difficult stuff because you don’t yet comprehend the extent of your own ignorance and would have the crap quite terrified out of you if you did.
But I don’t think any of that fully explains what was going on, and nor did any of the chirpy “Seventy-two reasons why the internet is turning you into a hopeless moron” type posts I turned up on, uh, the internet in search of the answer (although I did come across a link to a finding that men get stupider just by being in a woman’s presence, which has the worrying implication that roughly 50% of the people I am using as a reference point for my own stupidity are actually even smarter than they appear to me.)
But never fear. I still have a bullshit pop psych explanation, just a slightly more complicated and respectable one. One of the things I am finally getting around to reading is Daniel Kahneman’s Thinking, Fast and Slow. Kahneman’s system 1 and system 2 concepts are shorthand for respectively fast, intuitive, impressionistic thinking and slow, effortful, “rational” thinking. “Slow” system 2 thinking is hard and energy-expensive, which is why people have a natural resistance to it and are prone to over-rely on “fast” system 1 thinking (despite believing a lot of the time that they are using system 2 i.e. making rational judgements and decisions).
System 1 serves important purposes – impressionistic judgements enable accurate forecasting in many scenarios – but it is not good at handling certain types of problem, especially those with statistical and logical components. It is subject to various biases which can cause its conclusions and forecasts to be inaccurate, of which I think my favourite is attribute substitution (answering a different question to the one actually posed as if it was an answer to the posed question) because it explains about 80% of political commentary. Attribute substitution is built into the way people construct their political views – onlookers as well as politicians. Political problems are vast and complex, information is hard to come by and analyse, and yet people in public life and public house alike are culturally expected to take views on things they could not possibly carry out full system 2 analysis on. Attribute substitution is probably the single most useful mental tool a person commenting on a political problem has access to, if we define “useful” as “helps me avoid admitting that I do not have a solution to this problem and thereby losing status among my peers.”
However, Kahneman also refers to situations where system 1 thinking does provide reasonably accurate forecasts even where the material is logically or statistically complex, simply because some people in some situations have the system 2 knowledge database to support intuitive leaps. His illustrations are chess masters having an instant grasp of all possible future moves on a board without having to reason it through, and a physician making an instant diagnosis – both specialists recognise familiar cues in the situation and are able to make leaps to judgement which are reasonably accurate. This is where I think the relevance to academia comes in. In these terms, by the time I got to postgrad level in medieval history, I had done all the slow, logical, effortful system 2 thinking required to fix the basic rules in my head. This meant I was able to do a whole lot of informed system 1 thinking – that is, the frequent employment of low-effort intuitive thinking to make leaps and solve problem. This in turn freed up my capacity to do dogged system 2 thinking that was genuinely meaningful.
This is basically like being on drugs – the satisfying animal hit of “stands to reason” system 1 thinking plus the rational knowledge that theĀ system 2 slogging you’re doing is actually important – which is why I did the Masters in the first place and I guess why anyone sticks with academia at all. Finding thinking uniformly hard was something I had forgotten. I associated academic success with system 1 thinking, because that was the last state in which I had experienced it. I kept waiting for system 1 to kick in, and it didn’t, I didn’t have the database for it, I was grasping for the intuitive before I had done the basic crunchy bit that makes the intuition work. My back then-tutor was identifying my success in system 1 thinking – my now-tutor was describing a process he could observe in my tentative system 2 thinking (picking holes in other people’s argument is a great way of kicking off the system 2 crunchy bit).
For me the only implication here is “If you do a PhD, be crunchier about it”, but I also think it has interesting implications for academic careers in general. If you’ve done all your basic system 2 thinking in the first years of your career, you are able to take effective shortcuts in problem-solving and more of your expensive system 2 capacity is freed up for the boundary-pushing work which will move you forward as a researcher. But it has its downsides too from a point of view of both research and pedagogy. You’re no longer well-equipped to describe to your students – mired as they are in system 2 – what it is you’re really doing. Your shortcuts once laid down are less likely to get truly re-examined, which may mean the perpetuation of specialized versions of heuristic biases in your work.
Perhaps this provides another perspective on why intellectual revolutions are as fraught as they are. It is not mere social and professional defensiveness at work when new paradigms are rejected – we can usually detect these kinds of bias. At a higher level of abstraction, a call to embrace a new paradigm is a call to put down the lovely, easy, satisfying system 1 toys and start again from scratch with the unpromising lego bricks of system 2, which is what I have just had to do.
2 Comments