It was a Tuesday afternoon, about 3:30pm. After weeks of preparation, months of discussions with SLT, and careful coordination with Heads of Year, we were finally ready to launch a new round of parent feedback sheets. Templates were in place, links had been generated, and the school secretary pressed send, pinging personalised links to every single parent.
Within three minutes, thirty emails arrived. Parents couldn’t access the links.
At first, I thought this was just the usual teething issues — parents not logged into their school accounts, or trying to open things on unsupported devices. But very quickly, it became clear: this was bigger. Something fundamental had gone wrong.
That sinking feeling will be familiar to anyone who’s led a large rollout. The realisation that you’ve missed something small but critical, and that the mistake has now gone out to the whole community. Embarrassment, frustration, self-recrimination — because you know that if you’d just taken that one extra step, none of it would have happened.
In my case, it was a misunderstanding about how Google Drive sharing interacted with Guardian summaries in Google Classroom. The details matter less than the lesson: this was an example of poor crew resource management.
Why I Read About Plane Crashes
One of my odd hobbies is reading about aviation accidents. If you’ve never come across Kyra Dempsey’s writing on Admiral Cloudberg, I highly recommend it. Her analysis of why planes crash is outstanding, and the parallels for education technology are striking.
Aviation is one of the safest industries in the world, precisely because it assumes that human mistakes will always happen. The Swiss Cheese model says accidents only occur when multiple holes line up across different layers of defence. In other words: it’s never just one mistake, it’s a chain of them.
The aviation industry’s response was Crew Resource Management (CRM) — training pilots and copilots to continuously check each other, question assumptions, and speak up, regardless of hierarchy. The key lesson: telling people to “just be more careful” doesn’t work. Humans will always make mistakes. The only effective safeguard is building systems that catch them before they line up.
Two Lessons for Schools
There are two big takeaways I think schools can borrow from CRM:
1. Recognise authority gradients.
An authority gradient exists whenever one person has more status than another. Authority gradients are inescapable, and need to exist. Without someone having a final say, it becomes impossible to achieve anything. However, if an authority gradient becomes too steep, it can prevent people from speaking up and voicing doubts.
Authority gradients don’t necessarily originate from organisational hierarchy, like when a classroom teacher may defer to a Deputy Head despite their misgivings, but also by percieved understanding, which results in “the tech lead must know best.”
Both extremes are risky:
- Junior colleagues may end up being forced to make unilateral ‘technical’ decisions, without the experience and wider organisational understanding to appreciate second- and third-order consequences.
- Senior leaders may pushing ahead with ideas that others know won’t work, but no one feels able to challenge.
In this case, the member of staff asking us to hit publish felt that I was more technically skilled than they were, and so had undue trust that I would have thought through all the possible problems. He knows what he’s doing – it will be fine. We all have times when we defer to our colleagues, but it’s important never to assume they won’t make a human mistake. Double-checking somone’s work, espeically when the stakes are high, isn’t demaning or showing a lack of trust – it’s ensuring that multiple eyeballs have verified that nothing is missed.
The takeway? If you’re the one with authority, actively lower the gradient. Ask: “I might be wrong here — what do you think?” Create explicit space for people to challenge. If you’re on the other side, be aware that silence can lead to system-level mistakes.
2. Guard against ‘get-there-itis’.
In aviation, this phrase describes the dangerous urge to push on and land, even when conditions aren’t safe. In schools, it’s the pressure to get something out the door because of deadlines, expectations, or personal circumstances.
In my case, it was the end of the day, my daughter was sick in hospital, and I knew I wouldn’t be in school the next day. The urgency to finish overrode my earlier assumptions. I never double-checked the sharing permissions — something that would have been obvious if I’d just stopped and tested it with three staff parents first.
Get-there-itis narrows your focus to the task at hand and blinds you to the bigger picture. The antidote is to deliberately pause:
- Stop and ask, “Have we actually thought this through?”
- Force a mini-simulation: send to three people, check it works, then scale up.
- Encourage colleagues to call out when the team is rushing.
The Takeaway
Technology rollouts in schools will always involve risk. Mistakes will happen. What matters is how we build systems and cultures that catch them before they reach parents and students.
For me, the lesson was clear:
- Flatten the authority gradient — create space for challenge, regardless of role.
- Beware of get-there-itis — pressure is real, but rushing is where errors compound.
If aviation can learn from its crashes to make flying safer, schools can do the same with their own failures. The key is not to avoid mistakes altogether, but to build resilient systems that stop small slips from becoming big ones.