The Urgency of AI: Democracy and Education in Australia

What I learned from another Churchill Fellow’s journey

Chris Bush is one of my 2024 Churchill buddies. He is the other AI Churchill Fellow from Melbourne in the Class of 2024. Chris and I bonded at the onboarding event last year in Canberra, and have become good friends as we both navigated our respective projects. He’s also become somewhat of an AI mentor to me, kicking me off with AI tools and helping to demystify things when I was just starting out.

Something fascinating happened when I read Chris’s Churchill Fellowship report, which has just been published. Chris travelled the world investigating AI in education, focusing on how technology could either close or widen the equity gap in Australian schools. My research took me down a completely different path, exploring how AI-enabled disinformation threatens our elections and democracy.

Yet we ended up with remarkably similar conclusions.

Chris discovered that Australian schools are implementing AI in fragmented, inconsistent ways because our national framework offers only high-level principles without practical guidance. I found a similar problem in the electoral space. Australia lacks coordinated national response to AI-enabled election disinformation, leaving each state and territory to figure things out independently.

The equity concerns hit home too. Chris’s research shows how AI could either help disadvantaged students catch up or leave them further behind. In my work, I saw the same dynamic in democracy. AI-enabled disinformation doesn’t affect everyone equally—it hits more vulnerable groups hardest, and undermines the equal access to reliable information that democracy requires.

Perhaps most striking was our shared sense that the window for action is closing. Chris warns that “the window for proactive action is narrowing rapidly,” noting that the education sector’s resistance to change risks students graduating with irrelevant skills. I found the exact same temporal urgency in the electoral space, warning that Australia’s ponderous approach to legislative reform means we’ll take eight-plus years to develop responses after first experiencing them—meaning AI-enabled threats visible globally in 2024 won’t be properly addressed here until 2032 at earliest. Chris emphasises building teacher capacity before AI becomes more embedded in classrooms. I’m pushing for pre-planned rapid response mechanisms before the next election cycle. We both stress the same principle: capabilities must be built proactively, because by the time a crisis becomes visible, it’s already too late to prevent damage and start innovating.

Reading Chris’s report reinforced something important: whether we’re talking about education or democracy, Australia needs coordinated national frameworks, human-centred implementation, and the courage to act now.

Different domains, same lessons. That feels important.

Leave a comment