Three Years in the Frontline
Three years. That's how long it's been since I last wrote here. My last post, AI in the Education Frontline, landed in February 2023. ChatGPT was barely two months old. I was running staff workshops, teaching a Year 9 AI course, and my daughter had just used ChatGPT to figure out box plots in maths class. I signed off saying I'd investigate Bing AI (now Microsoft Copilot) and report back.
I didn't report back. Not because I lost interest, quite the opposite. I fell so far down the rabbit hole that writing about AI gave way to building with AI. My ADHD brain grabbed hold and didn't let go. Three years of late nights, experiments, breakthroughs, dead ends, and more learning than I've done in any comparable stretch of my career.
So. Here's my field report as of February, 2026...

The World Moved On
Cast your mind back to early 2023. The dominant reaction to AI was fear. In education, it was "Students will cheat." In workplaces, it was "It'll take our jobs." In creative fields, it was "It'll replace us." Schools were banning ChatGPT. The media ran wall-to-wall panic pieces. I pushed back on that at the time — deficit thinking, I called it — and I stand by that now.
Three years later, the conversation has shifted dramatically. We've moved from "should we allow this?" to "how do we use this well?" That's genuine progress. AI isn't a novelty any more. It's infrastructure. My students don't gasp when I mention AI — they ask which tool to use and the best way to use it. And that shift isn't limited to classrooms. Anyone learning a new skill, building a side project, or trying to make sense of complex information is navigating the same question.
The default is now what AI enables, not what could go wrong.
Don't get me wrong — the "unreliable narrator" concept I wrote about in 2023 still absolutely applies. AI still hallucinates. It still needs checking, but nowhere near as much as it did in 2023. The key shift is that healthy scepticism now sits alongside the benefits rather than overshadowing them.
What I've Been Building
Here's where it gets practical. Over the past year especially, I've been building learning tools — real, usable things, not prototypes gathering dust. My day job is teaching, so my examples come from the classroom. But the principles — and the tools — apply to anyone creating content for learners, whether that's a school class, an online course, a training programme, or your own self-directed study.
If you want more details on how any of these were built, let me know and I'll write dedicated posts.
Psychology NotebookLM Bot
2025 saw Google coming to the game with allowing schools to use Gemini and other AI tools without using it for AI training purposes. Up to that point we were using Microsoft Copilot and ChatGPT (with certain restrictions enabled) along with Claude. When we finally got access to NotebookLM, I was finally able to create different notebooks with specific content that students could then ask questions of to support learning. This proved very useful in getting students to understand how to read a paper as well as preparing for internals and exams. I pay for Gemini, so I could share the notebooks as just the chat and not give away the raw content.
The broader point here: NotebookLM lets you create an AI that only knows what you feed it. No internet hallucinations, no off-topic tangents. If you're a course creator, a trainer, or someone building learning resources, that's powerful. Your content, your sources, your quality control.

Psychology Revision Site
I built a complete NCEA Level 2 and Level 3 Psychology revision platform. It started simple — topic summaries for each of the six approaches (Biological, Behavioural, Cognitive, Humanistic, Psychodynamic, Sociocultural). Then it grew. And grew.
It now includes:
- Flashcards with spaced repetition — a simplified SM-2 algorithm that schedules reviews based on how well you know each card. Hard, Good, or Easy buttons after each card. It actually works.
- Multiple quiz types — multiple choice, true/false, pre-test mode (based on the pretesting effect research), and a "Compare Approaches" quiz where learners apply all six approaches to real scenarios like exam anxiety or addiction.
- Traffic light self-assessment — red, amber, green buttons on each topic. Simple, but learners love the visual feedback.
- A progress dashboard — study streaks, mastery breakdowns, quiz history with trends, forgetting curve visualisation showing predicted memory strength per topic.
- Class comparison — learners enter a group code and can see how their performance compares to the group average and national benchmarks. Powered by Firebase.
80 flashcards across both levels. 69 quiz questions. 32 compare-approaches questions. All grounded in actual course content — no fabricated studies, no hallucinated psychologists. I cross-referenced everything through NotebookLM, feeding in my actual course materials as sources.
The point isn't the Psychology content specifically — it's that one person, who isn't a software engineer, built a complete interactive learning platform using AI tools. That wasn't possible three years ago.

AI-Powered Feedback
This is the one that excites me most. Learners write practice responses — "Describe how the behavioural approach explains aggression" or "Discuss the sociocultural approach to understanding obesity" — and get instant formative feedback graded against specific criteria. With specific strengths, areas for improvement, and missing points identified.
The AI marker uses a serverless proxy (a Cloudflare Worker) that sends responses to Claude Haiku with the marking criteria baked into the system prompt. The cost? Fractions of a cent per submission.
Let me put that in context. Commercial tools like Turnitin cost roughly $3–5 per user per year, and all they do is check for plagiarism. My AI marker provides detailed formative feedback — the kind that actually helps people improve — for a group of 30 learners at around $1 per year. For 1,500 users, you're looking at maybe $50–100 annually. That's not a typo.
Detailed formative feedback at a fraction of the cost of tools that only catch plagiarism.
Is it perfect? No. It's an unreliable narrator, remember. But as a formative tool — something you use to practise and get feedback before the real thing — it's genuinely transformative. And the rate limiting (10 submissions per day) means nobody's running up a bill.
This pattern isn't limited to school exams. Anyone building a course — whether it's professional development, language learning, or technical training — could use the same architecture to give learners instant, specific feedback on their practice work.
Voice Cloning
This one feels like science fiction, but it's real and it's practical. I used ElevenLabs to clone my voice from about three hours of recordings. Learners can now listen to flashcard content narrated in my actual voice. It costs about $22/month.
The first time I heard it was genuinely unsettling. It sounds like me. The intonation, the pacing - it's close enough that nobody questions it. For anyone who struggles with reading, or who just processes information better through audio, this is a game-changer. It's Universal Design for Learning in action — what benefits some, benefits all.
Think beyond education for a moment. If you're a course creator, a podcaster, or anyone who produces content — cloning your own voice means you can scale your audio output without recording every single word. The ethical guardrails matter (only clone your own voice), but the practical value is enormous.
Biology Course Platform
I've also started building a Level 3 Biology course platform using Astro - a modern static site framework. Week-by-week content, dark-first theme with a light toggle, hosted on GitHub Pages for free. It's earlier in development than the Psychology site, but the architecture is solid and content is being authored week by week.
The Tools Behind the Tools
A few things made all of this possible that wouldn't have existed — or wouldn't have been accessible to someone like me — even two years ago.
Claude Code is the big one. It's a command-line AI assistant that can read your codebase, write code, run tests, and iterate. I'm not exaggerating when I say it has changed what's possible for someone like me. I have a PhD in science, not computer science. I can code, but I'm not a software engineer. With Claude Code, I've built features in hours that would have taken me weeks, or that I simply wouldn't have attempted.
This is the bit that I think matters most for Pursue Learning's audience: the barrier between "having an idea" and "building the thing" has collapsed. You don't need to be a developer. You need to be curious, persistent, and willing to describe what you want clearly. The AI handles the rest — or at least, enough of the rest to get you moving.
NotebookLM has been invaluable for content creation. I feed in my course materials, textbook chapters, research papers, class notes, and use it as a grounded source for generating flashcards, quiz questions, and study content. Because it's working from my sources, not the open internet, the hallucination risk drops dramatically.
Tana is my second brain. It's an outliner where everything is a node in a connected graph. My daily planning, project tracking, research notes, even my Oura ring health data flows in automatically each morning. When you have ADHD, having everything written down in one searchable place isn't a luxury, it's survival.

The Honest Cost
I need to be upfront about something. All of this has come at a personal cost.
I feel like a child in a toyshop having the ability to get all the toys. And that feeling, for someone with ADHD, is dangerous. The hyperfocus kicks in. I'm building instead of sleeping. My Oura ring politely tells me my sleep score is terrible, and I ignore it because I've just had an idea for yet another new feature.
I know I've overcommitted. I've taken on too many projects simultaneously. I've let the excitement of what's possible run ahead of what's sustainable.
I share this not because I want sympathy, but because if you're experimenting with AI and feeling the same pull — whether you're a teacher, a creator, a self-taught developer, or just someone who can't stop tinkering — you're definitely not alone. The tools are genuinely exciting. The possibilities are genuinely real and attainable. But so is burnout. Pace yourself. I'm still learning that one.
What's Changed in My Thinking
In 2023, I framed AI primarily through caution. It's an unreliable narrator. Check everything. Don't trust it blindly. All still true. But my emphasis has shifted.
Now I lead with what AI enables:
- True personalisation — not the kind that gets written into corporate strategy documents, but actual personalised support at scale. Every learner gets feedback on their specific work. Every learner can replay audio explanations as many times as they need. That applies whether you're running a classroom of 30 or an online course of 3,000.
- Equity of access — this has always mattered to me. Free learning resources. AI feedback that costs pennies. Voice narration for every learner. No paywall, no login required. The best learning tools shouldn't only be available to those who can afford them.
- Creator, not just consumer — I've gone from consumer of technology to creator. And I'm not special in this regard. The tools are accessible enough now that anyone with curiosity and a bit of persistence can build things for their learners — or for themselves. All of us using AI coding tools are developers now.

What's Next
This post is a restart. Pursue Learning has been quiet for too long, and there's too much to share.
I'll be writing about the practical side of all this — how to set up AI-powered feedback for your own courses, how NotebookLM can generate study content from your source materials, how spaced repetition works and why it matters, how to go from idea to working prototype without a computer science degree. The kind of posts where you walk away with something you can actually use.
I'll also be honest about the things that don't work. The dead ends. The features I built that nobody used. The prompts that produce rubbish. Lead with benefits, but don't write off the negatives.
Whether you're a teacher, a course creator, a lifelong learner, or someone who just wants to build things — this is for you. What I could do three years ago was interesting. What I can do now is amazing. And I want to bring you along for the ride.
Until next time...
Eliot