I wanted to test an idea: could you build a complete sales training package using only AI tools? Not just a document or a slide deck, but the full set; research, written content, an interactive quiz, a presentation, a podcast, and a video. All from a single topic brief.
The topic was penetration testing sales for enterprise security teams. It's a subject I know fairly well from my time at Zscaler and Cisco, so I could judge the quality of the output to a decent standard. The goal wasn't to replace a training team; it was to see how far AI tools could take a concept before a human needed to step in.
The workflow
Each AI platform handled a different part of the pipeline, approximate versions at the time of creation: Claude 4.0, Gemini 2.5, Gamma, NotebookLM 2.5 and Synthesia Studio 2.0. The aim was using the right tool for each job, rather than trying to make one tool do everything.
It started with research. I used Claude and Gemini together, each with deep research enabled. Claude Opus produced the initial research document covering pen testing market data, buyer personas, compliance frameworks, and objection handling. Gemini then reviewed and expanded on it, and Claude did a final pass to merge the outputs. The cross-validation between the two models caught gaps that either one alone would have missed.
From that research base, each output was created with the tool best suited to the format: Claude built the interactive quiz and the cheat sheets as single-file HTML applications. Gamma turned the research into a professional slide deck. Google's NotebookLM generated a podcast-style discussion from the research document. Synthesia produced a video presentation with a virtual presenter.
The interactive quiz
The quiz ended up being an impressive engineering output. Claude built it as a single HTML file with no external dependencies: six modules covering business value, buyer personas, compliance, technical translation, objection handling, and competitive positioning.
It includes progress tracking with localStorage, an achievement system with unlock notifications, a statistics dashboard, and a glassmorphism UI. The whole thing is around 100KB of pure HTML, CSS, and JavaScript. No frameworks, no build step, no server required; it just runs in a browser.
The content quality was surprisingly good. Because the research phase was thorough and I could validate the accuracy from my own experience, the quiz questions were relevant and the scenarios felt realistic. It wasn't perfect; some questions needed tweaking for nuance that only comes from real sales conversations; but as a starting point it was far ahead of what I'd expected. There are likely some bugs in output but minor at most.
What this proved
The whole package took a couple of days to put together. Doing this manually; hiring a subject matter expert, a content writer, a designer, a video producer, and a developer; would have taken weeks and a significant budget. AI didn't replace any of those roles entirely, but it got each deliverable to maybe 80% of the way there in a fraction of the time.
The bigger takeaway for me was about workflow design. No single AI tool could have produced all of this. The value came from understanding what each platform does well and chaining them together: Claude for research and code, Gemini for validation and alternative perspectives, Gamma for visual design, NotebookLM for audio, Synthesia for video. The orchestration was the human contribution.
The full training package is live at pentest.jamescarty.co.uk and the source files are on GitHub.
Comments 0
Log in or register to comment.