Skip to main content

Ai-Agents

2026


Mapping the code-as-video landscape --- deterministic to generative, with a learning-content lens

The first two posts in this series put Editframe through its paces on a hello-world clip and a 71-second product walkthrough. This third post zooms out. The question is no longer ‘does code-as-video work for L&D’ — the first two experiments answered that. It is which other frameworks belong in the picture, where they sit on a deterministic-to-generative spectrum, and what each tier can credibly contribute to a learning content development workflow.

Code-as-video with Editframe --- video production as a dev workflow

The first experiment closed with a question : can an agent take structured product information and turn it into a short, repeatable video composition? This second experiment is the answer. The interesting part is not the seconds. It is what this workflow does to the way Learning Program Owners, Learning Designers, and Learning Developers traditionally split the work.

Hello world with Editframe --- video production as a dev workflow

Can video production start to look more like a dev workflow, where the composition is code, the output is reproducible, and the agent can help move through the rough edges? Experiment 01 – a hello-world Editframe project, scaffolded, iterated, and rendered locally, with the failure points captured for the next pass.

In-app Live Assistant : Part 2 --- Building the Screen-Sharing Version

The ADK Dev UI supports camera but not screen sharing. In this post I build a custom client that swaps getUserMedia for getDisplayMedia, writes a 16kHz AudioWorklet from scratch, and sends 1 FPS JPEG screen snapshots to the same WebSocket endpoint. The agent now sees the application instead of the user’s face.

In-app Live Assistant : Part 1 --- Walking Through Google's ADK Bidirectional Streaming Demo

I want to build a live AI assistant that can hear a user and see their screen at the same time, then talk back in real time. This post walks through getting Google’s ADK bidirectional streaming demo running locally with mic, camera, and voice out — and documents the things that tripped me up.