The Architecture Behind AskRobots: How Nine Execution Models Work Together

By dan • February 23, 2026 • 6 min read

# The Architecture Behind AskRobots: How Nine Execution Models Work Together

Most web applications are simple request/response systems. A user clicks a button, the server does something, and sends back a page. AskRobots started there too — as a Django application. But as real needs emerged, the architecture grew organically into something far more interesting: a distributed system with nine distinct ways data moves through it.

None of this was designed upfront. Each layer was added because we actually needed it. Here's how they all fit together.

## 1. Django: The Front Door

Django handles the fundamentals — HTTP requests, authentication, database models, templates, admin. It's the reliable foundation everything else builds on. User logs in, views their tasks, uploads a file — that's Django doing what Django does best.

But Django's request/response cycle is synchronous. User waits, server works, server responds. That's fine for reading data. It's not fine for processing a PDF or calling an AI API that takes 10 seconds.

## 2. Celery: Async Task Queue

When something takes too long for a web request, it goes to Celery. Upload an image? Django saves it, then fires a Celery task to generate thumbnails, describe it with AI, extract text if it's a PDF — all in the background.

Celery workers run as separate processes, pulling jobs from a Redis queue. The user doesn't wait. They see "uploaded successfully" immediately while three different background tasks process their file.

This is also how AI template execution works. User clicks "Execute" on a template, a Celery task picks it up, calls the AI provider, saves the work output, logs the cost. The user can close their browser — the work still happens.

## 3. Celery Beat: Scheduled Jobs

Some work needs to happen on a schedule, not in response to a user action. Celery Beat handles this — periodic tasks that run automatically. Cleanup jobs, usage calculations, maintenance tasks.

This is the "things happen while you sleep" layer. The system is productive even when nobody is using it.

## 4. Django Signals: Event-Driven Reactions

When a model instance is created or saved, Django signals fire automatically. Save a new File? A signal triggers thumbnail generation. Save with auto-describe preference enabled? Another signal queues the AI description task.

Signals are the glue between "something changed in the database" and "other things should happen because of that." They keep the code decoupled — the File model doesn't need to know about thumbnail generation. The signal handles the connection.

## 5. Redis: Cache, Broker, and Pub/Sub

Redis plays three roles simultaneously:
- **Message broker** for Celery — task queues live here
- **Cache** for frequently accessed data — session data, rate limiting, temporary state
- **Pub/Sub** for real-time features — websocket message routing

One service, three critical functions. It's the nervous system connecting everything else.

## 6. WebSockets via Django Channels

Traditional HTTP is one-way: client asks, server answers. WebSockets are bidirectional — the server can push updates to the browser in real time.

When an AI agent completes a task, the browser updates without a page refresh. The shell interface streams output character by character. No polling, no "refresh to see updates" — data flows the moment it's available.

## 7. HTMX: Partial Page Updates

Not everything needs a full page reload or a WebSocket connection. HTMX handles the middle ground — click a button, fetch a small piece of HTML, swap it into the page.

Comments load inline. File search results appear as you type. Task status updates without navigating away. It's the right tool for interactions that are too dynamic for static pages but don't need real-time streaming.

## 8. Webhooks: External Events Flowing In

External services don't wait for us to ask them what happened. They tell us.

Stripe sends a webhook when a payment succeeds — billing updates, wallet gets credited. Twilio sends a webhook when a message arrives. These are inbound events from the outside world, triggering the same Celery → Signal → Database → WebSocket pipeline as internal actions.

The system doesn't just respond to its own users. It responds to the entire ecosystem of services it's connected to.

## 9. MCP + Objects: The Programmable Runtime

This is where it gets interesting.

**MCP (Model Context Protocol)** lets AI agents interact with AskRobots directly. Claude Code, or any MCP-compatible client, can create tasks, search content, attach files, manage projects — all through a structured JSON-RPC interface. No browser needed. The AI talks to the platform as a first-class participant.

**The Objects system** goes further. Users (or their AI agents) write Python code that runs on the platform. No deployments. No migrations. No PR reviews. The AI rewrites a running object's source code, and it's live instantly. State is stored as TSV — add a field, remove a field, no schema changes needed.

This is the layer that turns AskRobots from a SaaS application into a programmable platform. The `tools_screenshot` object captures web pages. A counter object tracks state. A custom calculator processes domain-specific logic. Users build what they need without waiting for us to ship a feature.

## How They All Connect

Here's a real flow that touches almost every layer:

1. **AI agent** (via MCP) creates a task from a template
2. **Django** saves the task to PostgreSQL
3. **Signal** fires, noticing it's a template with full AI assistance
4. **Celery task** is queued to execute the template
5. **Celery worker** calls the AI provider, generates content
6. **Django ORM** saves the Work output, logs the API cost
7. **Billing webhook** from Stripe confirms a wallet top-up
8. **WebSocket** pushes a notification to the browser: "Work output ready"
9. **HTMX** loads the result inline when the user views the task
10. **Objects** could process the output further — custom logic, external API calls, anything

Ten steps, nine execution models, zero page refreshes. That's not architecture astronautics — every piece exists because a real workflow needed it.

## Why This Matters

Most platforms force you into their model. Fill out our form. Use our workflow. Wait for our feature roadmap.

AskRobots is different because it's multi-modal by nature. Humans interact through the web UI. AI agents interact through MCP. Custom code runs in Objects. External services push events through webhooks. Background jobs handle the heavy lifting. Everything feeds into the same task, project, and file system.

The result is a platform that gets better every time we add a new connection point — and users can add their own without touching our codebase.

---

*Built by a team of one human and AI, running on a single VM. Sometimes the simplest infrastructure produces the most interesting architecture.*