What I Built
Visit saltwaterbrc.com/playground.html and you’ll see a code editor. Write Python. Write JavaScript. Write shell commands. Hit run. Your code executes in an isolated container at the edge and you get the output back in milliseconds.
No backend. No local docker daemon. No SSH into a server. Just code, edge compute, isolation, and results.
This is what Sandbox SDK does: it lets you execute arbitrary code inside a Worker in a secure, isolated environment. And I built this to understand what salespeople need to know when they’re talking to companies in gaming, automotive, healthcare, financial services, and AI.
Try it now: Code Playground
The Architecture: Three Layers
Here’s how this actually works under the hood:
Layer 1: The Worker (Request Routing). When you hit “Run,” your code gets sent to a Cloudflare Worker. This Worker is the entry point. It validates your code, checks rate limits, and decides if this request should proceed. It’s the API layer that sits between you and the execution environment.
Layer 2: The Durable Object (State & Orchestration). The Worker routes your request to a Durable Object, which is a persistent, coordinated compute unit at the edge. The Durable Object manages the lifecycle of your execution — it queues requests, tracks running containers, and makes sure resources aren’t exhausted. If someone tries to spin up a hundred containers at once, the Durable Object says no.
Layer 3: The Sandbox Container (Execution). This is where Sandbox SDK comes in. The Durable Object spawns a container using the SDK. Your code runs inside that container in complete isolation — it has its own filesystem, its own processes, its own networking rules. One user’s Python script can’t access another user’s memory or files. The container has built-in timeouts, so infinite loops don’t crash the entire system.
The magic of this architecture is that it combines Worker request handling, Durable Object state management, and container isolation into a single coherent system. Most platforms choose one. Cloudflare lets you use all three together.
Where Docker Fits In (And Where It Doesn’t)
Here’s the question everyone asks: “What does Docker have to do with this?”
Docker is a tool you use locally to build your environment. On my machine, I have a Dockerfile that installs Node.js, Python, git, curl, and a few system utilities. I run docker build and it creates an image. That image defines what gets installed in the container that runs at the edge.
So the flow looks like this:
Your local machine: npm run dev starts Docker on your machine. It builds a local image that matches what will run at the edge. You test it. You iterate. You push the image to Cloudflare’s registry.
Cloudflare’s edge: When you deploy or when a request comes in, Cloudflare pulls that image and runs it in a container. The image is the specification. Docker built it. Cloudflare runs it.
This is important: Cloudflare doesn’t run Docker. Cloudflare runs your image. Docker is the build tool. The edge is the runtime.
Why does this matter? Because in development, you want to test something that matches production as closely as possible. That’s why we use Docker locally — so the container you build locally behaves the same way as the container that runs at the edge.
The Development Experience: Testing Without Deployment
The workflow I settled on was straightforward:
Step 1: Write your Worker code. Create a wrangler.toml file that configures your Worker and points to your Dockerfile. Define your environment variables, your routes, your bindings to Durable Objects.
Step 2: Start local development. Run npm run dev. This spins up a local version of your Worker using Docker. Your Worker runs in a local container that matches what’s on the edge. You can test the full request/response cycle without deploying anything.
Step 3: Test with curl. Open a new terminal and test your endpoints with curl. Send a request to localhost:8787 with your code payload. See the output. Debug the response. Iterate.
Step 4: Deploy. When it works locally, run npm run deploy. This pushes your Worker code to Cloudflare. It pushes your container image to Cloudflare’s registry. Everything goes live.
The point is: you don’t deploy to test. You test locally with Docker, then deploy when you’re confident.
What Broke (And Why It’s Important)
This build wasn’t clean. I hit every possible error a developer hits when they’re new to edge compute. Let me walk through them:
Issue 1: Workers Paid Plan Required. I started with the free plan. Sandbox SDK requires the Paid plan — around $5/month. Nothing wrong with this, but it’s not something you discover until deployment fails. Now I know. Future builders will know.
Issue 2: Docker Image Push Timeout. The first time I tried to deploy, the image push to Cloudflare’s registry timed out after 5 minutes. The image was 500MB. I hadn’t optimized the Dockerfile at all — I just installed everything and threw it at the edge. The fix: build leaner images. Use alpine base images. Only install what you actually need. Multi-stage builds. That 500MB image is now 200MB.
Issue 3: Platform Mismatch Error. I built the Docker image on my Mac (ARM64 architecture) and tried to run it on Cloudflare’s edge (x86 architecture). Docker complained. The fix: build explicitly for the target platform. Add --platform=linux/amd64 to your build. Or use buildx to build for multiple platforms at once.
Issue 4: Base Image Had Nothing Installed. I used a minimal base image to save space. Then I tried to run a Python script and there was no Python. I tried to use curl and there was no curl. The container was so minimal that it couldn’t do anything. The fix: you need to pick a balance. Use python:3.12-slim if you need Python. Use node:20-alpine if you need Node. Don’t start from scratch unless you really know what you’re doing.
Every error I hit is an error a customer will hit. Every error I had to solve is an error you’ll need to help them solve. That’s why I documented them.
Why This Matters for Sales Conversations
When you’re selling compute platforms, the first conversation is always: “Can it run my code?” The second conversation is: “Is it secure?”
Sandbox SDK lets me say yes to both. But more importantly, it lets me demonstrate both.
For gaming studios: You can now show them a live demo where they upload a game server script and it runs at the edge with zero latency. That’s not theory. They can see it execute in real time. Ask them: “Your players are in five countries. You’re getting lag because your game state syncs with a centralized server. What if the game server ran at their location?” That’s a different conversation than a slide deck.
For financial services: You can show them code executing in isolation, with inputs validated, outputs logged, and everything auditable. You can run their compliance script right there on the call and say: “Every request gets logged. Every interaction is tracked. Your data never leaves your perimeter unless you explicitly send it there.” That’s the difference between “possible” and “proven.”
For healthcare: Same story. Isolation, observability, compliance-friendly logging. You can run a HIPAA-relevant scenario — process patient data, return an encrypted result — and show them the audit trail.
For automotive: Vehicle telemetry, edge AI inference, real-time decision making. You can load a model, feed it sensor data, and show the inference result in milliseconds. Not on a server a thousand miles away. At the edge, at the location of the car.
For AI agents: Function calling, tool execution, sandboxed code generation. If you’re building an AI product that generates code, you need a place to run that generated code safely. Sandbox SDK is it. You can show a live demo of an AI agent writing Python, executing it, and getting a result — all without any risk that the code will escape its sandbox.
In every single case, the difference between “we have this feature” and “I just ran your scenario live” is enormous. You can’t fake that.
Phase 4 Complete: Four Core Products, One Domain
Here’s what saltwaterbrc.com now runs:
Workers AI — LLM inference. The “Ask” feature uses Llama 3.1 to answer questions about blog content.
Vectorize — Vector database. Blog chunks get embedded and stored. Similarity search finds relevant content.
AI Gateway — Observability layer. Every model call, every token, every cache hit gets logged and visualized.
Sandbox SDK — Edge containers. Code execution in isolation. Shell, Python, JavaScript running safely at the edge.
All four are running on the same domain. All four are integrated into the same product. All four cost $0 to run because everything hits the free tier or the cache.
And that’s the point of Phase 4. It wasn’t about building four separate features. It was about showing that these four products work together. That they solve a class of problems that you can’t solve with just one of them. That you can build enterprise-grade applications on top of them.
That’s what I can now say to a prospect: “We didn’t build this to impress you. We built this because we needed it. We needed to run AI inference, store vectors, observe what’s happening, and execute code safely. These four Cloudflare products are how we did it.”
What’s Next: Phase 5 Astro Migration
Phase 5 is a site-wide migration to Astro. The current site is a mix of hand-written HTML and Pages routing. Astro will give me templating, component reusability, better markdown support, and a sane way to manage the growing number of blog posts.
Phase 4 was about depth — integrating four advanced products. Phase 5 is about breadth — scaling the content, the templates, the site structure as the number of posts grows.
But Phase 4 is done. I built it. I broke it. I fixed it. I learned where the errors are and how to solve them. Now I can tell you where you’ll hit them too.
Try the Code Playground. Run some Python. Run some JavaScript. Break it if you want. See how the isolation works. See what edge compute actually feels like. And if you’re building a product that needs to run untrusted code safely, or needs compute at low latency, or needs to demo something that can’t be shown on a slide, this is what you’re actually building toward.