Skip to main content

AI at the Edge

A Visual Guide to Choosing Your Cloudflare Stack

Understanding whether to use Containers, simple Workers with KV, or an external AI model can be tricky. This guide provides a side-by-side visual comparison to help you decide which pattern is right for your project, like an advanced SEO analyzer or a bespoke article-writing bot.

1. Workers Containers

The Powerhouse: Full Runtimes & Heavy Compute

How It Works:

Best For:

  • Resource-intensive tasks (CPU, RAM, Disk).
  • Using any language or runtime (Python, Go, Rust).
  • Reusing existing Docker images and tools.
  • Requiring a full Linux filesystem or native libraries.

Use Case: SEO & Article Bot

Run a Python-based Scrapy crawler and Lighthouse analyzer inside a container. The Worker routes the target URL to the container, which performs a deep analysis and returns a full JSON report.

Pros

  • Total flexibility
  • Full resource access
  • Process isolation

Cons

  • Container cold-starts
  • Requires Docker knowledge
  • Higher complexity

2. Workers + KV

The Sprinter: Instant & Lightweight Logic

How It Works:

Best For:

  • Lightning-fast lookups and data retrieval.
  • Simple, rule-based logic and templating.
  • Lightweight state management (e.g., user preferences).
  • When lowest possible latency is critical.

Use Case: SEO & Article Bot

A Worker fetches a webpage's metadata, performs simple keyword density checks using JavaScript, and pulls pre-computed SEO advice or article templates from KV based on the keywords found.

Pros

  • Fastest cold-start (<5ms)
  • Simplest to develop
  • Lowest operational cost

Cons

  • JS/Wasm only
  • Limited compute/memory
  • No native libraries

3. Custom AI + KV

The Specialist: Advanced Conversational AI

How It Works:

Best For:

  • Leveraging large language models (LLMs) like GPT.
  • Complex, dynamic Q&A and conversations.
  • When you need state-of-the-art Still nothing. Perhaps the hash is wrong. Perhaps use view_image tool, perhaps it can describe it. The view_image is for image URLs. Let me try view_image with the URL https://imagedelivery.net/74UT-p0VjqHbZyOYEX8QQw/53e403ac-8aa9-47fb-3b04-5f9cfeedf600/public If it works, it will describe, if not, perhaps error. The tool is view_image. AI at the Edge: A Visual Guide to Cloudflare's Stack Skip to main content

    AI at the Edge

    A Visual Guide to Choosing Your Cloudflare Stack

    Understanding whether to use Containers, simple Workers with KV, or an external AI model can be tricky. This guide provides a side-by-side visual comparison to help you decide which pattern is right for your project, like an advanced SEO analyzer or a bespoke article-writing bot.

    1. Workers Containers

    The Powerhouse: Full Runtimes & Heavy Compute

    How It Works:

    Best For:

    • Resource-intensive tasks (CPU, RAM, Disk).
    • Using any language or runtime (Python, Go, Rust).
    • Reusing existing Docker images and tools.
    • Requiring a full Linux filesystem or native libraries.

    Use Case: SEO & Article Bot

    Run a Python-based Scrapy crawler and Lighthouse analyzer inside a container. The Worker routes the target URL to the container, which performs a deep analysis and returns a full JSON report.

    Pros

    • Total flexibility
    • Full resource access
    • Process isolation

    Cons

    • Container cold-starts
    • Requires Docker knowledge
    • Higher complexity

    2. Workers + KV

    The Sprinter: Instant & Lightweight Logic

    How It Works:

    Best For:

    • Lightning-fast lookups and data retrieval.
    • Simple, rule-based logic and templating.
    • Lightweight state management (e.g., user preferences).
    • When lowest possible latency is critical.

    Use Case: SEO & Article Bot

    A Worker fetches a webpage's metadata, performs simple keyword density checks using JavaScript, and pulls pre-computed SEO advice or article templates from KV based on the keywords found.

    Pros

    • Fastest cold-start (<5ms)
    • Simplest to develop
    • Lowest operational cost

    Cons

    • JS/Wasm only
    • Limited compute/memory
    • No native libraries

    3. Custom AI + KV

    The Specialist: Advanced Conversational AI

    How It Works:

    Best For:

    • Leveraging large language models (LLMs) like GPT.
    • Complex, dynamic Q&A and conversations.
    • When you need state-of-the-art generation quality.
    • Outsourcing the complexity of hosting a large model.

    Use Case: SEO & Article Bot

    A Worker takes a user's topic, retrieves the chat history from KV, then calls the OpenAI API with a detailed prompt to generate a high-quality, unique article. The new article is then saved back to KV.

    Pros

    • Access to powerful LLMs
    • Rich conversational features
    • No model hosting overhead

    Cons

    • External API costs/limits
    • Network latency to AI provider
    • Dependent on a third party

    Bespoke AI for Every Client

    Workers Containers allows you to create and deploy dedicated, containerized AI bots for each client, providing true production-level isolation and customization. Here are two ways to achieve this.

    Approach A: One Image Per Client

    Build a completely separate Docker image for each client containing their specific code, libraries, and model weights. This offers maximum isolation.

    # wrangler.toml
    
    # For Client A
    [[bindings.container]]
    name  = "CLIENT_A_BOT"
    image = "my-registry/client-a-bot:latest"
    
    # For Client B
    [[bindings.container]]
    name  = "CLIENT_B_BOT"
    image = "my-registry/client-b-bot:latest"

    Approach B: Multi-Tenant Image

    Build a single, flexible image with a loader system. The container fetches the client's unique logic, configs, or model weights from KV or Durable Objects at runtime.

    // worker.js
    
    // Get client ID from request
    const clientId = getClientId(request); 
    
    // Pass client ID to container
    const container = getContainer(env.MULTI_TENANT_BOT);
    return container.fetch(request, {
      headers: { 'X-Client-ID': clientId }
    });