Skip to main content

Self-Hosting

AI Supreme Council is a single, self-contained HTML file. You can host it on any static file server, run it from a USB drive, or even open it directly from your filesystem. No backend, no database, no runtime -- just a file.

The Simplest Method

Download index.html and open it in a browser:

# That's it. Open the file.
open index.html # macOS
xdg-open index.html # Linux
start index.html # Windows

The app works immediately from the file:// protocol. All features that do not require HTTPS (everything except PWA installation and clipboard API) function normally.

tip

For the full experience including PWA installation and clipboard access, serve the file over HTTPS using any static file server.

Static File Server

Upload to any web server. No server-side configuration needed.

Python (quick local testing)

python3 -m http.server 8000
# Open http://localhost:8000

Caddy (HTTPS with automatic certificates)

caddy file-server --root /path/to/aiscouncil --listen :8080

Nginx

server {
listen 443 ssl;
server_name aiscouncil.example.com;

ssl_certificate /etc/ssl/certs/example.com.pem;
ssl_certificate_key /etc/ssl/private/example.com.key;

root /var/www/aiscouncil;
index index.html;

location / {
try_files $uri $uri/ /index.html;
}
}

Cloud Hosting

The app works on any static hosting platform with zero configuration:

PlatformDeploy Method
Cloudflare PagesConnect Git repo or drag-and-drop
NetlifyConnect Git repo or drag-and-drop
VercelConnect Git repo or vercel deploy
GitHub PagesPush to gh-pages branch
AWS S3 + CloudFrontUpload files to S3, serve via CloudFront
Firebase Hostingfirebase deploy

While only index.html is strictly required, deploying these additional files enables PWA features:

your-server/
index.html # The complete application (required)
sw.js # Service worker for offline support
manifest.webmanifest # PWA manifest (name, icons, theme)
icon.svg # Vector icon
icon-192.png # Android home screen icon
icon-512.png # Splash screen / maskable icon
favicon.ico # Browser tab icon

Optional additions:

  s/
index.html # Web editor (if you want the editor too)
registry/
models.json # Local model registry (if you want offline model data)
ads.json # Ad configuration (only if using the free tier ad system)

Docker Deployment

A minimal Docker deployment using Nginx:

FROM nginx:alpine
COPY index.html /usr/share/nginx/html/
COPY sw.js /usr/share/nginx/html/
COPY manifest.webmanifest /usr/share/nginx/html/
COPY icon.svg /usr/share/nginx/html/
COPY icon-192.png /usr/share/nginx/html/
COPY icon-512.png /usr/share/nginx/html/
COPY favicon.ico /usr/share/nginx/html/
EXPOSE 80

Build and run:

docker build -t aiscouncil .
docker run -d -p 8080:80 aiscouncil
# Open http://localhost:8080

For HTTPS with Docker, use a reverse proxy like Traefik or Caddy in front of the Nginx container.

What Runs Locally

Everything. The HTML file contains all CSS, JavaScript, and module code inline. When a user opens the page:

  • The UI renders entirely client-side
  • API keys are stored in the user's browser (localStorage)
  • Chat history and bot configs are stored in the user's browser (IndexedDB)
  • LLM API calls go directly from the browser to the provider (Anthropic, OpenAI, xAI, etc.)
  • No data passes through your server

Your server serves a static file. That is its only job.

Guest Mode

When hosted on a custom domain (not aiscouncil.com), the app operates in guest mode:

  • No Google Sign-In required -- the login gate is skipped
  • All features work without authentication
  • Users go directly to the chat interface
  • Settings, storage, and all features are fully functional
info

Guest mode activates automatically when the app detects it is running on localhost, a .pages.dev domain, or any domain other than aiscouncil.com. No configuration needed.

Custom API Base

By default, the app sends auth, billing, and usage tracking requests to https://api.aiscouncil.com/v1. For self-hosted deployments, you can override this:

localStorage.setItem('ais-api-base', 'https://your-api.example.com/v1');

This is optional. If you are not using the managed billing/auth features, the app works without any API base -- all LLM calls go directly from the browser to the provider.

Local Proxy (aiscouncil-serve)

If you want to avoid exposing API keys in the browser (e.g., in a shared office environment), you can run the aiscouncil-serve local proxy:

ANTHROPIC_API_KEY=sk-ant-... \
OPENAI_API_KEY=sk-... \
XAI_API_KEY=xai-... \
./aiscouncil-serve --port 8741

The proxy routes requests by model prefix to the correct provider:

Model PrefixProviderAuth
claude-*Anthropicx-api-key header
gpt-*, o1*, o3*OpenAIBearer token
grok-*xAIBearer token
meta-llama/, deepseek/, qwen/, mistralai/, google/OpenRouterBearer token
Everything elseOllama (localhost)None

The proxy also supports X-Provider header to force a specific provider regardless of model prefix.

Build from source:

cd serve
../tools/zig/zig build -Doptimize=ReleaseSmall
# Binary at: zig-out/bin/aiscouncil-serve (~2.3 MB)

Offline Deployment

For air-gapped environments:

  1. Deploy the files to an internal web server
  2. On first visit, the service worker caches everything
  3. After caching, the app works without any network access
  4. Users can chat with Ollama or any local LLM running on the same network

Preload the model registry by placing registry/models.json alongside index.html. The app checks for a local copy before trying to fetch from GitHub.

CORS Considerations

When using local inference servers (Ollama, vLLM, etc.), the browser enforces CORS (Cross-Origin Resource Sharing). Your inference server must allow requests from your hosting origin.

Ollama:

# Set before starting Ollama
OLLAMA_ORIGINS=* ollama serve

vLLM:

vllm serve model-name --allowed-origins '*'

General (any server behind Nginx):

add_header 'Access-Control-Allow-Origin' '*' always;
add_header 'Access-Control-Allow-Headers' 'Content-Type, Authorization' always;
add_header 'Access-Control-Allow-Methods' 'POST, OPTIONS' always;
warning

Using * for Access-Control-Allow-Origin is fine for local development but should be restricted to your specific origin in production deployments.

Building from Source

To modify the app and rebuild:

# Clone the repository
git clone https://github.com/nicholasgasior/aiscouncil.com.git
cd aiscouncil.com

# Edit source files in src/ (NOT index.html directly)
# The app is split into ~35 source files that are concatenated

# Build the single-file output
./build.sh

# Verify the build matches
./build.sh --check

The build process concatenates all src/ files in order into a single index.html. No bundler, no transpiler, no npm -- just shell concatenation.

Web Editor

To also host the web editor, include the s/ directory:

your-server/
index.html # Bot platform
s/
index.html # Web editor (aiscouncil.com/s/)

The editor uses the same compression codec but with version prefix A instead of B. It is a separate self-contained HTML file with its own feature set (WYSIWYG editing, multi-format support, compression algorithms).

Data Portability

Users can export all their data via Settings > Privacy > Export All Data. The export is a JSON file containing all bot profiles, chat histories, and settings. API keys are excluded from exports. The export can be imported into any other instance of AI Supreme Council.

This means users are never locked into a specific deployment. They can move between self-hosted instances, the official site, or any other host at any time.