docs: Tidy

This commit is contained in:
Triex 2025-06-05 05:17:21 +10:00
parent 90627c6d54
commit 5ff856c018

View File

@ -59,10 +59,11 @@ Current LLM inference is dominated by Python/PyTorch, which introduces:
- `WebSocket /ws` - Streaming inference
### Deployment Vision
- **Docker containers** for cloud deployment
- **Static binaries** for edge devices
- **WebAssembly** for browser inference
- **Serverless functions** for auto-scaling
- **Static binaries** - Single file deployment, no dependencies
- **Direct VPS deployment** - Copy binary and run with systemd
- **Edge devices** - ARM/RISC-V cross-compilation
- **Serverless functions** - Minimal cold start with static linking
- **WebAssembly** - Browser inference without additional runtime
## Implementation Plan