Full-featured Next.js web interface for Ollama Code
The Web UI package (@ollama-code/web-app) provides a modern web interface for interacting with Ollama models. It includes three main components:
| Component | Description |
|---|---|
| Chat | Real-time streaming chat with model selection |
| Files | File browser with Monaco editor |
| Terminal | Full PTY terminal via WebSocket |
# Navigate to web-app package
cd packages/web-app
# Install dependencies (if not already installed)
npm install
# Start development server
npm run dev
Open http://localhost:3000 in your browser.
For full terminal functionality, use the custom server:
# Start with WebSocket server for terminal
npm run dev:server
// Chat messages are streamed in real-time
const response = await fetch('/api/chat', {
method: 'POST',
body: JSON.stringify({
model: 'llama3.2',
messages: [{ role: 'user', content: 'Hello!' }],
stream: true,
}),
});
// Stream the response
const reader = response.body.getReader();
while (true) {
const { done, value } = await reader.read();
if (done) break;
// Process chunk...
}
Supported Languages:
| Category | Languages |
|---|---|
| Web | TypeScript, JavaScript, CSS, HTML |
| Backend | Python, Go, Rust, Java, PHP |
| Systems | C, C++, Rust |
| Config | JSON, YAML, TOML, Markdown |
// Connect to terminal WebSocket
const socket = new WebSocket('ws://localhost:3000/terminal');
// Send input
socket.send(JSON.stringify({ type: 'input', data: 'ls -la\n' }));
// Resize terminal
socket.send(JSON.stringify({ type: 'resize', cols: 120, rows: 40 }));
// Receive output
socket.onmessage = (event) => {
const message = JSON.parse(event.data);
if (message.type === 'output') {
console.log(message.data);
}
};
/api/modelsList available Ollama models.
Method: GET
Response:
{
"models": [
{
"name": "llama3.2:latest",
"modified_at": "2025-01-15T12:00:00Z",
"size": 4869431328
}
]
}
/api/chatChat with Ollama model with streaming support.
Method: POST
Request:
{
"model": "llama3.2",
"messages": [
{ "role": "user", "content": "Hello!" }
],
"stream": true
}
Response: NDJSON stream
/api/generateGenerate text with Ollama model.
Method: POST
Request:
{
"model": "llama3.2",
"prompt": "Write a hello world program",
"stream": true
}
/api/fsFilesystem operations.
| Method | Description |
|---|---|
GET |
List directory or read file |
POST |
Create file or directory |
PUT |
Write file content |
DELETE |
Delete file or directory |
Example - List directory:
GET /api/fs?path=/
Response:
{
"path": "/",
"type": "directory",
"items": [
{ "name": "src", "type": "directory", "size": 0 },
{ "name": "package.json", "type": "file", "size": 1234 }
]
}
/api/ollama/[...path]Proxy for all Ollama API requests.
Example:
GET /api/ollama/tags → Ollama /api/tags
POST /api/ollama/show → Ollama /api/show
POST /api/ollama/embed → Ollama /api/embed
The terminal server provides WebSocket-based PTY access.
interface TerminalServerConfig {
server: HttpServer; // HTTP server to attach to
path?: string; // WebSocket path (default: '/terminal')
shell?: string; // Shell to use (default: $SHELL or 'bash')
cols?: number; // Initial columns (default: 80)
rows?: number; // Initial rows (default: 24)
env?: Record<string, string>; // Environment variables
cwd?: string; // Working directory
maxSessionsPerIp?: number; // Max sessions per IP (default: 5)
sessionTimeout?: number; // Timeout in ms (default: 30 minutes)
}
Client → Server:
// Input
{ type: 'input', data: 'ls -la\n' }
// Resize
{ type: 'resize', cols: 120, rows: 40 }
// Ping
{ type: 'ping' }
Server → Client:
// Output
{ type: 'output', data: 'file1.txt\nfile2.txt\n' }
// Process exited
{ type: 'exit', code: 0 }
// Error
{ type: 'error', data: 'Maximum sessions reached' }
| Variable | Description | Default |
|---|---|---|
OLLAMA_URL |
Ollama server URL | http://localhost:11434 |
PROJECT_DIR |
Base directory for file operations | Current directory |
PORT |
Server port | 3000 |
HOST |
Server host | localhost |
┌─────────────────────────────────────────────────────────────┐
│ Browser (React) │
├─────────────────────────────────────────────────────────────┤
│ ChatInterface │ FileExplorer │ TerminalEmulator │
│ │ │ │ │
│ ▼ ▼ ▼ │
│ /api/chat /api/fs WebSocket (/terminal) │
└─────────────────────────────────────────────────────────────┘
│
▼
┌─────────────────────────────────────────────────────────────┐
│ Next.js Server │
├─────────────────────────────────────────────────────────────┤
│ API Routes │ TerminalServer (PTY) │
│ │ │ │
│ ▼ ▼ │
│ Ollama API Shell Process │
└─────────────────────────────────────────────────────────────┘
npm run test
npm run typecheck
npm run build
npm run start:server
src/components/src/app/api/src/stores/src/types/npm run dev:server (not just npm run dev)/terminalPROJECT_DIR environment variableOLLAMA_URLollama listApache License 2.0