Chat & Prompts
Send prompts, chat interactively, and include file context
Chat & Prompts
Single Prompt
Send a one-off prompt and get a streamed response:
melious run "Explain the difference between TCP and UDP"Interactive Chat
Start a conversation with message history:
melious run -iOr simply run without a prompt on a terminal — the CLI enters interactive mode automatically:
melious runInteractive mode keeps your conversation history in memory. Use --save-on-exit session.json to persist it, and --load session.json to resume later.
File Context
Include files alongside your prompt using -f:
melious run -f src/main.go "Explain what this code does"Include multiple files:
melious run -f src/main.go -f src/handler.go "How do these files interact?"Glob patterns are supported:
melious run -f "src/**/*.ts" "Find potential bugs in this codebase"Files are injected into the message before your prompt text.
System Prompts
Set a system prompt inline:
melious run --system "You are a senior Go developer. Be concise." "Review this function"Or load from a file:
melious run --system-file rules.md "Refactor the auth module"Set a default system prompt for all runs:
melious config set run.system_prompt "Always respond in bullet points"Piping
Pipe input from other commands:
cat error.log | melious run "What's causing these errors?"git diff | melious run "Summarize these changes"Use --raw for clean output without formatting (ideal for scripts and pipes):
cat data.json | melious run --raw "Extract all email addresses" > emails.txtSession Persistence
Save your conversation on exit:
melious run -i --save-on-exit chat.jsonResume a previous conversation:
melious run -i --load chat.jsonCombine both to maintain a running session file:
melious run -i --load chat.json --save-on-exit chat.jsonParameters
| Flag | Type | Default | Description |
|---|---|---|---|
--model | string | from config | Model to use |
--preset | string | from config | Routing preset (balanced, speed, price, eco) |
--temperature | float | 0.7 | Sampling temperature (0-2) |
--max-tokens | int | 4096 | Maximum tokens in response |
--top-p | float | — | Top-p nucleus sampling |
-i, --interactive | bool | false | Interactive chat mode |
-f, --file | string[] | — | Include file(s) as context (repeatable) |
--system | string | — | System prompt |
--system-file | string | — | System prompt from file |
--raw | bool | false | Raw output, no formatting |
--no-stream | bool | false | Wait for full response (no streaming) |
--show-env | bool | false | Show environmental impact |
--load | string | — | Load conversation from file |
--save-on-exit | string | — | Auto-save conversation on exit |
--show-env forces non-streaming mode because environmental impact data is only available in the complete response.
Examples
Chat about a codebase:
melious run -f "src/**/*.py" "What does this project do?"Quick translation:
echo "Hello, how are you?" | melious run --raw "Translate to German"Use a specific model with a preset:
melious run --model deepseek-r1-0528 --preset speed "Solve this: 2^10 + 3^7"Low-temperature for deterministic output:
melious run --temperature 0.1 "List the planets in our solar system"