Melious
CLI

Chat & Prompts

Send prompts, chat interactively, and include file context

Chat & Prompts

Single Prompt

Send a one-off prompt and get a streamed response:

melious run "Explain the difference between TCP and UDP"

Interactive Chat

Start a conversation with message history:

melious run -i

Or simply run without a prompt on a terminal — the CLI enters interactive mode automatically:

melious run

Interactive mode keeps your conversation history in memory. Use --save-on-exit session.json to persist it, and --load session.json to resume later.


File Context

Include files alongside your prompt using -f:

melious run -f src/main.go "Explain what this code does"

Include multiple files:

melious run -f src/main.go -f src/handler.go "How do these files interact?"

Glob patterns are supported:

melious run -f "src/**/*.ts" "Find potential bugs in this codebase"

Files are injected into the message before your prompt text.


System Prompts

Set a system prompt inline:

melious run --system "You are a senior Go developer. Be concise." "Review this function"

Or load from a file:

melious run --system-file rules.md "Refactor the auth module"

Set a default system prompt for all runs:

melious config set run.system_prompt "Always respond in bullet points"

Piping

Pipe input from other commands:

cat error.log | melious run "What's causing these errors?"
git diff | melious run "Summarize these changes"

Use --raw for clean output without formatting (ideal for scripts and pipes):

cat data.json | melious run --raw "Extract all email addresses" > emails.txt

Session Persistence

Save your conversation on exit:

melious run -i --save-on-exit chat.json

Resume a previous conversation:

melious run -i --load chat.json

Combine both to maintain a running session file:

melious run -i --load chat.json --save-on-exit chat.json

Parameters

FlagTypeDefaultDescription
--modelstringfrom configModel to use
--presetstringfrom configRouting preset (balanced, speed, price, eco)
--temperaturefloat0.7Sampling temperature (0-2)
--max-tokensint4096Maximum tokens in response
--top-pfloatTop-p nucleus sampling
-i, --interactiveboolfalseInteractive chat mode
-f, --filestring[]Include file(s) as context (repeatable)
--systemstringSystem prompt
--system-filestringSystem prompt from file
--rawboolfalseRaw output, no formatting
--no-streamboolfalseWait for full response (no streaming)
--show-envboolfalseShow environmental impact
--loadstringLoad conversation from file
--save-on-exitstringAuto-save conversation on exit

--show-env forces non-streaming mode because environmental impact data is only available in the complete response.


Examples

Chat about a codebase:

melious run -f "src/**/*.py" "What does this project do?"

Quick translation:

echo "Hello, how are you?" | melious run --raw "Translate to German"

Use a specific model with a preset:

melious run --model deepseek-r1-0528 --preset speed "Solve this: 2^10 + 3^7"

Low-temperature for deterministic output:

melious run --temperature 0.1 "List the planets in our solar system"

On this page