
Vector Embeddings: Beyond the Hype
Understanding the Math Behind Modern AI Vector embeddings are everywhere in AI now. They power RAG systems, semantic …

Today we’re releasing Calliope CLI as open source. It’s a multi-model AI agent that lives in your terminal, works with 12+ AI providers, and remembers your projects across sessions.
No more switching between ChatGPT, Claude, and Gemini in browser tabs. No more copy-pasting code back and forth. Just one command:
npm install -g @calliopelabs/cli
calliope
And you’re talking to the world’s best AI models from your terminal.
We kept watching developers do the same dance: copy code from editor, paste into ChatGPT, copy response, paste back, realize it needs context, go back to ChatGPT, explain the context… repeat forever.
The terminal is where developers live. AI should meet them there—with full access to files, git history, and project context.
That’s Calliope CLI.
Switch between providers instantly. No config changes. No new tools.
/provider anthropic # Claude Sonnet 4, Opus, Haiku
/provider openai # GPT-4o, o1, o3-mini
/provider google # Gemini 2.0 Flash, Pro
/provider mistral # Mistral Large, Codestral
/provider groq # Llama 3.3 at lightning speed
/provider deepseek # DeepSeek Coder, Reasoner
/provider ollama # Run anything locally
Already paying for API keys? Use them. 100% BYOK (Bring Your Own Keys)—no middleman, no markup.
Not sure which model to use? Let Calliope decide:
/route on
Now Calliope automatically selects the right model for each task:
Optimize for cost and quality without thinking about it.
Calliope doesn’t just chat. It does things.
| Tool | What It Does |
|---|---|
shell | Run any command |
read_file / write_file | Read and modify your code |
execute_code | Run Python, Node, Bash in a sandbox |
git | Commit, push, diff—safely |
web_search | Look up docs and solutions |
mermaid | Generate diagrams |
Ask it to “fix the failing tests” and watch it run the tests, read the errors, modify the code, and verify the fix—all autonomously.
Independent operations run in parallel automatically:
[Executing 3 tools in parallel]
├── [git] status
├── [shell] npm test
└── [read_file] package.json
2-5x faster than sequential execution.
This is where it gets interesting. Ask Calliope to do something complex:
/loop "Refactor all TypeScript files to use strict mode. Run tsc after each change. Output DONE when there are no errors." --max-iterations 30
Calliope will iterate autonomously:
You can walk away and come back to a completed task.
Create a CALLIOPE.md file in your project:
/memory init
/memory add context "This is a TypeScript monorepo using pnpm"
/memory add preference "Use functional components with hooks"
Now every session starts with that context. No more re-explaining your project structure.
Calliope also automatically loads context from:
README.md, ARCHITECTURE.md.cursorrules, .github/copilot-instructions.mdCLAUDE.md (if you’re already using Claude)Your AI finally understands your codebase.
We didn’t build this to let AI run wild on your system.
Scope Management:
/scope add ~/projects/myapp # Allow access
/scope remove /tmp # Restrict access
Risk Assessment: Every operation is classified by risk level. High-risk commands (rm, git push, chmod) require confirmation unless you explicitly enable god mode.
Sandboxed Code Execution:
The execute_code tool runs in Docker containers with resource limits, timeouts, and network isolation.
| Mode | What It Does |
|---|---|
| Plan | Chat only, no execution. Design before you build. |
| Hybrid | Smart planning before complex operations. (Default) |
| Work | Direct execution. For when you know what you’re doing. |
/mode plan # Discuss architecture
/mode work # Ship the feature
Configuration Profiles:
/profile save work # Save current settings
/profile save experiments # Different settings
/profile work # Switch instantly
Conversation Branching:
/branch new "try-redis" # Fork the conversation
/branch switch main # Go back to original
Explore different approaches without losing context.
Know what you’re spending:
/cost
See token usage and costs by provider, session, and cumulative totals. No surprises.
The code is on GitHub. Read it, fork it, contribute to it.
github.com/calliopeai/calliope-cli
git clone https://github.com/calliopeai/calliope-cli
We believe AI development tools should be open. The community makes them better.
Install:
npm install -g @calliopelabs/cli
Run:
calliope
Configure: The setup wizard guides you through selecting a provider and entering your API key. Or just set environment variables:
export ANTHROPIC_API_KEY=sk-ant-...
export OPENAI_API_KEY=sk-...
calliope --skip-setup
Start building:
> Help me write tests for the auth module
> Fix the TypeScript errors in src/
> Explain what this regex does
> Refactor this function to use async/await
We’re actively developing:
Star the repo. Open issues. Send PRs. Let’s build this together.

Understanding the Math Behind Modern AI Vector embeddings are everywhere in AI now. They power RAG systems, semantic …

The Legal AI Paradox Law firms and legal departments sit on massive document collections. Contracts, case files, …