preloader
blog post

Calliope AI Lab v1.3: Notebook Generation, Dynamic Models, and 21+ LLM Providers

author image

Your AI Lab Just Got a Lot Smarter

Calliope AI Lab v1.3 is out — and it’s the biggest update since launch. New notebook generation capabilities, dynamic model discovery, and support for over 21 LLM providers make this the most capable version of the platform yet.

Here’s what’s new.

Notebook Generation Mode

The sidebar chat agent can now write directly to your notebooks. Ask it to generate a function, build a visualization, or scaffold a complete analysis pipeline — and it inserts the cells right where you need them.

This isn’t copy-paste. The agent understands your notebook context — what cells exist, what variables are defined, what errors you’re hitting — and generates code that fits. Use the /add command to create new cells from natural language, or /add -r to replace the selected cell with an improved version.

Behind the scenes, this works through WebSocket metadata on agent messages. The agent sends structured notebook actions (insert, replace) with the generated code, and the frontend renders an “Insert into Notebook” button you can click to apply it. Or set it to auto-insert for a fully agentic workflow.

Dynamic Model Discovery

Previously, switching AI models meant editing configuration. Now Calliope AI Lab automatically discovers available models from every connected provider.

Connect your OpenAI key and it pulls the full model list — GPT-4o, o1, o3-mini, everything available on your account. Same for Anthropic, Google, Groq, and every other provider. Local models through Ollama and LM Studio are auto-detected too.

This means you can switch between Claude, GPT-4, Gemini, and your local Llama model mid-conversation without touching a config file.

21+ LLM Providers

The provider ecosystem has expanded significantly:

CategoryProviders
Major CloudAnthropic, OpenAI, Google Gemini, Vertex AI, AWS Bedrock
PerformanceGroq, Cerebras, SambaNova, Fireworks AI
SpecializedDeepSeek, xAI, Perplexity (Sonar), AI21, Cohere
AggregatorsTogether AI, OpenRouter, NVIDIA, LiteLLM
LocalOllama, LM Studio, GPT4All

Every provider supports bring-your-own-key (BYOK). No Calliope account required. Your keys stay on your machine, stored in the local secrets manager.

Notebook Context Awareness

Chat Studio now automatically syncs with your active notebook. When you ask the agent a question, it sees:

  • All cell contents (code and markdown)
  • Recent outputs and execution results
  • Any errors in the current session
  • Variable definitions and data structures

This context-aware analysis means the agent’s suggestions are grounded in what you’re actually working on — not generic responses. Ask it to “fix this error” and it knows exactly which error you mean.

Improved SQL Execution via SSH Tunnels

For teams connecting to databases behind firewalls, the Data Agent now properly executes SQL queries through SSH tunnels. The tunnel manager also supports all common key types — Ed25519, ECDSA, and RSA — instead of hardcoding RSA only.

macOS Signed and Notarized by Apple

Starting with v1.3, macOS builds are code-signed and notarized through Apple’s developer program. No more Gatekeeper warnings. No right-click workarounds. Double-click the DMG and it just works.

Available Now

Calliope AI Lab v1.3.1 is available for download:

PlatformArchitectureFormat
macOSApple Silicon (M1–M4).dmg
Windowsx64.exe installer
Linuxx64.deb, .AppImage, .tar.gz
Linuxarm64.deb, .AppImage, .tar.gz

Download from the releases page .

Requirements

  • 8 GB RAM minimum, 16 GB recommended
  • 2 GB available disk space
  • macOS 11+ (Apple Silicon), Windows 10+, or Ubuntu 20.04+

What’s Next

We’re working on improved data agent output rendering in notebooks, deeper integration between the chat agent and notebook execution kernels, and expanded magic command capabilities. Stay tuned.

Related Articles