AI Assistants
You can control Visibox using natural language with an AI assistant like Claude, Cursor, Codex, Windsurf, Gemini CLI, VS Code, Zed, or Cline. Once connected, you talk to your show:
- “Add a Ken Burns effect to the first clip in Song 3.”
- “Create a new Song called Encore and add a Visualizer Clip.”
- “Set the volume to 50% and go fullscreen.”
- “Show me the setlist.”
- “Duplicate ‘Verse’ and rename the copy ‘Bridge’.”
Connecting is a one-time setup. After that, you don’t need to know anything about the technology underneath — you just describe what you want.
How to Connect
The simplest way is to let Visibox auto-install itself into your AI tool of choice.
- In Visibox, choose Settings > Connect to AI Tools… (on Mac, from the Visibox menu; on Windows, from the File menu).
- Visibox scans for installed AI tools on your machine and offers to register itself with each one.
- Restart your AI tool. Visibox is now available.
Once connected, just ask the AI assistant about your project. It will discover Visibox automatically and start responding.
Supported Tools
Visibox can auto-install into any of these:
- Claude Desktop
- Claude Code
- Cursor
- Windsurf
- Codex
- Gemini CLI
- VS Code
- Zed
- Cline
For any other tool that supports the Model Context Protocol, open the Remote Pairing window and click Copy MCP URL to get a URL you can paste into the tool’s config.
What AI Assistants Can Do
An AI assistant connected to Visibox can read and change almost anything about your show:
| Area | Examples |
|---|---|
| Playback | Play, stop, pause, resume, trigger a Clip, jump to a Song, seek to a position, set volume, go fullscreen. |
| Project state | Show the list of Songs and Clips, tell you what’s currently playing, describe any Song or Clip. |
| Editing | Add, duplicate, move, rename, or delete Songs and Clips. Attach audio. Change fade, speed, or filters. |
| Effects | Apply an Effect to a Clip, remove one, or write a brand-new Effect in CSS from your description. |
| Visualizers | Add a Visualizer Clip, swap presets, or create a new ISF or MilkDrop preset from a prompt. |
Because it’s all natural language, the AI will fill in small details for you — you don’t have to remember exact names or IDs. It’s also fine to ask conversational questions like “which Song has the most Clips?” or “is there anything in this project that looks broken?”
Creating Effects and Visualizers with AI
Effects in Visibox are written in CSS, and Visualizers are written in GLSL/ISF or MilkDrop. An AI assistant that understands those languages can:
- Write a new Effect from a description — “give me a warm VHS-like effect with subtle chromatic shift on the beat” and you’ll get a real, working CSS Effect added to your library.
- Modify an existing Effect — “make the Bass Pulse effect more aggressive on the peaks and shorter in duration.”
- Write a new Visualizer — describe the aesthetic and let the AI generate ISF shader code that you can drop into a Song.
When an AI edits an Effect or Visualizer, the changes show up in real time in the Effects Editor or Visualizers Editor if you have them open. You can watch code and the live preview update as the AI iterates, which makes creative back-and-forth feel natural.
Tips for Better Results
- Open the Project first. The AI works with the active Project, so make sure the one you want to control is open and frontmost.
- Start simple. Ask “what’s in this project?” to confirm the connection is working before issuing commands.
- Be specific about names. “Play the first Clip in Song 3” is clearer than “play the first clip.” Song and Clip titles help too.
- Undo works. If an AI-made edit isn’t what you wanted, use Edit > Undo (⌘Z) just like any other edit.
- You can refuse. The AI doesn’t have secret powers — it uses the same operations you do through the menus. If something would be destructive, you can decline before it runs.
Local vs. Remote
By default, AI assistants running on the same computer as Visibox connect automatically — no configuration needed. To connect an AI tool running on a different machine (a tablet, another laptop, a remote workstation), pair it first via the Remote Pairing window.
Under the Hood
Visibox implements the Model Context Protocol (MCP), an open standard that lets AI assistants talk to external applications. The AI tool loads Visibox’s MCP server at startup and the two communicate over a local connection — your data never leaves your machine unless your AI tool itself sends it out.
For developers integrating with Visibox at a lower level, see the Visibox API reference.