Back to projects

GIVERNY

An orchestration layer for AI coding assistants that prevents context blowout on complex tasks.

Claude Code GitHub Copilot Cursor AI Orchestration Bash
Visit project

GIVERNY

An orchestration framework that keeps AI coding assistants productive when tasks grow beyond a single prompt.

The problem

AI-powered coding assistants are powerful line-by-line, but they struggle with non-trivial, multi-file tasks. Context windows fill up, the model loses track of earlier decisions, and output quality degrades. The bigger the task, the worse the problem.

The solution

GIVERNY sits above your AI assistant as a lightweight orchestration layer. It enforces a phased workflow — RESEARCH, PLAN, IMPLEMENT, COMMIT — where each phase produces artifacts persisted to disk. The next phase reads those artifacts instead of relying on raw conversational context, so the window never bloats.

The orchestrator itself never writes code. It decomposes work into atomic, sandboxed subagent calls — each scoped to specific files with clear success criteria. Multiple subagents can run in parallel when tasks are independent, or sequentially when outputs feed into the next step.

Multi-tool support

GIVERNY works with Claude Code, GitHub Copilot, and Cursor. An interactive installer (install.sh) handles setup across all three, injecting the orchestration config into each tool’s native instruction format.

Why the name

Named after Claude Monet’s garden at Giverny — a carefully orchestrated system where individual elements combine into something greater than the sum of their parts.

Acknowledgments

The original GIVERNY was created by Felix Söderström. If you are one of Felix’s students, say hello from Otto.