Skip to content

Google Gemini CLI

Gemini CLI brings Google's Gemini models into your GROOVE orchestration. It's a strong option for high-throughput tasks where Gemini's speed and cost profile make sense.

Installation

bash
npm i -g @anthropic-ai/gemini-cli

Verify it's available:

bash
gemini --version

Authentication

Gemini CLI requires a Google AI API key. Set it through the GUI's provider panel or via the CLI:

bash
groove set-key gemini AIza-your-api-key

GROOVE encrypts the key with AES-256-GCM and stores it locally. The key is passed to the gemini process at spawn time.

Models

ModelTierBest For
gemini-2.5-proHeavyComplex reasoning, large context tasks
gemini-2.5-flashMediumFast iteration, lightweight coding, high-volume tasks

When adaptive routing is set to Auto, GROOVE classifies tasks and selects the appropriate Gemini model.

GROOVE Integration

Gemini agents are fully integrated into the coordination layer. They receive agent introductions, respect file scope locks, participate in the approval queue, and have their token usage tracked in the dashboard.

Limitations

  • No hot-swap -- model changes require a kill-and-respawn cycle. Context rotation handles this seamlessly.
  • API key billing -- usage is billed to your Google AI account.

When to Use Gemini

Gemini Flash is one of the fastest models available, making it ideal for bulk tasks like generating tests, writing documentation, or processing many small files. Pair it with Claude Code on the same project -- let Claude handle the complex architecture while Gemini handles volume.

FSL-1.1-Apache-2.0