Skip to content

LLM tools

16th November 2025

I believe the key to using LLMs effectively is by providing the exact context. I typically do not use MCP servers or include context bloat. My preferred system prompt is "Answer concisely" to limit token output and spend. These are the tools I tried.

TUIs

Crush. Alternatives I looked at:

  • Codex CLI
  • Claude Code
  • Gemini CLI, and
  • Opencode

I avoid these tools since they grant excessive access to the terminal. I ran them in Docker containers to limit access, but their context is bloated with available terminal commands and file access (even in empty directories) and are specifically tuned for software development. I am unable to use them for general purpose prompts.

Inline autocomplete

Github Copilot. Alternatives I looked at:

  • Supermaven

There aren't really any competitors in this space. Github Copilot is the one I always come back to because it very minimally interrupts my flow.

IDEs

Zed editor. Other alternatives I looked at:

  • VSCode Github Copilot
  • Cursor, and other VSCode forks
  • JetBrains AI

Zed and VSCode are my choices. I am able to provide the exact context I want by highlighting code. I can specify exact tokens, lines, file, or folders. LLM generated Git commit messages are so nice. I have found great success with this category.

General chat

A personal LLM CLI tool. Alternatives I looked at:

  • ChatGPT website, and other provider websites
  • ChatGPT app, and other provider apps

I use my personal LLM CLI tool because I am certain my API keys are secure, and am unwilling to pay a costly monthly subscription. I only want to be billed on usage. On mobile, the free apps are a fine alternative to Google.