Hello World: Introducing See Your AI (`seeyourai`)
Today, we are officially introducing See Your AI (or seeyourai for short).
As we move into a world where AI agents are doing more of our heavy lifting—writing code, managing deployments, and conducting research—we’ve noticed a growing problem: Agents are black boxes.
The Problem: The “Black Box” of Token Usage
If you’ve used tools like Claude Code or Gemini CLI, you know the feeling. You run a command, the agent thinking spinner whirls, and eventually, it finishes the task. At the end, you might see a summary: 15,000 tokens used.
But what does that actually mean?
- How much did that session actually cost in dollars and cents?
- Which files were “hot spots” that kept getting read but never used?
- Could a smaller, cheaper model have handled that specific task?
Currently, developers are left with abstract metrics and invisible waste. You can’t optimize what you can’t see.
Enter seeyourai
seeyourai is the essential observability and optimization layer for AI agents. We bridge the gap between static context (the files you provide) and dynamic behavior (what the agent actually does).
Our mission is to make AI agents:
- Observable: See every tool call, every thought, and every cent spent in real-time.
- Debuggable: Trace execution flows to understand exactly where an agent went off the rails.
- Optimizable: Identify “dead context” and get actionable suggestions to reduce your token bill.
Building in Public
We believe the best way to build a developer tool is in the open. That’s why we’re building seeyourai as an open-core platform.
The seeyourai CLI (seeyourai) is your local-first workshop. It captures OpenTelemetry traces from your agents and provides a powerful Terminal UI (TUI) and web dashboard to explore them. No cloud required, your data stays on your machine.
What’s Next?
We are just getting started. In the coming weeks, we’ll be sharing more about:
seeyourai optimize: Our engine for pruning dead context and saving you money.- The
seeyouraiGateway: A proxy layer to enforce budgets and provide semantic caching. - Green Agentics: Our commitment to reducing the environmental impact of redundant LLM compute.
We’re excited to have you on this journey. Bring visibility to the invisible.
— Sam