Skip to main content
Edgee AI Gateway is a unified, OpenAI-compatible API that sits between your application and LLM providers. It’s designed to help teams ship faster while keeping routing, reliability, observability, and privacy controls in one place.
Edgee AI Gateway is still under active construction. We’re building fast, shipping incrementally, and writing docs in parallel. If you need something more mature today, jump to the Edgee Proxy documentation.

What the AI Gateway focuses on

These are the core capabilities we’re designing the gateway around:

What to expect (right now)

  • A product in progress: features and APIs may evolve as we learn from real production use-cases.
  • Clear defaults, configurable controls: the goal is to reduce “LLM glue code” while keeping you in charge.
  • Docs expanding quickly: each feature page will get deeper guides, examples, and best practices as we ship.

Looking for the mature platform?

Edgee Proxy has a large set of production-ready capabilities and much deeper documentation today: