Signal #78387POSITIVE

Show HN: Translate LLM API Calls Across OpenAI, Anthropic, and Gemini

100

I built this after getting tired of maintaining one-off adapters between OpenAI, Anthropic, and Gemini APIs in the same project.The idea is to translate through a shared intermediate representation instead of writing every provider pair separately. So instead of OpenAI->Anthropic, OpenAI->Gemini, Anthropic->Gemini, etc., each provider just maps to/from the IR.This is not a unified client like LiteLLM. It's a translator/proxy: give it an OpenAI-style request and it can produce the Anthropic/Gemini equivalent, including streaming, tool calls, and multimodal inputs.It's open source, and there's also a gateway mode if you want to run it as a proxy in front of multiple providers.Would especially love feedback on where translation breaks down or gets too lossy across providers.Docs: https://llm-rosetta.readthedocs.io Paper/design notes: https://arxiv.org/abs/2604.09360 Comments URL: https://news.ycombinator.com/item?id=47831189 Points: 1 # Comments: 0

HackerNews AI Launchesabout 6 hours ago
Read Full Article

Explore with AI-Powered Tools

View All Signals

Explore more AI intelligence

Want to discover more AI signals like this?

Explore Steek
Show HN: Translate LLM API Calls Across OpenAI, Anthropic, and Gemini — Steek | Steek