Signal #74802NEUTRAL

Show HN: Open-source Perplexity clone one file back end, streaming answers

80

I built an open-source research agent. You ask a question, it searches the web via Tavily, synthesizes an answer with an LLM, and shows the sources it used. Answers stream in real-time.The interesting part is the backend. It's a single JS file (~100 lines) that handles web search, LLM streaming, and per-user conversation history. No vector database, no Redis, no separate storage service.It runs inside a cell — an isolated environment with a built-in database, search index, and filesystem. The cell handles persistence and streaming natively, so the agent code only has to deal with the actual logic.Tech: Next.js frontend, Tavily for search, OpenRouter for LLM (Gemini 2.5 Flash default).Demo: https://youtu.be/jvTVA7J925Y Comments URL: https://news.ycombinator.com/item?id=47797393 Points: 6 # Comments: 0

HackerNews AI Launchesabout 8 hours ago
Read Full Article

Explore with AI-Powered Tools

View All Signals

Explore more AI intelligence

Want to discover more AI signals like this?

Explore Steek
Show HN: Open-source Perplexity clone one file back end, streaming answers — Steek | Steek