Hasan Toor @hasantoxr
Claude Code users are going to lose their minds over this.
A dev just open-sourced the fastest production-ready multi-agent framework on GitHub. It beats LangGraph by 1,209x in agent instantiation speed and runs on 100+ models with a single pip install.
It's called PraisonAI.
Here's what's inside:
→ 3.77 microseconds average agent startup time, making it the fastest AI agent framework benchmarked against OpenAI Agents SDK, Agno, PydanticAI, and LangGraph
→ Single agent, multi-agent, parallel execution, routing, loops, and evaluator-optimizer patterns all built in with clean Python code
→ Deep Research Agent that connects to OpenAI and Gemini deep research APIs, streams results in real time, and returns structured citations automatically
→ Persistent memory across sessions with zero extra dependencies: short-term, long-term, entity, and episodic memory all working out of the box with a single parameter
→ MCP Protocol support across stdio, WebSocket, SSE, and Streamable HTTP so your agents can talk to any external tool or expose themselves as MCP servers for Claude, Cursor, or any other client
→ 24/7 scheduler so agents can run on their own without you manually triggering anything
It supports every major provider in one framework: OpenAI, Anthropic, Gemini, Groq, DeepSeek, Mistral, Ollama, xAI, Perplexity, AWS Bedrock, Azure, and 90 more. You switch models by changing one line. The framework handles everything else.
And if you want zero code at all, the CLI does everything the Python SDK does. Auto mode, interactive terminal, deep research, workflow execution, memory management, tool discovery, session handling, all from your terminal.
5.6K GitHub stars. 100% Open Source.
Link in comments.
#ai#api#devtools#infra#linux#open-source#python#startup