Skip to main content
Docs by LangChain home page
Python
Search...
⌘K
OSS (v1-alpha)
LangChain and LangGraph
Providers
Integration packages
OpenAI
Anthropic
Google
AWS
Hugging Face
Microsoft
All providers
Integrations by component
Chat models
Tools and toolkits
Retrievers
Text splitters
Embedding models
Vector stores
Document loaders
Key-value stores
Docs by LangChain home page
Python
Search...
⌘K
GitHub
Forum
Forum
Search...
Navigation
Abso
LangChain
LangGraph
Integrations
Learn
Reference
Contributing
LangChain
LangGraph
Integrations
Learn
Reference
Contributing
GitHub
Forum
On this page
Installation and setup
Chat Model
Abso
Copy page
Copy page
Abso
is an open-source LLM proxy that automatically routes requests between fast and slow models based on prompt complexity. It uses various heuristics to chose the proper model. It’s very fast and has low latency.
Installation and setup
Copy
Ask AI
pip
install
langchain-abso
Chat Model
See usage details
here
Was this page helpful?
Yes
No
⌘I
Assistant
Responses are generated using AI and may contain mistakes.