Alpha Notice: These docs cover the v1-alpha release. Content is incomplete and subject to change.For the latest stable version, see the v0 LangChain Python or LangChain JavaScript docs.
Super quick start
Let’s begin with the absolute basics - creating a simple agent that can answer questions and use tools:- A language model (Claude 3.7 Sonnet)
- A simple tool (weather function)
- A basic prompt
- The ability to invoke it with messages
Building a real-world agent
Now let’s create something more practical. We’ll build a weather forecasting agent that demonstrates the key concepts you’ll use in production:- Detailed system prompts for better agent behavior
- Real-world tools that integrate with external data
- Model configuration for consistent responses
- Structured output for predictable results
- Conversational memory for chat-like interactions
1
Define the system prompt
The system prompt is your agent’s personality and instructions. Make it
specific and actionable:
2
Create tools
Tools are functions your agent can call. They should be well-documented. Oftentimes, tools will want to connect to external systems, and will rely on runtime configuration to do so. Notice here how the
get_user_location
tool does exactly that:3
Configure your model
Set up your language model with the right parameters for your use case:
4
Define response format
Structured outputs ensure your agent returns data in a predictable
format. Here, we use Python’s
DataClass
dictionary.5
Add memory
Enable your agent to remember conversation history:
6
Bring it all together
Now assemble your agent with all the components:
What you’ve built
Congratulations! You now have a sophisticated AI agent that can:- Understand context and remember conversations
- Use multiple tools intelligently
- Provide structured responses in a consistent format
- Handle user-specific information through context
- Maintain conversation state across interactions