Skip to main content
Docs by LangChain home page
Python
Search...
⌘K
LangSmith
Platform for LLM observability and evaluation
Overview
Concepts
Tutorial - Trace a RAG application
Set up tracing
Integrations
Overview
LangChain
LangGraph
Anthropic (Python only)
OpenAI
AutoGen
Claude Code
CrewAI
Google ADK
Instructor (Python only)
OpenAI Agents SDK
OpenTelemetry
Semantic Kernel
Vercel AI SDK
Manual
Configuration
View traces
Filter traces
Query traces (SDK)
Compare traces
Share or unshare a trace publicly
View server logs for a trace
Bulk export trace data
Automations
Set up automation rules
Set up webhook notifications for rules
Configure webhook notifications
Set up online evaluators
Human Feedback
Log user feedback using the SDK
Monitoring
Monitor projects with dashboards
Alerts
Common data types
Run (span) data format
Feedback data format
Trace query syntax
Docs by LangChain home page
Python
Search...
⌘K
GitHub
Forum
Forum
Search...
Navigation
Integrations
Integrations
Get started
Observability
Evaluation
Prompt engineering
Self-hosting
Administration
Get started
Observability
Evaluation
Prompt engineering
Self-hosting
Administration
GitHub
Forum
Set up tracing
Integrations
Integrations
Copy page
Copy page
LangSmith provides support for
LangChain
and
LangGraph
as well as integrations with a growing set of popular
LLM providers
and
agent frameworks
. For setup and usage, refer to the guide pages in the navigation bar.
Native open source frameworks
LangChain
LangGraph
LLM providers
OpenAI
Anthropic
Google Gemini
Amazon Bedrock
DeepSeek
Mistral
Agent frameworks
AutoGen
CrewAI
Google ADK
OpenAI Agents
OpenTelemetry
Semantic Kernel
Vercel AI SDK
Other
Instructor
Claude Code
Was this page helpful?
Yes
No
Tutorial - Trace a RAG application
LangChain
⌘I
Assistant
Responses are generated using AI and may contain mistakes.