LangChain integrations
Welcome! These pages include reference documentation for all langchain-* Python integration packages.
To learn more about integrations in LangChain, visit the Integrations overview.
Model Context Protocol (MCP)¶
LangChain supports the Model Context Protocol (MCP). This lets external tools work with LangChain and LangGraph applications through a standard interface.
To begin using MCP tools in your project, see the langchain-mcp-adapters documentation.
Why MCP matters
MCP allows LangChain apps to connect easily to tools and workflows outside of LangChain. This improves how well they work together and their reliability.
Popular providers¶
-
langchain-openai
Interact with OpenAI (completions, responses) and OpenAI compatible APIs.
-
langchain-anthropic
Interact with Claude (Anthropic) APIs.
-
langchain-google-genai
Access Google Gemini models via the Google Gen AI SDK.
-
langchain-google-vertexai
Use Google's Vertex AI model platform.
-
langchain-aws
Use integrations related to the AWS platform such as Bedrock, S3, and more.
-
langchain-huggingface
Access HuggingFace-hosted models in LangChain.
-
langchain-groq
Interface to Groq Cloud.
-
langchain-ollama
Use locally hosted models via Ollama.
Other providers, including langchain-community, are listed in the section navigation (left sidebar).
"I don't see the integration I'm looking for"
LangChain has hundreds of integrations, but not all are documented on this site. If you don't see the integration you're looking for, refer to their provider page in the LangChain docs. Furthermore, many community maintained integrations are available in the langchain-community package.
Create new integrations
For information on contributing new integrations, see the guide.