Welcome! These pages include reference documentation for all langchain-* Python integration packages.
To learn more about integrations in LangChain, visit the Integrations overview.
LangChain supports the Model Context Protocol (MCP). This lets external tools work with LangChain and LangGraph applications through a standard interface.
To use MCP tools in your project, see langchain-mcp-adapters.
langchain-openaiInteract with OpenAI (completions, responses) and OpenAI compatible APIs.
langchain-anthropicInteract with Claude (Anthropic) APIs.
langchain-google-genaiAccess Google Gemini models via the Google Gen AI SDK.
langchain-awsUse integrations related to the AWS platform such as Bedrock, S3, and more.
langchain-huggingfaceAccess HuggingFace-hosted models in LangChain.
langchain-groqInterface to Groq Cloud.
langchain-ollamaUse locally hosted models via Ollama.
Other providers, including langchain-community, are listed in the section navigation (left sidebar).
LangChain has hundreds of integrations, but not all are documented on this site. If you don't see the integration you're looking for, refer to their provider page in the LangChain docs. Furthermore, many community maintained integrations are available in the langchain-community package.
For information on contributing new integrations, see the guide.