Initialize a model
Use model provider classes to initialize models:- OpenAI
- Anthropic
- Google
- Groq
Tool calling support
If you are building an agent or workflow that requires the model to call external tools, ensure that the underlying
language model supports tool calling. Compatible models can be found in the LangChain integrations directory.
Use in an agent
When usingcreateReactAgent
you can pass the model instance directly:
Advanced model configuration
Disable streaming
To disable streaming of the individual LLM tokens, setstreaming: false
when initializing the model:
Add model fallbacks
You can add a fallback to a different model or a different LLM provider usingmodel.withFallbacks([...])
:
Bring your own model
If your desired LLM isn’t officially supported by LangChain, consider these options:- Implement a custom LangChain chat model: Create a model conforming to the LangChain chat model interface. This enables full compatibility with LangGraph’s agents and workflows but requires understanding of the LangChain framework.
-
Direct invocation with custom streaming: Use your model directly by adding custom streaming logic with
StreamWriter
. Refer to the custom streaming documentation for guidance. This approach suits custom workflows where prebuilt agent integration is not necessary.