ChatMistralAI features and configurations head to the API reference. The ChatMistralAI class is built on top of the Mistral API. For a list of all the models supported by Mistral, check out this page.
Overview
Integration details
| Class | Package | Local | Serializable | JS support | Downloads | Version | 
|---|---|---|---|---|---|---|
| ChatMistralAI | langchain-mistralai | ❌ | beta | ✅ | 
Model features
| Tool calling | Structured output | JSON mode | Image input | Audio input | Video input | Token-level streaming | Native async | Token usage | Logprobs | 
|---|---|---|---|---|---|---|---|---|---|
| ✅ | ✅ | ✅ | ❌ | ❌ | ❌ | ✅ | ✅ | ✅ | ❌ | 
Setup
To accessChatMistralAI models you’ll need to create a Mistral account, get an API key, and install the langchain-mistralai integration package.
Credentials
A valid API key is needed to communicate with the API. Once you’ve done this set the MISTRAL_API_KEY environment variable:Installation
The LangChain Mistral integration lives in thelangchain-mistralai package:
Instantiation
Now we can instantiate our model object and generate chat completions:Invocation
API reference
Head to the API reference for detailed documentation of all attributes and methods.Connect these docs programmatically to Claude, VSCode, and more via MCP for    real-time answers.