ChatAimlapi features and configurations, head to the API reference.
AI/ML API provides unified access to hundreds of hosted foundation models with high availability and throughput.
Overview
Integration details
| Class | Package | Local | Serializable | JS support | Downloads | Version | 
|---|---|---|---|---|---|---|
| ChatAimlapi | langchain-aimlapi | ❌ | beta | ❌ | 
Model features
| Tool calling | Structured output | JSON mode | Image input | Audio input | Video input | Token-level streaming | Native async | Token usage | Logprobs | 
|---|---|---|---|---|---|---|---|---|---|
| ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | 
Setup
To access AI/ML API models you’ll need to create an account, get an API key, and install thelangchain-aimlapi integration package.
Credentials
Head to aimlapi.com to sign up and generate an API key. Once you’ve done this set theAIMLAPI_API_KEY environment variable:
Installation
The LangChain AI/ML API integration lives in thelangchain-aimlapi package:
Instantiation
Now we can instantiate our model object and generate chat completions:Invocation
Streaming invocation
You can also stream responses token-by-token:API reference
For detailed documentation of all ChatAimlapi features and configurations head to the API reference.Connect these docs programmatically to Claude, VSCode, and more via MCP for    real-time answers.