| Components | All | New | MacOS | Windows | Linux | iOS | ||||
| Examples | Mac & Win | Server | Client | Guides | Statistic | FMM | Blog | Deprecated | Old | |
Llama.Chat
Sends a chat messages to the model.
| Component | Version | macOS | Windows | Linux | Server | iOS SDK |
| Llama | 16.1 | ✅ Yes | ✅ Yes | ✅ Yes | ✅ Yes | ✅ Yes |
MBS( "Llama.Chat"; LlamaSession; Prompt { ; maxToken; TemplateName } ) More
Parameters
| Parameter | Description | Example | Flags |
|---|---|---|---|
| LlamaSession | The session identifier. | $$session | |
| Prompt | The prompt text for the chat. | "Hello" | |
| maxToken | The maximum number of tokens allowed before the function cancels. Default is 999. |
1234 | Optional |
| TemplateName | The chat template name. Default is "" to pick the default template. |
"chatml" | Optional |
Result
Returns text or error.
Description
Sends a chat messages to the model.Returns the response back.
The Llama.Chat function applies the chat template to structure the chat and log the responses.
Release notes
- Version 16.2
- Fixed an issue with Llama.Chat when the chat template couldn't be applied causing an exception.
Example Databases
Blog Entries
This function checks for a license.
Created 10th February 2026, last changed 13th February 2026