Model Settings
Configure model parameters using standard settings.
Model settings allow you to configure how the model behaves and what capabilities it has access to. These settings are defined in the metadata.model.settings
section of your prompt’s frontmatter.
You can control basic parameters like temperature and max tokens, enable streaming, configure tools for agent capabilities, and define schemas for structured output. All settings are optional, allowing you to use only what you need for your specific use case.
Example Configuration
Available Settings
Property | Type | Description | Optional/Required |
---|---|---|---|
stream | boolean | Indicates whether to stream the response. | Optional |
max_tokens | number | Maximum number of tokens to generate. | Optional |
temperature | number | Controls the randomness of the output; higher values result in more random outputs. | Optional |
top_p | number | Controls the cumulative probability for nucleus sampling. | Optional |
top_k | number | Limits the next token selection to the top k tokens. | Optional |
presence_penalty | number | Penalizes new tokens based on their presence in the text so far, encouraging the model to discuss new topics. | Optional |
frequency_penalty | number | Penalizes new tokens based on their frequency in the text so far, reducing the likelihood of repeating the same line verbatim. | Optional |
stop_sequences | string[] | Array of strings where the generation will stop if any of the strings are encountered. | Optional |
seed | number | Seed value for random number generation, ensuring reproducibility. | Optional |
max_retries | number | Maximum number of retries for the request in case of failures. | Optional |
headers | Record<string, string> | Additional headers to include in the request. | Optional |
max_llm_calls | number | Maximum number of LLM calls allowed for agent mode. | Optional |
tools | Record<string, { description: string; parameters: JSONSchema; }> | A record of tools available to the model, where each tool includes a description and JSON Schema parameters. | Optional |
schema | JSONSchema | A schema defining the expected structure of the model’s output. | Optional |
Only use one of tools
or schema
or neither.
Have Questions?
We’re here to help! Choose the best way to reach us:
Join our Discord community for quick answers and discussions
Email us at hello@puzzlet.ai for support
Schedule an Enterprise Demo to learn about our business solutions