Skip to main content

AI LLM endpoint params AWS object.

type
enum<string>
required

The type of the AI LLM endpoint params object for AWS. This parameter is required.

Available options:
aws_params
Example:

"aws_params"

temperature
number | null

What sampling temperature to use, between 0 and 1. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic. We generally recommend altering this or top_p but not both.

Required range: 0 <= x <= 1
Example:

0.5

top_p
number | null

An alternative to sampling with temperature, called nucleus sampling, where the model considers the results of the tokens with top_p probability mass. So 0.1 means only the tokens comprising the top 10% probability mass are considered. We generally recommend altering this or temperature but not both.

Required range: 0 <= x <= 1
Example:

0.5