Box Developer Documentation

AI LLM endpoint params Google

This resource is used by enpoints in the version 2024.0. For more details, see Box API versioning.

AI LLM endpoint params Google object

string
google_params

The type of the AI LLM endpoint params object for Google. This parameter is required.

Value is always google_params

0
0
2

The temperature is used for sampling during response generation, which occurs when top-P and top-K are applied. Temperature controls the degree of randomness in the token selection.

number
1
0.1
2

Top-K changes how the model selects tokens for output. A top-K of 1 means the next selected token is the most probable among all tokens in the model's vocabulary (also called greedy decoding), while a top-K of 3 means that the next token is selected from among the three most probable tokens by using temperature.

number
1
0.1
2

Top-P changes how the model selects tokens for output. Tokens are selected from the most (see top-K) to least probable until the sum of their probabilities equals the top-P value.

Response Example

{
  "type": "google_params",
  "temperature": 0,
  "top_k": 1,
  "top_p": 1
}